Hallucination
Hallucinations occurs when language models generate factually incorrect or fabricated content that appears plausible, stemming from knowledge gaps, architectural constraints, and the challenge of generating coherent text without true understanding.