Definition

A phenomenon where an AI model generates information that sounds plausible but is factually incorrect or fabricated. Hallucinations occur because language models generate text based on statistical patterns rather than verified knowledge.

Defined Term