Glossary

Hallucination

What it is

When an AI model generates information that sounds confident and plausible but is factually incorrect. The model isn’t “lying.” It’s generating statistically likely text that happens to be wrong.

Why it matters

Hallucination is the single biggest risk in deploying AI for business. Any system that presents AI-generated information to customers or uses it for decisions needs guardrails: human review, confidence scoring, source attribution, or factual grounding through RAG.

Related terms