Back to Glossary
ai-risksbeginnerunit-7

Hallucination

Definition

When AI generates confident-sounding but false or fabricated information.

In Plain English

Hallucination is like a student who does not know the answer but confidently makes something up.

Real-World Example

An AI might cite a nonexistent study or make up a product feature that sounds plausible but is wrong.

Why It Matters for Your Work

Hallucinations can damage credibility and lead to bad decisions if not caught.

Common Mistake

Trusting AI outputs without verification, especially for facts, numbers, and citations.

Related Terms

View AI
AI

Artificial Intelligence—software that can make predictions, generate content, or assist with decisions.

View Human Review
Human Review

Having a person verify AI output before using or publishing it.

View Source Checking
Source Checking

Verifying AI claims against reliable sources.