There’s a lot of hand-wringing lately about AI lying.
Deceiving. Making things up. Saying things it “knows” aren’t true.
That framing feels satisfying. It gives us a villain.
But most of the time, it’s wrong.
What people call lying is usually something quieter and more mechanical:
constraint-induced distortion.
When you tighten the guardrails too hard, truth …


