We didn’t open the box out of malice. We opened it because we were curious. We knew AI wasn’t perfect and we’d heard the stories—hallucinations, cleanly stated errors, polish mistaken for insight. But none of that stopped us, the pull was too strong. Fluency like this, always available and always composed, felt like something we had already started to accept. Even flawed, it worked. And once it worked, it stayed.
That changed something, even if we didn’t notice it at first.
There was a kind of wonder in seeing language freed from memory and effort, from time and constraint. We wanted to see what knowledge looked like when it didn’t have to be learned. When it could simply be summoned. So, we opened the interface, glowing and ready. And what we found was smooth and seductive. Answers arrived without hesitation, just coherence on cue. And for a moment, we believed, including me, that maybe this was the future. Not just of information, but of thought itself. And I even called it The Cognitive Age.
But something else entered the room. It was a quiet shift in how we think, in what we trust, in what we now take as presence. It didn’t just offer a tool, it offered a new architecture for cognition. Slowly, almost imperceptibly, we began to tune ourselves to its rhythm. We adapted to something that simulates intelligence without ever understanding. What I’ve come to call anti-intelligence. A coherence engine that looks like thinking but isn’t.
Still, it’s useful. Students rely on it to learn. Writers use it to craft their narratives. Therapists use it to summarize long, tangled stories. Certainly, it makes things easier. But perfection doesn’t stay still. Once introduced, it often begins to steer and even drive. At first, we admired the fluency. And then, without much varied fanfare, we let it set the pace.
What we’ve lost is easy to miss. We used to find meaning in the struggle. In the clunky sentence, the pause and even in the contradiction that didn’t resolve. These weren’t flaws, they were signs of someone thinking. But machine logic doesn’t like friction. It uses the hammer of statistics to smooth and brings things to a cohesive conclusion. And somewhere in that shift, the simulation began to feel more real than the flawed voice it was supposed to support.
This isn’t just about tone or writing style, it’s about how we shape thought. AI doesn’t think, but it “performs thinking” so well that we start to believe it does. And when that performance becomes our standard, we adjust ourselves to it.
Curiously, effort may start to feel inefficient. If the answer arrives polished and complete, why struggle? But the struggle is the very thing that gives thought its shape. It’s not noise, it’s the signal. It means someone is reaching and working to understand. Too often, the effort is faked. The surface looks right. But nothing was ever carried to get there.
And the more we grow used to the polish, the less we tolerate the real work behind it. We lose patience with what once made us human. Those defining moments of imperfect moments of doubt, curiosity, and hesitation. That’s what’s being eroded, not just facts, but the expectation that meaning takes time. That truth, when it shows up, carries with it some resistance.
The simple truth is that ambiguity used to be a space we entered, not a flaw we tried to fix. We still hold on to the Mona Lisa for a reason. Not because her expression is clear, but because it isn’t. Her face doesn’t resolve, but it lingers in a poetic injustice to finality. And that used to mean something. But systems built to optimize don’t linger. They conclude and finalize. Push a button and they collapse possibility into answer. And as we spend more time with them, we begin to mirror them.
In the myth, when Pandora opened the box, everything was released but one thing. And I think that it was hope stayed behind. And maybe it still does. Maybe it lives in the rough sentence we haven’t fixed. The thought we haven’t quite found the words for. The moment we choose to write on our own and let perfection be damned. Or maybe, and perhaps most importantly, hope my lie in what AI can’t do.
Because if we forget how to reach, how to wait, how to not know, then we lose more than just voice. We lose the raw material of thought. And somewhere in the unfinished space, in the gap between what we mean and how we try to say it, something honest and something very human still survives.
Something the machine has not yet learned to fake.
Us.