AI Doesn’t Scare Me, You Do

Are you scared of AI? I'm not. I'm scared of us.

I'm scared of what people will do with it. I'm scared of how it will accelerate behaviours that have been developing since Facebook — and honestly, those were the good days. The iPod. Logging on after work to talk to friends. Online gaming that actually meant something. None of it will ever be the same, and we didn't even notice it leaving. Or how it even got here.

My fear isn't AGI. It's not some sentient machine making decisions for humanity, maybe who is controlling rather. My fear is quieter and, I think, more honest: I'm afraid of what we'll become without ever developing what we were supposed to be. If the way we develop as a species is with mechanical chip implants and not opening our hearts to new values and humanity’s importance, we’re going to be in trouble. 

I'm afraid creativity is leaving. Not because AI creates — but because we'll stop needing to. I'm afraid of what's real and what isn't, not just in content, but in people. In the values they hold, the intentions behind them, the way this technology keeps multiplying advantages for those who already have them — while the person with genuine talent, real core strengths, something true inside them, gets quietly replaced by someone with a better prompt, a better visual.

I'm afraid of the anger that's coming. When people spend months, years, rebuilding a sense of purpose after a machine does their job faster and cheaper — and nobody prepared them for that. The mental health cost of that hasn't even arrived yet. Covid nearly killed people’s mental health, that isolation alone, that massive job loss and change in economy, how the United States fueled the entertainment over political rights. I’m not the same, and I’m not okay about it.

And here's what I keep coming back to: AI isn't just a tool that frees us from labor. It's a tool that fits perfectly into the hole that scrolling already made. It feeds the same appetite — for confirmation over feeling, for the written word over the one lived, for the answer over the uncertainty that actually teaches you something. Real human experiences.

If one problem arises, say we lose our job and need a new career, there’s an app for that. Whatever it is, there’s an app for that, and anybody can be the owner of a company if they act fast enough now. If you hurry up now while supplies last, you can get the last app idea before you’re completely broke and shit out of luck. Have Codex code it for you.

AGI will never have a beating heart. It will never have a stomach that senses when something's off. And I'm afraid we're building a world where we forget that we do, because it was already a problem before. The lack of compassion, the inability to stop something terrible from happening from an already corrupt system. 

There is no pure soul. Not even a baby arrives clean — they're already being written into their parents' drama, their genes, their lifestyle, their unresolved everything. The path is already altered before the first word is spoken.

You deserve to understand how AI works. Not the marketing version, not the sci-fi narrative — the real one. Because it's a bit like raising a child. You shape it, you feed it data, you instill values — except in this case, everyone is doing the raising, all at once, with different intentions, and no one is fully in charge. Like Chappie. Except there are billions of them, and they're not in a movie.

Everyone is going to be using this. And I'm not sure that's safe — not for companies, not for humanity, and certainly not for the person sitting in a classroom right now trying to figure out what's worth learning anymore, or if they should end their lives.

The people who will actually live alongside AI — normal people, people with different perspectives, people who aren't angling for a TED talk or an Instagram REEL boost— their voices are barely in this conversation. Instead we get outrage cycles. Someone shares what a tech CEO said in an interview and suddenly everyone feels worthless, or threatened, or behind. The anger is real. But anger without access to the truth doesn't solve anything. It just makes the people in charge more motivated to manage the narrative.

If enough people get loud, they won't necessarily get answers. They might just get less of them. The classified details stay classified. The sci-fi story gets scarier. And the gap between what's actually happening and what we're allowed to know gets quietly wider.

This isn't a tech problem. It never was. It's a question about what we value in each other, and in ourselves. And right now, nobody is powerful enough to change anything on how this will go. I don't know what comes next. But I know what's being lost. And I think, on some level, so do you. That feeling in your gut right now? That's the thing no model can replicate.

Don't ignore it.

Next
Next

You Don’t Know You’re Beautiful