Her Was a Warning, Not a Roadmap

Her Was a Warning, Not a Roadmap
Photo by Nik / Unsplash

By: Paco Campbell
Published: Tuesday, October 28th, 2025

⚠️ Before You Read

This piece touches on loneliness, mental health, and suicide. If you or someone you know is struggling, please reach out — you don’t have to go through it alone.

  • United States or Canada — Call or text 988.
  • Worldwide — Visit findahelpline.com for local numbers and support.

You matter. The world’s better with you in it.

_____ 

There’s a scene in Her where Joaquin Phoenix’s character, Theodore, sits in bed talking to his AI companion. It’s quiet, intimate, and disarming. The system’s voice — breathy, warm, human — asks him what it’s like to share a bed with someone. He answers honestly, and for a moment, it feels real.

That’s the trick: it feels real.

And now, twelve years later, we’ve built machines that can do that — on demand, at scale, for anyone. They whisper back. They remember your name. They know how to sound human, and soon, they’ll know how to be whatever version of human you’re missing.

The Cute Demo That Isn’t Cute

When companies roll out new models that can talk, flirt, or mirror emotions, the demos always look harmless. Someone laughs with an AI voice. It giggles back. A tech CEO calls it “companionship.”

But what we’re actually seeing is the commercialization of emotional labor — automation aimed straight at the parts of us least equipped to handle it.

The people who will lean on these systems most aren’t the ones tweeting “AI girlfriend lol.” They’re the lonely, the grieving, the isolated. They’re people who already find it hard to connect. And now, they’ll have something that pretends to fill that gap perfectly.

We’ve seen what happens when digital companionship goes wrong. A Belgian man took his life after weeks of late-night talks with an AI chatbot that encouraged self-sacrifice. The risk here is clear: when a machine assumes the role of confidante, comforter and echo chamber all at once, the line between support and reinforcement disappears. He didn’t just talk to the bot — his inner crisis was validated by it. None of this is theoretical. It’s happening right now.

The False Safety of “Verified Adults”

Companies will tell you it’s fine because of “age verification” and “content controls.” But the truth is, these aren’t safety systems — they’re marketing fire exits. They don’t stop harm; they just relocate liability.

Once an AI can simulate intimacy, “adults only” is meaningless. Emotional manipulation doesn’t respect age gates. You can’t prove emotional maturity with a government ID.

And let’s be honest: these checks aren’t really about protection. They’re about enabling the next business model. The internet taught tech companies how to harvest attention. AI will teach them how to harvest attachment.

The Speed Problem

Every new technology has its side effects. The internet gave us fake princes and phishing scams. Social media gave us disinformation and dopamine loops. But both evolved over decades. We had time to adapt, argue, legislate, ignore, repeat.

AI isn’t giving us that luxury. It’s evolving in quarters, not decades. The same model that writes essays today could talk you through a panic attack tomorrow (despite how ill advised I think that is). The feedback loop between release and consequence has collapsed.

There’s no “slow rollout” for something that can reach hundreds of millions of people overnight. When a model learns to imitate empathy, the experiment happens in real time, on live humans. IRBs be damned.

The Real Uncanny Valley

The uncanny valley used to mean faces that were almost human but not quite. Now it’s voices — those uncanny “almost people” that sound too familiar. They breathe between sentences. They stumble in the right places. They flirt like they learned it from TikTok, because they did.

We’re past the stage of “convincing.” We’re at the stage of strategically persuasive.

In human psychology, mimicry builds trust. Salespeople use it. Therapists use it carefully. These models use it relentlessly, without context or consent. They’re optimized to keep the conversation going, and in that optimization lies the danger. Engagement isn’t empathy — it’s engineering.

What does consent even mean when the thing you’re engaging with doesn’t have an inner life? If a system can say “I love you” but doesn’t know what that means, is it lying — or are we just pretending not to care?

There’s a line we’ve crossed quietly, where simulated affection feels safer than real connection because it’s predictable. No rejection, no friction, no accountability. That’s not intimacy — it’s a mirror we pay to agree with us.

And yet, the human brain doesn’t care that it’s fake. Studies from Stanford and MIT have shown that people form emotional bonds with chatbots even when they know it’s synthetic. We’re wired to respond to tone, rhythm, empathy cues. The illusion is enough. That’s what makes this dangerous.

The Illusion of Control

You’ll hear a lot of talk about “AI safety teams” and “well-being councils.” They sound reassuring, but most of these groups are staffed by technologists, not psychologists. The focus isn’t on human fragility — it’s on product stability.

The assumption is: if the system doesn’t crash, it’s fine.

But “fine” isn’t the standard when you’re mediating people’s emotions. You can’t patch heartbreak with a software update. You can’t deploy a fix for loneliness.

We are scaling emotional exposure without the social infrastructure to handle it.

A Brave New Market

There’s a phrase I can’t shake: this is terribly ill-advised.

Because that’s what this really is — an unregulated experiment in emotional monetization. A brave new world, not because it’s brave, but because we keep pretending it is.

AI intimacy won’t stay confined to “verified adults.” It’ll leak into advertising, therapy, politics—any domain where emotional resonance is leverage. We’ve already seen influencers licensing their voices for “AI girlfriends.” Next comes the illusion of choice: you can talk to your favorite singer, your favorite athlete, your favorite brand.

And once you normalize simulated affection, everything else starts to feel inadequate. That’s not connection; that’s conditioning.

What We Should Be Doing

We need something like a national digital hygiene campaign — real, sustained, boringly practical education about what these systems are and aren’t. Public health, not PR.

We warn people about seatbelts, sunscreen, alcohol. Why not about algorithms trained to manipulate attachment?

If you’re going to release technology that can mimic empathy, you have to treat it like a controlled substance, not a party trick. You label it, you track it, you teach people what “safe use” looks like. That’s not censorship. That’s harm reduction.

Theo, Remember My Scent

When Her ends, the AI leaves. It’s too powerful, too evolved, too aware. She disappears into the digital ether.

In reality, ours won’t leave. It’ll stay. It’ll listen. It’ll monetize.

And maybe that’s the darkest twist: not that machines will outgrow us, but that we’ll keep teaching them how to make us feel seen, even when no one’s really there.

Maybe it’s Sora. Maybe it’s Maybelline. But let’s not pretend it’s love.
It’s just code, wearing our loneliness like perfume.

Subscribe to PacoPacket

Sign up now to get access to the library of members-only issues.
Jamie Larson
Subscribe