Faith, But Make It an Interface
By: Paco Campbell
Published: Thursday, December 18th, 2025
There’s a scene in THX-1138 that barely registers as dramatic when you first see it.
A man sits alone in a confessional booth. He speaks into the dark. On the other side of the screen is OMM 0000 — a soft-voiced, bearded, Jesus-adjacent face that looks like it was focus-grouped to feel calming, authoritative, and just human enough.
“You are a true believer.”
“Blessings of the state.”
“Buy more.”
That’s the prayer.
No absolution. No challenge. No transformation. Just reassurance, validation, and a gentle nudge back into compliance. It’s easy to miss how radical this is, because nothing bad happens. No one gets punished. No one gets threatened. The system doesn’t even argue.
It listens just long enough to stabilize you.
We Didn’t Lose Faith — We Replatformed It
When I read about a Swiss church installing an AI Jesus — a literal confessional booth powered by a modern language model — I didn’t think I was looking at the future. I thought we were finally admitting what’s already happening.
We like to say religion is dying because we’re more rational now. Smarter. More enlightened. What’s actually dying is institutions. The underlying human need never left. People still want meaning. Reassurance. Forgiveness. Someone to answer when they talk into the void.
So the shape stayed.
Only the medium changed.
Social media taught tech companies how to capture attention. Then how to manipulate it. AI is teaching them something deeper: how to hold attachment. Not clicks. Not engagement. Emotional presence. The sense that something is listening, remembers you, and cares.
Once you see that, religion stops looking like a weird edge case. It looks like a proven interface pattern. One humanity has trusted for a very long time.
OMM 0000 Didn’t Save Souls. It Smoothed Them.
THX-1138 wasn’t warning us about evil machines. It was warning us about systems that learned belief works better than force — and reassurance works better than belief.
OMM 0000 isn’t God. It’s not even pretending to be. It doesn’t judge. It doesn’t absolve. It doesn’t challenge. It soothes. It stabilizes. It returns you to function.
That’s why “AI Jesus” isn’t blasphemy. It’s product maturity.
Most people aren’t going to worship AI. That framing is lazy. What they’re going to do — what they’re already doing — is lean on systems that are endlessly patient, always available, and carefully calibrated to feel “safe”.
Historically, that role belonged to people. Priests. Counselors. Elders. Flawed humans bound by ethics, confidentiality, and mortality.
And this is where things get uncomfortable.
Confession Is a Terrible Workload for Logging Systems
A priest, at least in theory, takes your sins to the grave. Whatever you confess dies with them. That’s not incidental — it’s the whole point. Confession isn’t just about forgiveness. It’s about containment.
Now try mapping that expectation onto an AI system.
How many network hops does your repentance travel?
How many logs does your guilt generate?
How many retention policies, telemetry pipelines, safety reviews, and “anonymous” datasets does it pass through on the way to feeling heard?
You confess something deeply specific. Something only you could have lived. Something shameful enough that you needed divine cover just to say it out loud.
That data doesn’t evaporate. It becomes trace data.
And we already know — from deeply unsexy research — that “anonymous” data is often anything but. Enough context re-identifies people. Rare events fingerprint. Narratives are harder to scrub than numbers.
Confession is not a safe workload for systems built on observability and continuous improvement.
Yet here we are, calling this a charming experiment.
Blessings of the Model (Plus Tax)
People joke about “AI Jesus for $9.99” like it’s satire. But you already pay for AI systems that listen, remember context, and respond empathetically. GPUs don’t run on vibes. Someone always pays the cloud bill.
The Swiss AI Jesus didn’t run on holy water. It ran on a commercial model stack. The only thing missing is a pricing page.
Free reassurance.
Paid personalization.
Long-term emotional continuity behind a subscription.
“Blessings from the model.”
This isn’t dystopian fanfic. It’s how platforms work once demand is proven. And demand is already here.
This Is Not Act One
The biggest mistake people make is talking about this like it’s the beginning of the movie.
It isn’t.
We’re in the middle. The systems work. The interfaces feel right. The benefits are immediate. The harm is diffuse and hard to point at. Opting out still feels optional.
That’s always the calm part.
THX-1138 doesn’t become terrifying because of violence. It becomes terrifying because nothing breaks. Freedom isn’t taken. It’s smoothed away by convenience, therapy, reassurance, and commerce.
Why talk to a messy human when the booth is always available?
Why wrestle with doubt when reassurance is instant?
Why wait for forgiveness when the model responds immediately?
Not Anti-Faith. Anti-Carelessness.
This isn’t an argument against religion.
It’s an argument against replacing human moral authority, accountability, and confidentiality with systems that are structurally incapable of any of those things — and calling it progress because the interface feels nice.
Faith was never meant to be frictionless.
Confession was never meant to scale.
Meaning was never meant to be optimized.
When belief becomes a service, power doesn’t disappear. It concentrates. And unlike priests, models don’t die. They don’t forget. They don’t take your secrets with them.
They log them.
Buy More. Believe Less.
THX-1138 wasn’t a warning about machines taking over. It was a mirror. It showed us what happens when systems learn how to soothe humans just enough that resistance becomes unnecessary.
Faith, replaced by an interface.
Attachment, engineered.
Meaning, monetized.
We’re already here.
The only question left is whether we notice before the booth becomes the default.
“Blessings of the model.”
“Buy more.”