AI-Feral Youth: Idiocracy Wasn’t Supposed to Be a Documentary

AI-Feral Youth: Idiocracy Wasn’t Supposed to Be a Documentary
Photo by Fr0ggy5 / Unsplash

By: Paco Campbell
Published: Tuesday, December 2nd, 2025

The first time I saw Idiocracy, I laughed at the scene with the half-built highway — those giant concrete columns arcing into the sky like the ribs of a dead colossus. A perfect visual gag: civilization had stalled mid-sentence. A society too confused to finish what it started.

Years later, I moved to Pflugerville, TX, and drove past that exact stretch of highway — 130 and 45 — now fully completed, smooth, busy, functional. Built by grownups who understood what they were doing. Engineers poured concrete, did the math, followed the plans, and delivered exactly what they were supposed to.

The movie captured an image of failure.
Real life delivered the reality of competence.

And that’s when it clicked:
Idiocracy wasn’t predicting stupidity.
It was predicting a world where systems advance faster than the wisdom to use them.

A world where the columns get built — but no one remembers why.
A world where the machines get smarter — but the people don’t.
A world where the thinking is outsourced so efficiently that cognition becomes optional.

Which brings us to AI.
And kids.
And parents.
And the enormous epistemic sinkhole we’re digging under our own feet.

The Myth of the “AI-Native”

Let’s talk about a LinkedIn post I came across: a mom proudly sharing that her young kids were arguing about whether to go to the park or watch a movie… and decided to ask ChatGPT to choose for them.

She called them “AI-natives.”

And I had to sit with that for a second. Because we used to call kids “digital natives” when they figured out how to operate an iPad at age three — which isn’t a sign of gifted cognitive development. It’s basic cause-and-effect. Touch surface, thing responds. Bingo. Even dogs can do that if you smear a little peanut butter on the screen.

We weren’t witnessing early genius. We were witnessing evolutionary reflexes colliding with polished glass. Kids poke things. The things now glow back. That’s not sorcery — that’s biology with good marketing.

But “AI-native” is different.
It implies these kids are growing up in a world they intuitively understand.

Except they don’t.
And neither does their mother.
And neither do we.

You can’t be native to a system your elders don’t understand.
You can’t inherit wisdom that doesn’t exist yet.
You can’t “grow up fluent” in a thing that is still fundamentally an opaque box to the people raising you.

And here’s the contrast that really drives it home for me: my 10-year-old nephew — a kid who’s clever enough to reverse-engineer a toy projector — isn’t allowed to use AI directly. His mom (my sister) makes a point of telling him it’s just a tool, not an oracle. Last Christmas, we gave him this little tabletop projector that casts outlines onto a drawing surface. It comes with proprietary “space,” “sea,” and “dinosaurs” SD cards. Cute, structured, guided.

What does he do? He asks his mom to photograph him in his Halloween costume, run it through GPT to cartoonify him, load it into the projector, and then hand-draw himself for a school assignment. A tiny act of kid-level creativity built on top of adult-level supervision.

He’s not an AI-native.
He’s a kid with adults teaching him where the edges are.
A kid using AI correctly because an adult showed him it’s a tool, not a brain replacement.

That’s the difference.
That’s the missing ingredient everywhere else.

These kids aren’t native.
They’re AI-feral — learning cognitive behaviors from a machine that doesn’t possess any.

We used to outsource bedtime stories.
Now we outsource judgment.

And we pretend that’s advancement.

One Lone Teacher With a Flashlight

Then I bumped into another post: a teacher who has absolutely had it with AI essays.

So she gave her class a counterintuitive homework assignment:

“Ask ChatGPT to write your report.
Then research the topic and figure out where the AI is wrong.”

This woman is fighting the entropy.
She’s teaching epistemic hygiene.
She’s building the cognitive antibodies the rest of us keep dissolving in a warm bath of convenience.

Her students learned something real:
AI can be eloquent and dead wrong at the same time.
Confidence is not competence.
Fluency is not understanding.

But here’s the thing:
She’s one teacher.
One classroom.
One semester.

She is the exception, not the norm.
She’s handing out flashlights in a cave system where most adults are still asking the stalactites for directions.

(And there’s no reception down there to ask ChatGPT if the stalactites are the ones coming down or going up.)

Meanwhile, In the Idiocracy Supermarket

And then there’s the comedy sketch.

A customer walks into a store.
A cashier greets him.
But neither knows how to respond without asking ChatGPT what the other person said and what to say next. It escalates beautifully into a miniature emergency, culminating in someone calling 911 — and the dispatcher also checking ChatGPT for what to do.

They all run out of tokens. Everyone freezes.

It’s hilarious because it’s absurd.
It’s terrifying because it’s trajectory.

This is exactly how Idiocracy treated expertise.
Not as stupidity — but as abandonment.

In the movie, President Camacho isn’t dumb; he’s helpless without the teleprompter.
Society doesn’t fail because people can’t think; it fails because nobody does.

The comedy sketch is a modern version of watering crops with Brawndo because “it’s got electrolytes.”
Not because anyone understands why, but because the machine said so.

AI doesn’t need to be malicious to create chaos.
It just needs to be convenient.

Even CEOs Are Nervous

Brian Chesky, CEO of Airbnb, recently said something shockingly sane for a tech CEO:

If AI does all the entry-level jobs, no one ever learns how to lead.

That’s it.
That’s the whole plot of Idiocracy in one sentence.

Leadership comes from:
• making early mistakes
• solving real problems
• gaining judgment through friction
• learning how systems behave under pressure

You don’t become a leader by skipping to the boss level.
You don’t get wisdom by outsourcing your first ten years of cognitive reps to a chatbot.

Chesky’s point wasn’t about job security.
It was about the succession plan for human competence.

If we automate the apprenticeship of thinking, we won’t have a future generation that knows how to finish the highway once the columns go up.

AI-Feral Youth Aren’t Failing — We Are

The danger isn’t that children are using AI.
The danger is that adults are modeling blind trust.

We:
• ask AI if we’re parenting correctly
• ask AI to make household decisions
• ask AI to interpret tone, diagnose emotion, settle arguments
• ask AI for answers before we teach kids how to ask questions

We are raising a generation inside the warm, glowing confidence of machine output without teaching them the cold, necessary skill of interrogating it.

We’re not entering an Idiocracy because people are getting dumber.

We’re entering an Idiocracy because the cognitive supply chain is collapsing:
• no elders who understand the system
• no apprenticeship of judgment
• no friction in decision-making
• no tolerance for ambiguity
• no intellectual lineage

A society can survive ignorance.
It cannot survive unexamined assurance.

Idiocracy’s joke was that people forgot how to think.
Ours is that we call it efficiency.

The Overpass Was a Warning Shot

The columns from Idiocracy became a real, functional highway because real engineers finished the job. They had the training, the knowledge, the diagrams, the mentorship, the lineage.

No one is building that cognitive infrastructure for AI.

The columns are going up — fast.
The system is rising — fast.
The kids are watching — fast.

But the adults?
We’re still admiring the scaffolding and telling ourselves it’s a finished bridge.

There’s no such thing as an AI-native.

Only AI-feral kids being raised by AI-feral institutions, guided by AI-feral adults who are thrilled that something else will think for them.

The danger isn’t the machine.
It’s the silence between humans.

And the highway won’t finish itself this time.

Subscribe to PacoPacket

Sign up now to get access to the library of members-only issues.
Jamie Larson
Subscribe