We’re Teaching Students to Fear the Future
We’ve seen this movie before — pens, typewriters, the internet... and now AI


I write about leadership, exploring AI, and why teaching critical thinking about it is more important than ever.
Please hit the heart ❤️, restack 🔄, subscribe 📨, and all that jazz to help spread the word!
🙌
👉 This isn’t just a piece on AI. It shows how fear is taught, how questions get shut down, and why students end up confused instead of prepared. From high school to college, the pattern stays the same. Rules get protected, even when learning suffers.
When I worked in higher education, the complaint I heard most often from students was that their professors were out of touch. They weren’t wrong. Many faculty spent their entire careers inside academia. They finished their degrees, went straight into teaching, and never actually worked in the fields they were supposed to be preparing students for.
That’s the typical trajectory for career academics.
The results end up being classrooms that often feel like they’ve been sealed off from the outside world. It’s hard to connect content or general education courses to real life. Students are left wondering how it has any relevance to the workplace. I get it. I felt the same when I was in college.
Now I’m watching the same thing happen with AI, only faster and with more panic.
Mixed messages
Look at what’s happening in K–12 right now. Some high schools have written into their handbooks that any use of AI is cheating unless a teacher gives explicit permission. That means even asking a tool, “How do I strengthen this argument?” is off-limits.
A video that recently popped up on Instagram (credit to Sydney Sullivan, PhD for flagging it in her recent post) showed that one solution colleges were leaning into was using a Google Docs writing report. You can see a video of what the student wrote and the entire process they used regarding changes, edits, etc.
GPTZero analyzes the results for natural human tone, how long it took to write, and literal number of edits.
In Arkansas, lawmakers decided the best response was to ban cell phones with a new law: Bell to Bell, No Cell Act (thanks to Karen Spinner for pointing this one out). It means no use from the start of the day to the end including lunch, recess, and changes in between classes. It’s partly for social media but probably mostly because they didn’t want students pulling out ChatGPT.
The message is blunt. Don’t touch it. Don’t ask questions. Just stay away.
Old patterns
Then those same students get to college, and the narrative flips. Suddenly, they’re told they can use AI as long as they cite it provided the faculty member allows it. But how do you cite something you’ve never been taught to use responsibly in the first place?
It’s like banning calculators for years, then expecting students to show up in calculus class already knowing how they work. Only in this case, the calculator can also help you draft your cover letter or organize a presentation.
Education keeps stepping back into a time capsule, trying to teach 21st-century students with 20th-century methods. And the result is predictable: disengagement.
Here’s a little history lesson of things you may not have known or just never really thought about:
Ballpoint pens were discouraged in many classrooms in the 1940s and 50s because they tended to leak and created messy writing when first developed. Some teachers believed fountain pens were superior in developing better handwriting.
Typewriters were resisted for decades because real writing was supposed to be done by hand.
The internet was dismissed as unreliable and just a distraction until suddenly everyone expected students to do research online.
In each case, schools wasted years telling students not to use something that the rest of the world had already adopted. Cutting edge technology at the time that was side-eyed but would go on to reshape our daily lives.
Framing the narrative
But let’s look beyond everyday tools, and how concepts or ideas are communicated to shape how we think about them. Take Vietnam. The official explanations didn’t match the reality on the ground, and by the time the gap was obvious, the country was divided. Iraq followed a similar pattern.
Whether or not you agreed with the wars, the point is the same: when institutions spin something as entirely right or entirely wrong, people eventually stop trusting them.
And before anyone decides to blow up my comment section, I am not comparing AI to war. It’s that the framing is the same in how institutions decide what’s acceptable. They label something as good or bad, useful or dangerous. And they leave no room for critical thinking in between. You are left to pick a side.
That middle space which is the one most people occupy gets silenced and right now students are feeling that most of all with classes about to start soon.
Missed lessons
The real danger isn’t whether a tenth grader uses ChatGPT to polish an essay. The real danger is that students aren’t being shown how to use it to figure out what they should and shouldn’t use it for. They aren’t being equipped to evaluate when it helps, when it harms, and how to tell the difference.
What they are experiencing is the stress of it all. What they are learning is that adults fear it and don’t want them to use it. I mean we are talking about teenagers here. Telling them something is forbidden is just begging them to want it. Plus, making something so taboo makes it even more powerful than if we openly discussed it.
When we shut down conversations, we also eliminate the chance to teach students how to think for themselves. And history has already shown us what happens then. The backlash always comes.
TL;DR (But not really):
Teaching students to fear AI doesn’t protect them. It leaves them unprepared for the world they’re stepping into.
What this solves:
It shows how fear-based rules repeat history, shutting down tools instead of teaching how to use them well.
Why most people will read this post:
Because they’ve seen classrooms and workplaces cling to outdated methods long after the rest of the world moved on.
Why it matters:
Students aren’t learning how to think critically about AI. They’re just learning not to touch it. That gap will cost them later.


It's a tough one. Just like we want to write using our own minds, we also don't want to neglect the amazing tools that are out there. The internet and AI. Just as we want to learn to think for ourselves, we don't want to slow down our progress by taking away things that aid our thinking. But I definitely feel like we need to be open to using new technologies because they are inevitable. Just as with the examples you've given from the past, change is inevitable. Those who succeed are the ones who can use a combination of their own intellect and the tools available to them.
Thank you for raising these issues, and the idea of putting problems into binary mode especially resonated. It's complex, and the ways we resolve complex systems is through probes, experiments, learning, and iteration.