43 Comments
User's avatar
3 Questions Deep's avatar

You are correct.

Great analogies.

It should be a must-read for everyone teaching.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I appreciate your comments. Thank you so much for the re-share and those kind words. I've been thinking about this a lot, and I actually wrote more about it in my weekly rundown that's coming out in a few hours.

We just can't do nothing and let students try and figure this out on their own.

Expand full comment
Kim Doyal's avatar

This is great, Bette.

I have nieces in college & a nephew entering his senior year of high school - they all use it all the time for just about everything they can (my kids are out of school & also use it all the time).

It's crazy to me how many legacy institutions/industries are so slow to adopt AI and really make use of it, create policy around it, etc.

The universities have a responsibility to their students to create a standard policy, use case, education, etc.

Hopefully they'll do it sooner rather than later.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I completely agree Kim. But what I will say after working in higher education for 20 years is that they are incredibly slow to adapt to change. Something like this is so disruptive it's going to be jarring. But in the meantime, students are left to their own devices and they will find ways of using it. But I fear what that means without guidance.

Expand full comment
MundaneMarvels's avatar

Bette, thank you for this sharp and timely piece. I couldn’t agree more with your take on the current AI β€œfree-for-all” in higher education. As you so effectively point out, this isn’t about banning or embracing AI... it’s about failing to teach students how to use it critically, responsibly, and reflectively.

We can’t keep expecting students to navigate this shift with one-line guidance or wildly inconsistent classroom policies. The cultural and cognitive integration of AI is already underway, not just in academic tasks but in emotional processing and self-regulation, as you illustrated so powerfully. If institutions don’t catch up fast, we won’t just be failing to prepare students for the future; we’ll be failing to equip them for the present.

Kind regards,

John

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

100% And in my opinion, these institutions don't have a lot of time to sit on the sidelines to wait and see how all this shakes out to determine what measured approach they're going to take. They're going to have to start making decisions and incorporating this technology into the mix. Curriculums are going to need to be completely revamped. They just are.

Expand full comment
Dr Priyanka Upadhyai's avatar

I have complex feelings about AI. You are spot on that this is going to hurt education and especially the future generations. And while I am reeling under the impact it has on students, I keep reading how AI is fast replacing therapy! On the other hand I do use ChatGPT to perform admin tasks on the fly; it saves me some time but then I have to cross-check everything so who knows

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Yes, and that right there is the crux of all of this right? You have to know the limitations because I do the same thing - if I ask it for data or specific information, I have to cross-check it as well because you can't trust that it's 100% accurate.

Will that change in the future? I don't know, but that's where we're at right now. So people who blindly believe whatever it spits out and then repeats that information or cites it is doing everybody including themselves a huge disservice.

Expand full comment
Lisa Cunningham DeLauney's avatar

We are way behind on regulation. Even with the internet, let alone AI. And young people are generally ahead of those setting the policy. So, we are relinquishing control or the chance to implement any strategy. But the scariest thing is the potential for machines to replace relationships. It's already happening. Not just with your friend seeking a therapist or coach. Many children now look upon their device as their best friend.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Yeah, Lisa. I don't know what the answers are, but we can't pretend it isn't happening. Unfortunately, I think a lot is going to fall to parents to decide how to deal with all of this because K-12 and higher ed, especially higher ed, is so slow to adapt to any kind of change and this isn't something you can just sit back to wait and see.

Expand full comment
Dee McCrorey's avatar

Thank you for going into such depth on this topic, Bette. Indeed, it's a dirty little AI secret that is also depressing πŸ˜”

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I really don't know what the answers are, Dee, and it's complicated. But what I do know is we can't pretend it's not happening. Unfortunately, I think parents are going to need to take it upon themselves to figure out a lot of how they're going to teach their kids how to use this technology.

Anyway, I've been thinking a lot about this lately, and I have more thoughts I'm going to post in my weekly rundown tomorrow.

Expand full comment
David Crouch's avatar

Interesting viewpoint. Not sure I agree with it. I think you oversimplify the issue and negate the solutions. First, no one - not one single person - understand how LLM/LRMs actually really work in terms of coming up what they do. It was an error in development to let them get this far with mega matrix algebra (how they actually work) without us building in proper monitors. Having lived thru 2 AI winters not sure we won’t have a third. The flaws with LLMs are multiple and insoluble with the current architectural and conceptual underpinnings. You state: β€œWhen used irresponsibly, it can do real damage”. I would suggest that there is no truly responsible way for us them short of incredible - what in my day was called - desk checking of every bit of output

Just saying no at the university level IS the right answer in my view. Similar to the highly successful just say no to smart phones in K-12 classrooms. There is not one rigorous study thh heat I have read that demonstrates learning improvement with the use of ungoverned AI; quite the opposite . In my view it is naive to expect that academic institutions should have already mastered and understood this complex and difficult technology, and figured out meaningful policies. Where the real criticism should fall is at the vast echo of silence from governments in terms of appropriate AI safety and governance laws, despite many many chances to do so.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

So I guess what you’re saying is we should ban anything we don’t fully understand?

Pretty sure we didn’t fully understand electricity, indoor plumbing, the internet, or modern cars when they were introduced either.

And we definitely tried the β€œjust say no” approach with sex ed and drugs in the β€˜80s. Pretty sure that didn’t work out all that well. Just telling teenagers not to do something is basically a guarantee they’ll do it faster. They’re already using it, David.

Expand full comment
David Crouch's avatar

Sorry I continue to disagree with your entire line. Trivializing something as complex, rapidly changing and as rapidly adopted as AI to indoor plumbing is….. well quite astonishing. There is no footing for meaningful debate.

Of course they are already using it! I KNEW that. I note you stepped past my very valid and timely point about smart phones in K-12 and our documented research there, in favour of a comparison with sex and drugs. Again no grounds for legitimate discussion.

I just read another profound study of the effects of Gen AI students: β€œ LLM users consistently underperformed at neural, linguistic, and behavioral levels”. As this was an academic research project it was free from the necessary adjectives. I will add after looking at the graphs: It was profoundly underperformed . AI poses a very critical negative force on our educational system; I have seen only the tiniest of use cases of where it adds pedagogically.

Expand full comment
Neela 🌢️'s avatar

If we don’t teach students how to use AI with intention and discernment, we’re not preparing them for the world they’re already in. I hope more educators read this and see the urgency.

Love this line, Bette - "Prompting is not thinking."

Happy Monday to you...

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Been thinking about this a lot. I wrote some more about it today in my weekly rundown. There's going to be a need to be a complete revamping of curriculum, and given how slow education is to adapt, I don't even know what this is going to look like.

Happy Tuesday 😊

Expand full comment
Neela 🌢️'s avatar

Well ….in Linda we trust hehehehe

Happy Wednesday Bette…

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

That doesn't comfort me πŸ˜” As I'm nearing the final third of my life, it's disheartening in the direction we're going.

Expand full comment
Neela 🌢️'s avatar

I’m not too far behind you hahahahaha

It is and these days I think about this even more :(

Happy Thursday Bette

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Yeah, it's a weird dichotomy because on the one hand, it's an exciting time with the technology and everything that we can do with it, but then the political direction we seem to be going in is disheartening.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

That makes all this even more worrisome. We've got a person from the World Wrestling Federation running the Department of Education.

Expand full comment
Neela 🌢️'s avatar

This entire admin makes me feel like crying.

But hey, people voted for this ….

Expand full comment
Cecilia At The Kitchens Garden's avatar

Yes, this is a great read Bette. Citing references is silly - perplexity gives us the sites it has scraped to find the info. The references are already written for us.

And, on top of that each family has a different take on AI. I agree that it should be viewed as a useful resource. Then we train ourselves to edit the AI offers with our own input.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

You're absolutely right, Cecilia, that everyone has their own opinion, and it gets very emotional on both sides. My fear is that if we don't teach students how to use it, they're left to their own devices, literally and figuratively, and they will find ways of using it that may or may not benefit them at all if we don’t intervene in some way.

Expand full comment
Cecilia At The Kitchens Garden's avatar

Absolutely. Morning Bette! Learning how to learn is critical. AI can be an incredible tool to assist learning but not our only tool. Our kids are clever - it won’t take much to enable them to take control of this toolbox.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Isn't it amazing Cecilia, when it comes to Finding alternatives and defying authority, Teenagers and young adults can be amazingly creative. But, yes, if we don't help them understand the limitations and what you can do with it. What’s realistic and what isn't. What should and shouldn't be done, they're going to decide for themselves.

Expand full comment
Cecilia At The Kitchens Garden's avatar

Yes they are. And I am thinking that learning how to learn in a world with AI components needs to start in the home and with young students..Many adults (parents and aunties) are still using AI in the same way as we used Google. This is a whole new beast and deserves entire classes on how to use it and not be used by it. You have begun an excellent discussion Bette!

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Thank you, Cecilia. I really appreciate that. And I completely agree with you that unfortunately, a lot of this is going to fall to parents because the educational system is going to be slower to react.

So the parents or guardians or mentors that are able to learn this technology, understand it, set some guidance around it or hire somebody that can help with that like an AI coach or tutor are going to be the ones that come out ahead.

Expand full comment
Nik Pathran's avatar

Insightful read, Bette! β€œPrompting is not thinking,” captures the AI puzzle perfectly.

This really highlights a missing piece, "Structured thinking before structured prompting." Without guidance, students are building habits that bypass reflection.

And more importantly, without a foundational understanding of how AI reflects our inputs, we risk outsourcing thinking without ever clarifying what we believe.

When students don’t know how to engage critically, they miss the opportunity to shape who they become. Over time, this becomes not just an academic issue...but an identity issue.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

100% Nik. I'm not sure people realize the ramifications that this is going to have, especially in education, because it's going to impact students for decades to come.

What we decide now is going to have a lasting impact, and with teachers so maxed out and higher education incredibly slow to respond, it's going to leave it to fall on parents. I think the people who are able to get their children outside help like AI tutors are the ones who are going to get ahead in this.

Expand full comment
Bonnie Marcus's avatar

Thank you for addressing this important issue about AI education. it's so critical for all of us to have sound guidance on how to incorporate this tool into our career and lives.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Strong leadership is urgently needed especially in education. Entire curricula from K–12 through post-secondary will need to be restructured. This isn’t a short-term disruption. The choices we make now will shape entire generations for decades to come.

Expand full comment
fenix's avatar

It’s interesting and honestly a bit unnerving how differently AI is being handled across environments, whether it be schools or workplaces. Based on your article, I do think it is a bit concerning about institutions not giving students proper guidance. Without clear policies and rules, there is the risk of repeating the mistakes of other tech revolutions (like the industrial age or social media), where lack of oversight led to major unintended consequences. With AI, no one even knows what blind spots we’re dealing with yet which is why structure and education are so critical.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Exactly. Unfortunately, education is slow to respond to big changes like this. I really fear how this is going to evolve. When everybody is just making up their own policies within an organization, that's just a recipe for chaos and problems.

Expand full comment
Hans Jorgensen's avatar

Who writes the code for these "tools" and who profits? What are the long-term implications of asking a built-in hallucinating program to "think" for you? What happens when we anthropomorphize software? These are questions for students and institutions. Thanks for raising this important topic, Bette.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I agree. There needs to be firmer policies in place, but this technology is evolving so quickly and education is one of the slowest institutions to adapt to anything. I'm not sure where this is going to leave us.

Unfortunately, It's going to fall to the parents to step in, and those who have the luxury to hire someone to help tutor their students how to use it are going to be the ones that come out ahead.

Expand full comment
Hans Jorgensen's avatar

Thanks for giving a heads-up to parents. The teachers I know are frustrated. Have a good week

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I'm sure they are because they're not given much help or guidance either. And when you have fuzzy policies, it leaves everyone feeling uncertain and not knowing what to do.

Expand full comment
John Polonis's avatar

You’re spot on that AI is a tool like anything else - teachers and parents can try to ignore it, but that’s like ignoring PCs because you don’t want to give up your typewriter.

Every part of society needs consistent rules of the road for how to use AI. The risks are too great otherwise.

The only thing worse could be ignoring it completely.

Great piece!

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Thanks, John. Appreciate it! I agree. There has to be some policies in place. But the educational system is woefully underprepared for this because entire curriculums are going to need to be revamped from K through 12 all the way to post-secondary. And with teachers so maxed out and higher education incredibly slow to adapt, I don't know where that's going to leave students.

Unfortunately, I think parents are going to have to take it upon themselves to potentially hire individuals to help their children learn how to use this tool.

Expand full comment
Josh Gratsch's avatar

I'm not sure which is worse: completely ignoring AI or not learning how to use it properly. If students are using it only to get answers and cheat, they're completely missing out on the real value of using GPT or similar tools as a personalized coach, advisor, or the like. To challenge thinking and appropriately expand learning. If students are going to cheat and limit their learning capacity, they will do so regardless of AI tools. If that's all professors are worried about, they're missing the bigger picture. The bottom line, from my perspective, is that AI education is not only necessary but also prudent.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

Absolutely, Josh. Teaching students how to use this is non-negotiable. But there has to be clear policies in place, and the entire educational curriculum is going to need to be completely revamped from K through 12 to post-secondary. I can tell you they are not prepared.

You really have to play with it to get a better understanding of the capacity that it has, and teachers are already maxed out. Higher ed is so slow to adapt to anything they have to have committees to have committees to have a committee. Unfortunately, I think parents are going to have to take it upon themselves to step in, maybe even hiring coaches to help with it.

Expand full comment