When ChatGPT FAILS: What a service outage taught me about learning with AI
Roy Lo / March 14, 2025
Last autumn, during final exam week at the GSB, ChatGPT experienced a service outage. My classmates and I panicked.
To be clear, most Stanford GSB classes allow and even encourage students to use AI tools for assignments and exams. No one was cheating. However, that feeling of helplessness and the sudden realization of my dependency on AI made me seriously reconsider how I should use AI in my learning experience.
Prior to Stanford, I worked as a product manager at an AI tech company. I witnessed the rise of ChatGPT and generative AI as an industry insider and adopted several AI tools to boost my team’s productivity. Now, returning to school after 15 years, I'm noticing we use AI tools daily—for writing assignments, project ideas, drafting outreach emails, solving finance models, and optimizing our exam grades. But I question whether I've truly learned more than before, or more precisely, whether I still remember the knowledge I've acquired.
Tony Fadell, author of the bestseller "Build: An Unorthodox Guide to Making Things Worth Making," said: "Humans learn through productive struggle, by trying it themselves and screwing up and doing it differently next time."
AI technology eliminates this "productive struggle," threatening deeper learning, creative thinking and intellectual independence.
This isn't just a concern for elite business schools. As AI tools become increasingly embedded in education at all levels, we face profound societal implications.
The renowned researcher Manu Kapur, who developed the theory of productive failure, challenges the common assumption that learning should always feel comfortable or easy. Instead, he suggests that the presence of struggle might be evidence that meaningful learning is taking place.
He said: "Being frustrated and struggling are normal things. In fact, if you're not feeling those things, that means you're probably not learning".
There's also a growing divide between those who use AI as a shortcut and those who use it as an enhancement. Stanford education professor Candace Thille, who studies learning science and technology, argues that AI can either scaffold learning or replace it, depending on how it's implemented. The difference determines whether we're developing a generation of independent thinkers or AI-dependent workers.
Because AI is evolving so rapidly, there is no established practice for integrating it effectively into our learning. Much like today’s world, which is full of opportunities but lacks clear regulations, everyone is trying to find a learning method that works for them.
My wife recently began continuing education at Stanford to learn Python programming with LLM assistance. A decade ago, when she was a software engineer, coding an algorithm or solving a bug often required hours or even days. But that continuous iteration helped her understand how systems operate clearly. After months of solving bugs, she became an expert. In contrast, learning coding with LLM has reduced the time to develop a workable application to mere minutes. However, she often just copies and pastes code without understanding the underlying rationale. She can complete assignments without truly understanding a single line of code. She feels less fulfillment than before and remains unconfident about Python and the applications she's developing.
This shift in learning practices affects not just individual learning but also societal values. While we've become obsessed with the high-quality results AI produces, are we overvaluing the speed of generation while leaving little room for mistakes? Are we implicitly judging unpolished raw material as reflecting a lack of effort? Are we inadvertently devaluing the patience and persistence needed for deep expertise? Is the pressure to produce polished work immediately changing our relationship with the messy yet essential process of learning?
I propose a framework for "AI-assisted learning that preserves productive struggle." Modern LLMs work better with thoughtful prompts, so why not use prompting as a "learning contract" between ourselves and AI? We can structure these interactions to maximize retention and skill development while still leveraging AI's capabilities. The template might look like this:
CONTEXT: I'm learning about [specific topic] at [your level of expertise].
MY CURRENT UNDERSTANDING:
[3-5 sentences explaining your current understanding of the concept]
SPECIFIC CHALLENGE:
[Describe the specific aspect you're struggling with or question you have]
WHAT I'VE TRIED SO FAR:
[List 2-3 approaches you've already attempted]
GUIDANCE REQUEST:
Please provide me with:
1. A hint or guiding question to help me think through this problem (not the complete answer)
2. An explanation of an underlying principle or concept I should understand
3. A suggestion for how to approach this type of problem in the future
LEARNING GOAL:
My goal is to [understand this concept/develop this skill/be able to solve similar problems independently].
ONLY AFTER I'VE ATTEMPTED TO SOLVE WITH YOUR GUIDANCE:
I'll follow up with my solution attempt, and then you can provide feedback and the complete answer if needed.
I've taken over 20 classes in the past three quarters at the GSB. Clearly, best practices for adopting AI in our learning remain debatable. Some professors encourage or even advocate for AI use in our learning. Others add complexity to their exams to prevent AI from easily solving problems, while some still oppose students using AI for assignments and exams. I believe the responsibility for learning ultimately rests with students. AI gives us more opportunities and flexibility to make learning meaningful—let's not shortcut the fun and joy in the process.
If we don't intentionally preserve struggle in our AI-assisted education, we risk developing a generation brilliant at prompting machines but incapable of the independent thinking that drives human progress.
By the way, I wrote this opinion piece using the AI-learning prompt template, and it helped me think more deeply and broadly.