AI Won’t Replace Your Kid’s Teacher, But It Will Make Them Better

Oct 16, 2025 3:02:23 PM

by

AI Won’t Replace Your Kid’s Teacher, But It Will Make Them Better
12:00

A few messages arrived asking if I’m going soft on AI in education.

I’ve sounded cynical in the past about the idea (even as I wrote cautiwously in favor of it).

Am I buying into the hype after joking often about robots in classrooms?

Have I decided it’s okay for tech companies to automate teachers out of jobs?

The fear is real. It’s not entirely unfounded. Let’s admit, Silicon soldiers have a terrible track record of “disrupting” industries by destroying livelihoods, extracting value, and leaving wreckage behind. Uber didn’t improve taxi drivers’ lives. Amazon didn’t strengthen local retail. Why should we trust tech companies with teachers?

Let me, as America’s first Stupidologist, separate the stupid from the smart with regard to tech and teaching.

It’s more interesting than either the utopian or dystopian versions we’re hearing about AI.

What AI Actually Can’t Do

 Let’s start with what should be obvious but apparently needs saying: AI can’t read your kid’s face.

It can’t see when they’re losing focus, tuning out, or about to give up. It can’t hear the frustration in their voice when they say “I get it,” but clearly don’t. It can’t notice the body language that says, “I’m embarrassed to ask this question in front of everyone.”

A human teacher constantly picks up on these signals. They’re making dozens of micro-adjustments every minute based on what they see and hear. That feedback loop is invisible but essential. Expert teachers are constantly monitoring and adjusting—reading the room, changing pace, circling back to check understanding, knowing when to push and when to ease off.


More than pedagogy, it’s relationship. It’s care. It’s human connection. AI, as cool as it might be, doesn’t have this. It responds to what you type. That’s it.


And AI doesn’t care whether you succeed. This sounds obvious, but it matters enormously. An effective teacher cares. They’ll push you when you’re being lazy. They’ll encourage you when you’re discouraged. They’ll stay after class because they’ve invested themselves in your success. AI is indifferent. It will answer your questions patiently, but on its own, it won’t challenge you to think harder, won’t celebrate your breakthroughs, and won’t be disappointed when you give up.

The best teacher I ever had told me my first college paper was “competent but cowardly.” That was rude. And that single sentence changed my intellectual life. Not because of the information it conveyed, but because someone I respected cared enough to challenge me to do better. I can’t see AI in its current capacity doing that with the same nuance and touch. It can’t look you in the eye and demand you live up to your potential.

The Things That Actually Take Time

AI can’t replace human teachers, but it can be the most powerful assistant they’ve ever had.

Teachers spend enormous amounts of time on low-value tasks—things that don’t require their unique human capacities.

Explaining the same concept for the tenth time to different students at different paces. Generating practice problems. Providing immediate feedback on routine questions. Grading objective assignments. Creating lesson plan variations for different ability levels. Administrative documentation that exists primarily to satisfy bureaucratic requirements.

A middle school math teacher might waste roughly six hours a week just grading homework and quizzes. Six hours that could be spent having one-on-one conversations with struggling students, providing deeper feedback on complex work, or actually thinking about her practice instead of drowning in paperwork.


Research shows that teachers using AI for these administrative tasks gain an additional 5.9 hours per week for higher-value work. That’s not unemployment. That’s liberation from stupidity—the kind of institutional stupidity that wastes brilliant people’s time on tasks that could be automated.


Think about what teachers could do with an extra six hours a week. Creative lesson planning. Mentorship and relationship building. Professional development. Time to think.

This is about AI amplifying teachers and freeing them to do what they do uniquely well, rather than replacing them. It’s about curing the stupidity of how we currently deploy human intelligence in schools.

The Stanford Study Everyone Misunderstands

The Stanford study I’ve mentioned before showed notable gains in student mastery when tutors used AI assistance, but these gains weren’t from tutoring students directly. It was from assisting human tutors. The AI provided real-time suggestions and explanations to the tutor, who then decided what to share with students. It helped tutors ask better questions, provide more precise explanations, and identify gaps in student understanding. The human remained in control. The human made the pedagogical decisions. The human built the relationship. The AI just made the human more effective.

The most significant gains went to students with less experienced tutors. AI didn’t make expert tutors much better—they were already great. But it made novice tutors nearly as effective as experts, improving outcomes by 9 percentage points.

This is the model. AI as a teaching assistant, not a replacement.

The Calculator Parallel

There once was a calculator panic. Remember that? Some of you won’t. When calculators became cheap and ubiquitous in the 1970s, math teachers had a fit. “Students won’t learn basic arithmetic!” they warned.

After doing poorly in a math class, my dad told me life was going to be hard because the whole world was going to be computerized. And that would require math. The idea that computerization would make math easier for people who failed at it didn’t cross his mind.

Ha! Take that, Dad.

Yes, we do less computation by hand now, but the Luddites were mostly wrong about what that meant.

Calculators freed students to focus on mathematical reasoning rather than computational tedium. We can now teach more sophisticated mathematics earlier because we’re not spending weeks drilling long division.


The calculator didn’t replace mathematical thinking. It changed what mathematical thinking looked like. And it made math teachers more effective because they could focus on concepts instead of computation.


I keep coming back to this because it’s the right mental model for AI tutoring. The technology handles the tedious parts so humans can focus on the profound parts. But that only happens if we choose to structure it that way. And right now, I’m not sure we’re making that choice deliberately enough.

What This Means for Teacher Training

 If AI is going to amplify teachers rather than replace them, we need to train teachers differently. And right now, colleges of education are mostly ignoring AI or treating it as a threat.

This is stupid—and I say that as a professional Stupidologist.

Teachers need to learn how to deploy AI tutors effectively with students. They need to know when to use AI and when human instruction is essential. They need to understand how to verify AI outputs, because hallucinations are real and regular. They need to interpret AI-generated data about student understanding. They need to design learning experiences that combine AI and human strengths.


This isn’t optional. It’s urgent. The teachers who learn to work alongside AI will be dramatically more effective than those who resist it.


And we need to pay them accordingly. A teacher who can effectively deploy AI to individualize instruction for 30 students is doing the work that previously required multiple teachers. That’s worth more, not less.

But here’s what I actually worry about, and this is where my education reform background kicks in: not that AI will replace teachers, but that we’ll use AI as an excuse to underfund schools. The logic will go like this: “Now that we have AI tutors, we don’t need as many teachers. We can increase class sizes. We can cut budgets.”

This would be catastrophic. And stupid. Predictably, preventably stupid.

Two Different Futures

There are two ways this could go, and I’ve been in education long enough to know which one we’re headed toward if we don’t intervene.

The right model: smaller classes, more teacher time for high-value interaction, AI handling routine tasks, better outcomes for everyone. Teachers spend less time grading and explaining basics, more time mentoring and inspiring. Students get immediate feedback on routine work and personalized explanation when stuck, but they get their intellectual courage, their sense of what matters, their connection to ideas from humans who care about them.

The wrong model: same large classes, AI as a cheap substitute for human attention, teachers overwhelmed and demoralized, worse outcomes masked by metrics. Technology deployed not to improve education but to cut costs. The triumph of efficiency over efficacy, which is just another word for stupidity.

Which model we get depends on policy choices, not technology. The technology is neutral. How we use it isn’t. And right now, we’re letting market forces and budget pressures make these choices for us.

What Parents Should Demand

If your kid’s school is implementing AI tutoring—and they probably will be soon—you need to ask specific questions. Not because you’re being difficult, but because these questions determine whether your child’s education gets better or worse.

How are teachers being trained to use this? If the answer is “they’ll figure it out,” that’s a red flag. What’s the data privacy policy? Who owns the data about how your child learns? What are they doing with it? How do you verify AI outputs? What’s the process for catching when the AI gives wrong information?

Most importantly: Is this replacing teacher time or augmenting it? If class sizes are growing, that’s the wrong direction. If teachers are getting less planning time, that’s the wrong direction. If the school is cutting staff because “we have AI now,” that’s the wrong direction.


The goal isn’t to resist AI in education. The technology works—the research is clear. The goal is to ensure it’s implemented in ways that support teachers and students, not replace one with the other. This is what accountability looks like in the age of AI.


The Future I Want

Without being overly sentimental, here’s the vision I see: Every teacher has an AI assistant that handles routine tasks, provides instant data about student understanding, and suggests interventions for struggling students. Teachers spend their time doing what only humans can do—inspiring, mentoring, building relationships, making education meaningful.

Class sizes shrink because we’re more efficient, not because we’re cutting humans. Students get immediate feedback on routine work, but they get their intellectual formation from people who care about them. AI and humans, each doing what they’re best at.

This is possible. The technology exists. The research shows it works.

For my alarmist friends who support public schooling, the real scandal isn’t AI in education. It’s that we’ve known for 40 years what works—one-on-one tutoring, direct instruction, and practice—and we’ve condemned generations of teachers to explaining the same concept to the same struggling student for the tenth time because we couldn’t afford anything better.

Now we can.

I think teachers should be demanding this—not resisting, but demanding that we implement it in ways that elevate their work rather than undermine it. Because the alternative isn’t preserving the status quo. The alternative is letting tech companies and budget-cutters decide how this technology gets used.

And we’ve seen how that movie ends. It ends stupidly.

This piece was originally published on Verbatim

Chris Stewart

An award-winning writer, speaker, and blogger, Chris Stewart is a relentless advocate for children and families. Based in outstate Minnesota, Chris is CEO of brightbeam, a nonprofit media group that runs campaigns to highlight policies and practices that support thriving kids. He was the founding Director of the African American Leadership Forum, was an elected member of the Minneapolis Board of Education, and founded and served as the CEO of Wayfinder Foundation. Above all, Chris is a serial parent, a Minecraft enthusiast, and an epic firestarter on Twitter where he has antagonized the best of them on the political left and right. You’ll often see Chris blogging at citizenstewart.com and “tweeting” under the name “Citizen Stewart.”

Leave a Comment

The Feed

Explainers

  • EXPLAINED: What Are Standardized Tests and Why Do We Need Them?

    Ed Post Staff

    Few issues in education spark more tension and debate than standardized testing. Are they a tool for equity or a burden on students? A necessary check on school systems or a flawed measure of...

  • Explained: What Is the Federal Charter Schools Program?

    Ed Post Staff

    Charter schools are public schools with a purpose. Operating independently from traditional school districts, they're tuition-free, open to all students, and publicly funded—but with more flexibility...

  • A Leaky Educator Pipeline Loses Teachers of Color

    Ed Post Staff

    Despite the benefits of a diverse teaching force, prospective teachers of color fall out of our leaky preparation pipeline at every stage: preparation, hiring, induction, and retention. Here’s what...