Four billion dollars is a lot of faith to put in something we don't fully understand yet. And I want to be clear: I'm not opposed to AI in classrooms at all. I use it myself, every week, to create in-class games and surface research I'd otherwise miss. AI is an incredibly powerful tool to help redefine classroom education globally and shape a new paradigm for in-class learning.
What I'm opposed to is the specific story that many ed-tech companies and AI vendors are telling school leaders:
- "AI can handle instruction, so you don't need as many teachers."
- "AI can reduce teacher workload to the point of reducing staff."
- "AI tutors can fully replace human instruction."
A school visit in Rio de Janeiro showed me exactly what this looks like in practice. A teacher asked a student to walk through a problem she'd just "solved" on the platform. The girl froze. She'd answered correctly, the algorithm said so, but she couldn't explain a single step. She'd learned to navigate the interface, not the mathematics. The platform had optimized for the metric it could measure, which is exactly what platforms do. Nobody had designed it to notice the difference.
That visit has stayed with me. Because the pressure I see building across ed-tech right now is moving in exactly that direction, at scale, with billions of dollars behind it.
The Bet Being Made With Someone Else's Children
The research on this is getting clearer. Studies on AI tutoring systems consistently show gains on procedural tasks: computation, grammar drills, vocabulary recall. The stuff that's easy to measure and easy to automate. But when researchers look at conceptual transfer, the ability to take knowledge and apply it somewhere new, the results go flat. Students learn the surface of a thing and miss the structure underneath it.
That limitation matters, because conceptual understanding is the whole point.
What the Algorithm Will Never Know
During my years at IASEA, training teachers across five Brazilian states, I heard variations of the same thing from veteran educators. One teacher in a public school in Rio de Janeiro put it plainly when I asked what she thought about the AI tools her district had started piloting.
She didn't hesitate.
"The algorithm doesn't know that Marcus's mother just went to the hospital. I do."— Public school teacher, Rio de Janeiro
She meant it literally. A student's home situation affects how he hears feedback that day, how much risk he's willing to take in front of his classmates, whether a gentle push will land or break something. A teacher who knows him can read all of that without a word being spoken and adjust accordingly. A platform cannot. It will deliver the next lesson node regardless.
John Hattie's meta-analyses across decades of education research put teacher-student relationship quality among the top five predictors of academic achievement, full stop. The relational dimension of learning is the real work, especially for students who've learned from experience that institutions don't particularly care about them.
The schools most likely to replace teachers with AI platforms are the ones serving those exact students. That should make us angry.
Who Pays When We Get This Wrong
Teacher attrition in under-resourced schools runs roughly 50% higher than in affluent districts. I saw this firsthand in Rio de Janeiro's favelas through my work with Viva Rio, where education programs served hundreds of communities. These schools face real, grinding shortages. So when an AI vendor walks in with a solution that can "cover" more students with fewer teachers, the pitch lands. The budget math seems to work.
But the students in those schools end up with a lesser education, dressed up in a sleek interface.
The districts with the worst teacher shortages face the strongest pressure to automate instruction. But those are precisely the students for whom the teacher relationship matters most. The substitution logic runs exactly backward.
When I co-founded Flip Education, I had investors ask me, again and again, why we needed teachers at all if the methodology was sound. Couldn't we just build an app? I understood the question. But after more than a decade designing active learning programs and training teachers to implement them, I know that active learning, the kind where students construct understanding together, fails without human facilitation.
Someone has to read when a group discussion is becoming performative rather than generative and redirect it. Someone has to notice which student hasn't spoken in twenty minutes and create a low-stakes entry point for them. Someone has to decide, in real time, that the lesson plan needs to be abandoned because something more important just surfaced. These are judgment calls, and they require knowing the room. Algorithms optimize; they don't judge.
What AI Should Actually Do in a Classroom
None of this means the technology is useless. The question is what we're asking it to do.
AI that saves teachers three hours of administrative work a week is valuable, because those three hours go back to students. AI that helps a teacher identify which students are struggling with a specific concept before a lesson is valuable, because it sharpens human judgment rather than replacing it. AI that generates differentiated practice problems, or surfaces a research summary, or gives a teacher a second read on an assignment rubric: all of that is useful in the same way any good tool is useful.
The distinction that matters is whether the technology is augmenting teacher capacity or substituting teacher presence. The first is a win for students. The second is a cost-cutting measure wearing an equity argument as a costume.
The Line I'm Drawing
I'm writing this because I've watched what happens when the wrong bet gets made, in schools across Rio de Janeiro and São Paulo, in a hundred quiet conversations with teachers who feel the pressure to make themselves smaller so the platform can do more.
Education has always had a complicated relationship with silver bullets. Every decade brings a new one. The technology changes; the underlying promise stays the same: that we can engineer our way out of the slow, relational, irreducibly human work of helping someone learn.
We can't. And the sooner the industry is honest about that, the sooner we can figure out what AI in education should actually look like.
Teachers are the mechanism by which learning happens. Everything else is infrastructure.



