Every Child Deserves an AI Tutor
A second‑grader learns more math in hours than most kids learn in weeks. By lunch, she’s done with academics and heads to a workshop where she operates a food truck, trains for a 5K, or builds a drone.
That isn’t science fiction. It’s a day at Austin’s Alpha School, where students finish core academics with AI tutors in about two hours and spend the rest of the day on truly hands-on projects. A recent profile says Alpha’s K–2 students scored in the top 0.1 percent on MAP tests last year. Now the founders are trying to scale the model through a digital platform called Timeback, which is free to everyone.
Alpha parent and software entrepreneur Joe Liemandt is building Timeback, betting a sub‑$1,000 tablet can teach any child in two hours a day—and they’ll love it.
But fear-driven regulation could stall this tool.
At Alpha, data is key. Students are measured on mastery, not seat time. Coaches track motivational tactics for each child. Timeback productizes that approach, bundling AI learning apps, progress tracking, and game mechanics. It uses a vision model to observe how students work, flag unhelpful habits, and offer real-time feedback.
Timeback recalls Neal Stephenson’s sci-fi Young Lady’s Illustrated Primer—an adaptive tutor that reads context, notices missteps, and turns everyday life into tailored quests to help a child lead an interesting, fulfilling life.
I want a Primer for my six‑ and three-year-old daughters. I was homeschooled from second grade by parents who didn’t attend college. The early commercial internet bent my path toward computer science and tech policy. Those tools weren’t built for kids, and internet privacy laws didn’t exist.
Thirty years later, my children should have a richer, more intentional digital education than I had.
Instead, the main “innovation” around kids and technology has been regulatory. People fear new tools when children are involved, and politics rewards saying no “for the kids,” especially when tech gathers data.
But knowing more about our children is often good. The world is becoming more legible as digital tools let us observe, measure, and understand reality—and ourselves—in new detail. Think of the microscope or X-ray: instruments that improved care because we could finally see what mattered.
Today’s sensors, software, and AI can do the same for learning. Properly used, legibility expands safe autonomy: families get clearer feedback, educators better diagnostics, and students the one-on-one attention a single teacher can’t provide to 25 kids. An AI that sees a child’s anti-patterns, such as hesitations, misclicks, and recurring distractions, and can coach them out of these bad habits. Trust grows through experience: developers partnering with parents and educators, learning quickly, and changing course when evidence demands it.
Yet some privacy fundamentalists see only threats. “Child safety” is a potent slogan, but blanket bans and rigid privacy laws freeze the feedback loops that make tutoring effective. Rules that hard-code privacy constraints and micromanage design will blind the very tools that could deliver customized education at scale.
Timeback also compresses screen time to expand real-world projects. Jonathan Haidt and other critics call for phone limits because online attractions replace outdoor, real-life experiences. Yet at Alpha, AI-powered instruction frees children for exactly the independent exploration those critics urge.
Timeback isn’t the only AI education tool, and it may not replicate Alpha without human guides, peers, and projects. Skepticism is healthy—and so is pluralism. Not every child, parent, or community will want this model. Fine. But the way to strengthen kids isn’t to deny families new options in a system that has offered the same broken ones since the industrial revolution.
A century ago, we didn’t stop doctors from using X-rays because they made the human body too visible. We learned from what these tools revealed while enforcing norms around consent and care. Education is on the cusp of a similar shift. AI education tools are early and will benefit from experimentation. But if we regulate out of fear, we’ll keep children in classrooms opaque by design—to parents, to learners, and in results.
Stephenson’s sci-fi Primer works because it truly sees the child. Close observation lets it coach rather than merely inform, building competence and confidence through timely feedback and stories that fit the learner. We expect the same from good teachers and mentors. Our policy should allow educational innovators access to the information needed to deliver cutting-edge education to our children.
The world is getting more legible. Our kids deserve an education that is, too. Let’s not blind the AI tutors that could help them read, build, and explore the future.