Zhejiang University

01/13/2026 | Press release | Distributed by Public on 01/14/2026 03:40

Does AI outperform teachers, or is it the other way around

Over the past seven decades, artificial intelligence (AI) has shifted from a distant idea to a real presence in our classrooms, quietly but radically changing how knowledge is delivered and absorbed.

As this "third partner" joins teachers and students, these big questions are getting harder to ignore: How can AI genuinely support teaching and learning? How can it help instructors teach more effectively? And how can it become a tool that helps students explore beyond the textbook and into the unknown?

Zhejiang University recently offered its latest response with the publication of AI for Education: A Collection of Cases. The book showcases new classroom scenarios enabled by generative AI (GAI), but more importantly, it reflects on what education should look like in an AI-shaped future.

The classroom's "third partner"

In a traditional classroom, the roles are clear: teachers teach, students learn. But amid the explosive rise of AI, Zhejiang University has introduced a "third partner" into the learning space. Why? Because real classrooms are asking for it.

Take, for example, Automatic Control Theory, a low-division course with around 110 students. In a midterm survey, students from the College of Energy Engineering described it as demanding and time-consuming-for both learners and instructors. Many said they understood the theory in class but struggled to translate it into problem-solving skills. Applying concepts to real engineering scenarios felt even harder.

That's where generative AI (GAI) comes in. In this course, GAI supports learning by instantly producing MATLAB simulation code for state-space equations and generating dynamic visualizations. Abstract concepts become easier to "see," not just memorize. For instructors, it reduces the workload of producing foundational materials; for students, it provides immediate, targeted support. It can even help teachers spot where students are struggling and generate supplementary resources on the spot, making classroom time more focused and efficient.

Math education is seeing similar momentum. The ChatMate platform, for example, has attracted more than 10,000 subscribers within a year. It can interpret complex mathematical expressions, offer personalized Q&A, and suggest study paths, earning a reputation among students as a reliable "math learning companion."

Zhejiang University has also launched the AI STEP program, an iterative initiative designed to explore the next phase of AI for Education and AI in Education. Since 2024, the university has funded 212 empirical teaching research projects under its "AI for Education" series, encouraging instructors to test human-AI collaboration in real classrooms.

These projects span virtually every discipline, from engineering, agriculture, and medicine to humanities and social sciences. Some focus on redesigning lessons with AI in mind, others on practical classroom applications, and still others on measuring how GAI affects student learning.

Many of these examples may sound almost miraculous, but they're not "magic." They're the result of AI's strength in scaling pattern recognition-what some describe as a kind of algorithmic "collective intelligence" built on massive datasets.

That strength brings new efficiency and new possibilities. But it also comes with limits. AI's outputs are shaped by what it has been trained on and how it is programmed to respond. In that sense, AI is less like an individual mind and more like a distilled reflection of many minds-capturing common knowledge and shared patterns of thinking, but not the full depth of human judgment or creativity.

A "new partner" yet far from omnipotent

For all its promise, AI is not a magic wand, and students and faculty at Zhejiang University are learning that firsthand.

"AI doesn't really think," students observed in the Learning Science and Technology course. To them, it often behaves less like a mind and more like a highly efficient, finely tuned search engine, which is adept at pulling together information, but not always capable of genuine reasoning. That limitation has shown up in practical settings, too. While building an AI-powered programming teaching assistant for the Mathematics Department, instructors found that the system could sometimes produce answers that sounded convincing yet fell apart under logical scrutiny.

Rather than being brushed aside, those misfires became some of the most valuable moments in class. They offered a concrete window into what researchers call "hallucinations"-errors in which large language models generate plausible-looking but incorrect content. And they helped students recognize a clear-cut dividing line: AI can be impressively fluent, even "smart," but human judgment is still needed to separate surface-level correctness from real understanding.

Likewise, in updating the course Medical and Pharmaceutical AI, the project team noted that AI is skilled at rapid recombination, sorting, summarizing, and optimizing along data-driven pathways. But it struggles to do what humans often do best: leap beyond patterns, challenge assumptions with emotion and ethics in mind, and connect ideas through creative analogy. As the team put it, truly new knowledge often emerges from the collision of different human perspectives, something AI can imitate, but not fully reproduce. To keep this "new partner" aligned with fast-moving academic frontiers, the team has made real-time updates and comprehensive content coverage a long-term priority.

There's also a more practical constraint: AI improves only when it is fed. Humans learn out of curiosity and passion; AI "learns" through continuous training on specialized data. In teaching innovation supported by the "WisdomBot" large model, researchers found that the biggest barrier wasn't a lack of capability, but a lack of inputs, specifically, high-quality, domain-specific datasets robust enough to support reliable performance.

And then there is the human dimension that technology still struggles to reach.

In Learning Science and Technology, a student compared feedback from AI and from their advisor on a paper. The AI offered thorough comments on structure, grammar, and formatting. But the advisor asked something harder and more important: "Does your core argument actually hold up?" Then came a real debate, the kind that propels a student to defend, revise, and refine their thinking. That mix of deep critique, intellectual pushback, and personal encouragement-what some students describe as the "temperature" of teaching-is still difficult for AI to replicate.

In many ways, it's these limitations that highlight what remains uniquely human in education. As AI becomes a long-term presence in classrooms, the question is no longer simply what AI can do. It's how teaching and learning will evolve around it and how we define the relationship among teacher, student, and machine in the years ahead.

The infinite possibilities of tomorrow's teaching

If AI is becoming a permanent presence in higher education, Zhejiang University's message is clear: the future isn't about replacing people; it's about reshaping roles.

In this emerging model, teachers move from being "the voice at the front of the classroom" to becoming guides who stay close to students' thinking. In medical classrooms, for example, tools like the MediLearn Assistantcan instantly answer questions about diseases and walk students through standardized diagnostic and treatment processes. But rather than making the instructor less important, faculty say it pushes their work to a higher level.

With routine explanations handled in seconds, teachers can focus on what matters most: helping students connect fragmented facts into clinical reasoning, and pushing beyond "the right answer" to questions that don't have one: ethical dilemmas, uncertainty, and the human side of care. The job becomes less about delivering information and more about designing learning experiences that build judgment, critical thinking, and creativity.

Students, too, are expected to step into a more active role. Instead of simply absorbing information, they become "knowledge explorers" and learn how to ask better questions, test assumptions, and work collaboratively.

That's especially important when the "third partner" is both highly knowledgeable and oddly passive: AI can generate fluent responses, but it doesn't take responsibility for what it says. Students have to learn how to interrogate outputs, verify claims, and keep digging until they understand where the model's knowledge ends. In an ocean of information, the goal is to find one's own direction rather than becoming an echo of the machine. Along the way, universities stress the need for students to stay alert to ethical risks and to uphold academic integrity.

In this vision, AI is positioned as a "tireless teaching assistant." One example is a Socratic-style agent built on the "WisdomBot" large model. Instead of functioning as a simple answer machine, it tries to guide students through reasoning by repeatedly asking questions like "Why?" and "What if…?"-nudging learners toward their own conclusions rather than handing them a final response.

Language learning offers another glimpse of what this might look like at scale. The E-lang platform can adjust difficulty in real time during conversations and introduce new topics deliberately, encouraging learners to move beyond their comfort zones. The idea, developers say, is not substitution, but support: allowing teachers to spend more time on mentoring and development, while helping students grow into more capable, independent learners.

This collection of cases doesn't pretend to deliver a blueprint for the future. Instead, it documents experiments in progress, along with the uncertainty, surprises, and insights that come with real exploration.

Its underlying argument is simple: technological change may be inevitable, but the warmth and depth of education remain human choices.

The future, in this telling, is not a classroom overturned by AI. It is a classroom reshaped with intention where humans and machines evolve together, and where the "third partner" helps open a new chapter in learning, creativity, and wisdom.

Source: Zhejiang University
Translator: FANG Fumin
Editor: HAN Xiao

Zhejiang University published this content on January 13, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on January 14, 2026 at 09:40 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]