10/01/2025 | Press release | Distributed by Public on 10/01/2025 09:57
Dr. Laura Barré'91 tells her students that studying with AI is like studying with a very confident C+ student - but approaching it with a critical mind may turn that flaw into an advantage.
Barré, an assistant clinical professor of nutritional science in the College of Human Ecology, is among hundreds of faculty members at Cornell experimenting with new ways to put generative artificial intelligence (GenAI) to work in the classroom - and ways to block it out.
Barré uses AI to generate some of her case studies and to simulate patients, which the students can then interact with as a medical provider.
Their methods range from using AI to generate patient case studies for students to assess, to asking students to debate AI-generated arguments to demonstrate their understanding of the class material. Others have returned to old-school methods of assessment, like oral exams and in-class assignments using paper and pencil.
"There is a recognition that generative AI brings both good and harms to the classroom environment, some really fantastic opportunities and advances for faculty work and for student learning, and also some very real and deep challenges," said Steven Jackson, vice provost for academic innovation and chair of the Cornell GenAI Education Working Group, which has been developing new ideas, policies and practices involving GenAI in education.
In spring 2025, the working group conducted an all-campus survey of more than 700 faculty and nearly 2,000 students. The group found 70% of students were using GenAI tools once a week or more, compared with 44% of faculty. But both groups reported excitement about the potential of GenAI, alongside fear that it can stymie real learning.
Barré, for example, is using GenAI to simulate patients. A single written prompt into a GenAI tool generates multiple versions of a patient for the students to "converse" with as a medical provider.
When the AI patient hallucinates new information or offers odd lab results, she sees that as an advantage.
"I leverage that, because that's how patients are," Barré said. "Out there in the clinical world, patients are not textbooks. They may not always follow our expectations for how a disease might present or how the labs might look."
She also uses AI to generate unique case studies. Previously, she found if she didn't rewrite the case studies, students sometimes came to class with a copy from the last year. AI saves her time writing and cuts down on cheating, she said.
Rene Kizilcec, working group member and associate professor of information science in Cornell Ann S. Bowers College of Computing and Information Science, sees potential for AI to help faculty engage students in new interactive ways and offer students timely support with coursework.
He is chief scientist for HiTA AI, which makes an AI-enhanced learning management system. He and other faculty members are currently piloting its tools. Faculty upload their syllabi, readings, videos, lecture slides and assignments. Students can ask the tool to create study guides and comprehension quizzes or clarify course materials.
He said this helped the 280 students in his learning analytics course.
"The last time I offered the course, students asked 75,000 questions in one semester through this tool," he said. "Every week I got analytics on what the most common questions were, what students were struggling with, what I could do better to help students or address those questions in advance."
Kizilcec, who is also director of the Future of Learning Lab, also uses the tool to assign AI-led activities to engage students conversationally about readings.
For example, an AI activity may start a conversation about an assigned reading by challenging the student to identify and clarify a common misconception in a statement. Or students debate the AI and may need to go back to the reading to counter an argument. That's exactly the kind of deep engagement Kizilcec is hoping for.
"Our responsibility is to make sure that students are ready for lifelong learning in a world where AI is pervasive," he said. "They need to learn to control and reason over the outputs of AI and not surrender their agency."
The Center for Teaching Innovationis hosting workshopsto share learning about GenAI among faculty members and is developing a campus-wide program in Critical AI Literacy, in collaboration with Cornell University Library.
To address concerns about academic integrity, working group member and Provost Fellow Liz Karns, senior lecturer of statistics and data science in Cornell Bowers, worked with the Office of General Counsel to set guidelines for faculty membersto prevent abuse of AI in the classroom and steps to follow if they suspect a student has used AI in an unsanctioned way.
Some faculty members are changing practices to prevent students from using GenAI to replace real learning. In Karns' classes, a project that used to take all semester now takes five minutes to complete via AI. She has switched to doing in-class assessments, with paper and pencil. She has also added 15-minute, one-on-one conversations about the content of the course that she calls "job interviews." It takes more of her time, but she sees these adjustments as the best way to protect learning - at least in her classes.
"This is a transition point in higher education, and it will take a few years to settle down, and academic integrity will follow along in that process," Karns said. "Our assessments are changing, and our evaluation methods are changing."
Silvia Amigo-Silvestre, senior lecturer of romance studies in the College of Arts and Sciences, has also returned to pen-and-paper assessments and oral examinations. But she has found a use for GenAI in her Spanish composition and conversation classes.
Students upload their writing into a tool that Amigo-Silvestre has prompted to provide actionable feedback but not corrections. For example, it might highlight a word as misused or overused and suggest the student consider an alternative. The student can then change the word or ask the AI for more clarification. In the end, the student may decide to stick to their original choice, but they've had to think critically about it.
"It points the student in the right direction, but the student has to find the answers," Amigo-Silvestre said.
Using the tool is optional. She said some students have concerns about the environmental impact of AI or becoming reliant on it.
"Generative AI is a powerful tool for learning," she said, "and I believe in finding creative and ethical ways to use it that encourage students to keep thinking critically, because that is why they are here."
Jackson, who is also professor of information science in Cornell Bowers, said this is a moment of challenge, opportunity and transformation in the classroom.
"We're going to get through this in the same way we've got through it in other moments," he said, "which is by really relying on the skill and the imagination, the creativity and the expertise, of faculty and students working together to come up with the solutions and the responses that work best in their particular disciplinary and classroom environment."