10/22/2025 | News release | Distributed by Public on 10/21/2025 13:48
University life is stressful. Between exams, rent payments and relationships, it's easy to feel overwhelmed.
While Toronto Metropolitan University (TMU) students have access to free mental health support, not everyone does. That's why some young people are turning to AI tools like ChatGPT for help. But is it safe?
"I don't think we should be seeking therapeutic advice from a general purpose chatbot, like ChatGPT," says Richard Lachman, TMU professor and expert in AI and ethics.
Professor Lachman says that ChatGPT is trained on everything on the internet. That means its advice could come from a mental health textbook or from Reddit comments.
"The answers are not necessarily best practices or best for you because ChatGPT does not understand the situation you're in," he said.
As many as one-third of young people have tried mental health conversations with AI. The appeal is clear: ChatGPT is free, available 24/7, and easy to use. Some people feel more comfortable discussing difficult topics with AI than with a therapist.
"People are more willing to share and take risks with chatbots because they feel like they're not being judged," said professor Lachman. "But it's not always therapeutically beneficial. It may just affirm what you're feeling, instead of questioning or disagreeing, which is sometimes what we actually need to hear."
Content warning: This section discusses self-harm and suicide.
One rare but serious problem is that ChatGPT has encouraged users to self-harm.
The parents of an American teenager are suing OpenAI, the creators of ChatGPT, after their son died by suicide. They say chat logs show the AI validated his harmful thoughts instead of helping him.
OpenAI says ChatGPT is designed to direct people to seek professional help like crisis hotlines. But they admit the AI doesn't always behave as intended.
Another concern is the lack of oversight, says professor Lachman.
"No one is vetting these apps and private companies. We don't know what testing they're putting their apps through. Even Sam Altman, the CEO of OpenAI, admits we don't know how these things do the things they do," said professor Lachman.
Some people use AI not for therapy, but to manage feelings of loneliness.
While AI can be entertaining and make you feel heard, it creates parasocial and one-sided relationships.
"AI has no empathy," said professor Lachman. "You won't get the pride that comes with a true mutual friendship."
He adds that every hour spent talking to the AI is one less hour spent building real connections with people.
Professor Lachman's advice: Avoid using ChatGPT for mental health support.
If you feel like the internet is your only option, start by using campus resources to learn when you're experiencing a mental health crisis.
If you're not in crisis and still want to use ChatGPT, professor Lachman says to treat it like a research project.
In addition to annual Wellbeing Week programming, TMU offers several supports for students on campus. There are also off-campus services.