05/13/2025 | News release | Distributed by Public on 05/12/2025 17:23
13 May 2025
TikTok is failing to address serious risks of harm to young users' mental and physical health almost 18 months after Amnesty International highlighted these risks in a groundbreaking report.
The 2023 research revealed that children were at risk of being drawn into toxic "rabbit holes" of depression and suicide related content on TikTok's 'For You' feed.
In an investigation using accounts to simulate 13-year-olds online, Amnesty International found that within 20 minutes of starting a new account and signalling an interest in mental health, more than half of the videos in TikTok's 'For You' feed related to mental health struggles. Multiple of these recommended videos in a single hour romanticised, normalised or encouraged suicide.
This was one of Amnesty International's key findings in a large-scale research project published in November 2023 on the risks to children and young people's rights posed by one of the most popular social media platforms of our time.
Ahead of the 2025 Mental Health Awareness Week, Amnesty International asked TikTok what changes the company had implemented since then.
TikTok's response listed familiar 'well-being' measures, most of them already in place when the research was conducted and failed to acknowledge the app's "rabbit hole" problem. It also failed to produce evidence of any new targeted measures to address it.
TikTok's 2024 risk assessment under the European Union's Digital Services Act (DSA) acknowledges that "certain types of concentrated content, though not violative of TikTok's Community Guidelines, may cause harm by inadvertently reinforcing a negative personal experience for some viewers. For example, there may be an impact to mental well-being, particularly for Younger Users, associated with concentrated content relating to extreme dieting and body-image-related content."
In the same section of the risk assessment, TikTok lists mitigation measures such as the proactive enforcement of its Community Guidelines, maintaining content eligibility standards and applying "dispersion techniques" to the 'For You' feed as well as user tools, including filtering options and a refresh function to reset the feed.
All these measures were already in place in 2023, when Amnesty International conducted its research and demonstrated that young users were exposed to systemic risks on the platform despite these measures.
Despite TikTok's growing user base, particularly in countries with young populations like Kenya, where the median age is 20, the platform is yet to conduct a basic child rights due diligence to address any risks posed to its youngest users.
TikTok's response to our latest research questions on what it is doing to makes the app safer for young users reveals that seven years after becoming available internationally, the company is still waiting for an external provider to complete a child rights impact assessment for the platform, a key responsibility under international human rights standards for businesses.
A young human rights activist in Kenya interviewed in March 2025 about his experiences on TikTok spoke about his struggle with excessive use of the platform, calling it "very addictive" and sharing that he tries to minimise his usage to four hours per day.
The case is illustrative of Amnesty International's continuing concerns in relation to TikTok's addictive design and the limited use of easily dismissible time limits, which rely on user or parental action to curb excessive and unhealthy use of the platform.
TikTok states that it works with experts in children's health to help design the app to respond to children's needs and that it has implemented new prompts to encourage teens to limit their usage.
This nonetheless requires children and their parents to counteract the addictive potential of a platform that is designed to maximise engagement through design choices not dissimilar from those employed by casinos.
Amnesty International's research also highlighted that TikTok's privacy-intrusive business model tracks everything a user does on the platform to predict their interests, emotional state, and well-being.
TikTok cloaks this invasion of users' right to privacy as a choice. In its response to Amnesty International's questions, TikTok states, "like other apps, TikTok collects information that users choose to provide, along with data that supports things like app functionality, security, and overall user experience" and that "viewing a video doesn't necessarily implicate someone's identity".
And yet TikTok's 'For You' feed clearly picks up on a person's emotional state when it amplifies masses of depression and even suicide-related content and then uses their susceptibility to this content to recommend more of it, regardless of the potential harms.
Young people's testimonies collected by Amnesty International highlight that many young people do not feel well-informed about what data is being collected by the app and do not feel "in control" of the recommended content or how their data is used.
Amnesty International shared its analysis of TikTok's continuing failure to address its risks to children and young people's rights with the company. Reacting to our allegation of TikTok's violation of the right to privacy, the company said, "Amnesty International's suggestion that TikTok is somehow aware of and utilises a user's emotional state in order to recommend content is a mischaracterisation of how our platform works."
Echoing its DSA risk assessment report, TikTok's response to our findings also stated that the company is "using machine learning models to avoid recommending a series of similar videos on themes that do not violate TikTok's Community Guidelines but are potentially problematic if viewed repeatedly." In addition to the mitigation measures discussed above, TikTok said that it had "developed a screen time dashboard that provides visibility for a user into how and when they are on the platform."
Some states are now implementing laws to force platforms like TikTok to assess and mitigate risks to users' rights and health. TikTok should not wait to be forced to make changes though, it must do so now.
Amnesty International is a global movement of more than 10 million people who take injustice personally. We are campaigning for a world where human rights are enjoyed by all - and we can only do it with your support.
Act now or learn more about our human rights campaigns.