04/01/2026 | Press release | Distributed by Public on 04/01/2026 02:47
Dozens of tech firms have been told to submit their latest risk assessments to Ofcom by summer, as the regulator keeps up pressure on platforms to put safety by design front and centre of their operating models.
The online safety watchdog has today issued legally binding notices to more than 40 firms of the largest and riskiest sites and apps in the world, formally requesting more than 70 risk assessments from them. Failure to provide a sufficient response, on time, could result in enforcement action.
Risk assessments are fundamental to keeping users safer online. In order to put in place appropriate safety measures to protect people, especially children, providers must first understand how harm could take place on their platforms, and how their features and functionalities could increase those risks of harm.
Under the UK's Online Safety Act, tech firms must assess and mitigate the risk of people in the UK encountering illegal content, and platforms likely to be accessed by children must also assess and mitigate the risk of under-18s being exposed to certain types of harmful material.
Providers should review their risk assessments at least once a year, and must update them before making any significant change to their service's design or operation, or if Ofcom makes any significant change to its assessment of risks.
Later this year, 'categorised' services - which we expect to include some of the most widely-used social media and search services - will have to publish summaries of their risk assessments, forcing them to be transparent about their view of the risks they pose.
Part of Ofcom's job is to make sure firms carry out suitable and sufficient risk assessments.
To monitor industry compliance with their duties, we routinely issue formal information requests. Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way. We have issued several fines for failures to do this, and taken action regarding the suitability of platforms' risk assessments.
In 2025, we requested providers' first risk assessments. We reviewed more than 100 of these from a range of large and small services, spanning over 10,000 pages. We told 11 platforms that we had serious concerns with their risk assessments, and all submitted revised versions or supplementary information.
This included Snapchat materially improving its illegal content risk assessment - in direct response to action from our enforcement team - which will ensure it must put a broad range of safety measures in place, commensurate with the risks to UK users that it has identified.
We have issued formal information requests to 30 providers, covering 43 services, which have until 31 July to submit their Year 2 illegal harms risk assessments and children's risk assessments to us.
We will use the responses we receive to identify gaps in risk assessments and drive improvements.