OFCOM - Office of Communications

03/24/2026 | Press release | Distributed by Public on 03/24/2026 03:48

Protecting people online from self-harm content and cyberflashing

People in the UK will be better protected online from illegal self-harm material and unsolicited nude images, under new proposals published today by Ofcom.

The regulator is consulting on updates to its codes of practice and guidance to reflect the Government's recent creation of new priority offences under the UK's Online Safety Act.

Duties on platforms

The Act lists over 130 priority offences. Under the Act, tech firms must assess the risk of these offences occurring on their sites and apps, put appropriate measures in place to mitigate the risk of them occurring, and take down priority illegal content quickly when they become aware of it.

Ofcom's codes of practice and guidance set out ways platforms can comply with these duties.

New priority offences

In December 2025, the Government added cyberflashing and encouraging or assisting serious self-harm to the list of priority offences in the Act. To reflect this change in the law, we are consulting on updates to our Risk Assessment Guidance, Risk Profiles, Register of Risks, Illegal Content Judgements Guidance and Illegal Content Codes of Practice.

This means that providers will have to assess the risk of unsolicited nude images and illegal self-harm content appearing on their services. They will also have to take appropriate safety measures to protect users from these harms. We are proposing that various existing measures in our codes should apply to these offences, including:

  • allowing users to report illegal content through reporting and complaints processes that are easy to find, access and use;
  • making sure content moderation functions are appropriately resourced and individuals working in moderation are trained to identify illegal content;
  • having content moderation systems and processes designed to take down illegal content swiftly when a platform becomes aware of it;
  • when testing their algorithms, checking whether and how design changes impact the risk of illegal content being recommended to users;
  • enabling users to block or mute other users and disable comments on their content;
  • providing crisis prevention information in response to search queries regarding self-harm; and
  • enabling users to easily report predictive search suggestions they believe may direct people towards priority illegal content.

Next steps

We are inviting responses to our consultation by 5pm on Friday 24 April 2026. We will take all feedback into account before making our final decisions, which we expect to publish in summer 2026.

OFCOM - Office of Communications published this content on March 24, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on March 24, 2026 at 09:48 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]