03/06/2026 | Press release | Distributed by Public on 03/06/2026 03:13
Ofcom has today launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content.
Due to the nature of these sites, we have decided not to name them or their provider.
It is illegal in the UK to share non-consensual intimate images (NCII) or child sexual abuse material (CSAM). Under the UK's Online Safety Act, providers of 'user-to-user' services are required to assess and mitigate the risk of UK users encountering this type of content on their platforms.[1]
This is something that disproportionately impacts women and girls, and making sure sites and apps tackle this is one of Ofcom's highest priorities.[2]
When the new duties on tech firms came into force last year, we immediately launched a programme of enforcement action against services that are used to distribute CSAM. As a result, some have deployed automated tools to detect and swiftly remove this vile content, while others have withdrawn from the UK.
In total, under the Act we have launched investigations into nearly 100 platforms - including X, when Grok was used to create and share demeaning sexual deepfakes of women and children. We have issued nearly a dozen fines for non-compliance, including against a nudification site, which has withdrawn from the UK.
We also recently announced that we will be fast-tracking our decision on proposed new requirements for tech firms to use technology to block non-consensual intimate images at source, bringing it forward to May.
Our job is to judge whether platforms have taken appropriate steps to comply with their legal obligations - it's not to tell platforms which specific posts or accounts to take down.
We have engaged extensively with victims, survivors and advocacy groups, and carried out an initial assessment of two sites used to facilitate image-based sexual abuse. Today, we have opened a formal investigation to establish whether the provider of these sites has failed to comply with its duties under the Act:
We will provide an update on this investigation as soon as possible.
The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.[4]
Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision'
If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.
In the most serious cases of ongoing non-compliance, we can make an application to a court for 'business disruption measures', through which a court could impose an order requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK.
As in other industries, companies that provide an online service to people in the UK must comply with UK laws. The Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.[5]
Notes to editors