02/03/2026 | Press release | Distributed by Public on 02/03/2026 06:18
Ofcom has today set out the next steps in its investigation into X, and the limitations of the UK's Online Safety Act in relation to AI chatbots.
Ofcom was one of the first regulators in the world to act on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people, including children, which may amount to criminal offences.[1]
After contacting X on 5 January, giving it a chance to explain how these images had been shared at such scale, we moved quickly to launch a formal investigation on 12 January into whether the company had done enough to assess and mitigate the risk of this imagery spreading on its social media platform, and to take it down quickly where it was identified.
Since then, X has said it has implemented measures to try and address the issue. We have been in close contact with the Information Commissioner's Office, which is launching its own investigation today.[2] Other jurisdictions have also launched investigations in the weeks since we opened ours, including the European Commission on 26 January.
Our investigation remains ongoing and we continue to work closely with the ICO and others to ensure tech firms keep users safe and protect their privacy.
Broadly, the Online Safety Act regulates user-to-user services, search services and services that publish pornographic content.
Chatbots are not subject to regulation at all if they:
Whether they fall within one or more of these categories depends on how they work, and even where some AI chatbots are in scope of the Act, this does not mean that all content they generate will be regulated. Images and videos that are created by a chatbot without it searching the internet are not generally in scope. They will only be in scope if they are pornographic - in which case they need to be age-gated - or can be shared with other users of the chatbot.[3]
We can only take action on online harms covered by the Act, using the powers we have been granted. Any changes to these powers would be a matter for Government and Parliament. The Secretary of State has said that the Government will look at how chatbots should be regulated and we are supporting that work.[4]
When we opened our investigation into X, we said we were assessing whether we should also investigate xAI, as the provider of the standalone Grok service. We continue to demand answers from xAI about the risks it poses. We are examining whether to launch an investigation into its compliance with the rules requiring services that publish pornographic material to use highly effective age checks to prevent children from accessing that content.
Because of the way the Act relates to chatbots, as explained above, we are currently unable to investigate the creation of illegal images by the standalone Grok service in this case.
In our investigation into X, we are currently gathering and analysing evidence to determine whether X has broken the law, including using our formal information-gathering powers. The week after we launched our investigation, we sent legally binding information requests to X, to make sure we have the information we need from the company, and further requests continue to be sent.
Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way, and they can expect to face fines if they fail to do so.
We must give any company we investigate a full opportunity to make representations on our case. If, based on the evidence, we consider that the company has failed to comply with its legal duties, we will issue a provisional decision setting out our views and the evidence upon which we are relying. The company will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision.
We know there is significant public interest in our investigation into X. We are progressing the investigation as a matter of urgency. We will provide updates and will be as open as possible during this process. It is important to note that enforcement investigations such as these take time - typically months.
We must follow strict rules about how and when we can share information publicly, as is the case for any enforcement agency, and it would not be appropriate to provide a running commentary about the substantive details of a live investigation. Running a fair process is essential to ensuring that any final decisions are robust, effective, and that they stick.[5]
While in the most serious cases of ongoing non-compliance we can apply for a court order requiring broadband providers to block access to a site in the UK, the law sets a high bar for such applications, and a specific process must be followed before we can do this. It would be a significant regulatory intervention and is not one we are likely to make routinely, given the impact it could have on freedom of expression in the UK.
If you come across something online that you think might be harmful or illegal, you should report it to the platform. If you think it may be child sexual abuse material, report it anonymously to the Internet Watch Foundation.
If you are worried about intimate images of yourself or someone you know:
More information on how to get help and report harmful content is available here.
Notes to editors: