02/25/2026 | Press release | Distributed by Public on 02/25/2026 04:56
The Information Commissioner gives the keynote speech.
Check against delivery.
Good morning and thank you for having me today.
IAPP London is always an important one in the UK's data protection calendar, and its one I'll always make time for - it's a valuable opportunity to speak to this audience of colleagues.
This is my last appearance here as Information Commissioner. The corporation sole that has served since 1984 will be retired and overtaken by a new Board when the relevant provisions of the Data (Use and Access) Act are brought into effect.
This is also my first, and last, appearance as the Chair of that new Board. I won't dwell on the details of our new governance and what this means in practice - there's a fireside chat with Chief Executive Paul Arnold later today for those who would like a deeper dive.
But it is a good opportunity for reflection. I want to talk briefly about where we've come from, what we've managed to achieve in the last four years, and what is to come.
When I arrived in January 2022, the streets of London were still bare, with the Covid pandemic continuing to keep people away from work - and each other.
I inherited an organisation that had dealt with extraordinary challenges and change. In the four years prior to my appointment, the ICO had needed to implement the GDPR, figure out and prepare for the impact of Brexit, and deal with the unprecedented data and operational challenges of Covid. It had grown fast, and some growing pains were evident.
It took me a little while to understand the scale of the challenges. You helped. I went on my 'listening tour' and heard what you wanted/expected/needed from the ICO.
It became clear to me that you - and industry and consumers - wanted clarity and certainty. Timely guidance, advice and enforcement. There was something of an anxiety and weariness at the scale and pace of change.
In my first keynote here, I responded to that sense of anxiety with a message of reassurance and the promise of certainty.
Now for my last address, I want to reflect on how we met these challenges at the ICO by choosing both our focus and tools wisely, and how we have built an organisation that I hope will be more resilient to future change.
Change has been the only constant during my tenure at the ICO, to adapt a quote from the ancient Greek philosopher Heraclitus. He famously said: "No man ever steps in the same river twice."
Over the past few years, we've had to adapt once again to a near constant process of law reform, culminating in the Data (Use and Access) Act. There's been a continuous flow of innovation, countless areas where the ICO has needed to step up, move quickly and provide certainty - generative AI, biometric technology, cookies to name a few.
It is against this backdrop of constant change, infinite possible demand for our services and organisational constraints that we have had to make choices.
New issues flow into all parts of the business all the time, pulling us in different directions and making demands on our finite time and resources.
I know you can relate to that challenge - DPOs are in the same boat. You are expected to do more with less, to adapt and change, upskill and stay informed, often within a fixed or shrinking budget.
With near endless demand in a changing environment of new business models, new technology, crises, expectations, risks - the list of things you could prioritise can start to feel insurmountable.
That's why we all - you in your role, and the ICO in ours - need to very deliberately choose our focus. If we tried to respond to all the possible calls on our time and resources at once, we'd be reacting to noise instead of making meaningful interventions to protect the public. We'd be doing lots of stuff, but would we be maximising our impact?
We are rightly challenged on those decisions. Some think we should devote more resources to data protection complaints from the public. Our complaints volumes are rising, from over 40,000 in 2024/25 to 66,000 so far in 2025/26. This is showing no signs of slowing down - we expect we may reach 75,000 by the end of the financial year.
Many complaints bodies and ombudsmen are reporting the same uptick. I do not believe it would be sustainable or responsible to increase our allocation of resources at the same rate at which those complaints numbers are growing.
Others believe the ICO should investigate every breach report that comes in and pursue every organisation with the full suite of enforcement tools at our disposal.
Others call for us to prioritise greater certainty for businesses - more guidance, more resources and more tools that support innovation and reduce friction. To spend more resources working 'upstream' to prevent poor practices putting people's data at risk and creating more demand from enforcement and complaints resources.
At a recent Select Committee hearing, it was suggested that we should be auditing all Government use of third-party IT applications, and all cloud storage contracts.
These are all valid arguments. But we can't do them all at once - certainly not to 100%. As the regulator, we must make those allocative decisions, and I, as Commissioner, have to be accountable for them.
You may not agree with how we divvy up our resources, but we have increasingly made those choices transparent. So at least you can see the trade-off's that a whole economy, principles-based regulator has to make. That's what I will continue to do as Chair.
We have chosen to focus on interventions that raise data protection standards across the board - for example, our approach to public sector enforcement and recent Memorandum of Understanding with Government. This means using all the regulatory tools available to us to drive change - producing guidance and advice, engaging upstream with companies and leading criminal prosecutions.
It also means taking enforcement action with fines, reprimands and warnings where needed, as you will have seen with our £14m fine to Reddit yesterday.
One of my first enforcement acts in 2022 was fining Clearview AI £7.5m after finding the company illegally scraped billions of images from publicly accessible websites and social media without people's knowledge or consent. We ordered the US-based company to stop collecting data of UK residents and delete existing data held in its systems.
Clearview appealed, as they were entitled to do. Then we appealed. Now we're waiting on a Court of Appeal hearing and frankly, there is little chance that it will be resolved before my time here concludes.
This was an important case for us to pursue - due to the sensitivities of biometric data, the impact on people's rights and the wider implications for issues of jurisdiction over foreign companies processing UK citizens data. It sent a clear message: we will act to protect the public, wherever in the world a company is based.
But this protracted litigation over so many years is resource intensive, and slow. It often doesn't give industry, or the public, the answers they need, when they need them. That doesn't mean we shouldn't use it - we absolutely should, but we have had to choose our enforcement tools carefully.
Regardless of your organisation or sector, your focus should be on the things that align with your purpose and matter the most to the people you serve, not what is shouting the loudest.
Ask yourself: What outcomes am I responsible for? What would make the biggest difference to my customers and stakeholders if it went well? And what might cause the greatest harm if left unaddressed?
We've made some strategic choices at the ICO by asking ourselves those very questions. Our priorities have been where we believe we could make the most meaningful difference to people in the UK - AI and biometrics, children's privacy and online tracking. As with all our work, we are guided by the potential for harm, but also by the opportunity for public benefit.
Focusing your resources where they matter most, when they matter most, is a skill. It doesn't always pay off. But our work to drive improvements in children's privacy is a great example of real success when you get this right.
Back in 2020, my predecessor first launched the ICO's Children's code, designed to keep children safe on the internet and hold online service providers to account. I'm pleased to say we've now improved online privacy for up to 11 million children, translating the code from policy into meaningful change for the public.
Enforcement action has been crucial here - in 2023, we fined TikTok over £12m for misusing children's personal data. Similar to the Clearview case, we've chosen to pursue TikTok through the courts so we can hold them to account. That work fed directly into our recommender systems investigations, which echo timely concerns we're now seeing in research and litigation in California about social media addiction.
We've also concluded investigations into Imgur and Reddit, fining both companies for failing to use children's data lawfully, including not checking the age of their UK users.
Focusing on children's privacy means we've been able to use that whole spectrum of regulatory tools to successfully drive behaviour change. Some of the largest online platforms have improved their default settings, reduced targeted advertising and included parental controls. We intervened when Snap offered its AI chatbot to children under 13, and we're now turning our attention to the safety of mobile games.
There's been plenty of discussion recently - in the press, in Parliament and in many households - about whether under 16s should be on social media at all. Where does that responsibility lie, and with whom?
This is obviously a far broader societal question, but it shows that safeguarding children online is more important than ever. We made the right choice to invest our resources and focus our attention on making the online world a safer place. And we will continue to do so.
Let's imagine you've now set your focus. Are you in the best possible position to tackle this? Does your organisation need to alter its approach, or its strategy, to be most effective?
At our own conference last October, I talked about the importance of agility. We can't always anticipate what change comes next, but we can control our response and how we adapt.
Improving our agility has been a priority, and I hope the choices we've made over the last few years have made the ICO a more agile organisation. This means we're able to get ahead of harms, to see what's coming next and intervene early - whether with position papers, as we did with generative AI, with guidance, or with enforcement.
I think the ICO is now better prepared for change and capable of adapting quickly to unexpected circumstances, shifting demands and emerging technologies as a result of those choices.
After all, when I first attended IAPP London, ChatGPT didn't exist. Yet within just five days of its release in 2022, it had one million users. I believe that figure is now 10% of the world's population.
Today, AI is part of both our cultural conversations and global infrastructure. We've all seen how quickly it can evolve. It has huge potential to improve how we do our jobs and serve the public, if used with proper oversight.
The fervor around Deepseek last year, or even AI agents, illustrates the speed at which new tech can appear and demand attention. We need to stand ready to match this speed - and help organisations do the same. Businesses have a race to market. We have to make sure that people's privacy is not put at risk to win that race.
We've chosen to be a regulator on the front foot, looking ahead with our Tech Horizons reports. By understanding some of the potential risks with novel technologies, we can help to mitigate them before they become real world harms a few years down the line - such as neurotech, genomics and agentic AI.
But not all risks can be predicted. We've had to prepare ourselves to pivot when new AI use cases call for scrutiny - such as our live investigation into the Grok AI system and its potential to produce harmful sexualised content using personal data.
If you're already thinking about where your focus could be, I imagine many of you in this room will be turning to AI. We are also here to help you and your organisation engage with emerging technology, innovating and investing responsibly. We've been building safe spaces for innovation, supporting firms to explore new technology with the guardrails in place to protect people's data.
For example, our Regulatory Sandbox offers a controlled environment to test out innovative concepts, products or services with us before taking them to market. It's open for expressions of interest, if you decide your focus this year is an innovative use of data.
Despite all this change, our core duties are the same - upholding the law, protecting people's privacy and empowering organisations to use data responsibly. But my office can't remain static when the world we regulate is always on the move.
There will be more changes this year to build our resilience and best position us to meet the challenges of the next decade. As well as our governance transition and relocating our head office to Manchester, we also have a new corporate strategy in the works - an opportunity to recalibrate our focus.
So much of our work remains rooted in those 'bread and butter' data protection issues - whether that's improving how local councils handle requests for care records or raising awareness of the harmful 'ripple effect' of data breaches. That key philosophy - that there's always a person behind the data - underpins all the choices I have made during my time as Commissioner.
That's my takeaway for you - have a think about where you should focus your attention to best serve your organisation, your customers and your stakeholders. You've got a packed agenda ahead of you, so use these sessions to build your knowledge and help you make an informed choice about your focus.
Once you've made that decision - think about how you can achieve it. This is where I hope the ICO can help you. Visit our website, subscribe to our newsletters, join one of our webinars.
Whether you want to focus on securing your cyber defences, adopting a new AI product safely, or improving your SAR response time, you'll find the support you need.
Just like at that first IAPP London in 2022, I want to leave you with a message of reassurance and certainty. As the regulator, we're here to help you navigate change.
That's why we've worked hard to get the most important DUAA guidance out as quickly as possible. And we're expanding our Data Essentials training to help small and medium businesses feel more confident using people's data responsibly in an evolving world.
I'll leave it there as I know we've got some time for questions.
As I touched on at the start, my first act as Commissioner was a 'listening tour.' I felt it was important to hear directly from you about your concerns and what you needed from my office.
This is still the case today. I'm still listening. If there's a gap where you would value our steer or a change we haven't anticipated, please do let us know. I'd encourage you to engage with our upcoming consultations and help us to shape where we go next.
Thank you for listening and enjoy the rest of the event.