11/12/2025 | News release | Distributed by Public on 11/12/2025 08:26
November 12, 2025
SecurityTrust, security, and privacyguide every product and decision we make.
Each week, 800 million people use ChatGPT to think, learn, create, and handle some of the most personal parts of their lives. People entrust us with sensitive conversations, files, credentials, memories, searches, payment information, and AI agents that act on their behalf. We treat this data as among the most sensitive information in your digital life-and we're building our privacy and security protections to match that responsibility.
Today, that responsibility is being tested.
The New York Times is demanding that we turn over 20 million of your private ChatGPT conversations.They claim they might find examples of you using ChatGPT to try to get around their paywall.
This demand disregards long-standing privacy protections, breaks with common-sense security practices, and would force us to turn over tens of millions of highly personal conversations from people who have no connection to the Times' baseless lawsuit against OpenAI.
They have tried this before. Originally, the Times wanted you to lose the ability to delete your private chats. We fought that and restored your right to remove them. Then they demanded we turn over 1.4 billion of your private ChatGPT conversations. We pushed back, and we're pushing back again now. Your private conversations are yours-and they should not become collateral in a dispute over online content access.
We respect strong, independent journalism and partner with many publishers and newsrooms. Journalism has historically played a critical role in defending people's right to privacy throughout the world. However, this demand from the New York Times does not live up to that legacy, and we're asking the court to reject it. We will continue to explore every option available to protect our users' privacy.
We are accelerating our security and privacy roadmap to protect your data. OpenAI is one of the most targeted organizations in the world. We have invested significant time and resources building systems to prevent unauthorized access to your data by adversaries ranging from organized criminal groups to state-sponsored intelligence services.
However, if the Times succeeds in its demand, we will be forced to hand over the very same data we're protecting-your data-to third parties, including the Times' lawyers and paid consultants.
Our long-term roadmap includes advanced security features designed to keep your data private, including client-side encryption for your messages with ChatGPT. We believe these features will help keep your private conversations private and inaccessible to anyone else, even OpenAI. We will build fully automated systems to detect safety issues in our products. Only serious misuse and critical risks-such as threats to someone's life, plans to harm others, or cybersecurity threats-may ever be escalated to a small, highly vetted team of human reviewers. These security features are in active development and we will share more details about them, and other short-term mitigations, in the very near future.
The privacy and security protections must become more powerful as AI becomes more deeply integrated into people's lives. We are committed to a future where you can trust that your most personal AI conversations are safe, secure, and truly private.
-Dane Stuckey, Chief Information Security Officer, OpenAI
Why are The New York Times and other plaintiffs demanding this?
What led to this stage of the process?
Did you offer any other solutions to the Times?
Is the NYT obligated to keep this data private?
How are these 20 million chats selected?
Is my data potentially impacted?
Are business customers potentially impacted?
What are you doing to protect my personal information and privacy?
How will you store this data?
Who will be able to access this data?
Does this court order violate GDPR or my rights under European or other privacy laws?
Will you keep us updated?
SecurityNov 7, 2025
SecurityOct 30, 2025
SecurityJun 9, 2025