European Commission - Directorate General for Communications Networks, Content and Technology

02/17/2026 | Press release | Distributed by Public on 02/17/2026 06:11

Two years of Digital Services Act allows 50 million content moderation decisions by platforms to be reversed

In just two years, online platforms have reversed almost 50 million decisions affecting users' content or accounts, helping users exercise their Digital Services Act (DSA) rights online in the EU.

With the DSA, users in the EU are more empowered online, online platforms face greater accountability, and the online environment is more transparent.

This instrument, the first of its kind in the world, gave users the right to challenge platforms' content moderation decisions that affect, suspend, delete or 'shadow ban' their content or accounts. Since its application, 30% of 165 million content moderation decisions that users appealed through the platforms' internal mechanisms have been reversed.

Notably, in the first half of 2025, 99% of content moderation decisions were taken by platforms to enforce their own terms and conditions, rather than to remove content reported as illegal under EU or national law.

In the first half of 2025, out-of-court settlement bodies reviewed over 1,800 disputes related to content on Facebook, Instagram and TikTok in the EU, overturning the platforms' decisions in 52% of the closed cases - restoring content and accounts, in a faster and cheaper way than going to court.

The DSA has also driven concrete changes in user safety and wellbeing. Targeted advertisements to minors on online platforms are prohibited since 2024 in the EU, thanks to this legislation. The DSA also obliges online marketplaces to counter the spread of illegal goods, improve the traceability of traders, and quickly inform customers who purchased any illegal product on their marketplace, offering options for redress.

An additional merit of this legislation is that researchers and civil society have unprecedented access to information on platforms' processes and content moderation practices in the EU. Furthermore, they can hold platforms accountable for their decisions.

Find more information about the impact of the Digital Services Act on digital platforms.

European Commission - Directorate General for Communications Networks, Content and Technology published this content on February 17, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on February 17, 2026 at 12:11 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]