05/07/2026 | Press release | Distributed by Public on 05/07/2026 05:57
The European Committee of the Regions calls for a 'safety-by-design' approach, mandatory child rights impact assessments and a leading role of local and regional authorities in media literacy.
The digital space is not safe enough for children and young people, and the responsibility for addressing this situation should fall upon platforms and regulators, rather than minors themselves. This is the main conclusion drawn from the opinion on 'The Protection of Youth and Minors in the Digital Sphere', drafted by rapporteur Heike Raab (DE/PSE), Secretary of State of the Rhineland-Palatinate Government, which was adopted unanimously at the plenary session of the European Committee of the Regions on 6 May 2026.
While acknowledging that digitalisation broadens access to information, fosters creativity, and facilitates cross-border participation, the CoR warns that it simultaneously amplifies harms and risks. Exposure to incitement to hatred has increased in recent years, cyber-bullying is causing children to withdraw from social life, and disinformation undermines minors' participation in democratic processes. Furthermore, deep-fakes and chatbots created by generative AI add an additional layer of risk that existing regulatory frameworks have yet to address adequately.
Cities and regions assert that the very design of platforms is part of the problem. Opaque recommendation algorithms and deliberately addictive interaction mechanisms cause direct harm to minors. The opinion calls for regulatory measures to prohibit or restrict practices that encourage addiction, such as 'loot boxes' in video games, and demands that design mechanisms promoting compulsive use be made fully transparent.
Platforms must be held clearly and effectively accountable
According to the CoR, a small number of international providers dominate the market, while benefiting from a preferential liability regime that leaves little direct legal responsibility for content. Local and regional authorities demand that this imbalance be corrected and firmly reject any transfer of responsibility to minors. They also reject blanket bans on social networks, which would impose restrictions on young people's rights to information, privacy, and participation. An age-based prohibition cannot replace meaningful obligations for platforms or requirements for 'safety by design'.
The opinion is nonetheless clear that age-based regulation and platform accountability are not mutually exclusive. The Committee of the Regions acknowledges that, on the basis of mandatory age verification, a minimum age of 14 for access to certain social media services could be envisaged, combined with enforceable age-appropriate design standards for platforms serving users up to 16 years of age. Rather than a blunt instrument, the CoR presents this as a targeted, proportionate response to the specific protection needs of different age groups.
CoR members welcome the fact that Europe is at the forefront of regulation with instruments such as the Audiovisual Media Services Directive (AVMSD) and the Digital Services Act (DSA), but advocate for consistent regulatory enforcement. In this regard, they urge the rigorous application of Articles 28 and 34 of the DSA to large platforms and call for clarification of the precedence of the AVMSD over the DSA regarding media content. Mandatory impact assessments on children's rights for all digital services are also requested.
On technical protection, and in line with the approach taken by the European Commission, cities and regions support age verification systems as an effective solution, provided they are proportionate and fully respect privacy, and do not exclude vulnerable groups. They also request that Member States implement AVMSD rules concerning influencers. Platforms must adopt a 'safety by design' approach tailored to minors, eliminating dark patterns such as infinite autoplay, manipulative notifications, and reward loops.
Local authorities - partners in enhancing media literacy
The opinion identifies local and regional authorities as key actors and multipliers in strengthening and promoting media literacy. Territorial disparities in connectivity, digital literacy, and access to advisory and support services can exacerbate risks in both urban and rural regions, as well as those with structural weaknesses, thereby worsening social inequalities. Any EU measure must take these differences into account.
The CoR emphasises the need to raise children's awareness of the opportunities, but also of the risks in the digital sphere and calls for guidelines for teachers on disinformation to be reviewed and strengthened.
Glenn Micallef, Commissioner for Intergenerational Fairness, Youth, Culture and Sport, who participated in the debate during the plenary session, underscored that protecting young people in the digital environment has become a societal responsibility, crucial for supporting minors and, in some cases, saving lives. He stressed that this objective should be pursued through three core areas: regulation, prevention, and empowerment.
Quote
Rapporteur Heike Raab (DE/PSE), Secretary of State of the Rhineland-Palatinate Government and Chair of SEDEC Commission: "Digital technologies are shaping our lives, including those of our children. We want them to benefit from these innovations, but we must also protect them from the dangers of the internet. Age-appropriate use of digital services and improved media literacy are vital for our cities and regions. Platforms must take responsibility."
Glenn Micallef, Commissioner for Intergenerational Fairness, Youth, Culture and Sport: "We would never send children into deep water before they learn to swim. So why should we let them navigate the digital world without the tools to do so safely? That is why we must invest in digital literacy for children, parents, and educators."
Background
Contact:
Víctor Moreno Morales de Setién
Tel:+32 475999662
[email protected]