JRC - Joint Research Centre

04/16/2026 | Press release | Distributed by Public on 04/16/2026 06:05

Fractured reality: how algorithms fuel polarisation and affect democracy

Platform algorithms facilitate merging true, misleading or false information, making it ever harder for people to know what is true.
© DC Studio - stock.adobe.com 2026

Trustworthy and accurate information is essential to democracy. The increased digitalisation of the information space and the use of algorithms to influence what people see online, make it ever harder for audiences to find common ground on what is true or false. Misleading information and the emergence of conflictual echo chambers and platforms - where people are exposed to toxic, negative and ideologically segregated information - divide society into opposing camps. This decline in our common sense of how we see the world, leads to so-called 'fractured realities', which pose a clear threat to the future of European democracy.

Trustworthy and accurate information is essential to democracy. The increased digitalisation of the information space and the use of algorithms to influence what people see online, make it ever harder for audiences to find common ground on what is true or false. Misleading information and the emergence of conflictual echo chambers and platforms - where people are exposed to toxic, negative and ideologically segregated information that divides society into opposing camps - pose a clear threat to the future of European democracy.

A new JRC report looks into the rise of polarisation and the erosion of public trust in governing institutions in the EU and other democracies worldwide. It outlines the challenges facing digital information spaces in the EU, and their effect on democracy. The report also sets out policy recommendations to safeguard the EU's digital information space and promote the EU's democratic resilience.

How information overload amplifies the attention economy

Today, the vast and growing amount of online information is increasingly rife with false or misleading information. As a result, meaningful and factual information often gets lost in the noise.

This information overload tends to make people spend more time on negative, emotional or conflict-driven content. Information that triggers an emotional reaction or that is in line with what users already believe is more likely to go viral.

This has led to increased polarisation, narrowed political or social views, emotional outrage and higher engagement with low-quality content. Lower trust, higher visibility of misleading actors, and distorted political discourse can pose a serious threat to democracy.

The three challenges facing the democratic information space

The study shows that there are three main challenges:

  • technology that is optimised to exploit human cognition, i.e. the way people think and what they pay attention to,
  • the platforms' underlying business models,
  • geopolitics.

On the human cognition level, platform users increasingly consume information in a passive way, i.e. they rely on what they are offered. The so-called "News Finds Me" perception describes how users no longer think that they must search for the news to be well-informed. The content automatically appears on their feeds, and users feel well-informed despite only seeing a particular facet of reality.

On a larger scale, this siloed consumption of information is driven by business models that are designed to make people spend more time than intended on platforms. Moreover, the study contains examples showing how foreign controlled platforms have used algorithms to further their own interests, which could, in turn, fuel extremist narratives.

A better understanding of misinformation

Simply fighting misinformation is not enough to curb the dangers posed by fractured reality. A new understanding of the powers of online platforms and their products is needed. Recently, the term "fantasy-industrial complex" has emerged as a new way of looking at today's mix of mis- and disinformation, deception, and facts. In the fantasy-industrial complex different actors, including politicians, media outlets, influencers and citizens co-create their own curated versions of reality. The goal of this information manipulation is no longer to make citizens believe false claims, but to distract and sow distrust.

As a result, today's increasingly dominant digital information space favours extreme, divisive and emotive positions. This makes it difficult for people to agree on what is real, hampering the consensus that democracy relies on.

Winning the technological competition alone is not enough

The report identifies opportunities for the EU to take the lead in building digital democratic resilience and information integrity.

Digital sovereignty could play a key role in safeguarding democracy and limiting foreign influence. It could help shape technology and business models and protect the integrity of information. This includes achieving digital sovereignty over critical software, hardware and data infrastructures.

Recommendations also include:

  • supporting decentralised alternatives,
  • encouraging business model change,
  • restoring user autonomy online,
  • creating alternative public spaces online and offline, free from the attention economy.

By encouraging better business models and digital sovereignty, the EU can foster a healthy information space that supports a consensual yet plural understanding of reality, while preserving citizens' digital autonomy.

Background

This report informs the European Commission's work to protect and promote resilient democracies, including in the context of the roll out of measures under the European Democracy Shield. This initiative provides a strategic approach to safeguard, strengthen and promote democracy in the EU, including a series of actions to strengthen the collective capacity to counter foreign information manipulation and interference and disinformation threats.

Furthermore, the Digital Services Act, applicable since 2024, helps to protect democracy, requiring Very Large Online Platforms and Search Engines to assess and mitigate systemic risks their services could pose to citizens and societies, including the spread of disinformation and the use of design choices that have negative effects on users' mental and physical wellbeing.

Related Links

JRC report: Fractured reality: How democracy can win the global struggle over the information space

Fractured reality page on the the Knowledge for policy website

Knowledge for policy website

Details

Publication date
16 April 2026
AuthorJoint Research Centre
JRC portfolios 2025-27
JRC - Joint Research Centre published this content on April 16, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on April 16, 2026 at 12:05 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]