News beyond the echo chamber 

15 December 2025
News beyond the echo chamber 

When people worry about ‘the algorithm’, they often imagine invisible machines quietly narrowing our horizons. Damian Trilling sees a more complex story and explores how news recommender systems can also broaden what we see online. 

Trilling’s work begins with a straightforward question that carries far-reaching implications: when we open a news app or scroll through a feed, which stories appear on our screens, and why? His research shows that the answer is rarely as simple as claiming that ‘the algorithm keeps us in a bubble.’ 

Some systems can steer users toward a narrow set of topics and viewpoints. Others, however, broaden what people see – and users themselves still decide how closely to follow the suggestions they receive. There is no single algorithm deciding everything, he emphasises, but many different systems that can be designed in very different ways. 

That nuance matters when debates turn to ‘filter bubbles’ and ‘echo chambers’. ‘Of course, fully algorithmic news curation can have unwanted effects,’ he says, ‘especially on media where there is a very tight connection between a user’s action and the algorithm, such as on TikTok.’  

At the same time, most users do not simply accept whatever algorithms suggest, nor do they restrict themselves to just a handful of topics. In experiments, Trilling and his team have shown that it is possible to build news recommenders that gently widen people’s scope without alienating them. Diversity, in other words, does not need to come at the expense of relevance or enjoyment. 

What does ‘fair’ really mean? 

These findings raise a thorny question: what would a ‘fair’ news algorithm look like? For Trilling, fairness is not a single technical target that engineers can code into a system. It is a contested, deeply political idea. During elections, for instance, should all parties appear equally often in recommendations? Should their presence reflect polling numbers, parliamentary size, or other criteria? Each choice reflects a different vision of how democracy should work. That is why he argues that ‘algorithmic fairness is not primarily a technical problem, but maybe even more a conceptual one.’  

In this context, the Digital Services Act (DSA) offers a critical tool. Instead of prescribing a single model of fairness, its transparency provisions enable detection of systematic, unexplained advantages or disadvantages for specific actors, viewpoints, or sources.  

Access to ‘exposure data’ – information about what users have actually seen, not only what they have liked or shared – under the DSA could one day make it possible to study such patterns in detail.  

At the same time, Trilling points to a new dilemma. Research funders increasingly expect open research data, yet data accessed under the DSA will be subjected to strict restrictions. Projects relying on these channels will therefore have to find fresh ways to be transparent without violating legal constraints. 

Watch the teaser

Damian Trilling

Follow the news flow 

His ERC Starting Grant project NEWSFLOWS zooms out from individual feeds to the broader media ecosystem. The starting point is that single stories nowadays travel in an ‘unbundled’ way: they are published on one site, shared and reshaped on social media, filtered by algorithms, and then picked up by journalists and politicians elsewhere.  

Rather than isolated bubbles, Trilling’s work focuses on feedback loops – reinforcing or self-correcting cycles in which user reactions, editorial decisions and recommender systems continuously influence one another. 

To capture this complexity, the team uses computational methods such as online field experiments, invites citizens to donate data about their news use, and applies automated content analysis to trace how different pieces of information spread. They study both algorithmic feedback loops (in recommendation systems) and human feedback loops (for example, how audience metrics influence newsroom decisions), revealing how stories gain momentum and what this means for democratic decision-making. 

For Trilling, the stakes are clear. A better understanding of news flows is not just an academic exercise. It can help media organisations and technology companies respond to calls for ‘responsible AI’ in a concrete way – by building products that support pluralism, respect user agency, and avoid invisible distortions of the public sphere. 

DamianTrilling

Damian Trilling is Professor of Journalism Studies at Vrije Universiteit Amsterdam.