For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.

‘Technology ensures that we’re all served our own personalised news cycle. As a result, we only get to hear the opinions that correspond to our own. The result is polarisation’. Or so the oft-heard theory goes. But in practice, it seems this isn’t really true, or at least not for the average Dutch person. However, according to communication scientist Judith Möller, the influence of filter bubbles, as they are known, could indeed be stronger when it comes to groups with radical opinions.

Portrait Judith Möller
Judith Möller: 'My theory is that filter bubbles do indeed exist, but that we’re looking for them in the wrong place.'

First of all, we need to differentiate between the so-called echo chamber and the filter bubble. As an individual, you voluntarily take your place in an echo chamber (such as in the form of a forum, or a Facebook or WhatsApp group), meaning you surround yourself with people who tend towards the same opinion as yourself. ‘Call it the modern form of compartmentalisation’, says communication scientist Judith Möller, who recently received a Veni grant for her research. ‘People have always had the tendency to surround themselves with like-minded people, and that’s no different on social media.’

Various news sources in parallel prevent a filter bubble

In the filter bubble, you are presented only with news and opinions that match you as an individual, on the basis of algorithms and without you being aware of this process. It’s said that this bubble is leading to the polarisation of society. Everyone is constantly exposed to ‘their own truth’, while other news gets filtered out. But Möller says that there is no evidence to support this, at least in the Netherlands. ‘We use various news sources in parallel – meaning not only Facebook and Twitter, but also radio, television and newspapers, so we run little risk of ending up in a filter bubble. Besides that: the amount of “news” on an average Facebook timeline is less than 5%. Moreover, it turns out that many people on social media are actually more likely to encounter news that they normally wouldn’t read or search out, so that’s almost a bubble in reverse.’

Bubbles at the fringes of the opinion spectrum

Nonetheless, a great deal of money is being invested in the use of algorithms and artificial intelligence, such as during election periods. Möller: ‘So there must be something in it. My theory is that filter bubbles do indeed exist, but that we’re looking for them in the wrong place. We shouldn’t look at the mainstream, but at groups with radical and/or divergent opinions who don’t fit into the “centre”. This is where we see the formation of ‘fringe bubbles’, as I call them – filters at the edges of the opinion spectrum.’

People with fringe opinions can suddenly become very visible

From spiral of silence to spiral of noise

As one example, the researcher cites the anti-vaccination movement. ‘Previously, this group was confronted with the ‘spiral of silence’: if you said in public, for instance to friends or family, that you were sceptical about vaccination, you wouldn’t get a positive response. And so, you’d keep quiet about it. But this group found each other on social media, and as a consequence of filter technology, the proponents of this view encountered the ‘spiral of noise’: suddenly it seems as if a huge number of people agree with you.’

The news value of radical and divergent opinions

And so, it can happen that people with fringe, radical or divergent opinions suddenly become very vocal and visible. ‘Then they become newsworthy, they appear in normal news media and hence are able to address a wider public. The fringe bubble shifts towards the centre. This has been the case with the anti-vaccination movement, the climate sceptics and the yellow vests, but it also happened with the group who opposed the Dutch Intelligence and Security Services Act – no-one was interested initially, but in the end, it became major news and it even resulted in a referendum.’

Consequences can be both positive and negative

‘In my research I aim to go in search of divergent opinions like these, and then I’ll try to determine how algorithms influence radical groups, to what extent filter bubbles exist and why groups with radical opinions ultimately manage, or don’t manage, to appear in news media.’
The consequences of these processes can be both positive and negative, believes Möller. ‘Some people claim that this attention leads people from the “centre” to feel attracted to the fringe areas of society, in turn leading to more extreme opinions and a reduction in social cohesion, which is certainly possible. On the other hand, this process also brings advantages: after all, in a democracy we also need to listen to minority opinions.’