For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
In a world overloaded with information, more and more people are turning to technology to make their decisions for them. Sometimes this just means following recommendations made by Netflix or Spotify. But what happens when technology decides for us what we get to hear about the world? When it chooses which news we get to hear and which we don’t? Valeria Resendez examined this issue during her PhD research. She will defend her dissertation on 25 October at the University of Amsterdam.
Image: Freepik

Conversational agents (CAs) – such as Google Assistant and Alexa - are being used more and more to distribute news directly to users. By entering into dialogue with users, CAs can make the experience feel more personal, private and interactive. But is the information they are passing on always reliable? The trust we place in technology used to be based purely on an item’s functionality and reliability, but as our relationships with technology become ever more personal, what it means to place our trust in it becomes a much more complicated issue.

To trust or not to trust?

Resendez: ‘Imagine using a CA that wakes you up each day with the weather, the meetings you’ve got planned, and the most important news headlines. And then one morning it claims that the recent elections were stolen by a massive amount of election fraud. This leaves you confused and worried. Is it true? Where is the information coming from? It is not always possible to know from where CAs are sourcing their news. But if we’ve decided we trust the CA, we’re more likely to just accept what they say.’

By acting as information gatekeepers, CAs can control the information flow that reaches the public. Access to information can naturally shape so-called public issue salience (i.e., the degree to which a person is concerned about an issue). The greater the trust users place in a recommender to deliver information, the more likely they might be to accept the recommendations, and the more their views on a particular subject may be shaped. CAs could therefore begin to shape public opinion on various topics. Resendez’s research raises the possibility that this could already be happening, at least on a small scale, and that more work needs to be done rapidly on understanding the mechanisms underlying this, given its potential impact on the democratic functioning of societies. 

More and more powerful

However, Resendez also found that trust in conversational agents is limited by certain factors inherent to the technology. The dialogue element of interacting with a CA actually leads to a slight decrease in trust, since users find the experience of receiving news through dialogue less enjoyable compared to reading it on a conventional website. Resendez: ‘We all have innate expectations of what a conversation should involve, and CAs are not yet able to interact like another human would, although that may begin to change as the technology becomes more and more powerful.’

Additionally, the brand of the CA can play a role in whether we trust it. If, for example, you trust Amazon, you are more likely to trust Alexa. This also goes for the CAs produced by news organisations. But as trust in the mainstream media continues to drop, will trust in news gathered for us by technology companies continue to grow?

‘By allowing technologies to decide on the best course and timing of different actions, we’re actually unloading cognitive tasks onto these agents,’ says Resendez. ‘We all know that today’s news environment is loaded with pitfalls for users, so we’re delegating very delicate tasks to these technologies and this requires a higher level of trust in their abilities to fulfil our expectations. Are we sure they’re currently worthy of that trust?’

Defence details

Valeria Resendez Gómez: Speaking the news. How Conversational Agents Influence Trust and Issue Salience. Supervisors: Prof. C.H. de Vreese and Prof. N. Helberger. Co-supervisor is Prof. T.B. Araujo.

Time and location

Friday, 25 October, 11:00, Aula