For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
ABS researcher Andrea Weihrauch recently gave a presentation in Washington DC to the White House Presidential Innovation Fellows.

The Presidential Innovation Fellows are a group of the nation's brightest technologists, designers, and strategists. They are selected to work in different federal agencies for a period of 1 year. Their aim is to build resilient government and contribute to creating stronger public services. The Fellows use data science, design, engineering, and systems thinking to achieve this goal.

Weihrauch was invited to speak by one of the Fellows currently employed by NASA. Her invite was based on her research on how technology can be used for societal benefits, and her research collaborations with non-profit organisations (such as UNICEF and the Dutch Tax and Customs Administration).

In her presentation Artificial Intelligence Agents and Diverse Representation - The Complex Reality of Human Stereotypes in a Technology World’, Weihrauch talked about the responsibilities involved when creating entirely new agent classes that fall somewhere between humans and machines. This includes, for example, humanoid robots, human-voiced voice assistants and avatars. She pointed out that designing most voice assistants (like Alexa or Siri) with female voices has consequences. Siri’s ‘female’ servility – echoed by so many other digital assistants projected as young women – is a powerful illustration of gender biases encoded into technology products. It can even increase the belief that typical voice assistant tasks should be completed by female (-identifying) humans.

'Black and white robots and chatbots'

She also talked about her own research on ethnically featured robots which face similar biases that humans hold against other humans. While not really a robot, she gave the example of the U.S. Citizenship and Immigration Services’ computer-generated virtual assistant EMMA to the Fellows. This chatbot is likely perceived as a White Caucasian female.

The ABS researcher felt that the subject area was interesting for the Fellows. One of them summarised her talk by saying ‘we do not want a reality where black robots have to work harder than white ones’. Weihrauch: ‘This is exactly what I fear as well. We have a chance to do things better in a new technological reality, instead of just reproducing biases and stereotypes, and sometimes even reinforcing them’.