For the Zoom link, please send an email to firstname.lastname@example.org.
Artificial intelligence applications play an increasingly important role in our daily life. But these technological advances come with serious societal risks. For instance, in recent years we have seen many cases of machine learning applications that show unfair biased behaviour towards particular groups or individuals. This led to growing concerns about harmful discrimination and the reproduction of structural inequalities once these technologies become institutionalized in society. As a result, a lot of energy and manpower is currently invested in identifying and resolving algorithmic discrimination.
However, in AI research and policy, the remedies against algorithmic discrimination are often narrowly framed as design problems, rather than complex, structural, social-political problems. As a consequence, remedies are delegated to AI researchers and technology companies. This leads to a highly technocentric, individualist approach, which hinders inclusive AI governance and democratic decision-making.
Lanzing and Schulz’s innovative objective is to challenge this technocentric approach. They aim to provide AI researchers with a clear conceptual framework on the notion of bias in machine learning. This framework will be grounded in the needs of AI researchers, but also highlight the normative and social-political dimensions of the problem.
Dr. Marjolein Lanzing is an Assistant Professor in Philosophy of Technology at the University of Amsterdam. Previously, she worked on the Googlization of Health as a post-doc on the ERC project Digital Good (PI: Tamar Sharon) at the Interdisciplinary Hub for Security, Privacy and Data Governance (Radboud University). She finished her PhD-research 'The Transparent Self': A Normative Investigation of Changing Selves and Relationships in the Age of the Quantified Self at the 4TU Center for Ethics and Technology (University of Technology Eindhoven). Marjolein studies ethical and political concerns related to new technologies, in particular, concerns regarding privacy and surveillance (autonomy, discrimination, manipulation and commodification), and what they mean for the way we understand ourselves and our social relationships. Marjolein is a board member of Bits of Freedom, an NGO that protects online freedom and (digital) civil rights.
Dr. Katrin Schulz is an assistant professor (UD 1) in experimental methods for AI and logic at the Institute of Logic, Language and Computation (ILLC). A central theme in her research is the question of how we make predictions and how these are successfully made. Katrin studies these questions from disciplines such as linguistics, philosophy, cognitive science and artificial intelligence. As part of this, Katrin also works on stereotyping and bias, and the role that new media and AI play in strengthening stereotyping and bias in society. She leads an NWO Open Competition Digitalisation SSH project on The biased reality of online media - Using stereotypes to make media manipulation visible. Together with Leendert van Maanen (University of Utrecht), Jelle Zuidema (University of Amsterdam) and two PhD students, she works on developing tools and methods that allow for (i) measuring bias in computational language models, and (ii) using these measures to quantify the influence of media coverage on beliefs of consumers of these media.
Together Marjolein and Katrin study the conceptualisation of algorithmic injustice in AI. By clarifying the limitations of the current framing of this type of injustice, they aim to provide angles for more effective interventions in fighting harm caused by new AI technology.
Dr. Eva Groen-Reijman will moderate the PEPTalk. She is a teacher of ethics and political philosophy, and a postdoctoral researcher on democratic theory and political microtargeting in the NWO funded interdisciplinary project Safeguarding Democratic Values In Digital Political Practices. She received her PhD (cum laude) for her thesis Deliberative Political Campaigns at the University of Amsterdam.