Preview Mode Links will not work in preview mode

Nov 30, 2019

ALGORITHMIC BIAS: AI Traps and Possible Escapes

Caroline Sinders (Machine Learning Designer/User Researcher, Artist, Digital Anthropologist, USA/DE) and Sarah Grant (Media Artist and Educator, Radical Networks, USA/DE) in conversation with Ruth Catlow (Co-Founder and Co-Artistic Director, Furtherfield, UK).

Algorithms are not neutral and unbiased, but instead often reflect, reinforce and automate the current and historical biases and inequalities of society, such as social, racial and gender prejudices. This panel frames this issue, and aims to discuss some possible escapes. Caroline Sinders discusses what an intersectional Feminist AI could look like, and how we could get it. Sarah Grant organises Radical Networks, a community event and arts festival for critical investigations in telecommunications. She will go into how the repeated biases and behaviours that we find in Internet could find themselves patterned and spread into AI systems.

ACTIVATION: Collective Strategies to Expose Injustice

The Art of Exposing Injustice - Part 4

The 18th conference of the Disruption Network Lab www.disruptionlab.org/activation

Photo credit: Maria Silvano