Discriminating Systems

The artificial intelligence industry is in the midst of a crisis in diversity and inclusion: while the representation of women in computer science recently fell below levels in the 1960s, inclusion across lines of race, gender, and ability in AI appear to be even worse. This is reflected in biased artificial intelligence systems: voice recognition that can’t ‘hear’ women’s voices, facial identification systems that struggle to identify darker skin pigmentation and trans people, and autonomous vehicles that present safety risks to children and disabled people. Discriminating Systems surveys the landscape of gender, race and power in the field of artificial intelligence, and its implications for algorithmic injustice.


West, S.M. (2021). Intersectionality and Human-Machine Communication. In McEwen, R., Jones, S. and Guzman, A. (Eds.) The SAGE Handbook on Human-Machine Communication. New York: SAGE Publishing.

West, S.M. (2020). Redistribution and Rekognition: A Feminist Critique of Algorithmic Fairness. Catalyst, 6(2). https://catalystjournal.org/index.php/catalyst/article/view/33043.

West, S.M., Whittaker, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. https://ainowinstitute.org/discriminatingsystems.pdf.

© 2020 by Sarah Myers West