Algorithms, Surveillance, and Data Ethics

This week’s unit on algorithms, surveillance, and data ethics took an intriguing look at the various ways people are monitored and represented in the digital sphere. I somewhat knew of the prevalence of these issues in DH in a sort of an abstract way previously, but they started to come into focus more when I read Ruha Benjamin’s edited collection Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life as part of my ENG802 Race, Gender and the Human class in Spring 2020. It provided a vast number of examples of how surveillance operates in society and particularly targets marginalized communities alongside generative ways of responding. The book felt particularly relevant at the time do to the massive shift online due to the COVID-19 pandemic, but I’ve been wanting to learn more since then.

Sharon Block’s article “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories” for Digital Humanities Quarterly describes the various ways in which the academic journal database JSTOR’s algorithm often mislabels scholarship relating to women and BIPOC issues and how their attempts to rectify the issue are passive by relying on scholars to do the extra work of reporting these incorrect classifications. Block points to several searches in which she pulled up essays about Women’s History where the top keyword was “Men.” Rather than acknowledge there might be a problem in need of fixing, JSTOR responded by saying that the term just happened to be used more which, with a search of the text, Block was able to prove was false with female-associated words being used much more frequently. Similarly, Block found that JSTOR often problematically conflates terms associated with Black women, skewing the perceptions of articles in their database and misrepresenting Black women’s histories. Block acknowledges that while faculty might be able to deduce these problems and work around keywords, scholarship is ultimately being misrepresented and is at risk of being misused by students or non-academic researchers attempting to navigate JSTOR’s search.

In her essay “Finding Fault with Foucault: Teaching Surveillance in the Digital Humanities” for The Journal of Interactive Technology & Pedagogy, Christina Boyles outlines the importance of separating theoretical conversations of Foucault from surveillance, particularly when teaching students who are fairly new to the topic. Although Boyles herself initially used Foucault when helping student conceptualize what surveillance means in society, she has realized that this approach is not fully representative of today’s surveillance state and how it disproportionately impacts Black and Brown bodies. Boyles advocates for adopting a decolonial approach for understanding surveillance and developing ethical communal values for dealing with issues of surveillance. She suggests that this can happen by implementing lessons that incorporate assessments of non-digital as well as digital modes of surveillance and by implementing an ethics of care while maintaining awareness of positionality and levels of risk.

Safiya Umoja Noble’s highly influential book Algorithms of Oppression: How Search Engines Reinforce Racism has played a big role in shaping scholars’ understandings of problematic algorithms (and was in fact mentioned by Block and Boyles in their work). Noble demonstrates the ways in which algorithms display bias and the danger of separating this technology from the people who program it. In particular, Noble highlights how Google Searches demonstrate racism by generating results that are harmful to BIPOC groups–for instance, her search of the term “black girls” initially primarily produced pornographic sites, although Google has since fixed this. She complicates the notion that search engines are apolitical, and points to the way searches are primarily controlled by paid advertisers despite the fact that very few users of search engines are aware of this fact. Noble attempts to address the ethical issues of portrayals of identity on the web along with the right to be forgotten. She also problematizes terms such as the digital divide, and discusses the issues of placing the responsibility of placing the responsibility on younger generations to fix the situation when there are plenty of female and BIPOC coders, social scientists, and humanists who could help improve the field. Yet, Google continues to brush of responsibility, claiming it cannot control the algorithm even when it has demonstrated that if there is enough push back (or laws made against certain search results) it can influence the algorithm–although this also leads to issues of what is being recognized as being excluded. Ultimately, Noble hopes that “this book can open up a dialogue about radical interventions on socio-technical systems in a more thoughtful way that does not further marginalize people who are already in the margins. Algorithms are, and will continue to be, contextually relevant and loaded with power” (171).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.