GUEST FORUM: Addressing Discriminatory Data: Talk with Wendy Hui Kyong Chun

[Critical AI’s Guest Forum welcomes writers on topics of potential interest to our readers inside and outside of the academy. The below post describes “Discriminating Data,” the fourth AY 21-22 event organized by the Digital Humanities and Media Studies Initiative as part of the University of Connecticut Humanities Institute’s focus on “The Future of Truth.”]

by Julia Brush, PhD Candidate in English, University of Connecticut

Discriminatory data is everywhere. It undergirds our recognition software programs, predictive algorithms, and supports the formation of echo chambers of misinformation with serious consequences on both individual and collective scales. While discussions of Big Data’s issues of privacy and surveillance are of great importance, it is also imperative that we consider the discrimination that is built into our technology and how that discrimination consequently perpetuates itself with dastardly effects. By acknowledging that our technology is discriminatory, can we move forward not only to address the problem, but also to create a future in which encoded discrimination is rectified?

This question animated Wendy Hui Kyong Chun’s talk “Discriminating Data,” presented on November 18 as part of the University of Connecticut’s Digital Humanities and Media Studies Initiative. Chun, the Canada 150 Research Chair in New Media at Simon Fraser University and the leader of the Digital Democracies Institute, discussed her recently published book, Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition, in conversation with Yohei Igarashi. When asked how she came into this project, Chun shared that she has been writing and thinking about this book since the beginning of her academic career as a student. Building on her background in Systems Design Engineering alongside her training as a humanities scholar, Chun punctuates her work with mathematical illustrations from Alex Barnett to emphasize the need for basic programming knowledge in the humanities. Chun stresses in the introduction to her book that interdisciplinary collaboration has the power to “create different worlds in which we can live in difference, and in which freedom finally becomes meaningful, because it is freedom for all” (xi).

However, in order to look toward meaningful freedom, we must reckon with the realities of how discrimination dominates our existing structures. In her talk, Chun explored correlation, homophily, authenticity, and recognition to provide contexts for our ongoing relationships with discriminatory data as well as to posit alternative engagements that would counter the dangers of discrimination. Chun explained how the correlation of our data, through linear regression models, creates a present and marks out a future that is dependent upon a specifically curated past. Joining a conversation that is already on view in Cathy O’Neil’s Weapons of Math Destruction (2016), Chun discussed both the limitations and possibilities of these computational technologies, stressing both the importance of their history and the ways these programs operate.

Chun looked to the values that our algorithms take for granted and render axiomatic. With the rise of homophily, or the concept that “neighborhoods” are created through similarities between people, comes the use of predictive algorithms to suggest connections, promote content, and drive consumerism. This idea that “birds of a feather flock together,” however innocuous it may appear, stokes hate between individuals who differ, as difference becomes a threat to homophilic formations. By contrast, homophily gains traction through rage and disturbance, creating bases of supposedly like-minded individuals who conspire together to achieve dominance over their perceived others. One needs only to think of the Cambridge Analytica scandal to see how correlation creates and manipulates these dangerous solidarities for political and commercial gain.

As Chun noted in her talk, homophily assumes that people are stagnant, do not experience change, and cannot connect to create communities of difference. Chun stated that like the inner workings of complex machine learning software, “people are opaque”; while correlation can make predictions, the discriminatory data beneath these correlations presumes that discrimination and aversion to difference is perpetual: a prediction that helps to create a self-fulfilling prophecy.

Conducive to maintaining these homophilic structures is the valuation of “authenticity,” which Chun observed as a contradictory premise. Authenticity, or the perception of authenticity, is imbricated into the political and social spheres in the United States. Authenticity claims to profess a devotion to one’s own “truth,” but this truth is only acceptable when it appeals to the authority of those empowered by this “truth”; namely, those who retain power within the existing structures of discrimination. In the same vein, Chun spoke of the contradictions between the supposed freedom delivered by technologies that are, in actuality, highly controlled. Control and authority, in both cases, serve to secure freedom for those who are already free, reifying discrimination as a “truth” that can be “authenticated” through homophily.

Despite these challenges, Chun ultimately believes that there is opportunity to change the course of history, both through the past and into the future. In reckoning with the past, Chun pointed to Ariella Aïsha Azoulay’s “potential histories” as a means through which we can expand and reconfigure our technologies and understandings to reflect those who are historically erased, silenced, and displaced.

Further, as Chun argued, the challenge is not to fix the model, but to fix the world. As an example, she used climate change models to stress that while the model may predict a certain future, it does not mean action cannot change its course. While the findings suggest that global temperatures will continue rising, those findings are dependent upon humans not making any changes to stop it. If we use the same framework to think through discriminatory data, we can take ownership of the past discriminatory practices and resolve to address them looking forward.

While the amassing of so-called Big Data can show us selectively what is and what has been, data does not determine what actions we might take to create a new future. This effort, of course, does not belong to just one collective or discipline, but requires connection through difference to take on the challenges ahead.

Leave a Reply