DATA RELATIONALITIES WITHIN SURVEILLANCE CAPITALISM: Workshop #4 with SALOME VILJOEN on DATA RELATIONALITIES

Image: Marc Smith/flickrCC BY-SA

[The Ethics of Data Curation is the first in a two-part series of AY 2021-22 workshops organized through a Rutgers Global and NEH-supported collaboration between Critical AI@Rutgers and the Australian National University. Below is the fourth in a series of blogs about each workshop meeting. Click here for the workshop video and the discussion that followed.

by Kayvon J. Paul (RU Law ’23), Student Fellow, Rutgers Institute of Information Policy and Law

Data governance, instead of the current data exploitation model employed in the age of information capitalism, was the focus of Salome Viljoen’s November 11 presentation, DATA RELATIONALITIES–the fourth installment in an ongoing interdisciplinary workshop on the Ethics of Data Curation. Viljoen studies data governance and the impact of information law in structuring inequality in the information economy. Her answer to the status quo—not least of all the arrival of surveillance capitalism–is what Viljoen calls a relational theory of data governance.

Surveillance capitalism, according to Harvard business professor Shoshana Zuboff, emphasizes corporate control and profit-making at the expense of privacy and through the exploitation of personal data. “[S]urveillance capitalism works a dispossession[,]  a domination[,] an expropriation[,] and a robbery by usurping people’s control over the data associated with their lives” (Zuboff 2020).

Personal data is the new gold. The more access to personal data a company has, the more valuable the company will be. This data can be used to predict behavior and to market products. Generally, companies can sell data to other companies and use the personal data they glean for any and all purposes that fullest extent of the law allows. Given that information law is relatively new and that information technologies constantly grown and evolve, there’s not much at present that companies cannot do with personal data. Common everyday examples of data exploitation include:

  • Mobile apps selling your geographical location data.
  • Internet companies selling your browsing data.
  • Cable and streaming companies selling your viewership and demographic data.
  • Credit companies selling your application and credit history data.
  • E-commerce platforms selling customer and purchasing data.

In processes that, these days, are often described as “artificial intelligence,” many companies run personal data through predictive analytics software to exploit this information and find ways to increase their profits. This may include using personal data to advertise products, deny credit applications, deny insurance policies, and the list goes on. This also includes assisting the government’s requests for information (which may help a compliant company to build good will in exchange for favorable regulatory policies later on).  There is no question that surveillance capitalism has a disparate impact on marginalized communities and low-income individuals. As Michele Gilman, who facilitated the event, noted in her opening remarks, marginalized people often have their digital profiles marked as vulnerable, are often excluded from data algorithms, and are also heavily surveilled in all aspects of their lives. Gilman argued that data justice should be at the forefront of any data governance regime.

But how does one go about achieving that justice; or, more specifically, what is the solution to the problem of surveillance capitalism? The exploitation of personal data will only get exponentially worse with the digitalization of society especially with Facebook (now known as Meta Platforms) and Google recently announcing their intentions to build out the “Metaverse”—the so-called third frontier of digital social interaction (with the internet being the first and social media being the second). What type of data governance regime will position us effectively to prevent corporations from exploiting personal data, hold them accountable for the use of such data, and pave the way for the assertion of private property rights?  This is one of the biggest issues facing local, state, national, and international regulatory bodies throughout the world. The result so far is a congeries of complex, unorganized, inconsistent, and often overlapping regulatory regimes that still favors big technology and data mining companies.

During her presentation Viljoen provided a brief overview of what legal theorists describe as the propertarian and dignitarian views of data governance. The former emphasizes that individuals should be compensated for their interests in the personal data off of which companies disproportionally profit. To level the playing field, propertarian theory suggests that individuals should be able to command a labor or property “market rate” for their personal data, obtainable at point of collection to prevent gross unjust enrichment. The dignitarian view, by contrast, asserts that compensation at the point of data collection does too little to prevent unjust enrichment and nothing to prevent future discrimination based on personal data. In other words, a higher threshold of consent and control over the use of personal data over time is warranted.

In her article for the Yale Law Journal (the complete version of the argument that workshoppers read in brief in “Data as Property?”)  Viljoen highlights a conceptual flaw that both propertarian and dignitarian perspectives overlook. That is, both miss that the point of data production in the digital economy is to put people into population-based relations with one another. Emphasizing that the data relations piece of data production fuels not just the social value but also harm of data collection and its use in a digital society, Viljoen operates under the assumption that the basic purpose of data production is to relate people to one another. She discusses the importance of data relations, arguing the need to assign interests at law in data so that we can then operationalize those interests and give legal force to claims.

Viljoen offers the example of a population of data subjects from which data is collected which becomes the basis for synthesizing and building behavior models that make predictions or attempt to influence behavior. The affected parties, therefore, are not just the people from whom data was collected in the first place but also the people in populations, now and in the future, who share relevant features with that group. Only a relational-driven data governance regime, Viljoen argues, can combat such a data exploitation regime.

Viljoen’s discussion of data relationalities provoked a wide-ranging interdisciplinary discussion. What is clear is that policymakers need to seriously consider this relational-driven and data empowerment model for data governance.

With Big Tech companies investing billions in the creation of a new so-called metaverse, portending an ever more complete digitalization of society, a robust data governance regime is long overdue. 

Leave a Reply