[Data Ontologies is the second in a two-part series of AY 2021-22 workshops organized through a Rutgers Global and NEH-supported collaboration between Critical AI @ Rutgers and the Australian National University. Below is the seventh in a series of blogs about each workshop meeting. Click here for the workshop video and the discussion that followed.]
by Atif Akin (Art & Design, Rutgers)
Tega Brain is an Australian-born artist and environmental engineer whose work examines how technology shapes ecology. She is an Industry Assistant Professor of Integrated Digital Media, at New York University. She has created wireless networks that respond to natural phenomena, systems for obfuscating fitness data, and online smell-based dating services. Brain received a Bachelor of Environmental Engineering and a Bachelor of Arts at the University of New South Wales in 2006. She completed a Master of Art at the Queensland University of Technology in 2012.
Sam Lavigne is an artist and educator whose work deals with data, surveillance, cops, natural language processing, and automation. Born in San Francisco, Lavigne studied Comparative Literature at the University of Chicago. He has a Master in Professional Studies in Interactive Telecommunications Program at New York University. Sam has since taught at ITP/NYU, The New School, and the School for Poetic Computation, and was formerly a Magic Grant fellow at the Brown Institute at Columbia University, and Special Projects editor at the New Inquiry Magazine. Sam is currently an assistant professor in the Department of Design at the University of Texas in Austin.
Sam Lavigne defined scrapism as:
It combines aspects of data journalism, conceptual art, and hoarding, and offers a methodology to make sense of a world in which everything we do is mediated by internet companies. Web scraping describes techniques for automatically downloading and processing web content, or converting online text and other media into structured data that can then be used for various purposes.
In other words, it is the automatization of web browsing, surfing in the 90s language. This technique is also employed by major silicon valley companies to further extend their existing databases. Search engines are nothing but web scrapers.
Sam talks about the first web scraper and crawler, the World Wide Web Wanderer, which was created to follow all these indexes and links to try and determine how big the internet was. Then a historical analysis of the world wide web explains how web pages are actually the front ends of databases and how to navigate them without the visual signifiers and prompts.
Oftentimes, artists and designers navigate between the visible and the invisible. Today, internet and www protocols are key agents to render what is visible and hide what is not. Websites are only the front ends of big databases. What a surfer can see and have access to is strictly controlled, surveilled, and limited. Web scraping is a strategy to tackle these interfaces and be active in the decision-making by way of restructuring and republishing the databases. Scraping also provides the user with the opportunity to restructure and republish this databasing the web. This is the idea that Sam and Tega come around to create their redatabasing, scraping projects.
New York Apartment is one of these projects for which they created a website that advertises a fictitious New York City apartment for sale that covers more than 300 million square feet and spans the five boroughs.
Tega presented “Get Well Soon!” underlining the parallels between healthcare and real estate, in a neo-liberal capitalist world where both are commodities. The project basically is a giant get-well-soon card complied with comments on gofundme website:
Secondly, Tega presented Synthetic Messenger, a botnet that artificially inflates the value of climate news. Every day it searches the internet for news articles covering climate change. Then 100 bots visit each article and click on every ad they can find.
It was very useful to see the tools and platforms that Tega and Sam have used. They have also created and contributed to several open-source software projects such as:
- p5.vscode: an extension for Visual Studio Code to manage p5.js projects
- p5.riso: a p5.js library for generating files suitable for Risograph printing
- videogrep: python script to create automatic video supercuts
- audiogrep: python script to create automatic audio supercuts
- vidpy: video editing and compositing in python
- IndigEmoji: a sticker pack featuring Indigenous emoji (contributor)
- p5.js editor: a desktop editor for the p5.js project (deprecated)
Of all the projects that I have reviewed around scrapism and Sam and Tega as scrapers, I personally would like to highlight Zoom Escaper which is a tool to help you escape Zoom meetings and other video conferencing scenarios. It allows you to self-sabotage your audio stream, making your presence unbearable to others.