OJAL KHUBCHANDANI
The research, writing, and editing of this post was part of an undergraduate project undertaken for a Rutgers Honors College Seminar in Fall 2021, “Fictions of Artificial Intelligence.” The author’s bio follows the post.

In his novella The Lifecycle of Software Objects (2010), Ted Chiang illustrates a world in which human characters build deep emotional connections with entities that we might recognize as artificial intelligence (AI). For instance, human protagonists Ana and Derek lovingly adopt their pet-like “digients”–“artificial intelligences that have been created within a digital world” (Liu n.p.) Their fictional connection exemplifies a genuine, meaningful relationship between humans and AI which contrasts with the exploitative relationships of many humans working in AI-driven industries today. More specifically, we might look to the what Mary L. Gray and Siddharth Suri call “ghost workers,” laboring behind the scenes of today’s technology, to see how technological advances have led to new forms of exploitation.

According to Gray and Suri’s Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass, ghost workers are a specific type of gig workers who use platforms, such as Rev, Amazon’s Mechanical Turk (MTurk), LeadGenius, and many more. They provide human assistance to ensure that software functions properly and accurately responds to users (Gray and Suri xiii). Such ghost work creates the illusion of seamless use of autonomous technology. The tasks include monitoring content, labeling pictures, transcribing audio, sorting through reviews, and much more. Through these tasks, they bridge the gap between users’ needs and technology’s current limitations.
To maintain the image of autonomous AI, the technological platforms utilize a grading or a rating system, which varies depending on the platform. Such grading systems unfairly evaluate employees and subsequently create unsustainable conditions for ghost workers. For instance, Rev is an audio transcription and video captioner platform that employs a grading scale for the criteria of formatting and accuracy. Workers will receive a grade out of five points from a third party for each criterion, which is combined to create the worker’s average score. This score impacts the worker’s Revver level and thus privileges. The platform has Revver levels: Rookie, Revver, and Revver +. As workers’ average score increases and their levels improve, they could have early access to jobs, extended deadlines, and increased job quality. When their rating decreases, they lose the advantages.
Though it might appear to be a fair system, even a single outlier grade can drastically affect the employee’s average score, work, and ultimately their paycheck. After the score declines, it can be a significant challenge for the worker to increase the grade again (Gray and Suri 82). Since a lower grade will qualify the workers for more complex, time-consuming, and lower pay jobs, workers must struggle through many difficult tasks to improve their grades. If they have another poor performance due to a difficult assignment, it can create a cycle of low grades and hard projects. This cycle can be harmful to the ghost worker and the provider, which is the person or company who has created the task in the platform. Depending on the worker’s reliance on their ghost work wages, they may not be able to earn a sufficient income and eventually leave the site. At the same time, since most ghost workers have more than one source of income, the provider may also lose a quality employee (15). Because of a low tolerance to poor outlier scores, the system can reduce the efficiency of the platform, and subsequently increase the number of incomplete tasks, which causes technical issues to remain unresolved. Consequently, the grading system is a key determinant of the technology’s usability.

The grading system is also a means of concealing the human labor that goes into functioning AI. By assigning workers scores and classifying them as additional costs in technology maintenance, the grading system dehumanizes and deemphasizes the critical role of ghost work in technological enhancement. LeadGenius offers an alternative. This ghost work platform does not utilize a grading system and understands the significance of its employees’ contributions. The company provides “personalized and actionable B2B lead information that helps its clients attain their global revenue growth goals” (Columbus, n.p.). For instance, a law office could pay the company to search for people who were recently arrested for a minor offense (Gray and Suri 25). The office would receive the leads’ information and call them to offer their assistance for a fee. While AI can easily recognize recent minor crimes in the area, people are better equipped to determine the offense’s impact on the offender’s future. In LeadGenius’s model, their ghost workers provide better results for their clients than if technology was the only source of lead generation. Along with improved results, it creates better working conditions for the ghost worker.
LeadGenius’s employment model deviates from many other ghost work platforms’ models, because they value their employees’ work, compensate them appropriately, and create a positive working environment. In an interview with MIT Technology Review, the company’s founder, Prayag Narula, discussed how LeadGenius pays its employees hourly wages “enough to support a small family in the regions we operate” (Knight n.p.) A new employee will start on a ninety-day trial, and they will receive an eight percent pay increase if they log in, stay active for twenty hours per week, and are punctual for shifts (Gray and Suri 23). In addition to the increased pay, employees have the freedom to schedule their own shifts and maintain a very flexible schedule, a crucial advantage of ghost work (25). With a different system, LeadGenius claims to create better working conditions for its employees, which allows them to continue reaping in the benefits of ghost work. Happier employees only serve to help the company’s business, as they produce stronger leads for their clients. An improved ghost work employment model creates value for everyone involved in the process.

The scope of ghost work clearly displays how humans have an integral role in technology. However, the majority of online platforms, such as Rev or MTurk, view it as an inconvenience rather than an asset. Most ghost work platforms obscure human labor, minimizing it to an extra step in the process, to preserve the illusion of autonomous AI. The LeadGenius model contradicts this belittling notion, which allows the company to foster higher-quality work. Ghost work is the backbone of all technological innovation and companies would benefit from developing more positive working conditions for the humans behind the machines. As AI is increasingly integrated into society, platforms should not exploit ghost workers but provide them with the proper resources to supportAI innovation in labor and ensure its usability. Ghost workers should be included as the human part of artificial intelligence instead of another cost in the system.
Ojal Khubchandani is a freshman at Rutgers University’s SAS Honors Program, majoring in Finance and Business Analytics and Information Technology. She is interested in technology, the stock market, and the gig economy. Her goals include earning a CFA certification and pursuing a career in investment banking.