Several surveys show how artificial intelligence companies outsource to Africa, sometimes in deplorable conditions.
For barely two dollars an hour, Kenyan workers are working to "make ChatGPT less toxic", says a instructive investigation by Time. The application of artificial intelligence (AI) has revolutionized the internet. But it would be wrong to imagine that simple robots succeeded in creating this new kind of app: letting out violent, sexist and racist remarks, it was necessary to train the AI, which drew information from a web full of discriminatory remarks. OpenAI, the creator of ChatGPT, has sent tens of thousands of text extracts to a Kenyan company. The subcontracting mission would have started in November 2021.
While Sama is a San Francisco-based company, it employs workers in Kenya, Uganda, and India. And works for Google, Meta and Microsoft. While it presents itself as a company that works for “ethical AI” and claims to have helped lift more than 50 people out of poverty, the facts are different: OpenAI contractors earn between 000 ,1,32 and 2 dollars per hour.
"Our mission is to ensure that artificial general intelligence benefits all of humanity, and we work hard to build safe and useful AI systems that limit bias and harmful content," OpenAI admits. But this hard work is in reality reserved for African “little hands”. The mission entrusted to Kenyans is reminiscent how Facebook exploited employees to moderate its content.
"Torture"
"It's torture," summarizes a Sama employee responsible for reading zoophilic content. So much so that, in February 2022, eight months before the expiry of the contract, Sama broke his subcontract because of the terrible working conditions. According to Time, Sama pocketed $200 for his mission, about 000 Kenyans had to read content full of sexual abuse, hate speech and violence. Advice in "well-being" have been made available to employees and Sama ensures have hired “professionally trained and licensed mental health therapists”. What the employees dispute.
Beyond the psychological hardship, it was the working conditions of the start-up that were shocking: while in Kenya there is no mandated minimum wage, OpenAI contractors earned barely as much as a receptionist in Nairobi, reports Time. " Time-consuming and undervalued tasks are generally outsourced by technology companies to a host of precarious workers, generally located in southern countries”, indicate Clément Le Ludec, digital sociologist, and Maxime Cornet, doctoral student in sociology of AI.
They found a similar mission in Madagascar, which concerned data work. “Our study also shows the reality of 'French-style AI': on the one hand, French technology companies rely on GAFAM services to access data hosting services and computing power; on the other hand, data-related activities are carried out by workers located in the former French colonies, in particular Madagascar, thus confirming already old logics in terms of outsourcing chains”, they write.
Invisibilization of subcontractors
Problem: African workers “are located at the end of a long chain of outsourcing, which partly explains the low wages of these skilled workers”. As for the companies, they are generally held by foreigners. Of the 48 companies offering digital services in free zones, studied by the two researchers, barely 9 are held by Malagasy against 26 by French.
“This pattern is reminiscent of what the researcher Jan Padios calls 'colonial recall', according to the two researchers. The former colonized countries have linguistic skills and cultural proximity with the countries giving the orders from which the service companies benefit”. An economic post-colonialism, which very often leads unfortunately to an invisibilization of artificial intelligence workers who, without these surveys, would always be in the most total anonymity.