Labelers training AI say they’re overworked, underpaid and exploited by big American tech companies


The familiar narrative is that artificial intelligence will take away human jobs: machine-learning will let cars, computers and chatbots teach themselves – making us humans obsolete. 

Well, that’s not very likely, and we’re gonna tell you why. There’s a growing global army of millions toiling to make AI run smoothly. They’re called “humans in the loop:” people sorting, labeling, and sifting reams of data to train and improve AI for companies like Meta, OpenAI, Microsoft and Google. It’s gruntwork that needs to be done accurately, fast, and – to do it cheaply – it’s often farmed out to places like Africa – 

Naftali Wambalo: The robots or the machines, you are teaching them how to think like human, to do things like human.

We met Naftali Wambalo in Nairobi, Kenya, one of the main hubs for this kind of work. It’s a country desperate for jobs… because of an unemployment rate as high as 67% among young people. So Naftali, father of two, college educated with a degree in mathematics, was elated to finally find work in an emerging field: artificial intelligence.

Lesley Stahl: You were labeling. 

Naftali Wambalo: I did labeling for videos and images. 

Naftali and digital workers like him, spent eight hours a day in front of a screen studying photos and videos, drawing boxes around objects and labeling them, teaching the AI algorithms to recognize them.

Naftali Wambalo: You’d label, let’s say, furniture in a house. And you say “This is a TV. This is a microwave.” So you are teaching the AI to identify these items. And then there was one for faces of people. The color of the face. “If it looks like this, this is white. If it looks like this, it’s Black. This is Asian.” You’re teaching the AI to identify them automatically.

Naftali Wambalo
Naftali Wambalo

60 Minutes


Humans tag cars and pedestrians to teach autonomous vehicles not to hit them. Humans circle abnormalities to teach AI to recognize diseases. Even as AI is getting smarter, humans in the loop will always be needed because there will always be new devices and inventions that’ll need labeling. 

Lesley Stahl: You find these humans in the loop not only here in Kenya but in other countries thousands of miles from Silicon Valley. In India, the Philippines, Venezuela – often countries with large low wage populations – well educated but unemployed.

Nerima Wako-Ojiwa: Honestly, it’s like modern-day slavery. Because it’s cheap labor–

Lesley Stahl: Whoa. What do you –

Nerima Wako-Ojiwa: It’s cheap labor. 

Like modern day slavery, says Nerima Wako-Ojiwa, a Kenyan civil rights activist, because big American tech companies come here and advertise the jobs as a ticket to the future. But really, she says, it’s exploitation.

Nerima Wako-Ojiwa: What we’re seeing is an inequality.

Lesley Stahl: It sounds so good. An AI job! Is there any job security?

Nerima Wako-Ojiwa: The contracts that we see are very short-term. And I’ve seen people who have contracts that are monthly, some of them weekly, some of them days. Which is ridiculous.

She calls the workspaces AIi sweatshops with computers instead of sewing machines. 

Nerima Wako-Ojiwa: I think that we’re so concerned with “creating opportunities,” but we’re not asking, “Are they good opportunities?”

Because every year a million young people enter the job market, the government has been courting tech giants like Microsoft, Google, Apple, and Intel to come here, promoting Kenya’s reputation as the Silicon Savannah: tech savvy and digitally connected.

Nerima Wako-Ojiwa: The president has been really pushing for opportunities in AI –

Lesley Stahl: President?

Nerima Wako-Ojiwa: Yes.

Lesley Stahl: Ruto?

Nerima Wako-Ojiwa: President Ruto. Yes. The president does have to create at least one million jobs a year the minimum. So it’s a very tight position to be in.

Nerima Wako-Ojiwa
Nerima Wako-Ojiwa

60 Minutes


To lure the tech giants, Ruto has been offering financial incentives on top of already lax labor laws. but the workers aren’t hired directly by the big companies. They engage outsourcing firms – also mostly American – to hire for them. 

Lesley Stahl: There’s a go-between.

Nerima Wako-Ojiwa: Yes.

Lesley Stahl: They hire? They pay.

Nerima Wako-Ojiwa: Uh-huh (affirm). I mean, they hire thousands of people.

Lesley Stahl: And they are protecting the Facebooks from having their names associated with this?

Nerima Wako-Ojiwa: Yes yes yes.

Lesley Stahl: We’re talking about the richest companies on Earth. 

Nerima Wako-Ojiwa: Yes. But then they are paying people peanuts.

Lesley Stahl: AI jobs don’t pay much?

Nerima Wako-Ojiwa: They don’t pay well. They do not pay Africans well enough. And the workforce is so large and desperate that they could pay whatever, and have whatever working conditions, and they will have someone who will pick up that job.

Lesley Stahl: So what’s the average pay for these jobs?

Nerima Wako-Ojiwa: It’s about a $1.50, $2 an hour.

Naftali Wambalo: $2 per hour, and that is gross before tax. 

Naftali, Nathan, and Fasica were hired by an American outsourcing company called SAMA – that employs over 3,000 workers here and hired for Meta and OpenAI. In documents we obtained, OpenAI agreed to pay SAMA $12.50 an hour per worker, much more than the $2 the workers actually got – though, SAMA says, that’s a fair wage for the region –

Humans in the loop
Naftali, Nathan, and Fasica

60 Minutes


Naftali Wambalo: If the big tech companies are going to keep doing this– this business, they have to do it the right way. So it’s not because you realize Kenya’s a third-world country, you say, “This job I would normally pay $30 in U.S., but because you are Kenya $2 is enough for you.” That idea has to end.

Lesley Stahl: OK. $2 an hour in Kenya. Is that low, medium? Is it an OK salary? 

Fasica: So for me, I was living paycheck to paycheck. And I have saved nothing because it’s not enough.

Lesley Stahl: Is it an insult?

Nathan: It is, of course. It is.

Fasica: It is.

Lesley Stahl: Why did you take the job?

Nathan: I have a family to feed. And instead of staying home, let me just at least have something to do.

And not only did the jobs not pay well – they were draining. They say deadlines were unrealistic, punitive – with often just seconds to complete complicated labeling tasks.

Lesley Stahl: Did you see people who were fired just ’cause they complained?

Fasica: Yes, we were walking on eggshells.

They were all hired per project and say SAMA kept pushing them to complete the work faster than the projects required, an allegation SAMA denies.

Lesley Stahl: Let’s say the contract for a certain job was six months, OK? What if you finished in three months? Does the worker get paid for those extra three months? 

Male voice: No – 

Fasica: KFC.

Lesley Stahl: What? 

Fasica: We used to get KFC and Coca Cola. 

Naftali Wambalo: They used to say thank you. They give you a bottle of soda and KFC chicken. Two pieces. And that is it. 

Worse yet, workers told us that some of the projects for Meta and OpenAI were grim and caused them harm. Naftali was assigned to train AI to recognize and weed out pornography, hate speech and excessive violence, which meant sifting through the worst of the worst content online for hours on end. 

Naftali Wambalo: I looked at people being slaughtered, people engaging in sexual activity with animals. People abusing children physically, sexually. People committing suicide. 

Lesley Stahl: All day long?

Naftali Wambalo: Basically- yes, all day long. Eight hours a day, 40 hours a week.

The workers told us they were tricked into this work by ads like this that described these jobs as “call center agents” to “assist our clients’ community and help resolve inquiries empathetically.” 

Fasica: I was told I was going to do a translation job.

Lesley Stahl: Exactly what was the job you were doing?

Fasica: I was basically reviewing content which are very graphic, very disturbing contents. I was watching dismembered bodies or drone attack victims. You name it. You know, whenever I talk about this, I still have flashbacks.

Lesley Stahl: Are any of you a different person than they were before you had this job?

Fasica: Yeah. I find it hard now to even have conversations with people. It’s just that I find it easier to cry than to speak.

Nathan: You continue isolating you– yourself from people. You don’t want to socialize with others. It’s you and it’s you alone. 

Lesley Stahl: Are you a different person?

Naftali Wambalo: Yeah. I’m a different person. I used to enjoy my marriage, especially when it comes to bedroom fireworks. But after the job I hate sex.

Lesley Stahl: You hated sex?

Naftali Wambalo: After countlessly seeing those sexual activities pornography on the job that I was doing, I hate sex.

SAMA says mental health counseling was provided by quote “fully licensed professionals.” But the workers say it was woefully inadequate.

Naftali Wambalo: We want psychiatrists. We want psychologists, qualified, who know exactly what we are going through and how they can help us to cope.

Lesley Stahl: Trauma experts.

Naftali Wambalo: Yes. 

Lesley Stahl: Do you think the big company, Facebook, ChatGPT, do you think they know how this is affecting the workers?

Naftali Wambalo: It’s their job to know. It’s their f***ing job to know, actually– because they are the ones providing the work.

Lesley Stahl and Naftali Wambalo:
Lesley Stahl and Naftali Wambalo in Nairobi, Kenya

60 Minutes


These three and nearly 200 other digital workers are suing SAMA and Meta over “unreasonable working conditions” that caused psychiatric problems

Nathan: It was proven by a psychiatrist that we are thoroughly sick. We have gone through a psychiatric evaluation just a few months ago and it was proven that we are all sick, thoroughly sick. 

Fasica: They know that we’re damaged but they don’t care. We’re humans just because we’re black, or just because we’re just vulnerable for now, that doesn’t give them the right to just exploit us like this. 

SAMA – which has terminated those projects – would not agree to an on-camera interview. Meta and OpenAI told us they’re committed to safe working conditions including fair wages and access to mental health counseling. Another American AI training company facing criticism in Kenya is Scale AI, which operates a website called Remotasks.

Lesley Stahl: Did you all work for Remotasks?

Group: Yes.

Lesley Stahl: Or work with them? 

Ephantus, Joan, Joy, Michael, and Duncan signed up online, creating an account, and clicked for work remotely, getting paid per task. Problem is: sometimes the company just didn’t pay them.

Ephantus: When it gets to the day before payday, they close the account and say that “You violated a policy.” 

Lesley Stahl: They say, “You violated their policy.”

Voice: Yes.

Lesley Stahl: And they don’t pay you for the work you’ve done— 

Ephantus: They don’t. 

Lesley Stahl: Would you say that that’s almost common, that you do work and you’re not paid for it?

Joan: Yeah. 

Lesley Stahl: And you have no recourse, you have no way to even complain?

Joan: There’s no way.

The company says any work that was done “in line with our community guidelines was paid out.” In March, as workers started complaining publicly, Remotasks abruptly shut down in Kenya altogether.  

Lesley Stahl: There are no labor laws here? 

Nerima Wako-Ojiwa: Our labor law is about 20 years old, it doesn’t touch on digital labor. I do think that our labor laws need to recognize it– but not just in Kenya alone. Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies, they shut down and they move to a neighboring country.

Lesley Stahl: It’s easy to see how you’re trapped. Kenya is trapped: They need jobs so desperately that there’s a fear that if you complain, if your government complained, then these companies don’t have to come here. 

Nerima Wako-Ojiwa: Yeah. And that’s what they throw at us all the time. And it’s terrible to see just how many American companies are just-just doing wrong here– just doing wrong here. And it’s something that they wouldn’t do at home, so why do it here?

Produced by Shachar Bar-On and Jinsol Jung. Broadcast associate, Aria Een. Edited by April Wilson.



Source link


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *