OpenAI Paid African Workers $2/Hour to Make ChatGPT Safer
OpenAI paid Kenyan workers $1.32 to $2 per hour to read details of murder, child sexual abuse, suicide, and incest. This happened between November 2021 and February 2022 in Nairobi, the capital of Kenya.
Now here’s the kicker: If you were a hotel receptionist working in Nairobi around the same time, you’d be in a better position. You’d make around $1.52 an hour to smile at customers and answer phone calls. Best of all, your day job wouldn’t give you nightmares.
OpenAI’s involvement with the story wasn’t direct. They hired a contractor called Sama, which, like OpenAI, is based in San Francisco. Sama is in the business of recruiting white collars from India, Uganda, and Kenya and having them perform a very specific task: Label data for tech companies like Google, Microsoft, and Meta.
Data labeling is a fancy way to say you put data into buckets. Developers call it the “cat/not-a-cat” game where you review a picture and specify whether or not it’s a cat. Your answer then becomes a label attached to the picture, and congratulations! You’ve just created a nice input that will make your Machine Learning algorithm a bit smarter. The same “cat/not-a-cat” principle applies to text and video.
Data labeling sounds like an easy job until your boss asks you to classify gut-wrenching content. Imagine the only way you could feed your family is to read explicit scenes of rape and torture nine hours a day, every single day.
Now stop imagining because somewhere in Africa and South Asia, there are people doing that right now. Hundreds, perhaps thousands of them.
ChatGPT is stupid — and that’s why it’s toxic
Machine Learning models like ChatGPT are dumb. You can call them Artificial Intelligence if you want, but that doesn’t make them any smarter. Their only capability is to transform a bunch of text into another bunch of text— and while doing so, they have precisely zero understanding of both the questions and the answers.
In a sense, ChatGPT is like a calculator but for words.