Facebook accused of poor treatment of African content moderators

Facebook accused of poor treatment of African content moderators

Meta-owned social media platform Facebook, has come under fire for the treatment of African content moderators according to a recent TIME article.

TIME revealed that content moderators in Nairobi, Kenya were subject to poor pay and working conditions by Sama, a company responsible for Facebook’s Sub-Saharan Africa content moderation since 2019. Sama also provides data labeling services for tech giants like Google, Microsoft. 

In 2021, Facebook said it spent more than $5 billion on safety measures. It contracts the services of more than 15,000 content moderators globally, most of whom are employed by third-parties like Sama. 

“The work that we do is a kind of mental torture,” one employee, who currently works as a Facebook content moderator for Sama, told TIME. “Whatever I am living on is hand-to-mouth. I can’t save a cent. Sometimes I feel I want to resign. But then I ask myself: what will my baby eat?”

These working conditions led 6 Ethiopians to resign in January. Facebook is also struggling to reduce the influence of content on its platform which is leading to real-world violence in Ethiopia.

According to TIME, at least two Sama content moderators resigned after being diagnosed with mental illnesses including post-traumatic stress disorder (PTSD), anxiety, and depression. Many others described how they had been traumatized by the work but were unable to obtain formal diagnoses due to their inability to afford access to quality mental healthcare. Some described continuing with work despite trauma because they had no other options.

In the summer of 2019, instead of negotiating when content moderators threatened to strike unless they were given better pay and working conditions, Sama responded by flying two highly-paid executives from San Francisco to Nairobi to deal with the uprising. 

South African Daniel Motaung, the attempted strike’s leader who was in the process of formally filing trade union papers, was fired shortly after. He was accused by Sama of taking action that would put the relationship between the company and Facebook at “great risk.” Sama told other participants in the labor action effort that they were expendable and said they should either resign or get back to work, several employees told TIME. The workers abandoned the protest before the seven days were up, and there was no pay increase.

Sama denied that there was any strike or labor action. 

According to payslips seen by TIME, Sama pays foreign employees monthly pre-tax salaries of around 60,000 Kenyan shillings ($528), which includes a monthly bonus for relocating from elsewhere in Africa. After tax, this equates to around $440 per month, or a take-home wage of roughly $2.20 per hour, based on a 45-hour work week. Sama employees from within Kenya, who are not paid the monthly relocation bonus, receive a take-home wage equivalent to around $1.46 per hour after tax.

The average salary at Sama including benefits was approximately 2.5 times the Kenyan minimum wage,  according to a 2021 study carried out by three MIT researchers. But these wages only cover the basic costs of living, workers say, and don’t allow them to save or improve their financial situations.

Employees also complained of working long hours—up to nine hours per day including breaks, and their screen time is monitored. According to several copies of Sama content moderators’ performance reviews, TIME revealed that the moderators were measured against target metrics for average time spent and quality. This evidence contradicts the public statement that Facebook has made about not setting expectations on its contractors.

“A common misconception about content reviewers is that they’re driven by quotas and pressured to make hasty decisions,” Ellen Silver, Facebook’s vice president of operations, said in a 2018 blog post. “Let me be clear: content reviewers aren’t required to evaluate any set number of posts … We encourage reviewers to take the time they need.” 

A Meta spokesperson, Ben Walters, corroborated this statement.

The Social media giant has also put some features in place to help protect moderators, like the option to render videos in black and white or add blurring. But one Sama employee said he doesn’t use these options because of the pressure to meet quotas. “How can you clearly see whether content is violating or not unless you can see it clearly? If it’s black and white or blurred, I cannot tell,” the employee said. “Some people use the option, but I don’t. Because if I wrongly action that [content], I will be marked down.”

This report adds to a seemingly endless series of public controversies relating to Facebook in recent years. Despite claiming innocence in this case, it’s imperative the social media giant holds its contractors like Sama to higher standards or else its name will be dragged in the mud. – TechCabal