Meta is facing a second set of lawsuits in Africa over the psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse.
Lawyers are gearing up for court action against a company contracted by Meta, which owns Facebook and Instagram, after meeting moderators at a facility in Ghana that is understood to employ about 150 people.
Moderators working for Majorel in Accra claim they have suffered from depression, anxiety, insomnia and substance abuse as a direct consequence of the work they do checking extreme content.
The allegedly gruelling conditions endured by workers in Ghana are revealed in a joint investigation by the Guardian and the Bureau of Investigative Journalism.
It comes after more than 140 Facebook content moderators in Kenya were diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content.
The workers in Kenya were employed by Samasource, an outsourcing company that carries out content moderation for Meta using workers from across Africa. Majorel, the company at the centre of the allegations in Ghana, is owned by the French multinational Teleperformance.
One man, who cannot be named for legal reasons, said he attempted suicide owing to the nature of his work. His claims his contract was subsequently terminated and he has returned his home country.
Facebook and other large social media companies employ armies of content moderators, often based in the poorest parts of the world, to remove posts that breach their community standards and to train AI systems to do the same.
Moderators are required to review distressing and often brutal pictures and videos to establish whether they should be removed from Meta’s platforms. According to workers in Ghana, they have seen videos of a person being skinned alive and a woman being beheaded.
The moderators claim mental health care offered by the firm was unhelpful, was not delivered by medical doctors, and that personal disclosures made by staff about the effects of their work were circulated among managers.
Teleperformance disputed this, saying it employed licensed mental health professionals who are registered with the local regulatory body and hold a master’s degree in psychology, counselling, or another mental health field. The legal case is being prepared by a UK-based nonprofit, Foxglove. It would be the second case brought by content moderators in Africa, after Samasource workers in Kenya sued in December.
Foxglove said it was “urgently investigating these shocking abuses of workers” with a view to using “every tool at our disposal, including potential legal action” to improve working conditions.
It is working with a Ghanaian firm, Agency Seven Seven, on preparing two possible lawsuits. One would allege psychological harms and could involve a group of moderators, and the other unfair dismissal, involving the moderator from east Africa whose contract was terminated after he attempted suicide.
Foxglove’s co-executive director Martha Dark said: “These are the worst conditions I have seen in six years of working with social media content moderators around the world.
“In Ghana, Meta is displaying nothing short of a complete disregard for the humanity of its key safety workers upon whom all its profits rely – content moderators. They are treated as objects who can be used up, burned out and replaced with no care whatsoever for the permanent damage to their mental and physical wellbeing.”
Dark said basic wages for content moderators in Accra were below living costs, incentivising them to work overtime, pay for which pay is understood to be even lower than normal rates. Moderators face deductions from their pay for failing to meet performance targets, she added.
Contracts seen by the Guardian show that the base wage starts at about 1,300 Ghanaian cedis a month – just over £64. This is supplemented by a system of performance-related bonuses, the upper range of which amounts to about 4,900 cedis (£243) a month, significantly less than the estimated cost of living in Accra.
A Teleperformance spokesperson said that content moderators enjoyed “strong pay and benefits, including monthly pay that is roughly 10 times the country’s minimum wage for domestic moderators, and 16 times the minimum wage for those who have relocated from other countries, when including project allowance, transportation allowance, language premium and more – all of which are automatically paid to the moderator and are not performance-based”.
Foxglove’s researcher Michaela Chen said she had seen photos of moderators’ living quarters, in which they were “crammed five to a flat, two to a room”. She said there appeared to be a culture of secrecy, including surveillance from managers, who follow workers into the toilets during breaks.
This extends to moderators’ work for Meta. She said: “Workers spend all day working on Meta’s platforms, moderating to Meta’s standards and using Meta’s systems, but at the same time, moderators are told constantly: ‘You do not work for Meta,’ and are forbidden from telling anyone they do.”
Teleperformance said that moderators are “offered housing in… one of the most upscale and well-known residential and commercial neighbourhoods in Accra”.
The spokesperson described the housing as ‘safe, with strong security’ and having
Carla Olympio, a partner at Agency Seven Seven, said she believed a personal injury case could succeed in Ghana’s courts and would set a precedent establishing that worker protections extend to psychological harms as well as physical injury.
“[There is] currently a gap in our laws because they haven’t necessarily caught up with the new developments that cover technology and virtual work,” she said.
Rosa Curling, co-executive Director at Foxglove, said it was seeking for the court to “order immediate changes to the content moderators’ workplace”, including proper safeguards and psychiatric care.
A spokesperson for Teleperformance said: “At TP in Ghana, we take our content moderation work seriously. From the very beginning during the interview process, within the employee contract and through employee training and resiliency testing, we are fully transparent with our prospective moderators regarding the content they might see during their work to help keep the internet safe for our communities. We have robust people management systems and workplace practices, including a robust wellbeing program staffed by fully licensed psychologists to support our content moderators throughout their content moderation journey.
Meta said the companies it works with are “contractually obliged to pay their employees who review content on Facebook and Instagram above the industry standard in the markets they operate”.
The tech company said it takes “the support of content reviewers seriously”, including detailing expectations around counselling, training and other support in contracts with the companies it outsources.
It said all content moderators sign client confidentiality agreements because they are dealing with user information which needs to be protected and for their own safety, but moderators may discuss their jobs with doctors and counsellors, and some aspects with family members.