Meta and Sama, its main subcontractor for content moderation in Africa, are facing a lawsuit in Kenya over alleged unsafe and unfair working conditions, if they fail to meet 12 demands on workplace conditions brought before them.
The law firm representing Daniel Motaung, who was laid-off from his job at Sama for organizing a strike over poor working conditions and pay, accused the subcontractor of violating various rights, including that of health.
Motaung was laid off for organizing the strike. The law firm has given Meta and Sama 21 days to respond to the demands or face a lawsuit.
In the demand letter, the law firm asked Meta and Sama to adhere to the country's labor, privacy and health laws, recruit qualified and experienced health professionals, and provide the moderators with adequate mental health insurance and better compensation.
It is a practice that keeps Facebook's profit margins high but at the cost of thousands of people's health and the safety of Facebook worldwide. Lawyers for Motuang said that the conditions which are unsafe, degrading and pose a risk of post-traumatic stress disorder are reported by Sama.
The Time story detailed how the moderators were recruited under the false pretense that they were taking up call center jobs. After signing their employment contracts and moving to its hub in Nairobi, the content moderators only got to learn about the nature of their jobs, according to the story.
The social media posts that are perpetrating and perpetuate hate, misinformation and violence are removed by the moderators.
Employees are expected to abide by many requirements, but not reveal the nature of their jobs with outsiders. The lowest wages are earned by the content moderators in Africa. The firm fashions itself as ethical. After the expos, the firm increased employee pay.
The law firm alleged that Sama failed to grant Motaung and his colleague adequate psychosocial support and mental health measures. They were granted 30 minutes a day with a counselor.
Our client was not prepared for the kind of job he was to do. He remembers the first video he moderated being a beheading. The law firm said that no psychological support had been offered to him prior to that point.
The leader of the legal action said that she uses Facebook to discuss the news. This case is important because of that.
The very safety and integrity of our democratic process is dependent on a Facebook that is staffed and has the support of the front-line workers against hate and misinformation. This isn't an ordinary labor case, the working conditions for Facebook moderators affect all of us.