Tech

Meta’s terrible content broke him. Now he wants it to pay


This case is the first from a content moderator outside of the company’s home country. In May 2020, Meta (then Facebook) reached an agreement 52 million USD with US-based executives who developed PTSD while working for the company. But previous reports have found that many international executives of the company doing almost identical jobs face lower wages and receive less support when working in countries with less care. mental health and more labor rights. While US-based moderators earn around $15 per hour, moderators in places like India, Philippines and Kenya earn much less, by year 2019 Report from Verge.

“The purpose of sending content moderation work abroad and far away is to keep distance from it and reduce costs,” said Paul Barrett, associate director of the Center for Business and Human Rights at New York University. fees for this business function”. , the author of a 2020 report about outsourced content moderation. However, content moderation is critical for the platform to continue to function, keeping the type of content that could keep users—and advertisers—away from the platform. “Content moderation is a core business function, not something offsite or afterthought. But there is a strong irony from the fact that the whole arrangement is set up to relieve liability,” he said. (A summary version of Barrett’s report was included as evidence in the current case in Kenya on behalf of Motaung.)

Barrett says other outsourcers, such as those in the apparel industry, would be unthinkable today to say that they were not responsible for the conditions in which their clothing was manufactured.

“I think tech companies, younger and in a way more arrogant, think they can do this trick,” he said.

Moderator Sama, who spoke to WIRED on condition of anonymity because of fear of reprisal, was described as needing to review thousands of pieces of content daily, often needing to make decisions about what can and can’t. stay on the platform for 55 seconds or less. Sometimes that content can be “something offensive, hate speech, bullying, provocation, something sexually suggestive,” they said. “You should expect anything.”

Crider, of Foxglove Legal, says that the systems and processes that moderators Sama come into contact with — and which have proven to be mentally and emotionally damaging — were designed by Meta. (The case also alleges that Sama engaged in labor abuse through anti-union activities, but does not allege that Meta was part of the effort.)

“This is about broader complaints about the inherently harmful work system, which is inherently toxic and exposes people to unacceptable levels of risk,” says Crider. “That system is functionally identical, whether the person is in Mountain View, Austin, Warsaw, Barcelona, ​​Dublin or Nairobi. And so, in our view, the problem is that Facebook is designing a system that puts people at risk of injury and PTSD risk.”

Crider says that in many countries, especially those based on British common law, courts will often look at decisions in other similar countries to help shape themselves, and that Motaung’s case has could be a blueprint for outsourced operators in other countries. “Although it does not set any official precedent, I hope that this case can serve as a landmark for other jurisdictions to consider how to deal with large multinationals. This.”

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button