The horror content of the meta broke it. He wants to pay now


The case is the first from a content moderator outside the company’s home country. In May 2020, Meta (then Facebook) reached a $52 million settlement with the US-based moderators who worked for the PSDS company. But an earlier report found that the company’s global moderators, who perform many of the same jobs, face lower pay and less support when working in countries with fewer mental health services and labor rights. While US-based moderators make $15 an hour, moderators in countries like India, the Philippines, and Kenya make much less, according to a 2019 report from The Verge.

“The whole point of sending content moderation work overseas and far away is to keep it manual and reduce the cost of doing this business,” said Paul Barrett, deputy director of New York University’s Center for Business and Human Rights. Author of the 2020 External Content Moderator Report. But content moderation is critical to the survival of platforms by keeping content that drives users and advertisers off the platform. “Content moderation is a core business imperative, not an afterthought or an afterthought. But there is something very ironic about the whole arrangement being designed to abdicate accountability,” he said. (A condensed version of Barrett’s report is now included as evidence in the Motaung case in Kenya.)

Barrett says other outsiders find it unthinkable today that, like the apparel industry, they don’t take any responsibility for how their clothes are made.

“Tech companies think they can pull this trick because they’re younger and in some ways more aggressive,” he said.

One Sama moderator, worried about retaliation, told WIRED on condition of anonymity that he has to review thousands of pieces of content every day, often deciding what can and can’t stay on the platform for 55 seconds or less. . Sometimes that content can be “graphic, hate speech, bullying, inflammatory, sexual,” they say. “You have to wait for nothing.”

The systems and processes of Kreider, Foxglove Legal, Sama’s moderators have been exposed – and shown to be mentally and emotionally damaging – all designed by meta. (While the indictment describes Sama’s alleged involvement in labor abuses as a form of union fraud, it does not state that Metta was part of this effort.)

“This is a widespread complaint about a system of work that is inherently harmful, inherently toxic, and exposes people to unacceptable levels of risk,” Kreider said. “That system is practically the same whether the person is in Mountain View, Austin, Warsaw, Barcelona, ​​Dublin or Nairobi. And from our point of view, the point is that Facebook is designing it to be vulnerable and to expose people to PTSD.

Kreider said that in many countries, particularly those based on British common law, courts often look to decisions from other countries in the same country to make their own decisions, and the Motang case may be a blueprint for arbitrators in other countries. Although it does not set any formal precedent, I hope this case will serve as a signal to other jurisdictions on how to deal with these large internationals.



Source link

Related posts

Leave a Comment

nineteen − 2 =