Published on 12:00 AM, November 27, 2020

Content moderators at Facebook demand safer working conditions

More than 200 Facebook content moderators, as well as some full-time employees, raised their voice to the social media company with concern over workplace safety.

In an open letter to Facebook and the company's content moderators contractors, Accenture and Covalen, the group demanded the tech company "stop needlessly risking moderators' lives,". This protest was triggered after some of the moderators — who deal with sexual abuse and graphic violence content — were required to return to office during the middle of the pandemic. Shortly after returning to the office, a moderator reportedly tested positive for COVID-19.

In the letter, the group wrote "After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office," it further reads, "Moderators who secure a doctors' note about a personal COVID risk have been excused from attending in person. Moderators with vulnerable relatives, who might die were they to contract COVID from us, have not."

The group demands that Facebook maximizes the amount of work people can do from home and allow those who are high-risk (or live with someone who is high-risk) to be able to work from home indefinitely. They also want the company to offer hazard pay, healthcare, psychiatric care, and employ moderators rather than outsource them.

In response, Facebook's VP of Integrity Guy Rosen said on a press call that they are "not able to route some of the most sensitive and graphic content to outsourced reviewers at home,".

"This is really sensitive content. This is not something you want people reviewing from home with their family around." He added.

The moderators argue that Facebook's algorithms are nowhere near ready to successfully moderate the site's content. They claim the algorithm cannot spot satire, sift journalism from disinformation, and unable to respond quickly to self-harm or child abuse.

Currently, the group represents content moderators throughout the U.S. and Europe and has support from legal advocacy firm Foxglove. In a tweet Foxglove claims, this is the "biggest joint international effort of Facebook content moderators yet."