Facebook Content Moderators Demand Safer Work Conditions Amid Pandemic

Open Letter Highlights Workplace Safety Concerns

Over 200 Facebook content moderators—alongside some full-time employees—have issued a public demand for safer working conditions. In an open letter addressed to Facebook and its outsourcing partners (Accenture and Covalen), the group urged the company to “stop needlessly risking moderators’ lives.” This follows reports from The Intercept that moderators handling extreme content (e.g., graphic violence, sexual abuse) were required to return to offices during the COVID-19 pandemic. Shortly after their return, a moderator reportedly tested positive for the virus.

Key Demands from Moderators

The letter outlines several critical requests:

  • Remote Work for High-Risk Employees: Moderators demand indefinite work-from-home options for those at high risk or living with vulnerable individuals.
  • Hazard Pay & Healthcare: They call for hazard pay, comprehensive healthcare, and psychiatric support.
  • Direct Employment: Instead of outsourcing, they urge Facebook to employ moderators directly.
  • Transparency & Process Reform: The group challenges Facebook’s claim that moderation must occur on-site for security reasons, arguing most content could be reviewed remotely.

“Without our work, Facebook is unusable. Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation.” — Content Moderators

Facebook’s Response

A Facebook spokesperson stated:

“We appreciate the valuable work content reviewers do and prioritize their health and safety. The majority of our 15,000 global moderators work remotely and will continue to do so during the pandemic. All have access to healthcare and wellbeing resources from day one.”

However, Facebook’s VP of Integrity, Guy Rosen, later clarified that highly sensitive content (e.g., criminal material) cannot be reviewed remotely due to security concerns (CNBC).

Broader Implications

The Human Cost of Content Moderation

Moderators are the frontline defense against harmful content, yet their working conditions often lack adequate safeguards. The letter emphasizes:

  • Mental Health Risks: Exposure to traumatic content without sufficient psychological support.
  • Physical Safety: COVID-19 exposure in office settings.

A Call for Systemic Change

The moderators argue that Facebook’s reliance on AI is insufficient:

  • Algorithms fail to detect nuances like satire or distinguish journalism from disinformation.
  • Human moderators remain essential for rapid intervention in cases of self-harm or child abuse.

International Support

The campaign has gained backing from legal advocacy group Foxglove, which called it the “biggest joint international effort of Facebook content moderators yet” (Twitter).

Editor’s Note: This article was updated to clarify that full-time Facebook employees also support these demands in solidarity with moderators.

Remaining 0% to read
All articles, information, and images displayed on this site are uploaded by registered users (some news/media content is reprinted from network cooperation media) and are for reference only. The intellectual property rights of any content uploaded or published by users through this site belong to the users or the original copyright owners. If we have infringed your copyright, please contact us and we will rectify it within three working days.