Research Highlights

Who Moderates the Social Media Giants? A Call to End Outsourcing

Cover of "Who Moderates the Social Media Giants?"
A new report from the NYU Stern Center for Business and Human Rights examines how social media companies have outsourced the critical function of content moderation to third-party vendors and offers recommendations for improvement
Content moderation—deciding what stays online and what gets taken down— is an indispensable aspect of the social media industry. Along with the communication tools and user networks the platforms provide, content moderation is one of the fundamental services social media offers—perhaps the most fundamental. Without it, the industry’s highly lucrative business model, which involves selling advertisers access to the attention of targeted groups of users, just wouldn’t work.

And yet, a new report from the NYU Stern Center for Business and Human Rights, “Who Moderates the Social Media Giants? A Call to End Outsourcing,” finds social media companies have made the striking decision to marginalize the people who do content moderation, outsourcing the vast majority of this critical function to third-party vendors—the kind of companies that run customer-service call centers and back-office billing systems. Some of these vendors operate in the U.S., others in such places as the Philippines, India, Ireland, Portugal, Spain, Germany, Latvia, and Kenya. They hire relatively low-paid labor to sit in front of computer workstations and sort acceptable content from unacceptable.

Authored by Paul Barrett, the Center’s deputy director, the report focuses primarily on Facebook as a case study and 
  • Offers an overview of the current situation
  • Explores the interplay between the coronavirus pandemic and content moderation
  • Describes the origin and early development of moderation
  • Examines problems with Facebook’s content moderation, with an emphasis on the lack of adequate health care for the people who do it and the generally chaotic environment in which they work
  • Looks at the lack of adequate moderation in at-risk countries in regions such as South Asia
  • Offers recommendations for improving the situation, including: end outsourcing of content moderators and raise their station in the workplace; double the number of moderators to improve the quality of content review; provide all moderators with top-quality, on-site medical care; explore narrowly tailored government regulation; and more
Read the full report on the Center’s website here.