Research Highlights

Enhancing the FTC’s Consumer Protection Authority to Regulate Social Media Companies

Cover of report reading "Enhancing the FTC's Consumer Protection Authority to Regulate Social Media Companies"
A new report from the NYU Stern Center for Business and Human Rights concludes that the social media industry’s self-regulation has proven inadequate and it is time for Congress and the Federal Trade Commission to step in
Congress is considering dozens of bills designed to rein in social media platforms such as Facebook, Instagram, YouTube, and Twitter. The profusion of proposed legislation reflects a broadly held view that the companies that own these platforms have failed to self-regulate adequately. This failure has resulted in the amplification of misinformation, hate speech, incitement of political violence, and other harmful content. While most lawmakers apparently agree that something must be done about the malign effects of social media, there isn’t a consensus on how to proceed. The surfeit of legislative proposals reflects clashing motivations and competing regulatory strategies.

A new report from the NYU Stern Center for Business and Human Rights, “Enhancing the FTC’s Consumer Protection Authority to Regulate Social Media Companies” offers principles and policy goals to help lawmakers and regulators sort through the bills pending before Congress and shape an agenda for the Federal Trade Commission to use its consumer protection authority to incentivize better corporate conduct.

Co-authored by Paul Barrett, Center for Business and Human Rights deputy director and senior research scholar, and Lily Warnke, a former fellow at the Center, the report notes goals Congress should aim to achieve through legislation, including:
  • Enhance the consumer protection authority of the Federal Trade Commission and direct the agency to conduct sustained oversight of social media companies.
  • Empower the FTC to enforce a new mandate that the companies maintain procedurally adequate content moderation systems, which deliver on promises made to users about platform rules and enforcement practices.
  • Direct the agency to enforce new transparency requirements, including that social media companies disclose how their algorithms rank, recommend, and remove content, as well as information about the kind of content that reaches large audiences and the factors that cause such content to “go viral.”
  • Oblige social media companies to maintain comprehensive, searchable advertising libraries, which disclose who has paid for each ad.
  • Respect the First Amendment by prohibiting the FTC from any involvement in substantive content decisions, including decisions to take down or leave up content.
  • Amend Section 230 of the Communications Decency Act of 1996 to clarify that its terms do not shield social media platforms from FTC enforcement actions or from private civil claims related to categories of particularly harmful content for which Congress determines platforms must be held accountable.
Read the full report on the Center’s website here.