Research Highlights

Regulating Social Media: The Fight Over Section 230 — and Beyond

Regulating Social Media Report Cover

A new report from the NYU Stern Center for Business and Human Rights argues that Section 230 should be improved to push internet companies to accept greater responsibility for curbing harmful content

A new report from the NYU Stern Center for Business and Human Rights argues that Section 230 should be improved to push internet companies to accept greater responsibility for curbing harmful content

Recently, Section 230 of the Communications Decency Act of 1996 has come under sharp attack from members of both political parties, including presidential candidates Donald Trump and Joe Biden. The foundational law of the commercial internet, Section 230 does two things: It protects platforms and websites from most lawsuits related to content posted by third parties. And it guarantees this shield from liability even if the platforms and sites actively police the content they host. This protection has encouraged internet companies to innovate and grow, even as it has raised serious questions about whether social media platforms adequately self-regulate harmful content.

A new report from the NYU Stern Center for Business and Human Rights, “Regulating Social Media: The Fight Over Section 230 — and Beyond,” assesses proposals for revising the law. The report’s conclusion is that Section 230 ought to be preserved—but that it can be improved. It should be used as a means to push platforms to accept greater responsibility for the content they host.

Authored by Paul Barrett, the Center’s deputy director, the report offers three recommendations: 
  • Keep Section 230: The law has helped online platforms thrive by protecting them from most liability related to third-party posts and by encouraging active content moderation. It has been especially valuable to smaller platforms with modest legal budgets. But the benefit Section 230 confers ought to come with a price tag: the assumption of greater responsibility for curbing harmful content.
  • Improve Section 230: The measure should be amended so that its liability shield provides leverage to persuade platforms to accept a range of new responsibilities related to policing content. Internet companies may reject these responsibilities, but in doing so they would forfeit Section 230’s protection, open themselves to costly litigation, and risk widespread opprobrium.
  • Create a Digital Regulatory Agency: There’s a crisis of trust in the major platforms’ ability and willingness to superintend their sites. Creation of a new independent digital oversight authority should be part of the response. While avoiding direct involvement in decisions about content, the agency would enforce the responsibilities required by a revised Section 230.
Read the full report on the Center’s website here.