Research Highlights

Tackling Domestic Disinformation: What the Social Media Companies Need to Do

Tackling Domestic Disinformation
A new report from the NYU Stern Center for Business and Human Rights examines domestically generated disinformation in the US and urges social media platforms to take a harder line in addressing the problem
A growing amount of misleading and false content infests social media. A new report from the NYU Stern Center for Business and Human Rights focuses on domestically generated disinformation in the US: the nature and scope of the problem, what the social media platforms have done about it, and what more they need to do.

Authored by Paul Barrett, the Center’s deputy director, the report notes that domestic disinformation comes from disparate sources, including message boards, websites and networks of accounts on Facebook, Twitter and YouTube.

Increasingly, social media platforms are removing disinformation from Russia and other foreign countries because of its fraudulent nature and potential to disrupt democratic institutions. In contrast, some commentators have argued that misleading content produced by US citizens is difficult to distinguish from ordinary political communication protected by the First Amendment. According to this view, platforms shouldn’t be encouraged to make judgments about what’s true and untrue in politics.

But, the report argues, platforms are already making similar judgments when their algorithms rank and recommend posts, tweets and videos. They also remove certain categories of harmful content, such as harassment and hate speech. The report urges platforms to add provably false information to the removal list, starting with content affecting politics or democratic institutions. The First Amendment, which precludes government censorship, doesn’t constrict social media venues owned and operated by nongovernmental entities. The real question confronting the platforms, according to the report, is how to make their evaluations of factually questionable content more reasonably, consistently and transparently.

The report is divided into four parts:
  • Part one provides an overview of the subject and the report’s argument. The report contends that platforms ought to take a harder line on domestic disinformation, which pollutes the marketplace of ideas.
  • Part two describes the various forms that domestic disinformation takes.
  • Part three assesses steps the platforms have taken to address domestic disinformation.
  • Part four outlines our recommendations to the platforms and describes steps we think they need to take to intensify the fight against domestic disinformation.
Read the full report here.