Research Highlights

Disinformation and the 2020 Election: How the Social Media Industry Should Prepare

Disinformation and the 2020 Election cover
A new report from the NYU Stern Center for Business and Human Rights assesses some of the forms and sources of disinformation likely to play a role on social media during the 2020 presidential election campaign in the U.S.
Is the 2020 U.S. presidential election campaign secure? While midterm Election Day in November 2018 did not feature much interference, there is no guarantee that antagonists will refrain from digital meddling in the more consequential 2020 contest, according to a new report from the NYU Stern Center for Business and Human Rights.

The report, “Disinformation and the 2020 Election: How the Social Media Industry Should Prepare,” assesses some of the forms and sources of disinformation likely to play a role during the next presidential election campaign.

Authored by Paul Barrett, the Center’s deputy director, the report explores these risks and analyzes what the major social media companies—Facebook, Twitter, and YouTube (owned by Google)—have done to harden their defenses against disinformation. The report also offers nine recommendations of additional steps social media companies should take to prepare for 2020, including: 
  • Detect and remove deepfake videos: Realistic but fraudulent videos have the potential to undermine political candidates and exacerbate voter cynicism.
  • Remove provably false content in general: The platforms already remove hate speech, voter suppression, and other categories of content; the report recommends that they add one more.
  • Hire a senior content overseer: Each company needs an executive with clout to supervise the process of guarding against disinformation.
  • Improve industry-wide collaboration on disinformation: For example, when one platform takes down abusive accounts, others should do the same with affiliated accounts.
  • Teach social media literacy in a more direct, sustained way: Users have to take responsibility for recognizing false content, but they need more help to do it.
Read the full report on the Center’s website.