Opinion

How Facebook Fails To Fight QAnon

Michael Posner

By Michael Posner

By Michael Posner

Last week, Marjorie Taylor Green easily won a Republican Congressional primary in Georgia and is likely to be elected in November to represent Georgia’s 14th Congressional District. Green is a deeply divisive candidate, who has delivered a steady stream of inflammatory attacks on Muslims, immigrants, African Americans, and Jews. She also is an avid supporter of QAnon, which occupies a dark corner of the internet, and whose followers believe that there is a worldwide network of Satan-worshiping pedophiles who control politicians and the media. With over a dozen Congressional candidates expressing support for QAnon, its growing collection of misguided and often hateful content poses an emerging threat to our democracy. Its explosive growth also underscores the inadequacy of Facebook’s efforts to address this type of political disinformation.

The QAnon conspiracy theories first emerged in October 2017, when an anonymous post on the fringe platform 4chan claimed classified information showed the existence of an elaborate plot by the “deep state” against Donald Trump and his supporters. It evolved from “Pizzagate,” the conspiracy theory that Hillary Clinton ran a pedophilia ring from a Washington pizza shop, and has grown substantially by appealing to adherents of a broader range of conspiracy theories. Outside of 4chan, many of QAnon’s followers have found a home on Facebook, where their brand of political disinformation has attracted millions of followers and given them outsized prominence. According to West Point’s Combating Terrorism Center, QAnon “represents a militant and anti-establishment ideology,” which “finds resonance with other far-right extremist movements, such as the various militant, anti-government, white nationalist, and neo-Nazi extremist organizations across the United States.” The FBI has called QAnon a motivator of extremist violence. The group poses a significant threat, not only through potential violence, but also through the spreading of disinformation and degrading of democratic discourse.

This threat is fueled and accelerated by Facebook. As Charles Warzel observed recently in The New York Times, “QAnon’s rise is the direct result of a world in which media and politics are distorted by the dizzying scale of social networks, by their lack of adequate content moderation, and by the gaming of algorithms and hashtags. While the social media platforms didn’t create QAnon, they created the conditions for it to thrive.”  Indeed, the sensationalist nature of QAnon’s content makes it much more likely to be shared by Facebook’s users or recommended by the company’s own algorithms. A recent internal investigation by Facebook, reported by NBC News, uncovered thousands of groups and pages, with millions of members and followers who support the QAnon-inspired conspiracy theories. The top 10 groups identified in Facebook’s investigation had more than one million members collectively. These internal findings have been corroborated by outside assessments. The Guardian documented more than 170 QAnon groups, pages, and accounts across Facebook and Instagram with more than 4.5 million aggregate followers. It also documented dedicated communities of QAnon followers on Facebook in at least 15 countries.

Read the full Forbes article.
___
Michael Posner is a Professor of Business and Society and Director of the NYU Stern Center for Business and Human Rights.