Opinion

Dealing With Disinformation: Facebook and YouTube Need to Take Down Provably False "News"

Michael Posner
Quote icon
In effect, both Facebook and Google/YouTube are hedging their bets, opting to demote disinformation with the hope that no one will see it, rather than removing it altogether. Now they need to take the next step.
By Michael Posner
Mark Zuckerberg recently announced that Facebook will shift its focus to “private, encrypted services where people can be confident what they say to each other stays secure.” While some commentators initially applauded this statement as a 180-degree turn from the company’s past, the real import is unclear because Zuckerberg didn’t mention any plan of action. As he later acknowledged, “this isn’t a product announcement, it’s a statement of principles.”

Perhaps what is most telling is what his statement did not discuss—namely, whether Facebook would change its core business model. Today, the company draws more than two billion of us worldwide to its site, collects our personal data, and then uses this information to generate billions of dollars in advertising revenue each year. If we assume that this advertising model will not soon disappear, then Facebook and the other leading Internet networking platforms, Twitter and Google’s YouTube will continue to face the serious challenges posed by those who seek to promote deliberately false information on their sites, especially in the political realm.  These platforms are selling user attention, or engagement, to those who advertise on their sites. They know that users are drawn to emotionally evocative content, especially that which is sensational and negative.  Much of the disinformation online fits that bill and therefore is promoted by the platforms’ algorithms. Some of this disinformation is promoted by foreign actors like the Russian government. But more of it comes from a wide range of domestic sources, whose political orientation runs the gamut from right to left.

Facebook and these other companies have a mixed record in responding to disinformation. On the plus side they have stepped up their efforts to remove postings by the Russians, often because their source is disguised and because it disrupts our democratic discourse.   They also are removing certain categories of harmful content, such as harassment or hate speech, whether from foreign or domestic sources.  In addition, the companies have begun to take useful but still insufficient steps to combat deliberate disinformation, or “fake news.”

Read the full Forbes article.
___
Michael Posner is a Professor of Business and Society and Director of the NYU Stern Center for Business and Human Rights.