Opinion

Confronting Facebook's Growing Pains

Michael Posner
Quote icon
Among the top negative emotions gaining the greatest attention are anger and fear. The companies have built algorithms that serve up what we want to see, which often means content that prompts anger and fear.
By Michael Posner
On Wednesday, The New York Times published an in-depth account of Facebook’s poor corporate responses to political disinformation, hate speech and other harmful content on its platform. The Times described the company’s approach as “delay, deny and deflect.” A day later, Facebook issued a sharp rebuttal, insisting that it is making progress on these challenges, such as recent improvement in its efforts to take down hate speech.  This back and forth follows a pattern, where the company reacts to each new set of revelations, expresses contrition for not acting more quickly, and then offers new assurances that it now has the problem under control.

Even though the company has taken a number of important corrective actions—for example, banning military officials in Myanmar who had used the platform to inflame ethnic hatred—Facebook’s catch-us-and-we-will-fix-it routine is not sustainable.  The company’s brand reputation is now in serious jeopardy.  Facebook needs to take a look at three aspects of its core business model that are hindering progress. These and other aspects of the governance of social media companies are examined in a report from the Center for Business and Human Rights at NYU Stern School of Business.

First, Facebook and its fellow Internet giants, Google and Twitter, should accept a more significant role in regulating content on their sites.  For years, these companies have portrayed themselves as plumbers, running pipes, who aren’t in control of the content that flows through them. They routinely deflect responsibility, saying they are not like the editors of The New York Times, and they are not arbiters of the truth.  The reality is that the Internet companies are neither plumbers nor traditional news editors. They are something in between—and now need to help shape and enforce the rules governing this new paradigm.  Their resistance to acknowledging this obvious truth has been driven by company executives and lawyers. They fear that embracing a more expansive self-oversight role will endanger a legal-liability shield the Internet companies enjoy under the 1996 Communications Decency Act.  This anxiety about liability may help explain why, as reported by the Times, Facebook’s top management restrained the company’s former digital security chief, Alex Stamos, when he wanted to move more aggressively in early 2017 against Russian disinformation.

Read the full Forbes article.
___
Michael Posner is a Professor of Business and Society and Director of the NYU Stern Center for Business and Human Rights.