Mark Zuckerberg long allowed misinformation, hate speech, and conspiracy theories to fester on Facebook, framing his critics’ calls for stronger action as a threat to free speech, and never quite mentioning that the toxic content endemic to Facebook is good for his bottom line. “I believe that people should be able to see for themselves what politicians that they may or may not vote for are saying,” Zuckerberg said in a Capitol Hill hearing in late 2019, “and judge their character for themselves.”
But, as everything that’s transpired since has made obvious if it wasn’t already, there’s a societal cost of that hands-off approach. And Facebook, like other social media companies, has been forced in recent months to take action. Over the past year, the social media giant has scrambled to stop the spread of fake news and conspiracies about the COVID-19 pandemic, implemented changes to its policies and algorithms to tackle harmful disinformation about the 2020 election and its incendiary aftermath, and even adopted a more aggressive approach to hate speech, lies, and extremism, including with its extraordinary decision, in the wake of this month’s attack on Capitol Hill, to indefinitely suspend Donald Trump. These measures have marked a shift from the laissez-faire approach Zuckerberg had long taken, yet they’ve tended to be too little, too late, and too temporary, treating the most glaring symptoms of the problem without addressing the underlying condition. Zuckerberg may have been able to curb some misinformation by banning political ads in the week leading up to the election, but the polarization and distrust his platform helped sow remains deeply embedded in American politics.
Maybe it’s a sudden surge of conscience, maybe it’s the threat of regulation, but Facebook will soon take its most significant step yet to reduce its harmful role in politics—by trying to retreat from that sphere altogether. Zuckerberg, whose company posted massive profits in 2020, announced on Wednesday that Facebook will aim to “reduce the amount of political content” on the platform and will no longer recommend civic and political groups—a practice that has helped draw users into partisan echo chambers. Political discussions will still be allowed, Zuckerberg said, but the company’s “theme” for 2021 is to be a “force for bringing people closer together.”
“We need to make sure that the communities people connect with are healthy and positive,” he wrote in a Facebook post after an earnings call.
Zuckerberg cited community feedback indicating that “people don’t want politics and fighting to take over their experience on our services” as a motivation for the coming changes, but it’s likely that pressure from Washington has been as much, if not more, of an incentive. Democratic Representatives Anna Eshoo and Tom Malinowski last week blasted Zuckerberg and other tech leaders for “polluting the minds of the American people” and called for social media companies to “fundamentally rethink algorithmic systems that are at odds with democracy” in the wake of the January 6 Capitol riot. “We recognize the recent steps Facebook has taken to crack down on harmful accounts such as those related to QAnon, by removing specific posts that incite violence and banning specific users,” the lawmakers wrote to Zuckerberg. “But content moderation on a service with more than 2.7 billion monthly users is a whack-a-mole answer to a systemic problem, one that is rooted in the very design of Facebook.”
Zuckerberg’s announcement Wednesday would seem to indicate he’s open to taking such action, but there’s good reason to be skeptical. While he said Facebook would commit itself to trying to “turn down the temperature and discourage divisive conversations,” he presented no real details as to how it will do so. And while it may be true that community feedback indicates users want a nicer Facebook, the company’s internal research suggests otherwise: when the platform recently tested algorithmic changes that reduced the visibility of “bad for the world” content, it found that user engagement went down with it. The so-called “nicer newsfeed” it temporarily fostered in America’s post-election powderkeg may have been better for society, but it likely wasn’t better for business. As long as harmful and divisive content contributes to Facebook’s growth, the company will have an incentive to take half-measures.
But perhaps the biggest obstacle to Facebook fixing what it helped break in American politics is that it may no longer be within the company’s means to do so. Zuckerberg long ignored, downplayed, and papered over the problems with his creation. Now, they’ve metastasized into something far larger and more difficult to control. Meaningful effort to address the systemic issues with social media is certainly welcome and long overdue. But taming the monster isn’t so easy. The changes are a “good step,” Malinowski wrote after Zuckerberg’s announcement Wednesday. “But we’ll see how it works in practice.”
— Embedding With Pentagon Leadership in Trump’s Final, Frenzied Days
— Donald Trump Refused to Take ‘No’ From Women—And Then From America Itself
— How Trump’s COVID Chaos Drowned the FDA in Junk Science
— Inside the Epic Bromance of Jeffrey Epstein and Donald Trump
— After Wrecking the Country, Jared and Ivanka Plot Vacation Plans
— Can Trump’s Cult of Followers Be Deprogrammed?
— Trump Makes an Exit With His Brand in Tatters
— From the Archive: How Donald Trump Turned Palm Beach Against Him
— Not a subscriber? Join Vanity Fair to receive full access to VF.com and the complete online archive now.