Until recently, Mark Zuckerberg liked to think of Facebook as a basically innocuous social utility. But that view has taken a beating in recent months, as news shared on Facebook had a huge — and, critics say, pernicious — effect on the presidential election.
In an open letter released on Thursday, Zuckerberg grapples with the implications of Facebook’s growing power.
“For the past decade, Facebook has focused on connecting friends and families,” he writes. “With that foundation, our next focus will be developing the social infrastructure for community.”
Many of the ideas he discusses are uncontroversial, like helping people find their loved ones during natural disasters. But the most significant section — and likely the most controversial — focuses on filter bubbles, clickbait, and fake news.
“Giving everyone a voice has historically been a very positive force for public discourse,” Zuckerberg writes. “But the past year has also shown it may fragment our shared sense of reality. Our goal must be to help people see a more complete picture, not just alternate perspectives.”
Zuckerberg vows to stamp out “hoax content” from Facebook and to alert users when an article has been ruled false by fact-checkers. But he also recognizes that the biggest problem with news on Facebook isn’t outright fakery so much as stories that are distorted or sensationalistic. These stories deepen partisan divisions and make compromise impossible.
“Even if we eliminated all misinformation, people would just emphasize different sets of facts to fit their polarized opinions,” Zuckerberg warns. “That’s why I’m so worried about sensationalism in media.”
It’s a good line, and Zuckerberg says he wants to address this problem. He says that if people become less likely to share an article after reading it, that’s a clue that it probably doesn’t deliver on its headline, so Facebook will push it down the list in users’ news feeds.
Still, Zuckerberg ignores what seems to me the fundamental reason that sensationalistic content thrives on Facebook: the fact that Facebook’s news feed chooses stories based on “engagement.” That is, the more often users click, share, and otherwise engage with a particular story, the higher Facebook ranks it in peoples’ news feeds.
As any online journalist will tell you, “engagement” and “sensationalism” are very often synonyms. A one-sided, partisan article will typically generate more clicks than a nuanced article that gives each side of an issue their due.
It’s no mystery why Facebook chose to heavily prioritize engagement when the site was young: The company wanted users to come back to the site as often as possible. But the calculation could be different now that Facebook is so ubiquitous. Facebook doesn’t have to manipulate people into visiting constantly — most people do so already out of habit and because they want to see their friends’ baby pictures. If anything, people might start to feel better about their Facebook experience if every news headline they saw wasn’t trying to manipulate them into clicking on it.