After the Jan. 6 Capitol riot,
parent Meta Platforms Inc. said it wanted to scale back how much political content it showed users. The company went further than almost anyone knew.
The results of the effort are gradually reshaping political discourse on the world’s biggest social-media platform, even though the company backed off the most aggressive approach: hitting the mute button on all recommendations of political content.
The company’s sometimes tortured efforts over the past 18 months to play down politics and other divisive topics on the platform are outlined in internal documents reviewed by The Wall Street Journal.
At first, Facebook overhauled how it promoted political and health-related content. With surveys showing users were tired of strife, the platform began favoring posts that users considered worth their time over ones that merely riled them up, the documents show. Debates were fine, but Facebook wouldn’t amplify them as much.
Meta’s leaders decided, however, that wasn’t enough. In late 2021, tired of endless claims about political bias and censorship, Chief Executive
and Meta’s board pushed for the company to go beyond incremental adjustments, according to people familiar with the discussions. Presented with a range of options, Mr. Zuckerberg and the board chose the most drastic, instructing the company to demote posts on “sensitive” topics as much as possible in the newsfeed that greets users when they open the app—an initiative that hasn’t previously been reported.
The plan was in line with calls from some of the company’s harshest critics, who have alleged that Facebook is either politically biased or commercially motivated to amplify hate and controversy. For years, advertisers and investors have pressed the company to clean up its messy role in politics, according to people familiar with those discussions.
It became apparent, though, that the plan to mute politics would have unintended consequences, according to internal research and people familiar with the project.
The result was that views of content from what Facebook deems “high quality news publishers” such as Fox News and CNN fell more than material from outlets users considered less trustworthy. User complaints about misinformation climbed, and charitable donations via the company’s fundraiser product through Facebook fell in the first half of 2022. And perhaps most important, users didn’t like it.
One internal analysis concluded that Facebook could achieve some of its goals by heavily demoting civic content—coverage of political, community and social issues—in the newsfeed, but it would be at “a high and inefficient cost.”
At the end of June, Mr. Zuckerberg pulled the plug on the most extreme plan. Unable to suppress political controversy through blunt force, Facebook has fallen back on more gradual changes to how its newsfeed promotes what the company deems “sensitive topics,” such as health and politics.
“As Mark said almost two years ago, people wanted to see less politics overall on Facebook while still being able to engage with political content if they want to, and that’s exactly what we’ve been doing,” said
a spokeswoman for the company. “Over the last several years, we tested various approaches, ultimately implementing changes that reduce politics, while giving people the experiences they want.”
The current approach still reduces how much of that content users see. Meta now estimates politics accounts for less than 3% of total content views in users’ newsfeed, down from 6% around the time of the 2020 election, the documents show. But instead of reducing views through indiscriminate suppression or heavy-handed moderation, Facebook has altered the newsfeed algorithm’s recommendations of sensitive content toward what users say they value, and away from what simply makes them engage, according to documents and people familiar with the efforts.
“It turns out that maybe the way we recommend political content shouldn’t be the same way we recommend entertainment,” said
a former Meta data-science manager who worked on this issue before leaving in September to become a managing director of the University of Southern California’s Psychology of Technology Institute. Mr. Iyer said Meta’s current approach is a departure from the engagement-focused model long used not just by Facebook but by many other big social-media platforms.
Mr. Iyer said there should be more focus on the way platforms allow certain content to go viral, rather than subjective decisions about what to leave up or take down. “Having employees judge good vs. bad speech often creates more problems than it solves,” he said. “Our goal should be fewer judgment calls.”
Meta’s recent moves risk alienating some digital publishers who built businesses around catering to the newsfeed’s algorithms.
Courier Newsroom, a network of eight digital publications that describes itself as “the largest left-leaning local news network in the country,” said it struggled for traction on Facebook as it revved up for the 2022 midterms. Despite increasing its output of articles by 14.5% in October, the company found that its organic impressions on Facebook—meaning those not boosted by paid advertising—fell by 42.6% from a month earlier.
“Facebook remains one of the most powerful and far-reaching distribution platforms in the world,” said
Courier Newsroom’s publisher. “By further limiting the reach of trusted news publishers on their platform, they will only exacerbate the information crisis in America they’ve helped to create.”
At the progressive-leaning news site Mother Jones, which primarily covers politics and social issues, total Facebook content views in 2022 were roughly 35% of what they were in the year-earlier period. “It’s grim to see in such stark terms how much power a single tech company has over the news that people can access,” said Monika Bauerlein, Mother Jones CEO.
a conservative commentator whose Facebook page is one of the most popular on the platform, user engagement—a measure of likes, comments and shares—has continued to grow, according to analytics tool CrowdTangle. He, too, thinks Facebook’s strategic shift away from politics is misguided, in part, he said, because its user base skews older and tends to be interested in political issues.
“They could have used their older audience to fund their younger, cooler Meta,” he said. “Instead they said, ‘I’ve got an idea. Let’s take our gravy train of older users that like conservative content and wipe them off the platform.’ ”
Facebook for years has had a conflicted relationship with politics, given that there are few other topics that get users as riled up—and as engaged on social media.
Speaking at Georgetown University in 2019, Mr. Zuckerberg defended social media’s role in politics and society. “I believe people should be able to use our services to discuss issues they feel strongly about—from religion and immigration to foreign policy and crime,” he said, contending that Facebook’s role in public discourse was ultimately healthy.
A Journal article in 2021 cited internal company research showing that steps to promote engagement had favored inflammatory material, with publishers and political parties reorienting their posts toward outrage and sensationalism.
Facebook said that it had an integrity team in place to address efforts to exploit its algorithm, and that it wasn’t responsible for partisan divisions in society.
Researchers continued to document Facebook’s preferential treatment of toxic content even after the articles’ publication. An October 2021 presentation noted that Facebook’s algorithm had long provided incentives for the creation of “outrage-inducing, untrustworthy content.”
In the wake of the 2020 election and the Jan. 6 riot, pressure mounted on the company to address the widespread perception that it was either toxic or biased.
The company’s postelection surveys showed users believed Facebook had unhealthy effects on civic discourse, and that they wanted to see less politics on the platform. Lawmakers grilled company executives in congressional hearings.
Mr. Zuckerberg appeared to be tiring of the topic. “Politics has kind of had a way of creeping into everything,” he said on a January 2021 earnings call, weeks after the riot. Users were sick of politics, he said, not just in the U.S. but around the globe. For the first time publicly, he said Facebook was looking at ways to de-emphasize political content in the newsfeed. “I just don’t think that it’s serving the community particularly well to be recommending that content right now,” he said.
That February, Facebook announced it would temporarily begin showing less political content to “a small percentage of people” as it experimented with ways to better serve its users. The goal wasn’t restricting political discussion, the company said, but “respecting each person’s appetite for it.”
It began the experiment in the U.S., Brazil and Canada, a departure from its usual practice of testing major changes in secondary markets.
Six months later, the company said it had made progress toward identifying “what posts people find more valuable than others,” and that it would put less emphasis on reshares and comments in the future.
The announcement played down the magnitude of the change. Facebook wasn’t just de-emphasizing reshares and comments on civic topics. It was stripping those signals from its recommendation system entirely.
“We take away all weight by which we uprank a post based on our prediction that someone will comment on it or share it,” a later internal document said.
That was a more aggressive version of what some Facebook researchers had been pushing for years: addressing inaccurate information and other integrity issues by making the platform less viral. Because the approach didn’t involve censoring viewpoints or demoting content via imprecise artificial-intelligence systems, it was deemed “defensible,” in company jargon.
Views of civic content in newsfeed fell by nearly a third, internal data showed. With the company no longer amplifying posts it predicted were most likely to draw lots of replies, comments on civic posts dropped by two-thirds. Anger reactions fell by more than half on civic content, and nearly a quarter platform-wide. Bullying, inaccurate information and graphic violence fell, too.
But there was a cost. Although users told Facebook that their feeds were more “worth your time,” they also logged on to the platform 0.18% less. Although the company had previously abandoned changes that had resulted in smaller reductions, this time it accepted the cost.
The board, however, remained worried about what was referred to internally as political “salience”—the degree to which people associated Facebook with high-controversy topics, according to documents and people familiar with the matter.
Users reported to Facebook that civic content continued to cause bad experiences, generating profanity, bullying and misinformation reports at disproportionate rates. User surveys showed that people were convinced that the platform had an unhealthy societal effect.
The board and Mr. Zuckerberg settled on broadly suppressing civic content in newsfeed. But turning off the politics wasn’t as easy as flipping a switch. The newsfeed algorithm has 30,000 lines of code, and inaccuracies in content detection meant Facebook would be able to reduce views of civic content by only 70%. Nor would the changes directly address controversies arising from forums of like-minded users, called Facebook Groups, a recurring hot-spot for issues such as incorrect vaccine information and election denial.
The company planned to roll out the changes in the summer of 2022, before the U.S. midterm elections. Documents show the company planned to inform users of the change and was considering how to offer them choices.
The company tried to predict the fallout by testing variations on the algorithm change on more than a quarter of Facebook’s U.S. users. The tests showed that for many news publishers, the effects were going to be severe.
Depending on the mix of suppression features deployed, projected Facebook traffic to Fox News, MSNBC, the
New York Times,
Newsmax, the Atlantic and The Wall Street Journal would initially fall by as much as 40% to 60% beyond the already enacted reductions, with the effects likely diminishing as publishers found ways to adapt.
That wasn’t seen as a problem. An accompanying analysis noted Facebook hoped to “reduce incentives to produce civic content.”
Just testing the broad civic demotion on a fraction of Facebook’s users caused a 10% decline in donations to charities via the platform‘s fundraising feature. Humanitarian groups, parent-teacher associations and hospital fundraisers would take serious hits to their Facebook engagement.
An internal presentation warned that while broadly suppressing civic content would reduce bad user experiences, “we’re likely also targeting content users do want to see….The majority of users want the same amount or more civic content than they see in their feeds today. The primary bad experience was corrosiveness/divisiveness and misinformation in civic content.”
Even more troubling was that suppressing civic content didn’t appear likely to convince users that Facebook wasn’t politically toxic. According to internal research, the percentage of users who said they thought Facebook had a negative effect on politics didn’t budge with the changes, staying consistently around 60% in the U.S.
All that data persuaded Mr. Zuckerberg, in late June, to abandon the most far-reaching planned changes, leaving the company to once again rethink how to promote sensitive content. It built a way to cordon off civic and health posts from its near-constant efforts to boost engagement.
The company is still debating whether it should also restrict how it promotes other types of content. When newsfeed dialed back the reward for producing inflammatory posts on politics and health, some publishers switched to more sensationalistic crime coverage.
“This opens the door to think about what the lines for sensitive and nonsensitive content should be,” said Mr. Iyer, the former Facebook data-science manager. “It would be great for the world to weigh in on this conversation.”
A few weeks after Mr. Zuckerberg dropped plans for the broad changes, the company issued a terse addendum to its February 2021 blog post in which it announced plans to experiment with showing users less political content.
“Our tests have concluded,” the company wrote, that “placing less emphasis on shares and comments for political content is an effective way to reduce the amount of political content people experience in their Feed. We have now implemented these changes globally.”
A current planning document lists “Reducing Bad Experiences with Sensitive Topics” as a “can’t fail priority” for Facebook’s newsfeed integrity team.
“We need to accelerate progress across Facebook in this area,” the document said.
Write to Jeff Horwitz at Jeff.Horwitz@wsj.com, Keach Hagey at email@example.com and Emily Glazer at firstname.lastname@example.org
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8