Watch CBS News

Facebook whistleblower says company incentivizes "angry, polarizing, divisive content"

Inside Facebook's algorithm
Facebook whistleblower on the company's algorithm 03:40

Frances Haugen spent 15 years working for some of the largest social media companies in the world including Google, Pinterest, and until May, Facebook.

Haugen quit Facebook on her own accord and left with thousands of pages of internal research and communications that she shared with the Securities and Exchange Commission. 60 Minutes obtained the documents from a Congressional source.

On Sunday, in her first interview, Haugen told 60 Minutes correspondent Scott Pelley about what she called "systemic" problems with the platform's ranking algorithm that led to the amplification of "angry content" and divisiveness. Evidence of that, she said, is in the company's own internal research.

"Facebook's mission is to connect people all around the world," said Haugen. "When you have a system that you know can be hacked with anger, it's easier to provoke people into anger. And publishers are saying, 'Oh, if I do more angry, polarizing, divisive content, I get more money.' Facebook has set up a system of incentives that is pulling people apart."

Haugen said Facebook changed its algorithm in 2018 to promote "what it calls meaningful social interactions" through "engagement-based rankings." She explained that content that gets engaged with – such as reactions, comments, and shares – gets wider distribution.

Haugen stated that some of Facebook's own research found that "angry content" is more likely to receive engagement, something that content producers and political parties are aware of.

"One of the most shocking pieces of information that I brought out of Facebook that I think is essential to this disclosure is political parties have been quoted, in Facebook's own research, saying, we know you changed how you pick out the content that goes in the home feed," said Haugen. "And now if we don't publish angry, hateful, polarizing, divisive content, crickets. We don't get anything. And we don't like this. We know our constituents don't like this. But if we don't do these stories, we don't get distributed. And so it used to be that we did very little of it, and now we have to do a lot of it, because we have jobs to do. And if we don't get traffic and engagement, we'll lose our jobs."

Facebook declined an on-camera interview with 60 Minutes. The company told 60 Minutes it conducted internal and external research before altering its algorithm.

"The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends -- which research shows is better for people's well-being -- and deprioritizing public content," said Lena Pietsch, Facebook's Director of Policy Communications, in a statement to 60 Minutes. "Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people's experience more meaningful, but blaming Facebook ignores the deeper causes of these issues - and the research."

The company said it continues to make alterations to its platform with the goal of "making people's experience more meaningful" and is conducting "new tests to reduce political content on Facebook based on research and feedback."

As for Haugen, the 37-year-old data scientist with an MBA from Harvard, is scheduled to testify before congress this week.

FACEBOOK'S FOREIGN IMPACT

Facebook's foreign impact 03:19

Facebook is one of the largest internet platforms in the world. It boasts 2.8 billion global users that account for roughly 60% of internet connected people on earth.

Despite its massive reach, former employee turned whistleblower Frances Haugen told 60 Minutes the company does not offer the same safety systems for every language on the platform or country where Facebook is used.

"It's really important to remember that Facebook makes different amounts of money for every country in the world," Haugen said. "Every time Facebook expands to a new one of these linguistic areas, it costs just as much, if not more, to make the safety systems for that language as it did to make English or French, right. Because each new language costs more money but there's fewer and fewer customers. And so, the economics just doesn't make sense for Facebook to be safe in a lot of these parts of the world."

Facebook told 60 Minutes it works with 80 independent third-party fact-checkers who review content in 60 languages.

"Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business," said Facebook's Pietsch. "Our incentive is to provide a safe, positive experience for the billions of people who use Facebook. That's why we've invested so heavily in safety and security."

MISINFORMATION AND HATEFUL CONTENT

Misinformation and hateful content on Facebook 03:04

In August, Facebook touted its regulation of COVID-19 misinformation and hate speech. The company issued a public report stating it removed 3,000 accounts, pages, and groups for violating its rules for spreading COVID-19 related misinformation. Facebook also said it removed 20 million pieces of COVID-19 misinformation from the platform and hate speech content removal has increased 15-fold since the company began reporting it.

Former employee Frances Haugen believes Facebook isn't telling the full story in its transparency reports.

"We have no independent transparency mechanisms that allow us to see what Facebook is doing internally," Haugen told Scott Pelley. "And we have seen from things like the community enforcement report that when Facebook is allowed to create its own homework, it picks metrics that are in its own benefit. And the consequence is they can say we get 94% of hate speech and then their internal documents say we get 3% to 5% of hate speech. We can't govern that."

In a statement to 60 Minutes, Facebook said it has dedicated extensive resources to keeping people safe. The company says it has 40,000 people working on safety and security, and that it has invested $13 billion on such measures over the last six years.

"We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority," Facebook's Pietsch said to 60 Minutes. "If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps."

Haugen: Facebook needs to declare "moral bankruptcy"

Whistleblower says Facebook needs to declare "moral bankruptcy" 03:51

Before she quit Facebook, Frances Haugen worked in the social media platform's Civic Integrity unit, which she said was tasked with making sure the company was "a good force in society."

Haugen described working on an understaffed counterespionage team to combat foreign actors using the platform for malice.

"Our team at any given time only could work on a third of the cases that we had," Haugen told 60 Minutes. "A third of them. Like, we literally had to sit the and make a list and be like, 'Who will we actually protect?' And there's no reason we had to do that. We could've had, you know, two, three, ten times as many people. And we intentionally didn't build detection systems because we already couldn't handle the cases we had."

Facebook told 60 Minutes it's constantly getting better at rooting out hate speech and removing it." The company said hate speech accounts for 5 views per every 10,000 on the platform. Facebook said that hate speech prevalence has dropped 50% in the last three quarters.

"Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place," said Facebook's Pietsch. "We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."

Haugen said she believes the social media giant should declare "moral bankruptcy" and level with the public on its past failures.

"The reason I came forward is Facebook has been struggling," Frances Haugen told 60 Minutes. "They've been hiding information…And we need to not solve problems alone, we need to solve them together. And that's why I came forward."

YOU CAN WATCH SCOTT PELLEY'S FULL REPORT BELOW

Facebook Whistleblower Frances Haugen: The 60 Minutes Interview 13:37
View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.