Watch CBS News

Facebook's response to 60 Minutes' report, "The Facebook Whistleblower"

Facebook whistleblower speaks to 60 Minutes
Facebook Whistleblower Frances Haugen: The 60 Minutes Interview 13:37

Lena Pietsch, Facebook's director of policy communications, responded to 60 Minutes' report, "The Facebook Whistleblower," with the following statements on behalf of the social media company:

"Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."

On the claim that internal research showed that the company is not doing enough to eradicate hate, misinformation, and conspiracy:

"We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps."

On the claim that incentives within Facebook are misaligned, and the desire for engagement on the platform and profit outweighs safety in some instances:

"Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business. Our incentive is to provide a safe, positive experience for the billions of people who use Facebook. That's why we've invested so heavily in safety and security."

On the claim that the change in "Meaningful Social Interactions" in 2018 amplified polarizing and hateful content:

"The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends -- which research shows is better for people's well-being -- and deprioritizing public content. Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people's experience more meaningful, but blaming Facebook ignores the deeper causes of these issues - and the research."

On the claim that safety measures were put in place, and then rolled back, and made Facebook less safe in the lead up to January 6:

"We spent more than two years preparing for the 2020 election with massive investments, more than 40 teams across the company, and over 35,000 people working on safety and security. In phasing in and then adjusting additional emergency measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement. When those signals changed, so did the measures. It is wrong to claim that these steps were the reason for January 6th -- the measures we did need remained in place through February, and some like not recommending new, civic, or political groups remain in place to this day. These were all part of a much longer and larger strategy to protect the election on our platform -- and we are proud of that work."

Additional statement on Facebook's response to dangerous organizations before the January 6 Capitol insurrection:

"We banned hundreds of militarized social movements, took down tens of thousands of QAnon pages, groups and accounts from our apps, and removed the original #StopTheSteal Group. This is in addition to our removal, and repeated disruption of various hate groups, including Proud Boys, which we banned in 2018. Ultimately, the responsibility resides with those who broke the law, and the leaders who incited them. Facebook has taken extraordinary steps to address harmful content and we'll continue to do our part. We also aggressively worked with law enforcement, both before January 6 and in the days and weeks since, with the goal of ensuring that evidence linking the people responsible for January 6th to their crimes is available for prosecutors."

On Instagram pausing the launch of a version for younger users:

A statement from Instagram: "While we stand by the value that this experience would provide to families, we've decided to pause this project to give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the importance of this project for younger teens online today. The reality is that kids are already online, and we believe that developing age-appropriate experiences designed specifically for them is far better for parents than where we are today."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.