Facebook whistleblower fears Meta's plan for the metaverse

Facebook whistleblower shares fears about metaverse

Meta Platforms, the owner of Facebook, is racing to build the "metaverse," and Frances Haugen is worried. The company's notorious struggles to moderate content on its social media platform bode ill for Meta's ability to control what is posted in the virtual world, said Haugen, the former Facebook product manager who told lawmakers in October that Facebook prioritizes profit over user safety and programs its algorithms to promote divisive content.

"It's the exact same problems you're going to see in VR," Haugen said in an interview with CBS News, adding, "Facebook hasn't actually designed safety by design into it from the start."

Haugen said platforms like TikTok, where a small portion of the content generate most of the views, are easier to moderate compared to Facebook's more distributed model. In the virtual spaces where Meta is betting big, moderating content, removing disinformation and tracking violators will be a challenge because the interactions are not recorded. 

"You don't know who's the person who said the horrible comment to you," Haugen said, while noting that there are technological solutions that could protect people in the metaverse, such as logging activity.

"You could be keeping the last 20 minutes all the time and for audio at least, it's not that much," she said. "But it shows that they don't have safety by design because that's an easy feature to have."

Meta all in on the metaverse

Meta CEO Mark Zuckberberg has declared that the metaverse — a for now mostly theoretical network of 3D virtual environments accessed with augmented and virtual reality headsets — will be the "successor to the mobile internet." Meta is spending $10 billion this year to build products and protocols that support video games, concerts and workplace collaboration tools -- a significant sum for a company that reported $29.01 billion in revenue last quarter.

Experts estimate that Zuckerberg's vision of an open virtual ecosystem could cost anywhere from $800 billion and a trillion dollars, and require participation from the company's biggest rivals, including Microsoft, Google, Apple and others. 

To that end, Meta has taken pains to show that it plans to work with other stakeholders in developing the metaverse. In September, the company said building the interconnected networks will take up to 15 years, and promised to collaborate with governments and academic researchers on key issues. It launched a two-year, $50 million research program to work with civil rights groups and nonprofits "to determine how to build these technologies responsibly."

Last month, the company also announced a partnership with the Digital Wellness Lab at Boston Children's Hospital to develop a youth digital literacy program for the metaverse.

Haugen isn't reassured by Meta's pledge to collaborate. She said that prioritizing the construction of a virtual reality world suggests that Zuckerberg is "dissociating" from the company's current challenges, and she urged lawmakers to put pressure on Facebook to change its algorithms and content recommendation practices.

She suggested that lawmakers and regulators consider the risks that parents, community groups and activists attribute to Facebook's algorithms and "pair the company's assessment of its risks with the community's assessment."

"Facebook should have to articulate what it is going to do to address each harm because as long as Facebook is operating in the dark, they will not do enough on any of these," she said.

At a Senate hearing last week, the CEO of Meta-owned Instagram, Adam Mosseri, promised transparency around the use of algorithms and ranking models.

"I can commit to you today that we will provide meaningful access to data so that third-party researchers can design their own studies and make their conclusions about the effects of well-being on young people," he said. "And on ranking, I can commit to do all I can to explain how ranking works and to find other ways for us to be transparent about algorithms."

Senators grill Instagram head on app's potential harmful impact on teens

Meta has defended its efforts to improve the technology giant's content moderation practices. The company notes that it publishes a content enforcement report every quarter and is on track to spend more than $5 billion on safety issues this year. Meta also said it is working with independent academic researchers to look at the role Facebook played during the 2020 election.

"Every day our teams have to balance protecting the ability of billions of people to express themselves with the need to keep our platform safe," Meta spokesperson Nkechi Nneji said in a statement to CBS News. "While there's more work to do, we continue making progress as a result of those investments," she added.

Yet Haugen, who worked on algorithms during her time at Facebook, said the ranking models and the challenges associated with them are not going away anytime soon.

"The fundamental problem of our time is, do we want to be governed by algorithms or do we want to be governed by people," she said. 

Facebook changing company name to Meta

Meta has argued that algorithms make the social media experience more meaningful for users. The company says its recommendation technology is designed to increase positive sessions and bring family and friends closer.

Nick Clegg, Meta's vice president of global affairs, wrote an essay titled "It Takes Two to Tango" earlier this year in which he described content ranking as a "dynamic partnership between people and algorithms."

"The personalized 'world' of your News Feed is shaped heavily by your choices and actions. It is made up primarily of content from the friends and family you choose to connect to on the platform, the Pages you choose to follow, and the Groups you choose to join," Clegg wrote. "Ranking is then the process of using algorithms to order that content."

Legislation in progress?

Lawmakers have said for months that bipartisan legislation to regulate social media companies is on the way. They have held several hearings, dragging tech CEOs and industry experts to answer questions about algorithms, content moderation and user privacy. Top executives from Meta, Twitter, Google, YouTube, Snapchat, and TikTok have all testified before the House and Senate this year, but meaningful legislation is still lagging.

Haugen, who signed a book deal with Little Brown and Co. on Thursday, said she is remaining patient, while asserting that the "normal mechanisms for controlling a trillion-dollar company do not exist."

"We have overcome tremendous, seemingly impossible things before," she said. "The Soviet Union fell, the British left India, Apartheid ended. All these things were impossible. They were impossible until they were inevitable."

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.