Facebook unveils new tools to "nudge" young users away from harmful content

Mark Zuckerberg breaks silence after Facebook whistleblower testifies before Congress

Fresh off of a Congressional grilling alleging its products harm children, Facebook is rolling out several new features it says will better protect youth, including prompting teens to take a break using its photo-sharing app, Instagram, and "nudging" teens if they are repeatedly looking at the same content that's not conducive to their well-being.

The Menlo Park, California-based Facebook is also planning to introduce new controls for adults of teens on an optional basis so that parents or guardians can supervise what their teens are doing online. These initiatives come after Facebook announced late last month that it was pausing work on its Instagram for Kids project.

Critics, though, say the plan lacks details and are skeptical that the new features would be effective.

No magic wand

The new controls were outlined on Sunday by Nick Clegg, Facebook's vice president for global affairs, who made the rounds on various Sunday news shows including CNN's "State of the Union" and ABC's "This Week with George Stephanopoulos."

"We are constantly iterating in order to improve our products," Clegg told Dana Bash on "State of the Union" Sunday. "We cannot, with a wave of the wand, make everyone's life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use."

Facebook Whistleblower Frances Haugen: The 60 Minutes Interview

Clegg said that Facebook has invested $13 billion over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on these issues. And while Clegg said that Facebook has done its best to keep harmful content out of its platforms, he says he was open for more regulation and oversight.

"We need greater transparency," he told CNN's Bash. He noted that the systems that Facebook has in place should be held to account, if necessary, by regulation so that "people can match what our systems say they're supposed to do from what actually happens."

The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens. She also accused Facebook of being dishonest in its public fight against hate and misinformation. Haugen's accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company's civic integrity unit.

Facebook whistleblower testifies social media platform is "just like cigarettes" in "hooking kids"

A question of algorithms

Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said that he doesn't think introducing controls to help parents supervise teens would be effective, since many teens set up secret accounts. When Facebook's head of global safety, Antigone Davis, testified before the Senate last month, she sparred with Senator Richard Blumenthal over the question of teens' secret accounts.

In a question that has since gained internet notoriety, Blumenthal asked Davis if Facebook would "commit to ending finsta," using slang for a secondary secret account, to which David replied by saying, "We don't actually do finsta."

Fairplay's Golin was also dubious about how effective nudging teens to take a break or move away from harmful content would be. He noted Facebook needs to show exactly how they would implement it and offer research that shows these tools are effective.

"There is tremendous reason to be skeptical," he said. He added that regulators need to restrict what Facebook does with its algorithms.

He said he also believes that Facebook should cancel its Instagram project for kids.

Facebook's head of global safety on whistleblower's testimony calling company "morally bankrupt"

When Clegg was grilled by both Bash and Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation ahead of the January 6 riots, he responded that if Facebook removed the algorithms people would see more, not less hate speech, and more, not less, misinformation.

Clegg told both hosts that the algorithms serve as "giant spam filters."

Democratic Senator Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, told Bash in a separate interview Sunday that it's time to update children's privacy laws and offer more transparency in the use of algorithms.

"I'm just tired of hearing 'trust us', and it's time to protect those moms and dads that have been struggling with their kids getting addicted to the platform and been exposed to all kinds of bad stuff," Klobuchar said. 

She added, "I appreciate that he is willing to talk about things, but I believe the time for conversation is done," referring to Clegg's plan. "The time for action is now."

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.