Watch CBS News

FTC says Facebook failed to protect children's privacy, proposes sweeping changes

Facebook users can get settlement money
Millions of Facebook users can now claim settlement money 03:20

U.S. regulators say Facebook misled parents and failed to protect the privacy of children using its Messenger Kids app, including misrepresenting the access to private user data that it provided to app developers.

As a result, The Federal Trade Commission on Wednesday proposed sweeping changes to a 2020 privacy order with Facebook — now called Meta — that would prohibit it from profiting from data it collects on users under 18. This would include data collected through its virtual-reality products. The FTC said the company has failed to fully comply with the 2020 order.

Meta would also be subject to other limitations, including with its use of face-recognition technology and be required to provide additional privacy protections for its users.

"Facebook has repeatedly violated its privacy promises," said Samuel Levine, director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."

Singled out?

Meta called the announcement a "political stunt."

"Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory," Meta said in a prepared statement.

It added, "Let's be clear about what the FTC is trying to do: usurp the authority of Congress to set industrywide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil."

Meta added that it plans to "vigorously fight this action and expect to prevail."

Meta to launch paid verification service on Facebook and Instagram, following Twitter's lead 04:42

Extension of parent's account

Facebook launched Messenger Kids in 2017, pitching it as a way for children to chat with family members and friends approved by their parents. The app doesn't give kids separate Facebook or Messenger accounts. Rather, it works as an extension of a parent's account, and parents get controls, such as the ability to decide with whom their kids can chat.

At the time, Facebook said Messenger Kids wouldn't show ads or collect data for marketing, though it would collect some data it said was necessary to run the service.

But child-development experts raised immediate concerns.

In early 2018, a group of 100 experts, advocates and parenting organizations contested Facebook's claims that the app was filling a need kids had for a messaging service. The group included nonprofits, psychiatrists, pediatricians, educators and the children's music singer Raffi Cavoukian.

"Messenger Kids is not responding to a need — it is creating one," the letter said. "It appeals primarily to children who otherwise would not have their own social media accounts." Another passage criticized Facebook for "targeting younger children with a new product."

Facebook, in response to the letter, said at the time that the app "helps parents and children to chat in a safer way," and emphasized that parents are "always in control" of their kids' activity.

Gaps in parental control

The FTC now says this has not been the case. The 2020 privacy order, which required Facebook to pay a $5 billion fine, required an independent assessor to evaluate the company's privacy practices. The FTC said the assessor "identified several gaps and weaknesses in Facebook's privacy program."

The FTC also said Facebook, from late 2017 until 2019, "misrepresented that parents could control whom their children communicated with through its Messenger Kids product."

Bill would ban kids under 13 from using social media 02:41

"Despite the company's promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls," the FTC said.

As part of the proposed changes to the FTC's 2020 order, Meta would also be required to pause launching new products and services without "written confirmation from the assessor that its privacy program is in full compliance" with the order.

One outspoken supporter of the FTC's proposal is the American Economic Liberties Project. "Over, and over, and over again, Meta has shown itself willing to break the law and sacrifice users' safety and privacy for profits," said Krista Brown, senior policy analyst at the anti-monopoly group. 

Brown added, "The FTC's proposed changes to its 2020 privacy order take on this dangerous business model directly, protecting kids and families."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.