Why this week's social media verdicts could finally hold tech giants to account
Back-to-back verdicts this week against Meta and YouTube could usher in a new chapter in accountability for tech companies, while opening the door to fresh legal challenges, experts tell CBS News.
Two cases, decided in New Mexico and California, are the first to hold social media companies liable for harming young people.
On Tuesday, a New Mexico jury ordered Meta to pay $375 million in civil penalties for failing to protect young users from predators and misleading them about the safety of its apps.
In a separate verdict issued Wednesday in Los Angeles, a jury ruled that Meta and YouTube were negligent in how they designed and operated their platforms, resulting in mental health harm to the plaintiff, a 20-year-old named Kaley, or "KGM." Jurors in that case ordered the companies to pay a total of $6 million in damages.
Meta and YouTube told CBS News they disagree with the verdicts and are planning to appeal.
While the ultimate impact of these cases remains uncertain, experts say the mounting legal and public pressure could portend major changes in how companies design their apps, deliver content and integrate safety features into their platforms. That would mark a victory for American parents, a majority of whom support stricter restrictions on their children's social media use.
The verdicts could also set the stage for how thousands of similar cases — brought by individual plaintiffs, state attorneys general and school districts — play out.
"This is a watershed moment," said J.B. Branch, the AI governance and technology policy counsel at Public Citizen, a consumer advocacy organization. "This is the crack that could potentially open the floodgates to some accountability that Americans have been looking for."
These rulings could reshape tech accountability in several ways, experts say.
A focus on product liability
Internet companies have long been protected by Section 230 of the 1996 Communications Decency Act, which shields them from liability for third-party content posted on their platforms.
However, lawyers in the Los Angeles case took a new tack by focusing on product liability, arguing that Google and Meta's design and operation of their platforms caused addictive behavior and harm.
"This is the first time that anyone has won a judgment against these companies for the very design and the features, as opposed to what other people post," Devorah Heitner, a researcher who studies young people's relationship with technology, told CBS News in an interview.
Legal experts anticipate an increase in product liability cases against social media companies after the Los Angeles trial showed that the legal theory resonated with the jury.
"I believe this is the path forward," said Matthew Bergman, the founding attorney of the Social Media Victims Law Center. Bergman's firm represented Kaley and has filed 1,500 other cases on behalf of families who say they were adversely impacted by social media in some way.
Deeper scrutiny on AI
In addition to social media platforms, this week's verdicts could also put artificial intelligence tools developed by big tech companies under the microscope, especially if product liability arguments gain traction.
Companies like OpenAI and Anthropic have rolled out AI-powered chatbots at lightning speed in the last few years. But some argue that the rush to get into the market has come at the expense of safety. Multiple families have filed lawsuits alleging that AI chatbots were responsible, or played a role, in their loved ones' suicides.
"We are indeed in a new era of Internet law litigation," Jess Miers, an assistant professor at the University of Akron School of Law, told CBS News in an email. "We can and should expect the majority of cases against online services (and now generative AI companies) to be product liability cases."
An increase in lawsuits
ByteDance, Google, Snap and Meta are facing thousands of other lawsuits alleging that their platforms caused harm, including from dozens of state attorneys general. Individual plaintiffs and school districts have also filed litigation against the tech giants.
Because thousands of families have filed similar lawsuits, KGM and a handful of other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury, eventually leading to a broader settlement reminiscent of the Big Tobacco and opioid trials.
Bergman said a group of cases that have been consolidated in California state and at the federal level are "currently awaiting outcomes of these bellwethers to determine whether there's a path to a negotiated resolution, or whether trial is in the works."
In addition to influencing the body of existing cases, Bergman said these verdicts could embolden more children and their parents to come forward, opening the door to more litigation against big tech companies.
"I think there are many families that have been afraid to take on big tech despite the injuries that their children have sustained," he said. "It is our hope and expectation that this verdict will assuage their reluctance and encourage them to seek the same kind of accountability that they would seek if their child were injured by any other dangerous product."
Changes to social media platforms
As part of the Los Angeles trial, Meta and YouTube were ordered to pay damages, but were not required to make any specific changes to their platforms. However, legal experts say the decision could compel social media companies to reconsider their app designs and how they deliver content in order to insulate themselves against future liability.
Clay Calvert, nonresident senior fellow in technology policy studies at the nonpartisan American Enterprise Institute, said he expects the pressure will only mount if the cases are held up on appeal and if other pro-plaintiff verdicts follow.
The changes could uproot some of the central components of apps, including the algorithms that decide what types of content users see in their feeds, experts tell CBS News. Companies could also move to limit screen time, provide warnings to children who use the apps as well as their parents and introduce stricter age verification rules.
"These trials are likely to result in changes to endless scroll and changes to the algorithm, potentially for everyone," Heitner said.
—With reporting by Emily Pandise.