Watch CBS News

Supreme Court hears case that could reshape the "fundamental architecture" of the internet

Supreme Court to review federal law
Supreme Court to review federal law that protects websites from lawsuits over user-generated content 04:08

Washington — Kati Morton was a reluctant adopter of YouTube.

A therapist working toward her license in California, it was her then-boyfriend and now-husband who first suggested that Morton explore posting videos on the platform as a way to disseminate mental health information.

The year was 2011, and Morton, like many others, thought YouTube primarily consisted of videos of cats playing the piano and make-up tutorials. But after seeing other content posted on the site, Morton decided to give it a shot.

Her audience started small, with her videos garnering a handful of views. But in the more than a decade since then, Morton's YouTube channel has grown to more than 1.2 million subscribers.

Crucial to the growth of Morton's audience is YouTube's system for recommending content to users, which the company began building in 2008. It relies on a highly complex algorithm to predict what videos will interest viewers and keep them watching. Today, half of Morton's views come from recommendations, she said.

"If you could see the entire life of the channel, it was really, really slow and steady," Morton told CBS News. "And then through recommendations, as well as collaborations, things have grown as you're able to reach a broader audience and YouTube is better able to understand the content."

YouTube's recommendations algorithm, and those used by platforms like TikTok, Facebook and Twitter, are now at the heart of a legal dispute that will go before the Supreme Court on Tuesday, in a case that involves the powerful legal shield that helped the internet grow.

"We're talking about rewriting the legal rules that govern the fundamental architecture of the internet," Aaron Mackey, senior staff attorney at the Electronic Frontier Foundation, told CBS News of what's at stake in the case, known as Gonzalez v. Google. 

"A backbone of online activity"

Section 230 of the Communications Decency Act immunizes internet companies from liability over content posted by third parties and allows platforms to remove content considered obscene or objectionable. The dispute before the Supreme Court marks the first time the court will consider the scope of the law, and the question before the justices is whether Section 230's protections for platforms extend to targeted recommendations of information.

The court fight arose after terrorist attacks in Paris in November 2015, when 129 people were murdered by ISIS members. Among the victims was 23-year-old Nohemi Gonzalez, an American college student studying abroad who was killed at a bistro in the city. 

Gonzalez's parents and other family members filed a civil lawsuit in 2016 against Google, which owns YouTube, alleging that the tech company aided and abetted ISIS in violation of a federal anti-terrorism statute by recommending videos posted by the terror group to users.

Google moved to dismiss the complaint, claiming that they were immune from the claims under Section 230. A federal district court in California agreed and, regarding YouTube's recommendations, found that Google was protected under the law because the videos at issue were produced by ISIS.

The U.S. Court of Appeals for the 9th Circuit affirmed the district court's ruling, and Gonzalez's family asked the Supreme Court to weigh in. The high court said in October it would take up the dispute.

The court fight has elicited input from a range of parties, many of which are backing Google in the case. Platforms like Twitter, Meta and Reddit — all of which rely on Section 230 and its protections — argue algorithmic recommendations allow them to organize the millions of pieces of third-party content that appear on their sites, enhancing the experience for users who would otherwise be forced to sift through a mammoth amount of posts, articles, photos and videos.

"Given the sheer volume of content on the internet, efforts to organize, rank, and display content in ways that are useful and attractive to users are indispensable," lawyers for Meta, the parent company of Facebook and Instagram, told the court.

What is Section 230 and why do people want it repealed? 12:55

Even the company that operates online dating services Match and Tinder pointed to Section 230 as "vital" to its efforts to connect singles, as the law allows "its dating platforms to provide recommendations to its users for potential matches without having to fear overwhelming litigation."

But conservatives are using the case as a vehicle to rail against "Big Tech" firms and amplify claims that platforms censor content based on political ideology.

Citing lower court decisions they believe has led to a "broad grant of immunity," a group of Republican senators and House members told the Supreme Court that platforms "have not been shy about restricting access and removing content based on the politics of the speaker, an issue that has persistently arisen as Big Tech companies censor and remove content espousing conservative political views, despite the lack of immunity for such actions in the text of" Section 230.

The case has presented the justices with a rare opportunity to hear directly from the co-authors of the legislation at issue. Ron Wyden, now a Democratic senator from Oregon, and Chris Cox, a former GOP congressman from California, crafted Section 230 in the House in 1996. The bipartisan pair filed a friend-of-the court brief explaining the plain meaning of their law and the policy balance they sought to strike.

"Section 230 protects targeted recommendations to the same extent that it protects other forms of content curation and presentation," they wrote. "Any other interpretation would subvert Section 230's purpose of encouraging innovation in content moderation and presentation. The real-time transmission of user-generated content that Section 230 fosters has become a backbone of online activity, relied upon by innumerable internet users and platforms alike."

Google, they argued, is entitled to liability protection under Section 230, since the platform's recommendation algorithm is merely responding to user preferences by pairing them with the types of content they seek. 

"The algorithm functions in a way that is not meaningfully different from the many curatorial decisions that platforms have always made in deciding how to present third-party content," Wyden and Cox said. 

The battle also highlights competing views about the internet today and how Section 230 has shaped it. For tech companies, the law has laid the groundwork for new platforms to come online, an industry of online creators to form and free expression to flourish. For Gonzalez's family and others, the algorithmic recommendations have proven deadly and harmful.

Like the Gonzalezes, Tawainna Anderson, too, has fought to hold a social media platform responsible over content it recommends to users.

Last May, Anderson sued TikTok and its parent company, China-based ByteDance, after her 10-year-old daughter Nylah died in late 2021 after trying to perform the dangerous "Blackout Challenge," in which users are pushed to strangle themselves until they pass out and then share videos of the experience.

The challenge, which went viral on TikTok, was recommended to Nylah through her account's "For You" page, a curated feed of third-party content powered by TikTok's algorithmic recommendation system.

"They are actually feeding it to our children. They are sending them videos that they never even searched before," Anderson told CBS News chief legal correspondent Jan Crawford. 

Anderson's lawsuit sought to hold TikTok accountable for deliberately funneling dangerous content to minors through the challenges and encouraging behavior that put their lives in danger. TikTok asked the federal district court in Pennsylvania to dismiss the suit, invoking Section 230. 

U.S. District Judge Paul Diamond tossed out the case in October, writing that the law shielded TikTok from liability because it was promoting the work of others. But he acknowledged in a brief order that TikTok made the Blackout Challenge "readily available on their site" and said its algorithm "was a way to bring the challenge to the attention of those likely to be most interested in it."

"The wisdom of conferring such immunity is something properly taken up with Congress, not the courts," Diamond wrote.

Mackey, of the Electronic Frontier Foundation, noted that if people disagree with the reach of Section 230 as the courts have interpreted it, the right remedy is for Congress, not the Supreme Court, to rewrite the law.

"When they passed it, they set this balance and said not that they didn't believe there wouldn't be harmful content, but they believed on balance the creation of opportunities and forums for people to speak, for the growth of the internet and development of a tool that became central to our lives, commerce, political expression — that was what they valued more," Mackey said. "Congress is free to rewrite that balance."

A new creator economy

In the 27 years since Section 230 became law, the explosive growth of the internet has fueled a multi-billion-dollar industry of independent online creators who rely on large tech platforms to reach new audiences and monetize their content.

In Morton's case, her YouTube channel has allowed her to expand beyond her office in Santa Monica, California, and reach patients around the country, including in areas where mental health resources may be scarce.

"The ability for me to get over a million views on YouTube means that I'm able to reach so many more people, and mental health information isn't held behind a paywall," she said.

Alex Su, a lawyer by training who runs the TikTok account LegalTechBro, first began sharing content on LinkedIn in 2016 as a way to drive awareness of his employer, a technology company. After building up a following of lawyers and others in the legal industry on LinkedIn, Su began experimenting with TikTok in 2020.

His TikTok videos, which touch on insider experiences of working at a law firm, resonated with other lawyers and people with ties to the profession. He said LinkedIn's recommendation system has been instrumental in helping Su reach his target audience and market his company's services.

"These algorithms let me go viral among people who can relate to my jokes," he told CBS News. "If I put this type of content in front of a general audience, they probably wouldn't find it as funny."

Internet companies and supporters of Section 230 note the law has allowed for new and emerging companies to grow into industry leaders without incurring significant litigation costs fighting frivolous claims.

Su, an early adopter of LinkedIn and TikTok for those in the legal field, noted that creators are often quick to take advantage of new platforms, where they can reach new audiences.

"I think it's no accident that there are these shifts where new entrants come in and you can take advantage of it as a content creator because then you can go viral on that platform with a new audience quickly," he said. "Without those different platforms, I would not have been able to grow in the way that I did."

Few clues from the court

The Supreme Court has given little indication of how it may approach Section 230. Only Justice Clarence Thomas has written about lower courts' interpretations of the legal shield.

"Courts have long emphasized non-textual arguments when interpreting [Section] 230, leaving questionable precedent in their wake," Thomas wrote in a 2020 statement urging the court to consider whether the law's text "aligns with the current state of immunity enjoyed by internet platforms."

The Supreme Court could issue a ruling that affirms how Section 230 has been interpreted by lower courts, or narrow the law's immunity.

But internet companies warned the court that if it limits the scope of Section 230, it could drastically change how they approach content posted to their sites. With a greater risk of costly litigation with fewer protections, companies may be more cautious about letting content appear on their sites that may be problematic, and only allow content that has been vetted and poses little legal risk.

"If you're concerned about censorship, the last thing you want is a legal regime that is going to punish platforms for keeping things online," Mackey said. "It's going to be increased censorship, more material will be taken down, a lot won't make it alone in the first place." 

A decision from the Supreme Court is expected by the summer.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.