Watch CBS News

Supreme Court grapples with online First Amendment rights as social media teems with misinformation

Misinformation spreads on social media
Misinformation spreads online as some in Congress fight what they see as censorship 13:26

As big tech firms wrestle with how to keep false and harmful information off their social networks, the Supreme Court is wrestling with whether platforms like Facebook and Twitter, now called X, have the right to decide what users can say on their sites. 

The dispute centers on a pair of laws passed in the red states of Florida and Texas over the question of First Amendment rights on the internet. The Supreme Court is considering whether the platforms are like newspapers, which have free speech rights to make their own editorial decisions, or if they're more like telephone companies, that merely transmit everyone's speech.

If the laws are upheld, the platforms could be forced to carry hate speech, and false medical information, the very content most big tech companies have spent years trying to remove through teams of content moderators. But in the process, conservatives claim that the companies have engaged in a conspiracy to suppress their speech.

As in this case: a tweet in 2022 from Congresswoman Marjorie Taylor Greene falsely claiming that there were…
"Extremely high amounts of COVID vaccine deaths."

Twitter eventually banned Greene's personal account for "multiple violations" of its COVID policy.

Facebook and YouTube also removed or labeled posts they deemed "misinformation."

Confronted with criticisms from conservatives like Congressman Jim Jordan, that the social media companies were censoring their views, and because of cost-costing, platforms began downsizing their fact checking teams. 

Rep. Jim Jordan
Rep. Jim Jordan 60 Minutes

So today, social media is teeming with misinformation. Like these posts suggesting tanks are moving across the Texas-Mexico border. But it's actually footage from Chile.

These are AI-generated images of – well, see for yourself.

With social media moderation teams shrinking, a new target is misinformation academic researchers who began working closely with the platforms after evidence of Russian interference online in the 2016 election.

Lesley Stahl: Are researchers being chilled? 

Kate Starbird: Absolutely. 

Kate Starbird is a professor at the University of Washington, a former professional basketball player, and a leader of a misinformation research group created ahead of the 2020 election. 

Kate Starbird: We were very specifically looking at misinformation about election processes, procedures, and election results. And if we saw something about that, we would pass it along to the platforms if we thought it violated their-- one of their policies. 

Here's an example: a November 2020 tweet saying that election software in Michigan "switched 6,000 votes from Trump to Biden." 

The researchers alerted Twitter that then decided to label it with a warning.

Lesley Stahl: I understand that some of the researchers, including you, have-- had some threats against them death threats.

Kate Starbird: I have received one. Sometimes they're threats with something behind them. And sometimes they are just there to make you nervous and uncomfortable. And it's hard to know the difference.

Lesley Stahl: This campaign against you is meant to discredit you. So we won't believe you.

Kate Starbird: Absolutely. It's interesting that the people that pushed voter fraud lies are some of the same people that are trying to discredit researchers that are trying to understand the problem.

Lesley Stahl: Did your research find that there was more misinformation spread by conservatives?

Kate Starbird: Absolutely. I think-- not just our research, research across the board, looking at the 2020 election found that there was more misinformation spread by people that were supporters of Donald Trump or conservatives. And the events of January 6th kind of underscore this.

Kate Starbird: The folks climbing up the Capitol Building were supporters-- of Donald Trump. And they were-- they were misinformed by these false claims. And-- and that motivated those actions.

Kate Starbird
Kate Starbird 60 Minutes

Ohio Republican Congressman Jim Jordan is chairman of the House Judiciary Committee. 

Lesley Stahl: So how big a problem is mis and disinformation on the web?

Rep. Jim Jordan: Well, I'm sure there's some. But I think, you know-- our concern is the bigger problem of the attack on First Amendment liberties. 

Congressman Jordan's Judiciary Committee produced a report that concluded there's a "censorship industrial complex" where the federal government and tech companies colluded with academic researchers to disproportionately silence conservatives, which Kate Starbird vigorously denies.

But Congressman Jordan says her group unfairly flagged posts like this tweet by Newt Gingrich: 

"Pennsylvania democrats are methodically changing the rules so they can steal the election" 

He complains that government officials put pressure on social media companies directly –

Rep. Jim Jordan: A great example, 36 hours into the Biden administration, the-- the Biden White House sends-- a email to Twitter and says, "We think you should take down this tweet ASAP." 

Just a call alone from the government, he says, can be unnerving. 

Rep. Jim Jordan: You can't have the government say, "Hey, we want you to do X," government who has the ability to regulate these private companies, government which has the ability to tax these private companies.

He says that White House email to Twitter involved a tweet from…

Rep. Jim Jordan: Robert F. Kennedy Jr. and everything in the tweet was true.

That tweet implied falsely that baseball legend Hank Aaron's death was caused by the COVID vaccine. 

Lesley Stahl: Did they take it down?

Rep. Jim Jordan: Turned out they didn't. Thank goodness. 

And that post is still up.

Kate Starbird says the social media platforms also often ignored the researchers' suggestions.

Kate Starbird: The statistics I've seen are just for the Twitter platform. But I-- my understanding is-- is that they've responded to about 30% of the things that we sent them. And I think the-- on the majority of those, they put labels. 

Lesley Stahl: But just a third.

Kate Starbird: Just a third, yeah.

Lesley Stahl: And do you suspect that Facebook was the same? 

Kate Starbird: Oh, yeah.

Katie Harbath: These platforms have their own First Amendment rights. 

Katie Harbath spent a decade at Facebook where she helped develop its policies around election misinformation. When she was there, she says it was not unusual for the government to ask Facebook to remove content, which is proper, as long as the government is not coercing.

Katie Harbath
Katie Harbath 60 Minutes

Katie Harbath: Conservatives are alleging that the platforms were taking down content at the behest of the government which is not true. The platforms made their own decisions. And many times we were pushing back on the government.

Lesley Stahl: Can we talk about a specific case? It's of Nancy Pelosi. It's a doctored tape where she's-- she looks drunk. 

This was the video of then-House Speaker Pelosi posted to Facebook in 2019, slowed down to make it seem that she was slurring her words.

Lesley Stahl: Did it come down?

Katie Harbath: It did not. 

Lesley Stahl: Why?

Katie Harbath: Because it didn't violate the policies that they had.

Lesley Stahl: So did she put pressure on the company to take it down?

Katie Harbath: She was definitely not pleased.

Lesley Stahl: Is that a yes?

Katie Harbath: Yes. And it really damaged the relationship that the company had with her.

The conservatives' campaign faced a setback at the Supreme Court on Monday when a majority of the justices seemed poised to reject their effort to limit attempts by the government to influence social media.

The court is deciding, in separate cases, whether the platforms are like news organizations with a First Amendment right to control who and what information appears on their sites.

Congressman Jordan argues that the tech companies shouldn't remove most of what they call "misinformation."

Rep. Jim Jordan: I think you let the American people, respect the American people, their common sense, to figure out what's accurate, what isn't.

Lesley Stahl: Well, what about this idea that they-- the 2020 election was stolen? You think that these companies should allow people to say that and individuals can make up their own mind and that there should be--

Rep. Jim Jordan: I think the American people are smart. Look-- I've not said that. What I've said is there were concerns about the 2020 election. I think Americans agree with that. 

Lesley Stahl: No they don't--

Rep. Jim Jordan: You don't think they think there were concerns with the 2020 election?

Lesley Stahl: Most people don't question the result. That's all I'm saying. They don't question whether--

Rep. Jim Jordan: Fair enough.

Lesley Stahl: Biden won or not. Right? Right? Most people don't question

Rep. Jim Jordan: Oh, OK. No--

Lesley Stahl: The outcome.

Rep. Jim Jordan: Right.

X basically did what Jordan proposes. After Elon Musk took over in 2022, most of its fact checkers were fired. Now the site is rife with trash talk and lies. Little would you know that this – said to be footage from Gaza -- is really from a video game. Eventually X users added a warning label.

In this post, pictures of real babies killed in Israeli strikes are falsely dismissed as dolls.

Darrell West: The toothpaste is out of the tube and we have to figure out how to deal with the resulting mess.

Darrell West
Darrell West, a senior fellow of technology innovation at the Brookings Institution 60 Minutes

Darrell West, a senior fellow of technology innovation at the Brookings Institution, says the clash over "what is true" is fraying our institutions and threatening democracies around the world.

Darrell West: Half of the world is voting this year and the world could stick with democracy or move toward authoritarianism. The danger is disinformation could decide the elections in a number of different countries.

In the U.S., he says, the right wing has been flooding the internet with reams of misleading information in order to confuse the public. And he's alarmed by the campaign to silence the academic researchers, who have had to spend money and time on demands from Jim Jordan's Judiciary Committee. 

Lesley Stahl: There are people who make the accusation that going after these researchers, misinformation researchers, is tantamount to harassment. And that your goal really is to chill the research. 

Rep. Jim Jordan: I find it interesting that you use the word "chill," because in-- in effect, what they're doing is chilling First Amendment free speech rights. When, when they're working in an effort to censor Americans, that's a chilling impact on speech.

Lesley Stahl: They say what you're doing, they do, is a violation of their First Amendment right.

Rep. Jim Jordan: So us pointing out, us doing our constitutional duty of oversight of the executive branch-- and somehow w-- (LAUGH) we're censoring? That makes no sense.

Lesley Stahl: We Americans, we're looking at the same thing and seeing a different truth.

Rep. Jim Jordan: We might see different things, I don't-- I don't think you can see a different truth, because truth is truth.

Lesley Stahl: Okay. The-- the researchers say they're being chilled. That's their truth.

Rep. Jim Jordan: Yeah.

Lesley Stahl: You're saying they're not. So what's the truth? 

Rep. Jim Jordan: They can do their research. God bless em', do all the research you want. Don't say we think this particular tweet is not true-- and-- or-- or--"

Lesley Stahl: Well, that's their First Amendment right to say that. 

Rep. Jim Jordan: Well, they can say it, but they can't take it down.

Lesley Stahl: Well, they can't take it down and they don't. They just send their information to the companies. 

Rep. Jim Jordan: But when they're coordinating with government, that's a different animal.

Lesley Stahl: Okay, well, of course, they deny they're coordinating.

We just went round and round. 

Starbird says she and her team feel intimidated by the conservatives' campaign, so while they will continue releasing their research reports on misinformation, they will no longer send their findings to the social media platforms.

Produced by Ayesha Siddiqi. Associate producer, Kate Morris. Broadcast associates, Wren Woodson and Aria Een. Edited by Matthew Lev.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.