Watch CBS News

Balance between fighting misinformation and protecting speech on social media gets more complicated

Misinformation spreads on social media
Misinformation spreads online as some in Congress fight what they see as censorship 13:26

As the U.S. 2024 presidential election gets underway, social media companies are caught in an unenviable position: trying to stop the spread of misinformation while also facing more and more allegations of censorship.

Claims of censorship online have, in some cases, stymied efforts to combat false election news shared online. The problem is not unique to the U.S.: high-stake elections are being held in dozens of countries around the world this year and some worry that misinformation could influence the results.

"Half of the world is voting this year and the world could stick with democracy or move toward authoritarianism," Darrell West, a senior fellow of technology innovation at the Brookings Institution, said. "The danger is, disinformation could decide the elections in a number of different countries."

How combating misinformation online has changed in recent years

Academic researchers began working closely with social media platforms after evidence surfaced of Russian interference in the 2016 election.  

Big tech companies have wrestled with keeping false and harmful information off their platforms for years. They've suspended and banned accounts. The companies have removed or labeled posts deemed "misinformation," sometimes adding warnings.

Darrell West
Darrell West, a senior fellow of technology innovation at the Brookings Institution 60 Minutes

Fighting misinformation became a key tenet of the internet as the COVID-19 pandemic began. Robert Kennedy Jr. was temporarily banned from Instagram after posting false coronavirus vaccine claims.  Over on Twitter, Rep. Marjorie Taylor Greene was suspended after she claimed COVID vaccines and masks didn't work.

Misinformation continued to spread online during the 2020 election. 

"We were very specifically looking at misinformation about election processes, procedures and election results," said Kate Starbird, a professor at the University of Washington and a leader of the Election Integrity Partnership, a group she helped launch in 2020. "If we saw something about that, we would pass it along to the platforms if we thought it violated one of their policies."

Researchers flagged a November 2020 tweet saying that election software in Michigan switched 6,000 votes from Trump to Biden. Twitter labeled the post with a warning.

Starbird said her research has found that more misinformation is spread by conservatives. 

"Not just our research, research across the board looking at the 2020 election found that there was more misinformation spread by people that were supporters of Donald Trump or conservatives," Starbird said. "And the events of January 6th kind of underscore this."

Kate Starbird
Kate Starbird, a professor at the University of Washington and head of the Center for an Informed Public 60 Minutes

But some researchers like Starbird, who says she received a death threat for her work on misinformation, have stopped communicating with social media platforms. 

Confronted with criticism from conservatives, who claim their views were being censored, and because of cost-cutting, social media platforms began downsizing their fact-checking teams.  

Why some in Congress say combating misinformation is stifling freedom of speech

House Judiciary Committee chairman Jim Jordan, a Republican from Ohio, argues that tech companies shouldn't remove most of what they call misinformation. 

"I think you let the American people, respect the American people, their common sense, to figure out what's accurate, what isn't," Jordan said in an interview.

While Jordan acknowledges there is misinformation online, he sees a bigger problem in what he views as an attack on First Amendment liberties. His committee last year produced a report that concluded there was a "censorship industrial complex" where the federal government and tech companies colluded with academic researchers to disproportionately silence conservatives — an allegation that Starbird vigorously denies. 

Jordan said her group has unfairly flagged posts, such as one by Newt Gingrich, who in 2020 tweeted: "Pennsylvania democrats are methodically changing the rules so they can steal the election."

Jordan also complains that government officials put pressure on social media companies directly. 

Rep. Jim Jordan
Rep. Jim Jordan 60 Minutes

"You can't have the government say, 'Hey, we want you to do X,'" Jordan said. "Government who has the ability to regulate these private companies, government which has the ability to tax these private companies."

Katie Harbath, who spent a decade at Facebook working on the company's policies around election misinformation, said the platforms have their own First Amendment rights.

She said that while she was at Facebook, it was not unusual for the government to ask the company to remove content, something she said was appropriate as long as the government is not coercing. 

"Conservatives are alleging that the platforms were taking down content at the behest of the government, which is not true," Harbath said. "The platforms made their own decisions."

Many times, the companies pushed back. In 2019, a doctored video of then-House Speaker Nancy Pelosi was posted online, slowed down to make it seem as if she was slurring. The video stayed up because it didn't violate Facebook's policies, Harbath said. 

"She was definitely not pleased," Harbath said of Pelosi. 

Court battles over misinformation, free speech online-  

The conservatives' campaign faced a setback at the Supreme Court on Monday when a majority of the justices seemed poised to reject their effort to limit attempts by the government to influence social media.

In other cases, the court will look at laws passed in Texas and Florida to determine whether tech companies are like news organizations —with a First Amendment right to control who and what information appears on their sites— or like telephone companies, entities merely transmitting speech.

If those state laws are upheld, the platforms could be forced to carry hate speech and false medical information, some warn. West, the senior fellow of technology innovation at the Brookings Institution, said the clash over what's true is fraying our institutions and threatening democracies around the world.

"The toothpaste is out of the tube and we have to figure out how to deal with the resulting mess," West said.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.