A protected right? Free speech and social media

A protected right? Free speech and social media

A decade ago this very month, in Cairo's Tahrir Square, social media was being praised. Its role as an organizing tool during the pro-democracy rallies had many calling the Arab Spring the "Facebook Revolution" instead.

But for all its glowing promise, we quickly learned social media is only as good as how it's used.

"The major difference between now and then is, more than ever before, our experiences on social media are determined by hidden decisions made by the social media companies themselves," said Ramesh Srinivasan, who was in Tahrir Square back then researching how Twitter and Facebook were giving voice to the voiceless.

"It used to be something more of an open pipe," he said.

Srinivasan is now an author, and a professor at UCLA's Department of Information Studies. "What we are seeing when we go online is likely to be that which is most sensational or inflammatory. They're predicting whatever is most likely to grab people's attention."

Keeping us engaged on social media is how Big Tech makes money, and the past four years have proven lies and conspiracy theories are unfortunately more engaging that the truth.

"And the danger in that is, what?" asked correspondent Lee Cowan.

"The danger is it's gonna present us with an extremely distorted view of reality," Srinivasan replied. "The fringe becomes the new normal."

That is a study in the Trump Presidency. His reality TV roots taught him controversy gets ratings. And he used his social media feeds in much the same way. No one has a bigger "bully pulpit" than the President of the United States, and no one before Mr. Trump used it with such abandon online.

The attack on the Capitol changed all that.

After years of defending his presence on their platforms, Facebook, Twitter, YouTube and other social media giants booted Mr. Trump, claiming he'd incited a riot. 

Amazon removed an entire site from its servers: Parler, which had become the place favored by many conservatives.

"De-platformed" was a word we learned a lot about this past week.

"I think Big Tech has made a terrible mistake, and very, very bad for our country," President Trump said Tuesday. "They shouldn't be doing it. But, uhh, there's always a counter-move when they do that."

While many applauded the move, the precedent of shutting out the leader of the free world made many people uncomfortable, too. It's a huge power to wield, one that is currently held in the hands of a very few.  

It's not a new argument; the CEOs of the major tech companies have been called on the carpet before.

"Who the hell elected you, and put you in charge of what the media are allowed to report and what the American people are allowed to hear?" Sen. Ted Cruz asked tech executives during an October 2020 hearing on the Communications Decency Act.

In 2019 Rep. Alexandria Ocasio-Cortez asked Facebook CEO Mark Zuckerberg, "So, you won't take down lies or you will take down lies? I think it's just a pretty simple yes or no."

But claiming Big Tech is running afoul of the First Amendment by de-platforming those it deems harmful may be missing the larger point.

Cowan asked, "Is kicking someone off any of these social media sites an infringement of their free speech rights?"

"No, it isn't," replied Daphne Keller, who directs the Program on Platform Regulation at Stanford University's Cyber Policy Center. "They are not subject to the First Amendment. They are not the government."

When it comes to digital speech and the First Amendment, she said, it gets messy pretty fast.

Cowan said, "This isn't just a free speech argument on the part of users; it's also a free speech argument on the part of the providers as well, right?"

"People who want to sue platforms, and force them to carry speech they don't want to, have a double First Amendment problem," Keller said. "First of all, those people don't have a First Amendment claim against the platforms, and second of all, the platforms do have a First Amendment argument against being forced to carry speech they disagree with."

But what if the argument over regulation was reframed – less about speech, and more about changing how Big Tech exposes us to that speech?

Yaël Eisenstat used to work for Facebook as one of the heads of election integrity, where she saw firsthand just what these companies do with all that content.

"This idea that it's just this free flow of information is false. It's a curated flow of information," she said. "It's a business model that is predicated on gathering as much of our human behavioral data as possible, to create these little boxes of who we are, to then target us with ads."

That's all fine if we're shopping for sneakers, she said. But those same algorithms apply to our politics, too. We'll willingly follow ideas that pop up down the rabbit hole, and those who want their messages to spread know the more controversial, the better.

"I don't think that Mark Zuckerberg set out with the idea that, 'I wanna create a platform where the most outrageous, salacious, hate-filled speech wins.' I don't think that was his goal," Eisenstat said. "But instead of holding the platform responsible for what somebody posts, it's the tools that I want them held responsible for, not the actual speech on the platform, except for, of course, if the speech breaks the law."

"But it sounds like what you're saying, though, is that'd be changing the business model, pretty much," said Cowan.

"One hundred percent."

Big Tech has promised more transparency, and better enforcement of their own rules when it comes to spreading dis- and mis-information.

Facebook removed more hate speech this year than ever before; Twitter, the same. Even Tik-Tok is being more proactive.

But that will likely not be enough going forward.

Cowan asked Ramesh Srinivasan, "Can we trust them to do this kind of regulation on their own, though?"

"No. We should not be trusting Twitter or any private company to magically serve the public interest. I think, if anything, the last four years have taught us that we can't do that."   

A year before the end of his second term, President Bill Clinton talked about the challenges of regulating the internet: "That's sorta like trying to nail Jell-O to the wall."

That was more than two decades ago. Times change, but the value of good stewardship doesn't.

"This discussion is about how we wanna live, how we wanna be, as a country and as a people," said Srinivasan. "It's a discussion about our humanity, at the end of the day. I do believe that you can force people to tie their actions to beliefs that might be a little more virtuous than their mere bottom lines."

    
For more info:

      
Story produced by John Goodwin. Editor: Ben McCormick. 

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.