How "prebunking" misinformation works

How "prebunking" misinformation works

This week on 60 Minutes, correspondent Lesley Stahl reported on the ongoing debate over how social media companies, including Meta, X and Google, moderate harmful content, like false medical information and hate speech, on their platforms.

Some critics say these companies are not doing enough to combat the proliferation of this content on their sites. At the same time, some politicians, like Rep. Jim Jordan, are accusing the companies of colluding with the government to silence conservative voices online.

60 Minutes spoke with Sander van der Linden, a professor of social psychology and director of the Social Decision-Making Laboratory at Cambridge University.

Van der Linden's research centers on how people interact with misleading or false information on social media, and why some end up believing things that are either half-true or completely false.

Van der Linden told Stahl that misinformation, content that is entirely false, exists on social media, but misleading information, like half-truths and biased narratives, is much more prevalent. 

The professor gave an example of misleading information that went viral during the COVID-19 pandemic. An article with a highly misleading headline was published in the South Florida Sun Sentinel. The headline said, "A 'healthy' doctor died two weeks after getting a COVID-19 vaccine; CDC is investigating why." 

The article was seen by tens of millions on Facebook, becoming the most-viewed article in the first quarter of 2021. It was used by skeptics and conspiracy theorists online to suggest COVID-19 vaccines didn't work and could even kill people. 

"It's highly misleading because it's suggesting that the doctor died because of the COVID vaccine. But of course…still to this day there's been no evidence that the vaccine actually was the cause of death for this doctor," van der Linden explained.

Van der Linden's research and work at Cambridge is based on a theory called psychological inoculation. The theory: if you understand how misinformation can manipulate you before you see it, you are less likely to believe it. 

He said one of the ways that you can "inoculate" yourself from misinformation is through a through a process called "prebunking."

"By deconstructing it and refuting it in advance, people can actually build up psychological or mental antibodies, so they become more resistant to misinformation in the future," he told Stahl. 

Van der Linden and his colleagues at Cambridge have worked with a variety of partners, including social media platforms, government agencies, and public health organizations, to develop educational videos and video games that show people the common manipulative techniques used to propagate misinformation.

One game called "Go Viral!" was released during the pandemic, in partnership with the U.N.'s World Health Organization. It puts players in the shoes of a propagandist that's spreading lies about COVID-19 and vaccines. Users can earn a higher score in the simulation by making false claims, citing made-up studies and phony experts, and by writing posts with emotionally manipulative language.

Van der Linden's team at Cambridge also worked with tech giant Google to create videos that illustrate common misinformation tactics. The videos were viewed by millions of people on YouTube, placed where an ad would normally play before a video starts. 

One of those videos used a scene from "Star Wars: Episode III – Revenge of the Sith" to help explain the concept of a false dichotomy. Obi-Wan Kenobi is presented with a narrow set of choices, designed to make him choose sides when more options are available.

The people who saw that video, and a group who didn't, were presented with an online quiz to gauge the project's effectiveness. Those who saw it were, on average, better at identifying manipulative tactics. 

But lately, according to Sander van der Linden, the political debate around content moderation has had a chilling effect on social media companies' willingness to expand and implement new "prebunking" initiatives. 

"Because they fear that there's going to be critiques and that users are not going to like it because of the politicization of doing anything about misinformation," he told 60 Minutes.

"This is the lowest hanging fruit. This is just empowering people to identify manipulation. And even that is, you know, to some extent controversial for them."

The video above was produced by Will Croxton. It was edited by Matthew Lev, Will Croxton and Sarah Shafer. 

"Bad News" was made by The University of Cambridge/Tilt Studio/Gusmanson Design

 "Go Viral!" was made by The University of Cambridge/DROG/Tilt Studio/Gusmanson Design with support from the World Health Organization

 "Prebunking Manipulation Techniques: False Dichotomies" was made by The University of Cambridge/The University of Bristol/Google Jigsaw

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.