A new report by the Center for Countering Digital Hate finds that "just 12 anti-vaxxers are responsible for almost two-thirds of anti-vaccine content circulating on social media platforms." The so-called "Disinformation Dozen" remain prominent figures on platforms like Facebook and Twitter, despite repeatedly violating their terms of service, according to the center.
"Living in full view of the public on the internet are a small group of individuals who do not have relevant medical expertise and have their own pockets to line, who are abusing social media platforms to misrepresent the threat of COVID and spread misinformation about the safety of vaccines," reads the report from the nonprofit organization, which works to disrupt "the spread of digital hate and misinformation."
The center identified the dozen "because they have large numbers of followers, produce high volumes of anti-vaccine content or have seen rapid growth of their social media accounts in the last two months." Among them are, who leads an anti-vaccine group; Dr. Joseph Mercola, who has made a fortune selling dietary supplements as alternatives to vaccines; and Ty and Charlene Bollinger, founders of "The Truth About Cancer."
"They're not just individuals... these are individuals who have behind them large companies that they themselves run, which pump our misinformation with the aim of persuading people not to follow the clinical guidance, and to instead buy their false cures, or to buy access to the information they claim is the truth about coronavirus and about vaccines," Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, told CBSN on Wednesday.
The center analyzed over 812,000 Facebook and Twitter vaccine-related posts shared between February 1 and March 16 of this year. Sixty-five percent percent of anti-vaccine posts came from one of the 12 "Disinformation Dozen."
"Anti-vaccine activists on Facebook, YouTube, Instagram and Twitter reach more than 59 million followers, making these the largest and most important social media platforms for anti-vaxxers," according to the report. The center also found that these accounts specifically target Black Americans, who've beenso far.
The organization has called for Facebook, Instagram, YouTube and Twitter to deplatform the "Disinformation Dozen" as well as the organizations they are linked to. Three of the 12 have been comprehensively removed from at least one platform, but none have been removed from all.
"Facebook, Google and Twitter have put policies into place to prevent the spread of vaccine misinformation; yet to date, all have failed to satisfactorily enforce those policies," reads the report. "All have been particularly ineffective at removing harmful and dangerous misinformation about coronavirus vaccines, though the scale of misinformation on Facebook, and thus the impact of their failure, is larger."
A spokesman for Twitter told CBS News it had removed 22,400 tweets under its COVID-19 "misleading information policy" and "challenged" nearly 12 million accounts. Dani Lever, a spokesperson for Facebook, said the company has removed more than 2 million pieces of content since February.
Facebook's Mark Zuckerberg told lawmakers on the House Energy and Commerce Committeethat his platform separates misinformation into categories, the most serious being what could cause "imminent physical harm." False claims about the coronavirus, or its vaccines, he said, could lead to someone getting sick, and therefore .
"That's the broad approach that we have … that sort of explains some of the differences between some of the different issues and how we approach them," Zuckerberg said.
That "broad approach" has been criticized by a group of COVID-19.who have urged Facebook and Twitter to "take immediate steps" to crack down on online "anti-vaxxer" falsehoods amid the ongoing effort to vaccinate the public against
In a letter to Zuckerberg and Twitter CEO Jack Dorsey on Wednesday, the attorneys general pressed the social media giants to fully "identify and enforce" the companies' terms of service to combat against vaccine disinformation and misinformation.
"A small group of individuals use your platforms to downplay the dangers of COVID-19 and spread misinformation about the safety of vaccines," the group said, citing CCDH's report. "These individuals lack medical expertise and are often motivated by financial interests."
In addition to deplatforming the dozen, the center also recommends platforms establish a clear threshold for enforcement action, display corrective posts to users exposed to disinformation, add warning screens when users click links to misinformation sites, institute an Accountability API (application programming interface), and ban private and secret anti-vaccine Facebook Groups.
"(Their) content has the potential of harming human life," Ahmed told CBSN. "Not just the individuals themselves, but more cruelly, the people they love and our communities as a whole."
for more features.