OAKLAND (AP) — Facebook said it will restrict the right-wing conspiracy movement QAnon and will no longer recommend that users join groups supporting it, although the company isn't banning it outright.
Facebook said Wednesday it is banning groups and accounts associated with QAnon as well as a variety of U.S.-based militia and anarchist groups that support violence. But the company will continue to allow people to post material that supports these groups, so long as they don't violate policies against hate speech, abuse and other provocations.
QAnon groups have flourished on Facebook in recent months, and experts say social media has aided the rise of the fringe movement. Twitter announced a similar crackdown recently and TikTok has banned QAnon altogether from its searches, along with related terms such as "WWG1WGA," shorthand for the group's motto "Where We Go One, We Go All."
Google said it has removed tens of thousands of QAnon-related videos from its YouTube service and banned hundreds of channels for violating its policies, but it also does not ban QAnon outright.
The QAnon conspiracy theory is centered on the baseless belief that President Donald Trump is waging a secret campaign against enemies in the "deep state" and a child sex trafficking ring run by satanic pedophiles and cannibals. For more than two years, followers have pored over tangled clues purportedly posted online by a high-ranking government official known only as "Q." Some extreme supporters of Trump adhere to the theory, often likened to a cult.
The conspiracy theory emerged in a dark corner of the internet but has recently crept into mainstream politics. Trump has retweeted QAnon-promoting accounts and its followers flock to his rallies wearing clothes and hats with QAnon symbols and slogans.
Last week, Marjorie Tyler Greene, a House candidate who openly supports QAnon, won her Republican primary in Georgia. She's part of a growing list of candidates who have expressed support for QAnon. Lauren Boebert, another candidate who has expressed support for QAnon, recently upset a five-term congressman in a Republican primary in Colorado.
Facebook said it will only remove groups and accounts outright if they discuss potential violence, including in veiled language. It said it is not banning QAnon outright because the group does not meet criteria necessary for the platform to designate it a "dangerous organization." But it is expanding this policy to address the movement because it has "demonstrated significant risks to public safety."
But experts say this doesn't go far enough.
"Facebook's actions today may ultimately come to be viewed as 'too little, too late,' said Ethan Porter, a professor of media and public affairs at George Washington University. "It will probably make a dent. But will it solve the problem? Not at all. At this point, the most fervent QAnon believers are not only entrenched on the platform, but likely heading to the halls of Congress. Yet this may give them trouble with new recruits."
An FBI bulletin last May warned that conspiracy theory-driven extremists have become a domestic terrorism threat. The bulletin specifically mentioned QAnon. Earlier last year, the Southern Poverty Law Center warned that the movement is becoming increasingly popular with anti-government extremists.
Facebook's "limited action now is an insufficient one given the long established fact that the group encourages violence, spreads false information that causes real world harm, and knows how to adapt to continue leveraging the Facebook platform," said Cindy Otis, a former CIA analyst and vice president of analysis at Alethea Group, a company that helps combat disinformation.
Facebook will still restrict the material it doesn't remove, initially by no longer recommending it. For instance, when people join a QAnon group, Facebook will not recommend similar groups to join. Neither will it suggest QAnon references in searches or, in the near future, allow it in ads.
Otis said Facebook's decision not to actively push users "down the rabbit hole of QAnons" is a good so far as it goes, but still insufficient.
"Keeping only the most immediately dangerous content off the platform does little when you've already got QAnon believers running (and winning) for Congress," she said.
The social network said it has removed over 790 groups, 100 pages and 1,500 ads tied to QAnon on Facebook and has blocked over 300 hashtags across Facebook and Instagram. There are 1,950 other groups and 440 pages Facebook says it has identified that remain on the platform but face restrictions, along with 10,000 accounts on Instagram.
For militia organizations and those encouraging riots, including some who may identify as antifa, the company said it has removed over 980 groups, 520 pages and 160 ads from Facebook.
"These movements and groups evolve quickly, and our teams will follow them closely and consult with outside experts so we can continue to enforce our policies against them," Facebook said.
Social media, including Facebook, has clearly aided QAnon's rise, even though most Americans have likely never heard of QAnon, at least based on a March report by the Pew Research Center.
"I don't want to overstate QAnon's influence among the mass public — it's widely disliked and widely disbelieved," Porter said. "But Facebook has helped it net some true believers."
It's not clear if any actions the companies are taking now will make up for earlier inaction.
"Clearly, QAnon at times has been dangerous and violent," Porter said. "But even if that alone isn't sufficient to ban QAnon — and I'm not sure it should be —the very top of Facebook should think seriously about what kind of public square they have built, and what they want their legacy to be."
© Copyright 2020 The Associated Press. All Rights Reserved. This material may not be published, broadcast, rewritten or redistributed.
for more features.