YouTube is choosing not to take down videos claiming that Donald Trump won the 2020 presidential election, even as misinformation about the outcome of the election continued to spread on the streaming service. The move highlights what YouTube's digital media critics call a more permissive approach to allowing misinformation on the platform compared with social media peers Facebook and Twitter.
"President Trump won four more years in the office last night. North Carolina, Florida, Pennsylvania, Michigan, Wisconsin are all Trump's," says a woman in the video, which resembles a newscast. "This has been a decisive victory for Trump."
YouTube added a box below the video noting the election results are not final and including a link to a Google search on the vote counts. (Google, like YouTube, is owned by Alphabet.) On Saturday, YouTube updated the box to say that the Associated Press had declared Biden the winner of the election.
On Tuesday afternoon, a Google video search of the term "Biden loses" turned up one video in the top slot titled, "Breaking News!!! Joe Biden Loses President-Elect Status." In the video, a man on a news set tells his viewers to "spread the news everywhere, tweet, email, text message everybody." The chyron on the bottom of the screen reads, "Binden (sic) loses president-elect status."
Despite the videos' claims, former Vice President Joe Biden remains projected to win the electoral votes needed to be declared the 46th president of the United States, narrowly defeating President Trump. CBS News projected Biden would win Pennsylvania on Saturday, putting him over the top four days after Election Day. On Tuesday, Mr. Biden's lead in Pennsylvania had risen to nearly 50,000 votes. No major news outlet has reversed its call that Mr. Biden got more votes than Mr. Trump in Pennsylvania.
CBS News had already declared Mr. Biden the winner in Michigan and Wisconsin, while Mr. Trump remains ahead in North Carolina.
Violence prohibited, but wrong information OK
The video claiming Mr. Trump won was posted to YouTube on Wednesday by One America News Network, which has nearly 1 million subscribers. The video has gotten more than 460,000 views, with 38,000 "thumbs up."
Yet another video on YouTube titled, "They are trying to Steal the Vote," asks watchers to send money to a PayPal account called magafirstnews. The person on the video says he has been banned for 30 days on Facebook.
In a statement to CBS MoneyWatch, a spokesperson for YouTube said its policy prohibits videos that include "harassment, hate speech and incitement to violence." Videos that solely make claims about the outcome of the election, however, don't fall into any of those categories and are permitted.
"Content making claims about the outcome of the election that also violates our policies are removed," the spokesperson said.
Declining to remove the video falsely anointing Mr. Trump as the winner in Tuesday's election could incite unsubstantiated claims that the election was stolen and lead to civil unrest, media critics of YouTube's decision said.
"It is crazy," said Dipayan Ghosh, a professor at Harvard University who studies digital media. "It's misinformation that can have real-world impact. This is the type of video that can lead to violence. YouTube should have taken it down."
YouTube earlier this week took down a video by Steven Bannon in which the former White House adviser said that.
"Our teams are working around the clock to quickly remove election-related videos that violate our policies," said Ivy Choi, a YouTube spokesperson. "Expressing views on the outcome of a current election or process of counting votes is allowed under our policy."
Nonetheless, Choi also said YouTube has taken actions to limit the views of the One America News video and similar videos by not surfacing them "high up" in searches or in YouTube's own recommendations. Asked why the videos were appearing high up in Google searches, Choi referred CBS News to Google. A Google spokesperson did not return a request for comment.
One America News Network did not respond to a request for comment.
Other social media companies have been more assertive in policing election misinformation on their platforms, Harvard's Ghosh said. Earlier this week, Facebook removed a group from its website calledafter some of its more than 350,000 members called for violence and others made false claims about voter fraud during the election.
Twitter added a warning label to at least six tweets by Mr Trump alleging either voter fraud or that his opponents were trying to "steal" the election. The label reads, "Some or all of the content shared in this Tweet is disputed and might be misleading about an election or other civic process." Twitter users were also prevented from liking, replying to or copying a link to the tweet.
TikTok has also been quick to remove videos revealed to be fake, including one of a poll worker burning ballots, according to Angelo Carusone, president of nonprofit watchdog Media Matters for America.
"What they are trying to do is deal with problems as they come up ad hoc," Carusone said of YouTube. "That's a decision that has business advantages, but the risk is that on reflection it will look like they didn't do enough to stop claims that the election was stolen."
Earlier this year, YouTubefor not taking down videos that used images of the Apple founder to scam users out of bitcoins. YouTube has declined to comment on the Wozniak suit, but a spokesperson for the company told CBS MoneyWatch in July it has taken down millions of videos that violated its policies.
"The reason you go after misinformation is that it has real-world harm," Carusone said. Videos like those posted by One America News Network that say Mr. Trump won "not only create doubt in the election, but also raise the temperature and increase the likelihood of disruptions and violence," he added.
YouTube's Choi noted that its policies are "on par with or in some instances more aggressive" than its social media rivals in dealing with misinformation and said "the way we are treating the OAN video is consistent with other tech platforms."
"Our hate and harassment policies clearly state that conspiracy theory content used to justify real-world violence is not allowed on our platform," Choi said. "We've been very clear publicly that over the last few years, we've heavily invested in the systems and policies that allow us to quickly remove violative content, raise up authoritative content and reduce borderline content."
Editor's note: This story was updated to make clearer the sources of criticism of YouTube's misinformation policies and include additional comment from YouTube to those criticisms.
for more features.