Facebook founder explain Friday why Facebook has left up President Trump's posts about the Minneapolis protests. "Our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies," Zuckerberg wrote.tried to
Mr. Trump posted on Twitter and Facebook early Friday that Minneapolis protesters were "thugs" and wrote "when the looting starts, the shooting starts." The second tweet in the thread wasas "glorifying violence."
Mr. Trump's full Facebook post remained visible one day later. Zuckerberg wrote that he has been "struggling with how to respond to the President's tweets and posts all day. Personally, I have a visceral negative reaction to this kind of divisive and inflammatory rhetoric."
But, Zuckerberg wrote, he is "responsible for reacting not just in my personal capacity but as the leader of an institution committed to free expression."
Later Friday, Mr. Trump tried to clarify his posts, writing that "looting leads to shooting ... I don't want this to happen, and that's what the expression put out last night means."
He alsohe wasn't aware of the origin of the words, attributed to a Miami police chief in 1967 during a period of unrest in that city.
Protests have erupted nationwide over the death of George Floyd, who died in Minneapolis after a white officer kneeled on his neck. Four officers in the incident have been fired, and on Friday, Derek Chauvin was taken into custody and charged with third-degree murder and manslaughter.
"Although the post had a troubling historical reference, we decided to leave it up because the National Guard references meant we read it as a warning about state action, and we think people need to know if the government is planning to deploy force," Zuckerberg wrote.
The Verge reported on Friday afternoon that Facebook employees have criticized the company's decision to leave up Mr. Trump's posts about Minneapolis and about , which Twitter has also flagged.
"I have to say I am finding the contortions we have to go through incredibly hard to stomach," one employee wrote in a comment, according to The Verge. "All this points to a very high risk of a violent escalation and civil unrest in November and if we fail the test case here, history will not judge us kindly."
Facebook has faced scrutiny for allowing misinformation to spread on its platform during the 2016 election.
last week that Facebook would remove posts that spread misinformation about the coronavirus. For example, it took down the widely-debunked " " video.
"There's harmful misinformation, which is the type of thing that puts people in imminent physical risk," Zuckerberg told O'Donnell. "So, if you're telling someone that social distancing doesn't work, or that something that's proven to be a cure when it isn't, we want to take that off our services completely. There's other misinformation which is not generally going to cause physical harm, it's just stuff that's wrong. We want to stop it from going viral, and there we work with independent fact-checkers, which has led to us showing about 50 million warning labels on content that people have seen."