Last Updated Apr 17, 2017 5:33 PM EDT
Facebook is speaking out against a video, posted on its platform, that shows the killing of an elderly Ohio man.
“This is a horrific crime and we do not allow this kind of content on Facebook. We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety,” Facebook said in a statement.
The video, which appears to show suspect Steve Stephens fatally shoot Robert Godwin, Sr., was taken down after more than two hours, drawing criticism about Facebook’s response time. Facebook has a video review team on call 24 hours a day, seven days a week, according to the company.
While some criticized Facebook for not taking the video down faster, the company issued a statement Monday afternoon saying it didn’t receive a report about the shooting video until “more than an hour and 45 minutes after it was posted.”
“We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better,” the company said.
The graphic video raises questions about whether Facebook might change its policies on video. As it stands now, any Facebook user can post video, no strings attached. Users can then flag content they find objectionable, which Facebook employees review for possible removal based off their policies.
“I think it’s entirely possible that this incident could change the game,” Wired editor-in-chief Nicholas Thompson told “CBS This Morning” on Monday.
“What I think will happen now is Facebook will have to, A) look at their algorithms to try to figure out whether this can be stopped, and B) think about the culture; there is a real culture of violence that has perpetrated itself inside of video sharing and social media platforms, and can that be changed?”
If it decides to revisit its policies, Facebook is in a sensitive position. While the company says it is dedicated to banning all content that violates the company’s standards, Facebook also hopes to leave space for victims and witnesses of violence to share what’s unfolding in front of them.
Video has emerged as a powerful tool for both citizens and activists to act as reporters of major events as they happen in real time. Last year, a Minnesota woman, Diamond Reynolds, live streamed her fiancé Philando Castile’s last moments of life after he was shot by police with her four-year-old daughter watching from the backseat of the car. The video instantly went viral across the world, shining a harsh light on police violence. The video temporarily disappeared from the social network because of a “technical glitch,” according to Facebook. It was restored later with a warning about its graphic nature.
In a post last year, Facebook explained its internal logic for reviewing videos.
“One of the most sensitive situations involves people sharing violent or graphic images of events taking place in the real world. In those situations, context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video,” the company said.
Facebook’s own sense of responsibilty has evolved over the years, Thompson said: it no longer sees itself as just a platform, but now views itself as an editorial publisher, too.
“If you think about the video that was posted yesterday and imagine it hadn’t been shot by the perpetrator, but it had been shot by someone witnessing, who wanted to use the live video feed as a way to try to stop it, that’s an entirely different case,” Thompson said. “But it’s the exact same video. That’s why these issues are complicated.”