When Facebook CEO Mark Zuckerberg took the stage last week to make his biggest speech of the year, he did something he probably wasn't expecting to do just a few days earlier: He acknowledged Facebook's role in.
Two days before Zuckerberg's speech, a man sharedof himself shooting and killing an elderly man in Cleveland. A few minutes later, he took to Facebook Live, the social network's livestreaming service, to confess to the crime. Then about an hour before Zuckerberg took the stage at F8, Facebook's annual software conference, police said the shooter had killed himself following a nationwide manhunt.
"We have a lot of work [to do] and we will keep doing all we can to prevent tragedies like this from happening," Zuckerberg said.
Facebook has promised a review of the reporting tools people use to flag those violent videos, as well as a continuation of its effort to develop artificial intelligence to try to prevent those videos from spreading.
But apparently those changes can't come fast enough: Less than two weeks later, tragedy has struck again. On Monday evening, a man in Thailand used Facebook Live toof his baby daughter before taking his own life. The video was on the site for approximately 24 hours before it was taken down.
These slayings join a growingthat point to the social network's inability to fully control the service. Facebook has nearly 2 billion users per month. That's nearly 2 billion people with the ability to tap a button, point a phone's camera at something, and broadcast anything on Facebook for friends to see.
"Unlike conventional posts, livestreaming at massive scale is virtually impossible to police," said Peter Csathy, the founder and chairman of Creatv Media, a firm that advises media and technology companies. "We are now seeing terrible extremes as a result."
Facebook, of course, isn't the only tech company dealing with woes related to violent content. Twitter and Google-owned YouTube have similar problems. But Facebook, with its massive user base and its perch as the king of social media, has become the poster child for the firestorm.
Facebook didn't respond to a request for comment on the killings in Thailand, nor questions about the future of Facebook Live or how the company can improve safeguards on the technology.
It's a delicate situation. Zuckerberg has staked the future of Facebook on videos posted to the site. When the company expanded the features for livestreaming last year, Zuckerberg said we're entering a "golden age of video."
"I wouldn't be surprised if you fast-forward five years and most of the content that people see on Facebook and are sharing on a day-to-day basis is video," he said at the time.
Not a guarantee
has had profound reverberations for the social network.
The company has gotten flak for taking down clips with social significance, like a livestream showing the aftermath of the shooting of a black man at a traffic stop in July. After the clip was pulled because of a "technical glitch," it was restored following an outcry that its significance to social movements like Black Lives Matter justified its availability, with a warning.
On the flip side, Facebook as faced outrage when it fails to swiftly remove broadcasts documenting horrific crimes, like the videos depicting the Cleveland and Thai slayings.
Facebook has also made a big bet in making sure people always have their phone cameras ready, waiting to post to Facebook. Last week, moments after Zuckerberg sent his condolences to the family and friends of the Cleveland victim, he launched into a demo of a new Facebook augmented reality platform. The aim is to let outside software developers create new ways to overlay digital images over whatever your camera lens sees in the real world.
Though Facebook didn't directly admit it, it was partly a response to the popularity of rival Snapchat, which has revolutionized how people use augmented reality camera filters today. In other words, Facebook's reliance on camera phones and videos are a big deal for its business.
As for the objectionable videos, Justin Osofsky, Facebook's vice president of global operations, said the company relies on "thousands" of people around the world to review flagged content. Facebook users report "millions" of items a week, in more than 40 languages.
But even as Facebook works on better AI and on streamlining the process for people to report graphic videos, there's no guarantee something like the Cleveland or Thai killings won't happen again on the platform.
"Nothing is 100 percent certain," Csathy said. "No matter how much resources are devoted to fight those pandemics, something will always get through."
Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.
Special Reports: CNET's in-depth features in one place.
This article originally appeared on CNET.com.