Watch CBS News

YouTube had to disable "hateful" comments during congressional hearing about white nationalism

YouTube had to turn off comments on a congressional hearing about hate crimes and white nationalism after live streams were flooded with hate speech and racist comments. Less than an hour into the hearing, YouTube announced that all comments would be disabled.

"Hate speech has no place on YouTube. We've invested heavily in teams and technology dedicated to removing hateful comments / videos," YouTube said on its press Twitter account, YouTubeInsider. "Due to the presence of hateful comments, we disabled comments on the livestream of today's House Judiciary Committee hearing."

Ironically, the hearing largely focused on the spread of hateful and extremist content on social media platforms. And one of the accounts live-streaming the hearing was Red Ice, a white nationalist channel based in Sweden, which raised money as the hearing went on.

The hearing included representatives from Google (which owns YouTube) and Facebook, as well conservative activist Candace Owen. There was also emotional testimony from Dr. Mohammad Abu-Salha, whose son-in-law and two daughters were murdered in a 2015 shooting in Chapel Hill, North Carolina. The alleged killer had posted online comments criticizing Islam and other religions, though he did not face hate crime charges.

"They are seared into my memory": Father describes murder of two daughters and son-in-law 05:14

The committee's chair, Rep. Jerrold Nadler, D-New York, had harsh words for social media companies in his opening statement.

"These platforms are utilized as conduits to spread vitriolic hate messages into every home and country," Nadler said. "Efforts by media companies to counter this surge have fallen short, and social network platforms continue to be used as ready avenues to spread dangerous white nationalist speech."

Later on, Nadler referenced a Washington Post story about the hateful comments posted online during the hearing. "This just illustrates part of the problem we're dealing with," he said.

The Facebook and Google reps said their companies were taking steps to crack down on extremist posts, including changes to algorithms that recommend content to users.

Lawmakers called the hearing after the mass shootings at two mosques in Christchurch, New Zealand, which left 50 people dead. The gunman live-streamed one of the rampages on Facebook and posted a manifesto referencing extremist movements and memes online.

Some New Zealand companies called for a Facebook advertising boycott after the terror attack, and tech giants like YouTube, Twitter and Reddit came under fire for delays in stopping the spread of the shooting video.  

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.