To grasp the phenomenal scale of YouTube: consider that people spend 1 billion hours watching videos on it, every day. It is the most used social network in the U.S. More queries are typed into the website's search bar than anywhere online except Google, which owns YouTube.
But the site has come under increasing scrutiny, accused of propagating white supremacy, peddling conspiracies and profiting from it all. They recently agreed to pay a record $170 million dollars to settle allegations that they targeted children with ads. YouTube is being forced to concentrate on cleansing the site.
We visited the company's headquarters in San Bruno, California, to meet Susan Wojcicki, the 51-year-old CEO in charge of nurturing the site's creativity, taming the hate and handling the chaos.
Susan Wojcicki: We have 500 hours of video uploaded every single minute to YouTube.
Lesley Stahl: Fi-- say that again.
Susan Wojcicki: So we have 500 hours of video uploaded every minute to YouTube.
Lesley Stahl: That is breathtaking.
Susan Wojcicki: It, it is, it is. We have a lot of video.
And a lot of influence on our lives, and how we pass our time.
Over a billion people listen to music on YouTube every month: it's the planet's top music site. There's a children's channel; with over 44 billion views.
Lesley Stahl: Do you let your children watch YouTube, including the young ones?
Susan Wojcicki: So I allow my younger kids to use YouTube Kids, but I limit the amount of time that they're on it. I think too much of anything is not a good thing. But there's a lot you can learn on YouTube. I think about how YouTube in many ways is this global library. You wanna see any historical speech, you could see it. You want to be able to learn a language--
Lesley Stahl: Make a soufflé?
Susan Wojcicki: --wanna laugh, you just wanna see something funny. A soufflé! Oh, yeah, cooking. Cooking's a great example.
So's watching people binge eat. A growing number of American adults are turning to it for their news, sports, medical information. It's now mankind's largest "how to" collection: how to tie a tie, tie the knot, or speak Thai.
The site has produced whole new pastimes where millions watch strangers open boxes, whisper, sleep. YouTube's artificial intelligence algorithms keep recommending new videos so users watch more and more and more.
Wojcicki invited us to the weekly all-staff meeting. She's surprisingly down-to-earth for one of the most powerful people in Silicon Valley, where her trajectory started in an unlikely way.
Susan Wojcicki: I owned a garage. And I was worried about covering the mortgage. So I was willing to rent my garage to any student. But then two students appeared. One was named Sergey Brin. The other was named Larry Page. They are the founders of Google.
Lesley Stahl: Yes, they are.
Susan Wojcicki: But at the time they were just students. They looked like any other students.
Larry and Sergey ended up hiring her as their first marketing manager: she was Google employee 16. As the company grew, so did her role and so did her family. She has five children. Google bought YouTube on her recommendation, for over $1.6 billion, and eight years later she became CEO, with a mandate to make it grow and make it profitable. And she did. It's estimated worth is $160 billion.
YouTube makes most of its money from ads, splitting revenue with people who create all kinds of videos. From do-it-yourself lessons to hip-hop lessons. The more popular ones can become multimillion dollar entrepreneurs.
YouTube also makes money from political ads, a thorny issue because some of them have been used to spread lies on social media.
Lesley Stahl: Facebook is facing a lot of controversy because it refuses to take down a President Trump ad about Biden which is not true. Would you run that ad?
Susan Wojcicki: So that is an ad that, um, right now would not be a violation of our policies.
Lesley Stahl: Is it on YouTube right now?
Susan Wojcicki: It has been on YouTube.
Lesley Stahl: Can a politician lie on YouTube?
Susan Wojcicki: For every single video I think it's really important to look at it. Politicians are always accusing their opponents of lying. That said, it's not okay to have technically manipulated content that would be misleading. For example, there was a video uploaded of Nancy Pelosi. It was slowed down just enough that it was unclear whether or not she was in her full capacity because she was speaking in a slower voice.
Susan Wojcicki: The title of the video actually said drunk, had that in the title. And we removed that video.
Lesley Stahl: How fast did you remove it?
Susan Wojcicki: Very fast.
But not completely. We just did a search and there it was still available. The company keeps trying to erase the purported name of the impeachment whistleblower, but that too is still there. Which raises doubts about their system's ability to cleanse the site.
In the 2016 election cycle, YouTube failed to detect Russian trolls, who posted over 1,100 videos, almost all meant to influence African-Americans, like this video.
Video: Please don't vote for Hillary Clinton. She's not our candidate. She's a f---ing old racist b----.
YouTube is an "open platform" meaning anyone can upload a video, and so the site has been used to spread disinformation, vile conspiracies, and hate. This past March, a white supremacist livestreamed his killing of dozens of Muslims in Christchurch, New Zealand. He used Facebook, but for the next 24 hours copies of that footage were uploaded on YouTube tens of thousands of times.
Susan Wojcicki: This event was unique because it was really a made-for-Internet type of crisis. Every second there was a new upload. And so our teams around the world were working on this to remove this content. We had just never seen such a huge volume.
Lesley Stahl: I can only imagine when you became CEO of YouTube that you thought, "Oh, this is gonna be so fun. It's "people are uploading wonderful things like--
Susan Wojcicki: Funny cat videos.
Lesley Stahl: --funny. And look at what we're talking about here. Are you worried that these dark things are beginning to define YouTube?
Susan Wojcicki: I think it's incredibly important that we have a responsibility framework, and that has been my number one priority. We're removing content that violates our policies. We removed, just in the last quarter, 9 million videos.
Lesley Stahl: You recently tightened your policy on hate speech.
Susan Wojcicki: Uh-huh.
Lesley Stahl: Why-- why'd you wait so long?
Susan Wojcicki: Well, we have had hate policies since the very beginning of YouTube. And we--
Lesley Stahl: But pretty ineffective.
Susan Wojcicki: What we really had to do was tighten our enforcement of that to make sure we were catching everything and we use a combination of people and machines. So Google as a whole has about 10,000 people that are focused on controversial content.
Lesley Stahl: I'm told that it is very stressful to be looking at these questionable videos all the time. And that there's actually counselors to make sure that there aren't mental problems with the people who are doing this work. Is that true?
Susan Wojcicki: It's a very important area for us. We try to do everything we can to make sure that this is a good work environment. Our reviewers work five hours of the eight hours reviewing videos. They have the opportunity to take a break whenever they want.
Lesley Stahl: I also heard that these monitors, reviewers, sometimes, they're beginning to buy the conspiracy theories.
Susan Wojcicki: I've definitely heard about that. And we work really hard with all of our reviewers to make sure that, you know, we're providing the right services for them.
Susan Wojcicki showed us two examples of how hard it is to determine what's too hateful or violent to stay on the site.
Susan Wojcicki: So this is a really hard video to watch.
Lesley Stahl: Really hard.
Susan Wojcicki: And as you can see, these are prisoners in Syria. So you could look at it and say, "Well, should this be removed, because it shows violence, it's graphic." But it's actually uploaded by a group that is trying to expose the violence.
So she left it up. Then she showed us this World War II video.
Lesley Stahl: I mean it's totally historical footage that you would see on the History Channel.
But she took it down.
Lesley Stahl: Why?
Susan Wojcicki: There is this word down here that you'll see, 1418.
1418 is code used by white supremacists to identify one another.
Susan Wojcicki: For every area we work with experts, and we know all the hand signals, the messaging, the flags, the songs, and so there's quite a lot of context that goes into every single video to be able to under- stand what are they really trying to say with this video.
The struggle for Wojcicki is policing the site, while keeping YouTube an open platform.
Susan Wojcicki: You can go too far and that can become censorship. And so we have been working really hard to figure out what's the right way to balance responsibility with freedom of speech.
But the private sector is not legally beholden to the First Amendment.
Lesley Stahl: You're not operating under some-- freedom of speech mandate. You get to pick.
Susan Wojcicki: We do. But we think there's a lot of benefit from being able to hear from groups and underrepresented groups that otherwise we never would have heard from.
But that means hearing from people with odious messages about gays, women and immigrants.
Wojcicki explained that videos are allowed as long as they don't cause harm: but her definition of "harm" can seem narrow.
Susan Wojcicki: So if you're saying, "Don't hire somebody because of their race," that's discrimination. And so that would be an example of something that would be a violation against our policies.
Lesley Stahl: But if you just said, "White people are superior" by itself, that's okay.
Susan Wojcicki: And nothing else, yes.
But that is harmful in that it gives white extremists a platform to indoctrinate.
And what about medical quackery on the site? Like turmeric can reverse cancer; bleach cures autism; vaccines cause autism.
Once you watch one of these, YouTube's algorithms might recommend you watch similar content. But no matter how harmful or untruthful, YouTube can't be held liable for any content, due to a legal protection called Section 230.
Lesley Stahl: The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn't you be held responsible for that material, because you recommend it?
Susan Wojcicki: Well, our systems wouldn't work without recommending. And so if--
Lesley Stahl: I'm not saying don't recommend. I'm just saying be responsible for when you recommend so many times.
Susan Wojcicki: If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there'd be a much smaller set of information that people would be finding. Much, much smaller.
She told us that earlier this year, YouTube started re-programming its algorithms in the U.S. to recommend questionable videos much less and point users who search for that kind of material to authoritative sources, like news clips. With these changes Wojcicki says they have cut down the amount of time Americans watch controversial content by 70%.
Lesley Stahl: Would you be able to say to the public: we are confident we can police our site?
Susan Wojcicki: YouTube is always going to be different than something like traditional media where every single piece of content is produced and reviewed. We have an open platform. But I know that I can make it better. And that's why I'm here.
Produced by Shachar Bar-On. Associate producer, Natalie Jimenez Peel. Broadcast associate, Maria Rutan.