Watch CBS News

Yuval Noah Harari on the power of data, artificial intelligence and the future of the human race

Yuval Noah Harari: The 60 Minutes interview
Yuval Noah Harari: The 2021 60 Minutes interview 13:27

When Yuval Noah Harari published his first book, "Sapiens," in 2014 about the history of the human species, it became a global bestseller, and turned the little-known, Israeli history professor into one of the most popular writers and thinkers on the planet. But when we met with Harari in Tel Aviv this summer, it wasn't our species' past that concerned him, it was our future.  Harari believes we may be on the brink of creating not just a new, enhanced species of human, but an entirely new kind of being - one that is far more intelligent than we are. It sounds like science fiction, but Yuval Noah Harari says it's actually much more dangerous than that. 

Anderson Cooper: You said, "We are one of the last generations of Homo sapiens. Within a century or two, Earth will be dominated by entities that are more different from us than we are different from chimpanzees."

Yuval Noah Harari: Yeah. 

Anderson Cooper: What the hell does that mean? That freaked me out.

Yuval Noah Harari: You know we will soon have the power to re-engineer our bodies and brains, whether it is with genetic engineering or by directly connecting brains to computers, or by creating completely non-organic entities, artificial intelligence which is not based at all on the organic body and the organic brain. And these technologies are developing at break-neck speed.

Anderson Cooper: If that is true, then it creates a whole other species. 

Yuval Noah Harari: This is something which is way beyond just another species. 

haraivideo.jpg
  Yuval Noah Harari

Yuval Noah Harari is talking about the race to develop artificial intelligence, as well as other technologies like gene editing - that could one day enable parents to create smarter or more attractive children, and brain computer interfaces that could result in human/machine hybrids.

Anderson Cooper: What does that do to a society? It seems like the rich will have access whereas others wouldn't.

Yuval Noah Harari: One of the dangers is that we will see in the coming decades a process of-- of s-- of-- greater inequality than in any previous time in history because for the first time, it will be real biological inequality. If the new technologies are available only to the rich or only to people from a certain country then Homo sapiens will split into different biological castes because they really have different bodies and-- and different abilities.

Harari has spent the last few years lecturing and writing about what may lie ahead for humankind.

Harari at Davos in 2018: In the coming generations we will learn how to engineer bodies and brains and minds. 

He has written two books about the challenges we face in the future -- "Homo Deus" and "21 Lessons for the 21st Century" -- which along with "Sapiens" have sold more than 35 million copies and been translated into 65 languages. His writings have been recommended by President Barack Obama, as well as tech moguls, Bill Gates, and Mark Zuckerberg.

Anderson Cooper: You raise warnings about technology. You're also embraced by a lot of folks in Silicon Valley. 

Yuval Noah Harari: Yeah. 

Anderson Cooper: Isn't that sort of a contradiction?

Yuval Noah Harari: They are a bit afraid of their own power. That they have realized the immense influence they have over the world, over the course of evolution, really.  And I think that spooks at least some of them. And that's a good thing. And this is why they are kind of to some extent open to listening.

Anderson Cooper: You started as a history professor. What do you call yourself now?

Yuval Noah Harari: I'm still a historian. But I think history is the study of change, not just the study of the past. But it covers the future as well.

Yuval Noah Harari on how he turned his lecture notes into a bestseller 01:05

Harari got his Ph.D. in history at Oxford, and lives in Israel, where the past is still very present. He took us to an archeological site called Tel Gezer.

Harari says cities like this were only possible because about 70,000 years ago our species - Homo sapiens - experienced a cognitive change that helped us create language, which then made it possible for us to cooperate in large groups and drive Neanderthals and all other less cooperative human species into extinction. 

Harari fears we are now the ones at risk of being dominated, by artificial intelligence. 

Yuval Noah Harari: Maybe the biggest thing that we are facing is really a kind of evolutionary divergence. For millions of years, intelligence and consciousness went together. Consciousness is the ability to feel things, like pain and pleasure and love and hate. Intelligence is the ability to solve problems. But computers or artificial intelligence, they don't have consciousness. They just have intelligence. They solve problems in a completely different way than us. Now in science fiction, it's often assumed that as computers will become more and more intelligent, they will inevitably also gain consciousness. But actually, it's-- it's much more frightening than that in a way they will be able to solve more and more problems better than us without having any consciousness, any feelings. 

Anderson Cooper: And they will have power over us?

Yuval Noah Harari:  They are already gaining power over us. 

Some lenders routinely use complex artificial intelligence algorithms to determine who qualifies for loans and global financial markets are moved by decisions made by machines analyzing huge amounts of data in ways even their programmers don't always understand. 

Harari says the countries and companies that control the most data will in the future be the ones that control the world. 

"Sapiens" gives a fresh take on existing research 01:29

Yuval Noah Harari: Today in the world, data is worth much more than money. Ten years ago, you had these big corporations paying billions and billions for WhatsApp, for Instagram. And people wondered, "Are they crazy? Why do they pay billions to get this application that doesn't produce any money?" And the reason why? Because it produced data.

Anderson Cooper: And data is the key?

Yuval Noah Harari: The world is increasingly kind of cut up into spheres of-- of data collection, of data harvesting. In the Cold War, you had the Iron Curtain. Now we have the Silicon Curtain between the USA and China. And where does the data go? California or does it go to Shenzhen and to Shanghai and to Beijing? 

Harari is concerned the pandemic has opened the door for more intrusive kinds of data collection, including biometric data.

Anderson Cooper: What is biometric data?

Yuval Noah Harari: It's data about what's happening inside my body. What we have seen so far. It's corporations and governments collecting data about where we go, who we meet, what movies we watch. The next phase is surveillance going under our skin.

Anderson Cooper: I'm wearing a, like a tracker that tracks my heart rate, my sleep. I don't know where that information is going.

Yuval Noah Harari: You wear the KGB agent on your wrist willingly.

Anderson Cooper: And I think it's benefiting me.

Yuval Noah Harari: And it is benefiting you. I mean, the whole thing is that it's not just dystopian. It's also utopian. I mean, this kind of data can also enable us to create the best health care system in history. The question is what else is being done with that data? And who supervises it? Who regulates it? 

Why Yuval Noah Harari meditates 01:38

Earlier this year, the Israeli government gave its citizens' health data to Pfizer to get priority access to their vaccine. The data did not include individual citizens' identities.

Anderson Cooper: So what does Pfizer want the data of all Israelis for?

Yuval Noah Harari: Because to develop new medicines, new treatments you need the medical data. Increasingly, that's the basis for how-- for medical research. It's not all bad. 

Harari has been criticized for pointing out problems without offering solutions, but he does have some ideas about how to limit the misuse of data. 

Yuval Noah Harari: One key rule is that if you get my data, the data should be used to help me and not to manipulate me. Another key rule, that whenever you increase surveillance of individuals you should simultaneously increase surveillance of the corporation and governments and the people at the top. And the third principle is that, never allow all the data to be concentrated in one place. That's the recipe for a dictatorship. 

Harari speaking at The Future of Education: Netflix tells us what to watch and Amazon tells us what to buy. Eventually within 10 or 20 or 30 years such algorithms could also tell you what to study at college and where to work and whom to marry and even whom to vote for. 

Without greater regulation, Harari believes we are at risk of becoming what he calls "hacked humans."

Anderson Cooper: What does that mean?

Yuval Noah Harari: To hack a human being is to get to know that person better than they know themselves. And based on that, to increasingly manipulate you This outside system, it has the potential to remember everything. Everything you ever did. And to analyze and find patterns in this data and to get a much better idea of who you really are. I came out as gay when I was 21. It should've been obvious to me when I was 15 that I'm gay. But something in the mind blocked it. Now, if you think about a teenager today, Facebook can know that they are gay or Amazon can know that they are gay long before they do just based on analyzing patterns.

Anderson Cooper: And based on that, you can tell somebody's sexual orientation?

Yuval Noah Harari: Completely. And what does it mean if you live in Iran or if you live in Russia or in some other homophobic country and the police know that you are gay even before you know it?

Anderson Cooper: When people think about data they think about companies finding out what their likes and dislikes are but the data that you're talking about goes much deeper than that?

Yuval Noah Harari: Like, think in 20 years when the entire personal history of every journalist, every judge, every politician, every military officer is held by somebody in Beijing or in Washington? Your ability to manipulate them is like nothing before in history.

Yuval Noah Harari on how he met his husband 01:08

Harari lives outside Tel Aviv with his husband, Itzik Yahav. They have been together for nearly 20 years. It was Yahav who read Harari's lecture notes for a history course and convinced him to turn them into his first book – "Sapiens."   

Itzik Yahav: I read the lessons. I couldn't stop talking about it. For me, it was clear that it could be a huge bestseller.

Yahav is now Harari's agent, and together they started a company called Sapienship. They are creating an interactive exhibit that will take visitors through the history of human evolution and challenge them to think about the future of mankind. 

Harari also just published the second installment of a graphic novel based on "Sapiens." And he's teaching courses at Israel's Hebrew University in ethics and philosophy for computer scientists and bioengineers. 

Harari teaching: When people write code, they are reshaping politics and economics and ethics, and the structure of human society.

Anderson Cooper: When I think of coders and engineers, I don't think of philosophers and poets. 

Yuval Noah Harari: It's not the case now, but it should be the case because they are increasingly solving philosophical and poetical riddles. If you're designing, you know, a self-driving car, so the self-driving car will need to make ethical decisions. Like suddenly, a kid jumps in front of the car. And the only way to-- to-- to prevent running over the kid is to swerve to the side and be hit by a truck. And your own-owner who is asleep in the backseat will-- might be killed. You need to tell the algorithm what to do in this situation. So you need to actually solve the philosophical question, who to kill. 

Last month the United Nations suggested a moratorium on artificial intelligence systems that seriously threaten human rights until safeguards are agreed upon, and advisers to President Biden are proposing what they call a "bill of rights" to guard against some of the new technologies. Harari says just as Homo sapiens learned to cooperate with each other many thousands of years ago, we need to cooperate now. 

Yuval Noah Harari: Certainly. Now, we are at the point when we need global cooperation. You cannot regulate the explosive power of artificial intelligence on a national level. I'm not trying to kind of prophesy what will happen. I'm trying to warn people about the most dangerous possibilities, in the hope that we will do something in the present to prevent them.

Produced by Denise Schrier Cetta. Associate producer, Katie Brennan. Broadcast associate, Annabelle Hanflig. Edited by Stephanie Palewski Brumbach.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.