Stephen Hawking warns artificial intelligence could be threat to human race

Stephen Hawking has warned that artificial intelligence could one day "spell the end of the human race."

Speaking to the BBC, the eminent theoretical physicist said the artificial intelligence developed so far has been useful but expressed fears of creating something that far exceeded human abilities.

"It would take off on its own, and re-design itself at an ever increasing rate," Hawking said. "Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."

Hawking, who has the motor neuron disease ALS, spoke using a new system developed by Intel and Swiftkey. Their technology, already in use in a smartphone keyboard app, learns how the professor thinks and then proposes words he might want to use next.

"I expect it will speed up my writing considerably," he said.

Hawking praised the "primitive forms" of artificial intelligence already in use today, though he eschewed drawing a connection to the machine learning that is required for the predictive capabilities of his speaking device.

Hawking's comments were similar to those made recently by SpaceX and Tesla founder Elon Musk, who called AI a threat to humanity.

"With artificial intelligence, we are summoning the demon," Musk said during an October centennial celebration of the MIT Aeronautics and Astronautics Department. Musk had earlier sent a tweet saying that AI is "potentially more dangerous than nukes."

More broadly, Hawking told the BBC that he saw plenty of benefits from the Internet, but cautioned that it, too, had a dark side.

He called the Internet a "command center for criminals and terrorists," adding, "More must be done by the Internet companies to counter the threat, but the difficulty is to do this without sacrificing freedom and privacy."

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.