LONDON -- Physicist Stephen Hawking has warned that new technologies will likely bring about "new ways things can go wrong" for human survival.
When asked how the world will end - "naturally" or whether man would destroy it first - Hawking said that increasingly, most of the threats humanity faces come from progress made in science and technology. They include nuclear war, catastrophic global warming and genetically engineered viruses, he said.
Hawking made the comments while recording the BBC's annual Reith Lectures on Jan. 7. His lecture, on the nature of black holes, was split into two parts and will be broadcast on radio on Jan. 26 and Feb. 2.
The University of Cambridge professor said that a disaster on Earth - a "near certainty" in the next 1,000 to 10,000 years- will not spell the end of humanity because by that time humans are likely to have spread out into space and to other stars.
"However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period," he joked, provoking laughter from the audience.
"We are not going to stop making progress, or reverse it, so we have to recognize the dangers and control them. I'm an optimist, and I believe we can," he added.
Hawking has been outspoken in the past about his concern that artificial intelligence could pose a threat if this growing field of technology is not handled properly. In a Reddit "Ask Me Anything" session last year, he wrote, "The real risk with AI isn't malice but competence. A super-intelligent AI will be extremely good at accomplishing its goals, and if those goals aren't aligned with ours, we're in trouble."