The Cambridge professor said that humans had to be "careful" for the next 100 years to ensure survival. His statement follows a 2015 warning on the risk of using artificial intelligence in conflicts.
World-renowned physicist Stephen Hawking has once again issued a warning on technological and scientific threats to human survival after singling out artificial intelligence in 2015.
Hawking said that scientific breakthroughs and new technologies will likely contribute to "new ways things can go wrong" for human survival, citing genetically engineered viruses and nuclear war.
Last year, Hawking signed a letter, along with some 1,000 industry leaders in technology, that warned of the grave impact artificial intelligence will have if it is developed for use in "killer robots," which could launch an unprecedented arms race.
However, the 74-year-old Cambridge University professor noted that a global disaster "would not be the end of the human race" in response to a question during the recording of the Reith Lectures produced by British broadcaster BBC.
"Although the chance of a disaster to planet earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next 1,000 or 10,000 years," Hawking said.
"By that time, we should have spread out into space, and to other stars, so a disaster on earth would not mean the end of the human race."
The physicist added that humans would not establish "self-sustaining colonies in space" for the next 100 years, warning that "we have to be very careful in this period."
ls/kms (AP, dpa)