1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages

Technology

'Autonomous weapons': Tech leaders warn UN of killer robots

With significant advances in robotics and AI, technology leaders have called for safeguards to avoid "destabilizing effects." They warned that "once this Pandora's box is opened, it will be hard to close."

More than 100 leaders in the industries of artificial intelligence (AI) and robotics signed an open letter on Monday warning of a "third revolution" in warfare, when the UN's Conference of the Conventional Weapons was scheduled to meet to discuss the subject.

"As companies building the technologies in artificial intelligence and robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm," said the letter.

Read more: 'Robots and humans will work together'

The letter's signatories included major players in the industry, including SpaceX CEO Elon Musk and Mustafa Suleyman, who founded Google's DeepMind and heads its applied AI unit.

The signatories urged the UN to work hard at "finding means to prevent an arms race in these weapons, to protect civilians from their misuse and to avoid destabilizing effects of these technologies."

'Pandora's Box'

This is not the first time industry leaders have warned about the disastrous consequences of "Lethal Autonomous Weapon Systems," or intelligent weapons. In 2015, more than 1,000 leading scientists warned against so-called "killer robots."

World-renowned physicist Stephen Hawking has described AI as the "worst mistake ever made," while Musk has called it the greatest conceivable threat to our existence.

Read more: The future, the robots, and... us?

"These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways," said the open letter published Sunday.

"We do not have long to act. Once this Pandora's box is opened, it will be hard to close."

DW recommends