"Killer robots" could threaten basic human rights, activists warn

istockphoto

Drones have been used to carry out military strikes for nearly a century, with the first version of a pilotless plane tested during World War I. All along, operators on the ground have controlled these unmanned aircraft. But soon, they may change drastically. Thanks to advanced engineering, technology and artificial intelligence, drones could be capable of operating and firing weaponry autonomously in the not-too-distant future.

Advancing the technology may offer the benefits of rapid response time, lower costs, and reduced physical and emotional risks to human soldiers. But these so-called "killer robots" are also raising concerns around the world.

"Fully autonomous weapons represent the step beyond remote-controlled armed drones," writes Human Rights Watch (HRW) in a report released today in conjunction with Harvard Law School's International Human Rights Clinic. "Unlike any existing weapons, these robots would identify and fire on targets without meaningful human intervention."

According to HRW, the governments of the United States, Israel, China, Russia, South Korea and the United Kingdom are believed to be investigating the potential of autonomous drones, with the U.S. and U.K. already sending devices on test flights.

The HRW report argues that such autonomous weapons would undermine basic human rights and the principle of human dignity.

"As inanimate machines, fully autonomous weapons could truly comprehend neither the value of individual life nor the significance of its loss," said the report. "Allowing them to make determinations to take life away would thus conflict with the principle of dignity."

Another fear is that the autonomous drones would not be able to distinguish between combatants and civilians.

There are also concerns around accountability for attacks carried out by autonomous drone. "International law mandates accountability in order to deter future unlawful acts and punish past ones, which in turn recognizes victims' suffering," said the report. "It is uncertain, however, whether meaningful accountability for the actions of a fully autonomous weapon would be possible. The weapon itself could not be punished or deterred because machines do not have the capacity to suffer."

Experts from the 117 member nations of the U.N. Convention on Conventional Weapons will meet in Geneva, Switzerland, this week to start considering regulations for the technology. They are expected to address ethical, operational, and legal questions, among others.

The report adds to concerns expressed over the past year and a half. In May 2013, U.N. special rapporteur Christof Heyns called for a moratorium on "killer robots." In November, a Vatican representative addressed the issue during a debate at the U.N., predicting that questions surrounding the technology and potential uses are going to "grow in relevance and urgency."

Furthermore, technology that is developed for military use often trickles down to civilian use. HRW's Campaign to Stop Killer Robots raises fears that autonomous weapons would someday be used by police in criminal cases.

In the report, HRW calls for countries to pass laws against "killer drones" and support an internationally binding agreement to "prohibit the development, production, and use of fully autonomous weapons."

Comments