- Page 1 of 1
November 10, 2015
Ramp Up Action to Ban Killer Robots
Blinding Lasers Prohibition Offers Precedent
(Geneva, November 9, 2015) – Governments should agree to expand and formalize their international deliberations on fully autonomous weapons, with the ultimate aim of preemptively banning them, Human Rights Watch and the International Human Rights Clinic at Harvard Law School said in a joint report released today. These weapons, also known as lethal autonomous weapons systems or killer robots, would be able to select and attack targets without further human intervention.
The 18-page report, “Precedent for Preemption,” details why countries agreed to preemptively ban blinding laser weapons in 1995 and says that the process could be a model for current efforts to prohibit fully autonomous weapons. Countries participating in the annual meeting of the Convention on Conventional Weapons (CCW) will decide by consensus on November 13, 2015, whether to continue their deliberations on lethal autonomous weapons systems next year.
“Concerns over fully autonomous weapons have pushed them to the top of the international disarmament agenda, but countries need to pick up the pace of discussions,” said Bonnie Docherty, senior clinical instructor at Harvard Law School, and senior Arms Division researcher at Human Rights Watch, which is a co-founder of the Campaign to Stop Killer Robots. “Governments can take direct action now with commitments to ban weapons with no meaningful human control over whom to target and when to attack.”
The report calls on countries to initiate a more robust process through creation of a group of governmental experts on fully autonomous weapons under the CCW.
Artificial intelligence experts, roboticists, and other scientists predict that fully autonomous weapons could be developed within years, not decades. The preemptive ban on blinding lasers, which is in a protocol attached to the conventional weapons treaty, shows that a prohibition on future weapons is possible.
“The prospect of fully autonomous weapons raises many of the same concerns as blinding lasers did two decades ago,” said Docherty, lead author of the new report exploring the history of the prohibition on lasers that would permanently blind their victims. “Countries should adopt the same solution by banning fully autonomous weapons before they reach the battlefield.”
The report shows that threats to the principles of humanity and dictates of public conscience, as well as notions of abhorrence and social unacceptability, helped drive countries to ban blinding lasers. Fully autonomous weapons present similar dangers.
Countries were further motivated by the risk of widespread proliferation of blinding lasers to parties that have little regard for international law, a risk echoed in discussions of fully autonomous weapons, Human Rights Watch and the Harvard Law School clinic said. As with blinding lasers 20 years ago, a ban on fully autonomous weapons could clarify and strengthen existing law without limiting the development of related legitimate technology.
The groups acknowledged notable differences in the specific legal problems and technological character of the two weapons but found that those differences make banning fully autonomous weapons even more critical.
In other publications, the Clinic and Human Rights Watch have elaborated on the challenges that fully autonomous weapons would face in complying with international humanitarian law and international human rights law and analyzed the lack of accountability that would exist for the unlawful harm caused by such weapons.
Several of the 121 countries that have joined the CCW – including the United States, United Kingdom, China, Israel, Russia, and South Korea – are developing weapons systems with various degrees of autonomy and lethality. The countries party to the treaty held nine days of informal talks on lethal autonomous weapons systems in 2014 and 2015, but they should now ramp up their deliberations, Human Rights Watch and the Harvard clinic said.
Docherty and Steve Goose, director of the arms division at Human Rights Watch, will present the report at a side event briefing at 2 p.m. on November 9 in Conference Room XI at the United Nations in Geneva. At the end of the week, Goose will assess the meeting’s decision on fully autonomous weapons, joined by other Campaign to Stop Killer Robots representatives, at a side event briefing at 1 p.m. on November 13 in Conference Room XI.
“Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition” is available at:
NOTE: Mana Azarmi, JD ’16, Federica du Pasquier, MA ’16, and Marium Khawaja, LLM ’16, contributed research to this report.
For more Human Rights Watch reporting on fully autonomous weapons, please visit:
For more information on the Campaign to Stop Killer Robots, please visit:
For more information, please contact:
In Geneva, Bonnie Docherty (English): +1-617-669-1636 (mobile); or email@example.com
- Page 1 of 1