Blog: Campaign to Stop Killer Robots
- Page 1 of 1
April 9, 2015
The “Killer Robots” Accountability Gap
Obstacles to Legal Responsibility Show Need for Ban
(Geneva, April 9, 2015) – Programmers, manufacturers, and military personnel could all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots,” Human Rights Watch said in a report released today. The report was issued in advance of a multilateral meeting on the weapons at the United Nations in Geneva.
The 38-page report, “Mind the Gap: The Lack of Accountability for Killer Robots,” details significant hurdles to assigning personal accountability for the actions of fully autonomous weapons under both criminal and civil law. It also elaborates on the consequences of failing to assign legal responsibility. The report is jointly published by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.
“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” said Bonnie Docherty, senior Arms Division researcher at Human Rights Watch and the report’s lead author. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.” Continue Reading…
June 16, 2014
Posted by Cara Solomon
Last week, the UN Human Rights Council took a fresh look at fully autonomous weapons, or “killer robots.” Previous international debate had focused on the weapons’ ability to comply with laws of war; the Council, by contrast, examined the issue through the lens of international human rights law, which applies in times of peace as well as armed conflict. In this June 9 post originally published by JURIST, Senior Clinical Instructor Bonnie Docherty argued that killer robots threaten the most fundamental human rights.
Fully autonomous weapons, which could select and fire on targets without meaningful human intervention, have the potential to revolutionize the nature of warfare, bringing greater speed and reach to military operations. In the process, though, this emerging technology could endanger both civilians and soldiers.
Nations have been considering the multiple challenges these weapons would pose to the laws of war, also called international humanitarian law. But little attention has been given to the implications for human rights law. If these weapons were developed and used for policing, for example, they would threaten the most basic of these rights, including the right to life, the right to a remedy and the principle of human dignity.
Fully autonomous weapons, also known as autonomous weapons systems or “killer robots,” do not yet exist, but research and technology in a number of countries are moving rapidly in that direction. Because these machines would have the power to determine when to kill, they raise a host of legal, ethical and scientific concerns. Human Rights Watch and Harvard Law School’s International Human Rights Clinic are advocating for a pre-emptive prohibition on fully autonomous weapons. The Campaign to Stop Killer Robots, a global coalition of 52 nongovernmental organizations coordinated by Human Rights Watch, is making the same call. Continue Reading…
May 14, 2014
Posted by Joseph Klingler, JD '14
In Geneva today, the Clinic and Human Rights Watch released the latest in a series of publications calling for a preemptive ban on the development, production, and use of fully autonomous weapons. The weapons- also called “killer robots”- would be capable of selecting and firing upon targets without any meaningful human control.
The joint paper, entitled “Advancing the Debate on Killer Robots,” systematically rebuts 12 arguments that have been raised by critics of a ban. Its release coincides with a major international disarmament conference dedicated to fully autonomous weapons, being held at the UN in Geneva this week. More than 400 delegates from government, international organizations, and civil society have gathered to discuss the weapons under the framework of the Convention on Conventional Weapons, a treaty that governs problematic weapons.
Clinical students Evelyn Kachaje, JD ’15, and Joseph Klingler, JD ’14, who along with Yukti Choudhary, LLM ’14 helped Senior Clinical Instructor Bonnie Docherty draft the paper, are attending the talks. The Clinic is working with the Campaign to Stop Killer Robots, a coalition of nongovernmental organizations, to increase momentum towards an eventual treaty banning fully autonomous weapons.
On Monday, before the conference began, the Clinic and Human Rights Watch released “Shaking the Foundations: The Human Rights Implications of Killer Robots.” The report found that fully autonomous weapons threaten fundamental human rights and principles: the right to life, the right to a remedy, and the principle of dignity.
December 3, 2013
Posted by Bonnie Docherty
Five years ago this week, 94 countries gathered in Oslo to sign the Convention on Cluster Munitions. The historic ceremony, held in the hall where the Nobel Peace Prize is awarded, was a moment of celebration and inspiration.
The groundbreaking treaty banned a class of weapons that cause serious harm to civilians. It also showed that humanitarian disarmament, which prioritizes humanitarian concerns over security interests, had become an established means of governing weapons.
While the anniversary of the Convention on Cluster Munitions offers an occasion to reflect on an earlier success, the past month also marked a breakthrough for those working to prevent future civilian casualties. At an international disarmament conference in Geneva, 117 countries turned their attention toward another threat: fully autonomous weapons, also known as “killer robots.” On November 15, the last day of the conference, states parties to the Convention on Conventional Weapons (CCW) unanimously agreed to take up the issue next year.
Cluster munitions have caused civilian casualties during and after conflicts for half a century. Fully autonomous weapons, which would target and fire on targets without meaningful human intervention, might do the same over the coming decades. They do not exist yet, but technology is moving rapidly in their direction.
The Campaign to Stop Killer Robots, a coalition of nongovernmental organizations (NGOs) coordinated by Human Rights Watch, has called for a preemptive prohibition of fully autonomous weapons because of their potential to revolutionize warfare and endanger civilians. The International Human Rights Clinic has supported its efforts through several joint advocacy publications with Human Rights Watch, including one released at CCW in November.
CCW is usually a slow-moving forum so the forthcoming discussions do not mean a treaty banning fully autonomous weapons will be negotiated in 2014. But the fact that parties to the convention, including such military powers as China, Russia, and the United States, have acknowledged the importance of the issue is truly remarkable. It is a tribute in large part to the effort of advocates working on the issue, including the Clinic’s students. Continue Reading…
May 29, 2013
Posted by Jonathan Nomamiukor, JD ’13
Last summer, after two years at Harvard Law School, I elected to take a leave of absence to join President Obama’s re-election campaign. My decision had less to do with any affinity for the President and more to do with my disillusionment with law school in general. I had enrolled with aspirations to enter public service, believing that by simply attending classes in the same building as Charles Hamilton Houston, the famed civil rights lawyer, I’d follow in his footsteps.
After a month of lectures about water property lines, chicken sexing, and figuring out whether a tomato was a fruit or a vegetable, I began to question whether law school was really the right choice for me. If my goal was to combat systemic inequities, could an education that focused on how to work within the status quo—rather than challenge it—be the best path? As the saying goes: will the master’s tools ever be good enough to dismantle the master’s house?
In London recently, I had the opportunity to find out. I traveled there with a team from the International Human Rights Clinic, which I joined after returning to HLS in January. For the past few months, we had been working on the controversial topic of fully autonomous weapons, which are essentially drones that can target and kill without any human intervention. These weapons don’t exist yet, but technology is moving rapidly in that direction, and precursors are already in use.
A coalition of nongovernmental organizations (NGOs) had gathered to launch a campaign to ban these “killer robots,” and I was there with my clinical supervisor, Bonnie Docherty, also a senior arms researcher at Human Rights Watch, to participate in it. At a pre-launch forum for campaigners, Docherty was busy giving a presentation in one room while I slipped into a session on the ethics involved with fully autonomous weapons. Continue Reading…
- Page 1 of 1