Blog: Killer Robots

June 20, 2016

Losing Control: The Dangers of Killer Robots

Posted by Bonnie Docherty


This piece originally appeared in The Conversation on June 16, 2016



New technology could lead humans to relinquish control over decisions to use lethal force. As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching. Fully autonomous weapons, also known as “killer robots,” are quickly moving from the realm of science fiction toward reality.

These weapons, which could operate on land, in the air or at sea, threaten to revolutionize armed conflict and law enforcement in alarming ways. Proponents say these killer robots are necessary because modern combat moves so quickly, and because having robots do the fighting would keep soldiers and police officers out of harm’s way. But the threats to humanity would outweigh any military or law enforcement benefits.

Removing humans from the targeting decision would create a dangerous world. Machines would make life-and-death determinations outside of human control. The risk of disproportionate harm or erroneous targeting of civilians would increase. No person could be held responsible. Continue Reading…

January 7, 2016

“Fighting for Disarmament”: Bonnie Docherty’s work featured in Harvard Gazette

This Q & A by reporter Liz Mineo ran in the Harvard Gazette on January 3, 2015

After researching the devastating humanitarian effects of the deadly cluster munitions used in Afghanistan in 2002, Bonnie Docherty joined a worldwide campaign to eliminate them.

Six years after she started her probe, cluster bombs were banned. Her investigation on the use of cluster munitions in Afghanistan, and later in Iraq and Lebanon, was highly influential in a 2008 treaty, joined by 118 countries, that bans these weapons.

121115_Docherty_605

Bonnie showing examples of inert cluster munitions. Credit: Jon Chase, Harvard staff photographer

For Docherty, a lecturer on law and a senior instructor at the International Human Rights Clinic at Harvard Law School, the battle to protect civilians from unnecessary harm continues.

Last month, Docherty traveled to Geneva to advocate for stronger regulations on incendiary devices, which she calls “exceptionally cruel weapons” that have been used in Syria, Libya, and Ukraine.

Docherty, who is also a senior researcher in the arms division at Human Rights Watch, recently sat down for an interview to talk about these weapons, killer robots, and her guiding principle: to protect civilians from suffering caused by armed conflicts.

GAZETTE: Before you became a disarmament advocate, you were a reporter for a local newspaper. Can you tell us about this part of your life?

DOCHERTY: After college, I was a reporter for The Middlesex News, now the MetroWest Daily News, outside of Boston, for three years. I covered mostly local news, government meetings, environmental issues, but I had the opportunity to go to Bosnia and embed with the peacekeepers for about 10 days in 1998. There was an Army lab in my town, that’s how I got the invitation to go to Bosnia. I had been interested in armed conflicts, but that trip definitely increased my interest in that field.

GAZETTE: How did you make the jump from suburban journalism to human rights and disarmament issues?

DOCHERTY: After I left the newsroom, I went to Harvard Law School. Right after graduation, I went to Human Rights Watch, which was a perfect mix of journalism and law because you go out in the field and you apply the law to what you find. My start date was Sept. 12, 2001, by happenstance, so whatever was planned was changed. Six months later, I was in Afghanistan researching the use of cluster munitions, which was my first exposure to disarmament issues.

GAZETTE: What are cluster munitions, and why are they so dangerous?

DOCHERTY: Cluster munitions are large weapons, such as bombs or rockets that contain dozens or hundreds of small munitions called submunitions. They’re problematic because they have a broad area effect — they spread over the size of a football field — and because many of them don’t explode on impact and lie around like landmines and explode in years or decades to come.

GAZETTE: How did your involvement with cluster munitions begin?

121115_Docherty_570

Bonnie holds an inert submunition fragment from a cluster munition. Credit: Jon Chase, Harvard staff photographer

DOCHERTY: I went to Afghanistan, Iraq, Lebanon, and later Georgia to document the use of these weapons. I’ve spoken with dozens of victims of cluster munitions, but the story I remember the most is when I was in Lebanon with two students from Harvard Law’s International Human Rights Clinic in 2006. We were there doing field research after Israel used cluster munitions in Lebanon. We were at a restaurant, and someone asked us to go to the town of Halta immediately. When we arrived, we found out that two hours earlier a 12-year-old boy had been killed by a cluster submunition. He had been playing with his brother, who had been throwing pinecones at him. The boy picked up something to throw back at his brother. It turned out to be a submunition. His friend said, “Oh, no. That’s dangerous, drop it,” and when he went to throw it away, it exploded next to his head. When we were there, they were still cleaning up the pool of blood from his body. The Lebanese army found 10, 12 submunitions lying around right next to a village, waiting to kill or injure civilians, farmers, children.

GAZETTE: Your research on cluster munitions led you to become one of the world’s most widely known advocates against these weapons. How did this happen?

Continue Reading…

Share By Email

loading
Close

November 10, 2015

Clinic and HRW: Ramp up Action to Ban Killer Robots

PRESS RELEASE


Ramp Up Action to Ban Killer Robots
Blinding Lasers Prohibition Offers Precedent

 

(Geneva, November 9, 2015) – Governments should agree to expand and formalize their international deliberations on fully autonomous weapons, with the ultimate aim of preemptively banning them, Human Rights Watch and the International Human Rights Clinic at Harvard Law School said in a joint report released today. These weapons, also known as lethal autonomous weapons systems or killer robots, would be able to select and attack targets without further human intervention.

The 18-page report, “Precedent for Preemption,” details why countries agreed to preemptively ban blinding laser weapons in 1995 and says that the process could be a model for current efforts to prohibit fully autonomous weapons. Countries participating in the annual meeting of the Convention on Conventional Weapons (CCW) will decide by consensus on November 13, 2015, whether to continue their deliberations on lethal autonomous weapons systems next year.

“Concerns over fully autonomous weapons have pushed them to the top of the international disarmament agenda, but countries need to pick up the pace of discussions,” said Bonnie Docherty, senior clinical instructor at Harvard Law School, and senior Arms Division researcher at Human Rights Watch, which is a co-founder of the Campaign to Stop Killer Robots. “Governments can take direct action now with commitments to ban weapons with no meaningful human control over whom to target and when to attack.”

The second Convention on Conventional Weapons informal meeting of experts on lethal autonomous weapons systems at the UN in Geneva in April 2015. © 2015 United Nations Office at Geneva

The second Convention on Conventional Weapons informal meeting of experts on lethal autonomous weapons systems at the UN in Geneva in April 2015. © 2015 United Nations Office at Geneva

The report calls on countries to initiate a more robust process through creation of a group of governmental experts on fully autonomous weapons under the CCW.

Artificial intelligence experts, roboticists, and other scientists predict that fully autonomous weapons could be developed within years, not decades. The preemptive ban on blinding lasers, which is in a protocol attached to the conventional weapons treaty, shows that a prohibition on future weapons is possible.

“The prospect of fully autonomous weapons raises many of the same concerns as blinding lasers did two decades ago,” said Docherty, lead author of the new report exploring the history of the prohibition on lasers that would permanently blind their victims. “Countries should adopt the same solution by banning fully autonomous weapons before they reach the battlefield.”

The report shows that threats to the principles of humanity and dictates of public conscience, as well as notions of abhorrence and social unacceptability, helped drive countries to ban blinding lasers. Fully autonomous weapons present similar dangers.

Countries were further motivated by the risk of widespread proliferation of blinding lasers to parties that have little regard for international law, a risk echoed in discussions of fully autonomous weapons, Human Rights Watch and the Harvard Law School clinic said. As with blinding lasers 20 years ago, a ban on fully autonomous weapons could clarify and strengthen existing law without limiting the development of related legitimate technology.

The groups acknowledged notable differences in the specific legal problems and technological character of the two weapons but found that those differences make banning fully autonomous weapons even more critical.

In other publications, the Clinic and Human Rights Watch have elaborated on the challenges that fully autonomous weapons would face in complying with international humanitarian law and international human rights law and analyzed the lack of accountability that would exist for the unlawful harm caused by such weapons.

Several of the 121 countries that have joined the CCW – including the United States, United Kingdom, China, Israel, Russia, and South Korea – are developing weapons systems with various degrees of autonomy and lethality. The countries party to the treaty held nine days of informal talks on lethal autonomous weapons systems in 2014 and 2015, but they should now ramp up their deliberations, Human Rights Watch and the Harvard clinic said.

Docherty and Steve Goose, director of the arms division at Human Rights Watch, will present the report at a side event briefing at 2 p.m. on November 9 in Conference Room XI at the United Nations in Geneva. At the end of the week, Goose will assess the meeting’s decision on fully autonomous weapons, joined by other Campaign to Stop Killer Robots representatives, at a side event briefing at 1 p.m. on November 13 in Conference Room XI.

Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition” is available at:
www.hrw.org/node/283112/

NOTE: Mana Azarmi, JD ’16, Federica du Pasquier, MA ’16, and Marium Khawaja, LLM ’16, contributed research to this report.

For more Human Rights Watch reporting on fully autonomous weapons, please visit:
http://www.hrw.org/topic/arms/killer-robots

For more information on the Campaign to Stop Killer Robots, please visit:
www.stopkillerrobots.org

For more information, please contact:
In Geneva, Bonnie Docherty (English): +1-617-669-1636 (mobile); or bdocherty@law.harvard.edu

 

Share By Email

loading
Close

April 9, 2015

Clinic and HRW Release Report: “Mind the Gap: The Lack of Accountability for Killer Robots”

PRESS RELEASE

The “Killer Robots” Accountability Gap

Obstacles to Legal Responsibility Show Need for Ban

 

(Geneva, April 9, 2015) – Programmers, manufacturers, and military personnel could all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots,” Human Rights Watch said in a report released today. The report was issued in advance of a multilateral meeting on the weapons at the United Nations in Geneva.

RobotCoverThe 38-page report, “Mind the Gap: The Lack of Accountability for Killer Robots,” details significant hurdles to assigning personal accountability for the actions of fully autonomous weapons under both criminal and civil law. It also elaborates on the consequences of failing to assign legal responsibility. The report is jointly published by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.

“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” said Bonnie Docherty, senior Arms Division researcher at Human Rights Watch and the report’s lead author. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.” Continue Reading…

Share By Email

loading
Close

June 4, 2014

Taking on “Killer Robots”

Posted by Bonnie Docherty

As readers of this blog will know, last month Senior Clinical Instructor Bonnie Docherty traveled with students to Geneva for the first multilateral meeting of the Convention on Conventional Weapons devoted to fully autonomous weapons, or “killer robots.” Below is her re-cap of the week’s events, published originally on May 23, 2014 in the online forum Just Security.

 

“Taking on ‘Killer Robots'”

 

New weapons that could revolutionize killing are on the horizon. Lethal autonomous weapons systems, also called fully autonomous weapons or “killer robots,” would go beyond today’s armed drones. They would be able to select and fire on targets without meaningful human intervention. In other words, they could determine themselves when to take a human life.

Representatives from 87 countries gathered at the United Nations in Geneva last week to discuss concerns about this technology and possible ways to respond. The conference was the first multilateral meeting dedicated to lethal autonomous weapons systems. It represented a crucial step in a process that should result in a ban on these problematic weapons before it grows too late to change course.

Human Rights Watch and Harvard Law School’s International Human Rights Clinic are calling for a pre-emptive prohibition on the development, production, and use of these weapons. The Campaign to Stop Killer Robots, a global coalition of 51 nongovernmental organizations coordinated by Human Rights Watch, is making the same call.

Overall, the talks in Geneva were productive and positive. The conference, under the auspices of the Convention on Conventional Weapons (CCW), attracted hundreds of delegates from governments, the United Nations, the International Committee of the Red Cross, and nongovernmental groups, setting a record for a CCW meeting. Participants engaged in four days of substantive discussions about the technical, ethical, legal, and operational concerns raised by fully autonomous weapons.

This “informal meeting of experts” was also noteworthy for its timeliness, unusual for a CCW conference. This meeting took place just a year and a half after Human Rights Watch and the Harvard clinic issued a groundbreaking report on these weapons, Losing Humanity: The Case against Killer Robots, which the UN website credited with bringing the issue to “the international community’s attention.”

The meeting illuminated both areas of emerging agreement and ongoing points of contention. At their next meeting in November, states parties to the Convention on Conventional Weapons should show that they are serious about taking action to deal with fully autonomous weapons and adopt a mandate for even deeper discussions in 2015.

Areas of Emerging Agreement

Four promising themes emerged at the recent meeting. First, there was widespread support for continuing discussions. The countries made clear that they saw last week as merely an initial foray into the issue. Many delegates also explicitly recognized the importance of continuing to involve nongovernmental groups, including the Campaign to Stop Killer Robots and its member organizations.

Second, a significant number of countries expressed particular concern about the ethical problems raised by fully autonomous weapons. The chair’s final report noted that these countries “stressed the fact that the possibility for a robotic system to acquire capacities of ‘moral reasoning’ and ‘judgment’ was highly questionable.” Furthermore, these machines could not understand and respect the value of life, yet they would be given the power to determine when to take it away. Fully autonomous weapons would thus threaten to undermine human dignity.

Third, many countries emphasized that weapons systems should always fall under “meaningful human control.” While the parameters of this concept will require careful definition, obligating nations to maintain that control is vital to averting a watershed in the nature of warfare that could endanger civilians and soldiers alike.

Finally, countries frequently noted in their statements the relevance of international human rights law as well as international humanitarian law. Human rights law applies in peace and war, and it would govern the use of these weapons not only on the battlefield but also in law enforcement operations. In a new report released last week, Shaking the Foundations: The Human Rights Implications of Killer Robots, Human Rights Watch and the Harvard clinic found that fully autonomous weapons could contravene the rights to life and a remedy as well as the principle of dignity.

Legal Debate

The most contentious part of the discussion surrounded the application of international humanitarian law to fully autonomous weapons. The debate echoed many of the points raised in a second paper that Human Rights Watch and the Harvard clinic released at the meeting. “Advancing the Debate on Killer Robots” responds directly to 12 critiques of a ban on the weapons.

The meeting revealed a divergence of views about the adequacy of international humanitarian law to deal with fully autonomous weapons. Critics of a ban argue that problematic use of these weapons would violate existing law and that supplementary law is unnecessary. A new treaty banning the weapons, however, would bring clarity, minimizing the need for case-by-case determinations of lawfulness and facilitating enforcement. It would also increase the stigma against the weapon, which can influence even states not party to a treaty to abide by a ban. In addition, a treaty dedicated to fully autonomous weapons could address proliferation, unlike traditional international humanitarian law, which focuses on use.

The debate about the adequacy of international humanitarian law to deal with fully autonomous weapons is reminiscent of arguments made in earlier Convention on Conventional Weapons meetings about cluster munitions. The adoption of the 2008 Convention on Cluster Munitions by 107 states resolved that dispute. Prohibitions on five other weapons that cause unacceptable humanitarian harm—antipersonnel landmines, blinding lasers, chemical weapons, biological weapons, and poison gas— provide additional precedent for new law. While most states are reserving judgment on the best solution to deal with the problems posed by fully autonomous weapons, five countries called for a ban last week.

Participants in the last week’s meeting also disagreed about when action should be taken. Critics of a ban supported a wait-and-see approach, arguing that improvements in technology could address the obstacles to compliance with international humanitarian law. There are serious doubts, however, that robots could ever replicate certain complex human qualities, such as judgment, necessary to comply with principles of distinction and proportionality. Furthermore, grave ethical concerns, the likelihood of proliferation and a robotic arms race, an accountability gap, and the prospect of premature deployment all suggest a technological fix would not suffice to address the weapons’ problems.

Action should be taken now before countries invest more in the technology and become less willing to give it up. The pre-emptive ban on blinding lasers in Protocol IV to the Convention on Conventional Weapons can serve as a useful model.

Next Steps

Despite some points of disagreement, the meeting advanced efforts to deal with fully autonomous weapons. Nations need to keep up momentum, however, to avoid having such meetings become what some have called a “talk shop.” In the short term, individual countries should establish national moratoria on fully autonomous weapons.

In November, the parties to the Convention on Conventional Weapons should adopt a mandate to study the issue in greater depth in 2015. They should agree to hold three to four weeks of formal meetings, known as a Group of Governmental Experts. They should also be clear that the meetings would be a step toward negotiating a new protocol on fully autonomous weapons. Such intense discussions would move the debate forward. They would show that the treaty members are committed to addressing this issue and that the Convention on Conventional Weapons is re-emerging as an important source of international humanitarian law.

Continue Reading…

May 14, 2014

A Second Release in Geneva: “Advancing the Debate on Killer Robots”

Posted by Joseph Klingler, JD '14

In Geneva today, the Clinic and Human Rights Watch released the latest in a series of publications calling for a preemptive ban on the development, production, and use of fully autonomous weapons. The weapons- also called “killer robots”- would be capable of selecting and firing upon targets without any meaningful human control.

The joint paper, entitled “Advancing the Debate on Killer Robots,” systematically rebuts 12 arguments that have been raised by critics of a ban. Its release coincides with a major international disarmament conference dedicated to fully autonomous weapons, being held at the UN in Geneva this week. More than 400 delegates from government, international organizations, and civil society have gathered to discuss the weapons under the framework of the Convention on Conventional Weapons, a treaty that governs problematic weapons.

Stop Killer Robots 2Clinical students Evelyn Kachaje, JD ’15, and Joseph Klingler, JD ’14, who along with Yukti Choudhary, LLM ’14 helped Senior Clinical Instructor Bonnie Docherty draft the paper, are attending the talks. The Clinic is working with the Campaign to Stop Killer Robots, a coalition of nongovernmental organizations, to increase momentum towards an eventual treaty banning fully autonomous weapons.

On Monday, before the conference began, the Clinic and Human Rights Watch released “Shaking the Foundations: The Human Rights Implications of Killer Robots.” The report found that fully autonomous weapons threaten fundamental human rights and principles: the right to life, the right to a remedy, and the principle of dignity.

May 12, 2014

Keep “Killer Robots” Out of Policing

 

PRESS RELEASE

 

Keep ‘Killer Robots’ Out of Policing

Fully Autonomous Weapons Threaten Rights in Peace, War

 

(Geneva, May 12, 2014)Fully autonomous weapons, or “killer robots,” would jeopardize basic human rights, whether used in wartime or for law enforcement, Human Rights Watch said in a report released today, on the eve of the first multilateral meeting on the subject at the United Nations.

The 26-page report, “Shaking the Foundations: The Human Rights Implications of Killer Robots,” is the first report to assess in detail the risks posed by these weapons during law enforcement operations, expanding the debate beyond the battlefield. Human Rights Watch found that fully autonomous weapons would threaten rights and principles under international law as fundamental as the right to life, the right to a remedy, and the principle of dignity.

“In policing, as well as war, human judgment is critically important to any decision to use a lethal weapon,” said Steve Goose, arms division director at Human Rights Watch. “Governments need to say no to fully autonomous weapons for any purpose and to preemptively ban them now, before it is too late.”

International debate over fully autonomous weapons has previously focused on their potential role in armed conflict and questions over whether they would be able to comply with international humanitarian law, also called the laws of war. Human Rights Watch, in the new report, examines the potential impact of fully autonomous weapons under human rights law, which applies during peacetime as well as armed conflict.

Nations should adopt a preemptive international ban on these weapons, which would be able to identify and fire on targets without meaningful human intervention, Human Rights Watch said. Countries are pursuing ever-greater autonomy in weapons, and precursors already exist.

The release of the report, co-published with Harvard Law School’s International Human Rights Clinic, coincides with the first multilateral meeting on the weapons. Many of the 117 countries that have joined the Convention on Conventional Weapons are expected to attend the meeting of experts on lethal autonomous weapons systems at the United Nations in Geneva from May 13 to 16, 2014. The members of the convention agreed at their annual meeting on November 2013 to begin work on the issue in 2014. Continue Reading…

Share By Email

loading
Close

November 13, 2013

Clinic and Human Rights Watch Urge International Talks on ‘Killer Robots’

Posted by Cara Solomon

Senior Clinical Instructor Bonnie Docherty is in Geneva today at the annual meeting of the Convention on Conventional Weapons, making the case for a pre-emptive ban on fully autonomous weapons, or “killer robots.” By her side are two students from the International Human Rights Clinic: Lara Berlin, JD ’13, and Ben Bastomski, JD ’15.

The Clinic has been working closely with Human Rights Watch (HRW) for more than a year on the threat of fully autonomous weapons, which would have the ability to identify and fire on human targets without intervention. Today, they released their latest joint paper on the topic and urged international talks to begin. Thanks to Bonnie, Lara, Ben, and Elina Katz, JD ’14, for their work on the paper.

For more information, read the HRW press release below.

 

 

PRESS RELEASE

UN: Start International Talks on ‘Killer Robots’

Conventional Weapons Meeting Provides Opportunity for Action

 

(Geneva, November 13, 2013) – Governments should agree this week to begin international discussions in 2014 on fully autonomous robot weapons, with a view to a future treaty banning the weapons, said Human Rights Watch today.

Human Rights Watch, together with the Harvard Law School International Human Rights Clinic, issued a report making the case for a pre-emptive ban to government delegates attending the annual meeting in Geneva of the Convention on Conventional Weapons (CCW).

“As technology races ahead, governments need to engage now in intensive discussions on the potential dangers of fully autonomous weapons,” said Mary Wareham, arms division advocacy director at Human Rights Watch and coordinator of the Campaign to Stop Killer Robots. “Deliberations about killer robots need to include nongovernmental groups, and be underpinned by a clear sense of urgency and purpose if they are to result in concrete action.” Continue Reading…

Share By Email

loading
Close

October 21, 2013

Clinic Calls for a Ban on Killer Robots

Posted by Bonnie Docherty

At a UN meeting in New York today, the International Human Rights Clinic and Human Rights Watch called for urgent action to stop the development of fully autonomous weapons, or “killer robots.” The Clinic and HRW released a question and answer document earlier in the day that makes plain the seriousness of the threat from these weapons, which would have the ability to identify and fire on human targets without intervention. The document builds on a Nov. 2012 report jointly published by the Clinic and HRW, entitled Losing Humanity: The Case Against Killer Robots.

Clinical students Kenny Pyetranker, J.D. ’13, Jonathan Nomamiukur, J.D. ’13, and Harin Song, J.D. ’14 contributed both research and writing to the paper released today. Please see here for the full press release from HRW.

 

Share By Email

loading
Close

April 16, 2013

U.S. Takes First Step on Fully Autonomous Weapons, but Stricter Controls Needed

Posted by Bonnie Docherty

Today we released a joint paper with Human Rights Watch advocating for stricter U.S. policy on fully autonomous weapons, sometimes known as “killer robots.” The paper critiques a new U.S. Department of Defense policy on these weapons, which represents a positive step but is not a panacea to the problem.

See below for the press release from Human Rights Watch, and stay tuned for news of the Campaign to Stop Killer Robots, which launches in London next week.

PRESS RELEASE

U.S.: Ban Fully Autonomous Weapons

U.S. Policy on Autonomy in Weapons Systems is First in the World

(Washington, DC, April 16, 2013) – Temporary US restrictions on lethal fully autonomous weapons should be strengthened and made permanent. Fully autonomous weapons, sometimes called “killer robots,” would be able to select and attack targets on their own without any human intervention.

In acknowledgement of the challenges such weapons would pose, the US Department of Defense issued a directive on November 21, 2012, that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force. This was the department’s first public policy on autonomy in weapons systems and the first policy announcement by any country on fully autonomous weapons.

“This policy shows that the United States shares our concern that fully autonomous weapons could endanger civilians in many ways,” said Steve Goose, Arms Division director at Human Rights Watch. “Humans should never delegate to machines the power to make life-and-death decisions on the battlefield. US policy should lay the basis for a permanent, comprehensive ban on fully autonomous weapons.”

The briefing paper by Human Rights Watch and the Harvard Law School International Human Rights Clinic reviews the content of the new directive and notes that it is a positive step. For up to 10 years, Directive Number 3000.09 generally allows the Department of Defense to develop or use only fully autonomous systems that deliver non-lethal force. In effect, it constitutes the world’s first moratorium on lethal fully autonomous weapons. Continue Reading…

Share By Email

loading
Close