THE ONLINE HOME OF THE HUMAN RIGHTS PROGRAM & INTERNATIONAL HUMAN RIGHTS CLINIC

Blog: Killer Robots

January 19, 2022

IHRC’s Bonnie Docherty Shares Thoughts on the Sixth Review Conference of the Convention on Conventional Weapons

By Sarah Foote with Bonnie Docherty

Countries party to the Convention on Conventional Weapons (CCW), a major international disarmament treaty, convened last month at the United Nations in Geneva for its Sixth Review Conference. They focused much of their attention on two topics: killer robots, which they refer to as lethal autonomous weapons systems, and incendiary weapons. Students from the International Human Rights Clinic, under the supervision of Bonnie Docherty, have contributed to civil society efforts to push for negotiations of a new treaty on killer robots, which would select and engage targets without meaningful human control. The Clinic and Human Rights Watch have also spearheaded advocacy to initiate a process to revisit and strengthen CCW Protocol III, which governs incendiary weapons. That protocol has loopholes that undermine its ability to protect civilians from the horrors of incendiary weapons, the source of excruciating burns and lifelong suffering. 

In the conversation below, Bonnie Docherty reflects on the Review Conference, its outcomes, and the next steps for these critical humanitarian issues.

Q. You weren’t able to travel to Geneva for the Review Conference of the Convention on Conventional Weapons held last December due to COVID. Were you able to watch the talks?

Bonnie Docherty: I watched all of the sessions from 4 am -12 pm for two and a half weeks through the UN Web TV live stream. Delegates from some countries and organizations did attend in person. However, due to COVID and Omicron, many civil societies representatives and diplomats did not attend for safety reasons. I participated actively through text messages, What’s App, emails, and meetings via Zoom with diplomats and colleagues. I used these tools to advocate for our issues and keep up-to-date with the people on the ground.

Although I could not make remote interventions myself, a Human Rights Watch representative read a statement that expressed our position on killer robots and incendiary weapons. A colleague from Mines Action Canada also delivered a statement I wrote on behalf of eight civil society organizations regarding incendiary weapons.

Lode Dewaegheneire of Mines Action Canada.

Q. What were the most important takeaways from the CCW discussions?

Bonnie Docherty: With regard to incendiary weapons, the outcome of the Review Conference on paper was disappointing because Russia refused to agree to put Protocol III on the agenda for next year. CCW operates by consensus so any one state can block progress. It was very discouraging after our all efforts to put forward a reasonable request—to hold dedicated discussions of the topic next year.

That said, there were powerful and encouraging statements from many states who supported having these discussions. There were impassioned pleas to stop the cruelty that incendiary weapons can cause. These countries understood the true human impact these types of weapons have, and this was important progress. They also recognized victims and the harm they have suffered.

Regarding autonomous weapons systems, the Review Conference made clear that progress on this issue cannot be made in a consensus body. Hopefully, the failure of the Conference to agree to negotiate a legally binding instrument will inspire states to go to a different forum and adopt a new treaty to make real change.

Continue Reading…

Share By Email

loading
Close

December 1, 2021

Killer Robots: Negotiate New Law to Protect Humanity

Legal Uncertainty, Growing Concerns Show Urgent Need for Regulation

Governments should agree to open negotiations on a new treaty to retain meaningful human control over the use of force, Human Rights Watch and the International Human Rights Clinic at Harvard Law School said in a report released today. Countries will be meeting at the United Nations in Geneva in December 2021 to decide whether to begin negotiations to adopt new international law on lethal autonomous weapons systems, also known as “killer robots.”

The 23-page report, “Crunch Time on Killer Robots: Why New Law Is Needed and How It Can Be Achieved,” by Human Rights Watch and the Harvard Law School International Human Rights Clinic, finds that international law should be strengthened and clarified to protect humanity from the dangers posed by weapons systems that select and engage targets without meaningful human control.

“After eight years discussing the far-reaching consequences of removing human control from the use of force, countries now need to decide how to respond to those threats,” said Bonnie Docherty, associate director of armed conflict and civilian protection at the Harvard International Human Rights Clinic and senior arms researcher at Human Rights Watch. “There’s an urgent need for a dedicated treaty to address the shortcomings of international humanitarian law and update it to deal with the legal, ethical, and societal challenges of today’s artificial intelligence and emerging technologies.”

At the United Nations in Geneva the Campaign to Stop Killer Robots called on governments to not allow the development of weapons systems that would select and attack targets without any human intervention. (c) 2018 Clare Conboy.

The Sixth Review Conference of the Convention on Conventional Weapons (CCW), scheduled to be held from December 13-17, is a major juncture for international talks on killer robots. At the last CCW meeting on killer robots in September, most countries that spoke called for a new legally binding instrument on autonomous weapons systems. Chile, Mexico, and Brazil urged treaty members to agree to initiate negotiations of new international law. Other proponents included the ‘Group of Ten’ states (Argentina, Costa Rica, Ecuador, El Salvador, Palestine, Panama, Peru, Philippines, Sierra Leone, and Uruguay) and states of the Non-Aligned Movement.  

There are various possible forums for negotiating a new treaty on autonomous weapons systems. Other than the CCW, options include a stand-alone process, as was used for the treaties banning antipersonnel landmines and cluster munitions, and the United Nations General Assembly, where the nuclear weapons ban treaty was negotiated.

Existing international humanitarian law is not adequate to address the problems posed by autonomous weapons systems, Human Rights Watch and the Harvard Clinic said. There is widespread support for developing new law and any divergence of views reinforces the need to clarify existing law. A new treaty would address the concerns raised by these weapons systems under international humanitarian law, ethics, international human rights law, accountability, and security.

Such a treaty should cover weapons systems that select and engage targets on the basis of sensor, rather than human, inputs. Most treaty proponents have called for a prohibition on weapons systems that by their nature select and engage targets without meaningful human control, such as complex systems using machine-learning algorithms that produce unpredictable or inexplicable effects.

Continue Reading…

Share By Email

loading
Close

August 2, 2021

Killer Robots: Urgent Need to Fast-Track Talks

Shared Vision Forms Sound Basis for Creating a New Ban Treaty

(Washington, DC, August 2, 2021) – Governments should make up for lost time by moving urgently to begin negotiations on a new treaty to retain meaningful human control over the use of force, the International Human Rights Clinic and Human Rights Watch said in a report released today. Representatives from approximately 50 countries will convene on August 3, 2021, at the United Nations in Geneva for their first official diplomatic meeting on lethal autonomous weapons systems, or “killer robots,” in nearly a year.

The 17-page report, “Areas of Alignment: Common Visions for a Killer Robots Treaty,” co-published by the two groups, describes the strong objections to delegating life-and-death decisions to machines expressed by governments at the last official Convention on Conventional Weapons (CCW) meeting on killer robots. That meeting, held in September 2020, featured proposals from many countries to negotiate a new international treaty to prohibit and restrict autonomous weapons.

“International law needs to be expanded to create new rules that ensure human control and accountability in the use of force,” said Bonnie Docherty, associate director of armed conflict and civilian protection at the Clinic and senior arms researcher at Human Rights Watch. “The fundamental moral, legal, and security concerns raised by autonomous weapons systems warrant a strong and urgent response in the form of a new international treaty.”

Nearly 100 countries have publicly expressed their views on killer robots since 2013. Most have repeatedly called for a new international treaty to retain meaningful human control over the use of force, including 32 that have explicitly called for a ban on lethal autonomous weapons systems. Yet a small number of militarily advanced countries – most notably Israel, Russia, and the United States – regard any move to create new international law as premature. They are investing heavily in the military applications of artificial intelligence and developing air, land, and sea-based autonomous weapons systems.

Governments have expressed support for banning autonomous systems that are legally or morally unacceptable, the groups said. There is strong interest in prohibiting weapons systems that by their nature select and engage targets without meaningful human control, including complex systems that use machine-learning algorithms to produce unpredictable or inexplicable effects. There are further calls to ban antipersonnel weapons systems that rely on profiles derived from biometric and other data collected by sensors to identify, select, and attack individuals or categories of people.

“Killing or injuring people based on data collected by sensors and processed by machines would violate human dignity,” Docherty said. “Relying on algorithms to target people will dehumanize warfare and erode our humanity.”

Continue Reading…

Share By Email

loading
Close

October 20, 2020

Killer Robots: Precedent for a Ban Treaty


Shared Concerns, Desire for Human Control Should Spur Regulation

A symbolic illustration of a man warding off a missile coming from a  person's mind.
(C) 2020 Brian Stauffer for Human Rights Watch.

(Washington, DC, October 20, 2020) – A treaty to ban fully autonomous weapons, or “killer robots,” is essential and achievable, the International Human Rights Clinic said in a report released today.

The 25-page report, “New Weapons, Proven Precedent: Elements of and Models for a Treaty on Killer Robots,” outlines key elements for a future treaty to maintain meaningful human control over the use of force and prohibit weapons systems that operate without such control. It should consist of both positive obligations and prohibitions as well as elaborate on the components of “meaningful human control.”

“International law was written for humans, not machines, and needs to be strengthened to retain meaningful human control over the use of force,” said Bonnie Docherty, associate director of armed conflict and civilian protection in the International Human Rights Clinic at Harvard Law School. “A new international treaty is the only effective way to prevent the delegation of life-and-death decisions to machines.”

The report was co-published with Human Rights Watch, for which Docherty is senior arms researcher. HRW coordinates the Campaign to Stop Killer Robots.

Continue Reading…

Share By Email

loading
Close

June 20, 2016

Losing Control: The Dangers of Killer Robots

Posted by Bonnie Docherty

This piece originally appeared in The Conversation on June 16, 2016

New technology could lead humans to relinquish control over decisions to use lethal force. As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching. Fully autonomous weapons, also known as “killer robots,” are quickly moving from the realm of science fiction toward reality

These weapons, which could operate on land, in the air or at sea, threaten to revolutionize armed conflict and law enforcement in alarming ways. Proponents say these killer robots are necessary because modern combat moves so quickly, and because having robots do the fighting would keep soldiers and police officers out of harm’s way. But the threats to humanity would outweigh any military or law enforcement benefits.

Removing humans from the targeting decision would create a dangerous world. Machines would make life-and-death determinations outside of human control. The risk of disproportionate harm or erroneous targeting of civilians would increase. No person could be held responsible.

Continue Reading…

Share By Email

loading
Close

January 7, 2016

“Fighting for Disarmament”: Bonnie Docherty’s work featured in Harvard Gazette


This Q & A by reporter Liz Mineo ran in the Harvard Gazette on January 3, 2015

After researching the devastating humanitarian effects of the deadly cluster munitions used in Afghanistan in 2002, Bonnie Docherty joined a worldwide campaign to eliminate them.

Six years after she started her probe, cluster bombs were banned. Her investigation on the use of cluster munitions in Afghanistan, and later in Iraq and Lebanon, was highly influential in a 2008 treaty, joined by 118 countries, that bans these weapons.

A woman sits in front of a table gesturing at various objects.
Bonnie showing examples of inert cluster munitions. Credit: Jon Chase, Harvard staff photographer

For Docherty, a lecturer on law and a senior instructor at the International Human Rights Clinic at Harvard Law School, the battle to protect civilians from unnecessary harm continues.

Last month, Docherty traveled to Geneva to advocate for stronger regulations on incendiary devices, which she calls “exceptionally cruel weapons” that have been used in Syria, Libya, and Ukraine.

Docherty, who is also a senior researcher in the arms division at Human Rights Watch, recently sat down for an interview to talk about these weapons, killer robots, and her guiding principle: to protect civilians from suffering caused by armed conflicts.

GAZETTE: Before you became a disarmament advocate, you were a reporter for a local newspaper. Can you tell us about this part of your life?

DOCHERTY: After college, I was a reporter for The Middlesex News, now the MetroWest Daily News, outside of Boston, for three years. I covered mostly local news, government meetings, environmental issues, but I had the opportunity to go to Bosnia and embed with the peacekeepers for about 10 days in 1998. There was an Army lab in my town, that’s how I got the invitation to go to Bosnia. I had been interested in armed conflicts, but that trip definitely increased my interest in that field.

GAZETTE: How did you make the jump from suburban journalism to human rights and disarmament issues?

DOCHERTY: After I left the newsroom, I went to Harvard Law School. Right after graduation, I went to Human Rights Watch, which was a perfect mix of journalism and law because you go out in the field and you apply the law to what you find. My start date was Sept. 12, 2001, by happenstance, so whatever was planned was changed. Six months later, I was in Afghanistan researching the use of cluster munitions, which was my first exposure to disarmament issues.

GAZETTE: What are cluster munitions, and why are they so dangerous?

DOCHERTY: Cluster munitions are large weapons, such as bombs or rockets that contain dozens or hundreds of small munitions called submunitions. They’re problematic because they have a broad area effect — they spread over the size of a football field — and because many of them don’t explode on impact and lie around like landmines and explode in years or decades to come.

GAZETTE: How did your involvement with cluster munitions begin?

A yellow object is held by hands.
Bonnie holds an inert submunition fragment from a cluster munition. Credit: Jon Chase, Harvard staff photographer

DOCHERTY: I went to Afghanistan, Iraq, Lebanon, and later Georgia to document the use of these weapons. I’ve spoken with dozens of victims of cluster munitions, but the story I remember the most is when I was in Lebanon with two students from Harvard Law’s International Human Rights Clinic in 2006. We were there doing field research after Israel used cluster munitions in Lebanon. We were at a restaurant, and someone asked us to go to the town of Halta immediately. When we arrived, we found out that two hours earlier a 12-year-old boy had been killed by a cluster submunition. He had been playing with his brother, who had been throwing pinecones at him. The boy picked up something to throw back at his brother. It turned out to be a submunition. His friend said, “Oh, no. That’s dangerous, drop it,” and when he went to throw it away, it exploded next to his head. When we were there, they were still cleaning up the pool of blood from his body. The Lebanese army found 10, 12 submunitions lying around right next to a village, waiting to kill or injure civilians, farmers, children.

GAZETTE: Your research on cluster munitions led you to become one of the world’s most widely known advocates against these weapons. How did this happen?

Continue Reading…

Share By Email

loading
Close

November 10, 2015

Clinic and HRW: Ramp up Action to Ban Killer Robots


PRESS RELEASE


Ramp Up Action to Ban Killer Robots


Blinding Lasers Prohibition Offers Precedent


(Geneva, November 9, 2015) – Governments should agree to expand and formalize their international deliberations on fully autonomous weapons, with the ultimate aim of preemptively banning them, Human Rights Watch and the International Human Rights Clinic at Harvard Law School said in a joint report released today. These weapons, also known as lethal autonomous weapons systems or killer robots, would be able to select and attack targets without further human intervention.

The 18-page report, “Precedent for Preemption,” details why countries agreed to preemptively ban blinding laser weapons in 1995 and says that the process could be a model for current efforts to prohibit fully autonomous weapons. Countries participating in the annual meeting of the Convention on Conventional Weapons (CCW) will decide by consensus on November 13, 2015, whether to continue their deliberations on lethal autonomous weapons systems next year.

“Concerns over fully autonomous weapons have pushed them to the top of the international disarmament agenda, but countries need to pick up the pace of discussions,” said Bonnie Docherty, senior clinical instructor at Harvard Law School, and senior Arms Division researcher at Human Rights Watch, which is a co-founder of the Campaign to Stop Killer Robots. “Governments can take direct action now with commitments to ban weapons with no meaningful human control over whom to target and when to attack.”

A large meeting room with delegates in formal wear look on at a screen. Many have computers in front of them.
The second Convention on Conventional Weapons informal meeting of experts on lethal autonomous weapons systems at the UN in Geneva in April 2015. © 2015 United Nations Office at Geneva

The report calls on countries to initiate a more robust process through creation of a group of governmental experts on fully autonomous weapons under the CCW.

Artificial intelligence experts, roboticists, and other scientists predict that fully autonomous weapons could be developed within years, not decades. The preemptive ban on blinding lasers, which is in a protocol attached to the conventional weapons treaty, shows that a prohibition on future weapons is possible.

“The prospect of fully autonomous weapons raises many of the same concerns as blinding lasers did two decades ago,” said Docherty, lead author of the new report exploring the history of the prohibition on lasers that would permanently blind their victims. “Countries should adopt the same solution by banning fully autonomous weapons before they reach the battlefield.”

The report shows that threats to the principles of humanity and dictates of public conscience, as well as notions of abhorrence and social unacceptability, helped drive countries to ban blinding lasers. Fully autonomous weapons present similar dangers.

Countries were further motivated by the risk of widespread proliferation of blinding lasers to parties that have little regard for international law, a risk echoed in discussions of fully autonomous weapons, Human Rights Watch and the Harvard Law School clinic said. As with blinding lasers 20 years ago, a ban on fully autonomous weapons could clarify and strengthen existing law without limiting the development of related legitimate technology.

The groups acknowledged notable differences in the specific legal problems and technological character of the two weapons but found that those differences make banning fully autonomous weapons even more critical.

In other publications, the Clinic and Human Rights Watch have elaborated on the challenges that fully autonomous weapons would face in complying with international humanitarian law and international human rights law and analyzed the lack of accountability that would exist for the unlawful harm caused by such weapons.

Several of the 121 countries that have joined the CCW – including the United States, United Kingdom, China, Israel, Russia, and South Korea – are developing weapons systems with various degrees of autonomy and lethality. The countries party to the treaty held nine days of informal talks on lethal autonomous weapons systems in 2014 and 2015, but they should now ramp up their deliberations, Human Rights Watch and the Harvard clinic said.

Docherty and Steve Goose, director of the arms division at Human Rights Watch, will present the report at a side event briefing at 2 p.m. on November 9 in Conference Room XI at the United Nations in Geneva. At the end of the week, Goose will assess the meeting’s decision on fully autonomous weapons, joined by other Campaign to Stop Killer Robots representatives, at a side event briefing at 1 p.m. on November 13 in Conference Room XI.

Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition” is available at:
www.hrw.org/node/283112/

NOTE: Mana Azarmi, JD ’16, Federica du Pasquier, MA ’16, and Marium Khawaja, LLM ’16, contributed research to this report.

For more Human Rights Watch reporting on fully autonomous weapons, please visit:
http://www.hrw.org/topic/arms/killer-robots

For more information on the Campaign to Stop Killer Robots, please visit:
www.stopkillerrobots.org

For more information, please contact:
In Geneva, Bonnie Docherty (English): +1-617-669-1636 (mobile); or [email protected]

Share By Email

loading
Close

April 9, 2015

Clinic and HRW Release Report: “Mind the Gap: The Lack of Accountability for Killer Robots”


PRESS RELEASE


The “Killer Robots” Accountability Gap


Obstacles to Legal Responsibility Show Need for Ban

(Geneva, April 9, 2015) – Programmers, manufacturers, and military personnel could all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots,” Human Rights Watch said in a report released today. The report was issued in advance of a multilateral meeting on the weapons at the United Nations in Geneva.

Report cover shows a robot overtaking an office buildin.

The 38-page report, “Mind the Gap: The Lack of Accountability for Killer Robots,” details significant hurdles to assigning personal accountability for the actions of fully autonomous weapons under both criminal and civil law. It also elaborates on the consequences of failing to assign legal responsibility. The report is jointly published by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.

“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” said Bonnie Docherty, senior Arms Division researcher at Human Rights Watch and the report’s lead author. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”

Continue Reading…

June 4, 2014

Taking on “Killer Robots”

Posted by Bonnie Docherty

Last month, Senior Clinical Instructor Bonnie Docherty traveled with students to Geneva for the first multilateral meeting of the Convention on Conventional Weapons devoted to fully autonomous weapons, or “killer robots.” Below is her re-cap of the week’s events, published originally on May 23, 2014 in the online forum Just Security.

Continue Reading…

Share By Email

loading
Close

May 14, 2014

A Second Release in Geneva: “Advancing the Debate on Killer Robots”

Posted by Joseph Klingler, JD '14

In Geneva today, the Clinic and Human Rights Watch released the latest in a series of publications calling for a preemptive ban on the development, production, and use of fully autonomous weapons. The weapons- also called “killer robots”- would be capable of selecting and firing upon targets without any meaningful human control.

The joint paper, entitled “Advancing the Debate on Killer Robots,” systematically rebuts 12 arguments that have been raised by critics of a ban. Its release coincides with a major international disarmament conference dedicated to fully autonomous weapons, being held at the UN in Geneva this week. More than 400 delegates from government, international organizations, and civil society have gathered to discuss the weapons under the framework of the Convention on Conventional Weapons, a treaty that governs problematic weapons.

A group of campaigners holds

Clinical students Evelyn Kachaje, JD ’15, and Joseph Klingler, JD ’14, who along with Yukti Choudhary, LLM ’14 helped Senior Clinical Instructor Bonnie Docherty draft the paper, are attending the talks. The Clinic is working with the Campaign to Stop Killer Robots, a coalition of nongovernmental organizations, to increase momentum towards an eventual treaty banning fully autonomous weapons.

On Monday, before the conference began, the Clinic and Human Rights Watch released “Shaking the Foundations: The Human Rights Implications of Killer Robots.” The report found that fully autonomous weapons threaten fundamental human rights and principles: the right to life, the right to a remedy, and the principle of dignity.

Share By Email

loading
Close