Blog: Human Rights Watch

June 16, 2014

The Human Rights Implications of Killer Robots

Posted by Cara Solomon

Last week, the UN Human Rights Council took a fresh look at fully autonomous weapons, or “killer robots.” Previous international debate had focused on the weapons’ ability to comply with laws of war; the Council, by contrast, examined the issue through the lens of international human rights law, which applies in times of peace as well as armed conflict. In this June 9 post originally published by JURIST, Senior Clinical Instructor Bonnie Docherty argued that killer robots threaten the most fundamental human rights.

 

Fully autonomous weapons, which could select and fire on targets without meaningful human intervention, have the potential to revolutionize the nature of warfare, bringing greater speed and reach to military operations. In the process, though, this emerging technology could endanger both civilians and soldiers.

Nations have been considering the multiple challenges these weapons would pose to the laws of war, also called international humanitarian law. But little attention has been given to the implications for human rights law. If these weapons were developed and used for policing, for example, they would threaten the most basic of these rights, including the right to life, the right to a remedy and the principle of human dignity.

Fully autonomous weapons, also known as autonomous weapons systems or “killer robots,” do not yet exist, but research and technology in a number of countries are moving rapidly in that direction. Because these machines would have the power to determine when to kill, they raise a host of legal, ethical and scientific concerns. Human Rights Watch and Harvard Law School’s International Human Rights Clinic are advocating for a pre-emptive prohibition on fully autonomous weapons. The Campaign to Stop Killer Robots, a global coalition of 52 nongovernmental organizations coordinated by Human Rights Watch, is making the same call. Continue Reading…

Share By Email

loading
Close

June 4, 2014

Taking on “Killer Robots”

Posted by Bonnie Docherty

As readers of this blog will know, last month Senior Clinical Instructor Bonnie Docherty traveled with students to Geneva for the first multilateral meeting of the Convention on Conventional Weapons devoted to fully autonomous weapons, or “killer robots.” Below is her re-cap of the week’s events, published originally on May 23, 2014 in the online forum Just Security.

 

“Taking on ‘Killer Robots'”

 

New weapons that could revolutionize killing are on the horizon. Lethal autonomous weapons systems, also called fully autonomous weapons or “killer robots,” would go beyond today’s armed drones. They would be able to select and fire on targets without meaningful human intervention. In other words, they could determine themselves when to take a human life.

Representatives from 87 countries gathered at the United Nations in Geneva last week to discuss concerns about this technology and possible ways to respond. The conference was the first multilateral meeting dedicated to lethal autonomous weapons systems. It represented a crucial step in a process that should result in a ban on these problematic weapons before it grows too late to change course.

Human Rights Watch and Harvard Law School’s International Human Rights Clinic are calling for a pre-emptive prohibition on the development, production, and use of these weapons. The Campaign to Stop Killer Robots, a global coalition of 51 nongovernmental organizations coordinated by Human Rights Watch, is making the same call.

Overall, the talks in Geneva were productive and positive. The conference, under the auspices of the Convention on Conventional Weapons (CCW), attracted hundreds of delegates from governments, the United Nations, the International Committee of the Red Cross, and nongovernmental groups, setting a record for a CCW meeting. Participants engaged in four days of substantive discussions about the technical, ethical, legal, and operational concerns raised by fully autonomous weapons.

This “informal meeting of experts” was also noteworthy for its timeliness, unusual for a CCW conference. This meeting took place just a year and a half after Human Rights Watch and the Harvard clinic issued a groundbreaking report on these weapons, Losing Humanity: The Case against Killer Robots, which the UN website credited with bringing the issue to “the international community’s attention.”

The meeting illuminated both areas of emerging agreement and ongoing points of contention. At their next meeting in November, states parties to the Convention on Conventional Weapons should show that they are serious about taking action to deal with fully autonomous weapons and adopt a mandate for even deeper discussions in 2015.

Areas of Emerging Agreement

Four promising themes emerged at the recent meeting. First, there was widespread support for continuing discussions. The countries made clear that they saw last week as merely an initial foray into the issue. Many delegates also explicitly recognized the importance of continuing to involve nongovernmental groups, including the Campaign to Stop Killer Robots and its member organizations.

Second, a significant number of countries expressed particular concern about the ethical problems raised by fully autonomous weapons. The chair’s final report noted that these countries “stressed the fact that the possibility for a robotic system to acquire capacities of ‘moral reasoning’ and ‘judgment’ was highly questionable.” Furthermore, these machines could not understand and respect the value of life, yet they would be given the power to determine when to take it away. Fully autonomous weapons would thus threaten to undermine human dignity.

Third, many countries emphasized that weapons systems should always fall under “meaningful human control.” While the parameters of this concept will require careful definition, obligating nations to maintain that control is vital to averting a watershed in the nature of warfare that could endanger civilians and soldiers alike.

Finally, countries frequently noted in their statements the relevance of international human rights law as well as international humanitarian law. Human rights law applies in peace and war, and it would govern the use of these weapons not only on the battlefield but also in law enforcement operations. In a new report released last week, Shaking the Foundations: The Human Rights Implications of Killer Robots, Human Rights Watch and the Harvard clinic found that fully autonomous weapons could contravene the rights to life and a remedy as well as the principle of dignity.

Legal Debate

The most contentious part of the discussion surrounded the application of international humanitarian law to fully autonomous weapons. The debate echoed many of the points raised in a second paper that Human Rights Watch and the Harvard clinic released at the meeting. “Advancing the Debate on Killer Robots” responds directly to 12 critiques of a ban on the weapons.

The meeting revealed a divergence of views about the adequacy of international humanitarian law to deal with fully autonomous weapons. Critics of a ban argue that problematic use of these weapons would violate existing law and that supplementary law is unnecessary. A new treaty banning the weapons, however, would bring clarity, minimizing the need for case-by-case determinations of lawfulness and facilitating enforcement. It would also increase the stigma against the weapon, which can influence even states not party to a treaty to abide by a ban. In addition, a treaty dedicated to fully autonomous weapons could address proliferation, unlike traditional international humanitarian law, which focuses on use.

The debate about the adequacy of international humanitarian law to deal with fully autonomous weapons is reminiscent of arguments made in earlier Convention on Conventional Weapons meetings about cluster munitions. The adoption of the 2008 Convention on Cluster Munitions by 107 states resolved that dispute. Prohibitions on five other weapons that cause unacceptable humanitarian harm—antipersonnel landmines, blinding lasers, chemical weapons, biological weapons, and poison gas— provide additional precedent for new law. While most states are reserving judgment on the best solution to deal with the problems posed by fully autonomous weapons, five countries called for a ban last week.

Participants in the last week’s meeting also disagreed about when action should be taken. Critics of a ban supported a wait-and-see approach, arguing that improvements in technology could address the obstacles to compliance with international humanitarian law. There are serious doubts, however, that robots could ever replicate certain complex human qualities, such as judgment, necessary to comply with principles of distinction and proportionality. Furthermore, grave ethical concerns, the likelihood of proliferation and a robotic arms race, an accountability gap, and the prospect of premature deployment all suggest a technological fix would not suffice to address the weapons’ problems.

Action should be taken now before countries invest more in the technology and become less willing to give it up. The pre-emptive ban on blinding lasers in Protocol IV to the Convention on Conventional Weapons can serve as a useful model.

Next Steps

Despite some points of disagreement, the meeting advanced efforts to deal with fully autonomous weapons. Nations need to keep up momentum, however, to avoid having such meetings become what some have called a “talk shop.” In the short term, individual countries should establish national moratoria on fully autonomous weapons.

In November, the parties to the Convention on Conventional Weapons should adopt a mandate to study the issue in greater depth in 2015. They should agree to hold three to four weeks of formal meetings, known as a Group of Governmental Experts. They should also be clear that the meetings would be a step toward negotiating a new protocol on fully autonomous weapons. Such intense discussions would move the debate forward. They would show that the treaty members are committed to addressing this issue and that the Convention on Conventional Weapons is re-emerging as an important source of international humanitarian law.

Continue Reading…

Share By Email

loading
Close

May 12, 2014

Keep “Killer Robots” Out of Policing

 

PRESS RELEASE

 

Keep ‘Killer Robots’ Out of Policing

Fully Autonomous Weapons Threaten Rights in Peace, War

 

(Geneva, May 12, 2014)Fully autonomous weapons, or “killer robots,” would jeopardize basic human rights, whether used in wartime or for law enforcement, Human Rights Watch said in a report released today, on the eve of the first multilateral meeting on the subject at the United Nations.

The 26-page report, “Shaking the Foundations: The Human Rights Implications of Killer Robots,” is the first report to assess in detail the risks posed by these weapons during law enforcement operations, expanding the debate beyond the battlefield. Human Rights Watch found that fully autonomous weapons would threaten rights and principles under international law as fundamental as the right to life, the right to a remedy, and the principle of dignity.

“In policing, as well as war, human judgment is critically important to any decision to use a lethal weapon,” said Steve Goose, arms division director at Human Rights Watch. “Governments need to say no to fully autonomous weapons for any purpose and to preemptively ban them now, before it is too late.”

International debate over fully autonomous weapons has previously focused on their potential role in armed conflict and questions over whether they would be able to comply with international humanitarian law, also called the laws of war. Human Rights Watch, in the new report, examines the potential impact of fully autonomous weapons under human rights law, which applies during peacetime as well as armed conflict.

Nations should adopt a preemptive international ban on these weapons, which would be able to identify and fire on targets without meaningful human intervention, Human Rights Watch said. Countries are pursuing ever-greater autonomy in weapons, and precursors already exist.

The release of the report, co-published with Harvard Law School’s International Human Rights Clinic, coincides with the first multilateral meeting on the weapons. Many of the 117 countries that have joined the Convention on Conventional Weapons are expected to attend the meeting of experts on lethal autonomous weapons systems at the United Nations in Geneva from May 13 to 16, 2014. The members of the convention agreed at their annual meeting on November 2013 to begin work on the issue in 2014. Continue Reading…

Share By Email

loading
Close

November 13, 2013

Clinic and Human Rights Watch Urge International Talks on ‘Killer Robots’

Posted by Cara Solomon

Senior Clinical Instructor Bonnie Docherty is in Geneva today at the annual meeting of the Convention on Conventional Weapons, making the case for a pre-emptive ban on fully autonomous weapons, or “killer robots.” By her side are two students from the International Human Rights Clinic: Lara Berlin, JD ’13, and Ben Bastomski, JD ’15.

The Clinic has been working closely with Human Rights Watch (HRW) for more than a year on the threat of fully autonomous weapons, which would have the ability to identify and fire on human targets without intervention. Today, they released their latest joint paper on the topic and urged international talks to begin. Thanks to Bonnie, Lara, Ben, and Elina Katz, JD ’14, for their work on the paper.

For more information, read the HRW press release below.

 

 

PRESS RELEASE

UN: Start International Talks on ‘Killer Robots’

Conventional Weapons Meeting Provides Opportunity for Action

 

(Geneva, November 13, 2013) – Governments should agree this week to begin international discussions in 2014 on fully autonomous robot weapons, with a view to a future treaty banning the weapons, said Human Rights Watch today.

Human Rights Watch, together with the Harvard Law School International Human Rights Clinic, issued a report making the case for a pre-emptive ban to government delegates attending the annual meeting in Geneva of the Convention on Conventional Weapons (CCW).

“As technology races ahead, governments need to engage now in intensive discussions on the potential dangers of fully autonomous weapons,” said Mary Wareham, arms division advocacy director at Human Rights Watch and coordinator of the Campaign to Stop Killer Robots. “Deliberations about killer robots need to include nongovernmental groups, and be underpinned by a clear sense of urgency and purpose if they are to result in concrete action.” Continue Reading…

Share By Email

loading
Close

May 29, 2013

Using the Tools of the Trade: The Campaign to Stop Killer Robots

Posted by Jonathan Nomamiukor, JD ’13

Last summer, after two years at Harvard Law School, I elected to take a leave of absence to join President Obama’s re-election campaign. My decision had less to do with any affinity for the President and more to do with my disillusionment with law school in general. I had enrolled with aspirations to enter public service, believing that by simply attending classes in the same building as Charles Hamilton Houston, the famed civil rights lawyer, I’d follow in his footsteps.

After a month of lectures about water property lines, chicken sexing, and figuring out whether a tomato was a fruit or a vegetable, I began to question whether law school was really the right choice for me. If my goal was to combat systemic inequities, could an education that focused on how to work within the status quo—rather than challenge it—be the best path? As the saying goes: will the master’s tools ever be good enough to dismantle the master’s house?

In London recently, I had the opportunity to find out. I traveled there with a team from the International Human Rights Clinic, which I joined after returning to HLS in January. For the past few months, we had been working on the controversial topic of fully autonomous weapons, which are essentially drones that can target and kill without any human intervention. These weapons don’t exist yet, but technology is moving rapidly in that direction, and precursors are already in use.

A coalition of nongovernmental organizations (NGOs) had gathered to launch a campaign to ban these “killer robots,” and I was there with my clinical supervisor, Bonnie Docherty, also a senior arms researcher at Human Rights Watch, to participate in it. At a pre-launch forum for campaigners, Docherty was busy giving a presentation in one room while I slipped into a session on the ethics involved with fully autonomous weapons. Continue Reading…

Share By Email

loading
Close

April 16, 2013

U.S. Takes First Step on Fully Autonomous Weapons, but Stricter Controls Needed

Posted by Bonnie Docherty

Today we released a joint paper with Human Rights Watch advocating for stricter U.S. policy on fully autonomous weapons, sometimes known as “killer robots.” The paper critiques a new U.S. Department of Defense policy on these weapons, which represents a positive step but is not a panacea to the problem.

See below for the press release from Human Rights Watch, and stay tuned for news of the Campaign to Stop Killer Robots, which launches in London next week.

PRESS RELEASE

U.S.: Ban Fully Autonomous Weapons

U.S. Policy on Autonomy in Weapons Systems is First in the World

(Washington, DC, April 16, 2013) – Temporary US restrictions on lethal fully autonomous weapons should be strengthened and made permanent. Fully autonomous weapons, sometimes called “killer robots,” would be able to select and attack targets on their own without any human intervention.

In acknowledgement of the challenges such weapons would pose, the US Department of Defense issued a directive on November 21, 2012, that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force. This was the department’s first public policy on autonomy in weapons systems and the first policy announcement by any country on fully autonomous weapons.

“This policy shows that the United States shares our concern that fully autonomous weapons could endanger civilians in many ways,” said Steve Goose, Arms Division director at Human Rights Watch. “Humans should never delegate to machines the power to make life-and-death decisions on the battlefield. US policy should lay the basis for a permanent, comprehensive ban on fully autonomous weapons.”

The briefing paper by Human Rights Watch and the Harvard Law School International Human Rights Clinic reviews the content of the new directive and notes that it is a positive step. For up to 10 years, Directive Number 3000.09 generally allows the Department of Defense to develop or use only fully autonomous systems that deliver non-lethal force. In effect, it constitutes the world’s first moratorium on lethal fully autonomous weapons. Continue Reading…

Share By Email

loading
Close

March 21, 2013

Clinic and Human Rights Watch: Obama Should Urge Jordan to Stop Sending Asylum Seekers Back to Syria

Posted by Meera Shah

In a joint press release with Human Rights Watch today, the International Human Rights Clinic called on President Obama to use his visit to Jordan as an opportunity to urge the Jordanian government to stop returning asylum seekers to Syria.

While Jordan has accommodated more than 350,000 refugees since the start of the Syrian conflict in March 2011, it is routinely and unlawfully rejecting Palestinian refugees, single men, and undocumented people seeking asylum at its border with Syria. Based in part on the Clinic’s field research conducted in Jordan and Lebanon over January term, the extended press release documents the difficulties faced by asylum seekers in these categories as they attempt to flee the fighting in Syria.

Below you’ll find the first part of the press release. Here is the full document.

PRESS RELEASE

Jordan: Obama Should Press King on Asylum Seeker Pushbacks

Palestinian Refugees, Single Men, and Undocumented Unlawfully Forced Back to Syria

(New York, March 21, 2013) – Jordan is routinely and unlawfully rejecting Palestinian refugees, single males, and undocumented people seeking asylum at its border with Syria, Human Rights Watch and Harvard Law School’s International Human Rights Clinic (the Harvard Clinic) said today.

While attention during U.S. President Barack Obama’s visit to Jordan on March 22, 2013 will focus on the large number of Syrian refugees that Jordan has welcomed and accommodated since the start of the Syrian crisis in March 2011, its rejection of these categories of asylum seekers fleeing the violence should not be ignored, Human Rights Watch and the Harvard Clinic said.  President Obama should seek assurances from King Abdullah that Jordan will not reject any asylum seekers at its border with Syria. The risks to their lives in Syria are too serious to send anyone back at the present time.

“King Abdullah’s support for 350,000 Syrian refugees deserves President Obama’s praise, but Obama should not give Jordan a free pass to force Palestinian refugees and asylum seekers back to Syria,” said Bill Frelick, refugee program director at Human Rights Watch. “Jordan should recognize that everyone —  and that includes Palestinian refugees, single men, and undocumented people — has the right not to be forcibly sent back to Syria to face the risk of death or serious harm.”

In two separate trips to Jordan and Lebanon, in January and February, Human Rights Watch and the Harvard Clinic conducted in-depth interviews with more than 120 Syrian and Palestinian refugees from Syria. Human Rights Watch and the Harvard Clinic documented that, as a matter of policy, Jordan is turning back people from Syria at its border without adequately considering the risk to them. Such a policy violates the international law principle of nonrefoulement, which forbids governments from returning refugees and asylum seekers to places where their lives or freedom would be threatened.

Continue Reading…

Share By Email

loading
Close

November 21, 2012

Op-Ed: The Trouble with Killer Robots

Posted by Cara Solomon

Since its release on Monday, the Clinic’s joint report with Human Rights Watch on “killer robots” has been attracting quite a bit of attention. Check out articles in The Guardian and AFP, as well as segments on Democracy Now and the BBC.

Bonnie also wrote an excellent Op-Ed about the issue for Foreign Policy magazine, which is reprinted in full below.

The Trouble with Killer Robots

Why we need to ban fully autonomous weapons systems, before it’s too late

by Bonnie Docherty

Imagine a mother who sees her children playing with toy guns as a military force approaches their village. Terrified, she sprints toward the scene, yelling at them to hurry home. A human soldier would recognize her fear and realize that her actions are harmless. A robot, unable to understand human intentions, would observe only figures, guns, and rapid movement. While the human soldier would probably hold fire, the robot might shoot the woman and her children.

Despite such obvious risks to civilians, militaries are already planning for a day when sentry robots stand guard at borders, ready to identify intruders and to kill them, without an order from a human soldier. Unmanned aircraft, controlled only by pre-programmed algorithms, might carry up to 4,500 pounds of bombs that they could drop without real time authorization from commanders. Continue Reading…

Share By Email

loading
Close

November 19, 2012

“The Case Against Killer Robots”: An International Human Rights Clinic and Human Rights Watch Report

PRESS RELEASE

Ban Killer Robots Before It’s Too Late

Fully Autonomous Weapons Would Increase Danger to Civilians

November 19, Washington, DC – Governments should pre-emptively ban fully autonomous weapons because of the danger they pose to civilians in armed conflict, Human Rights Watch and the International Human Rights Clinic at Harvard Law School said in a report released today. These future weapons, sometimes called “killer robots,” would be able to choose and fire on targets without human intervention.

The 50-page report, “Losing Humanity: The Case Against Killer Robots,” outlines concerns about these fully autonomous weapons, which would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians. In addition, the obstacles to holding anyone accountable for harm caused by the weapons would weaken the law’s power to deter future violations.

“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”

“Losing Humanity is the first major publication about fully autonomous weapons by a nongovernmental organization and is based on extensive research into the law, technology, and ethics of these proposed weapons. It is jointly published by Human Rights Watch and the Harvard Law School International Human Rights Clinic.

Human Rights Watch and the International Human Rights Clinic called for an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons. They also called on individual nations to pass laws and adopt policies as important measures to prevent development, production, and use of such weapons at the domestic level.

“It’s critical to take action now,” said Bonnie Docherty, senior clinical instructor at the International Human Rights Clinic and senior researcher at Human Rights Watch. “The technology is alluring, and the more nations invest in it, the harder it will be to convince them to give it up.” Continue Reading…

Share By Email

loading
Close

November 1, 2012

Today, Friday, Nov. 2: The International Job Search; Event on Syria Cancelled

Posted by Cara Solomon

Unfortunately, today’s event on Syria and the limits of human rights advocacy has been cancelled, and likely rescheduled for December 7. We’ll keep you updated on that. But please note there’s another important event at noon today, geared mostly toward 1Ls who are interested in a public interest job abroad. Details below:

“The International Job Search”

An Informational Session with Representatives from HRP, OPIA, International Legal Studies and the Office of Student Financial Services

12- 1 pm

Austin West

This session is designed to provide advice to 1Ls seeking summer public interest opportunities abroad. Topics for discussion include:

• How to determine what kinds of jobs to pursue
• How to research employers
• How to land a summer job
• Application logistics, including timing
• Summer funding programs, including the Chayes International Public Service Fellowship and Human Rights Program Summer Internships

Share By Email

loading
Close