- Clifton Centennial T-Shirts!Posted 2 weeks ago
- Check Out March Horoscope!Posted 3 weeks ago
- 2017 – Year of KosciuszkoPosted 1 month ago
- Clifton Centennial Events!Posted 2 months ago
- Truth About German Nazi CampsPosted 2 months ago
- Nothing’s Impossible Says WisniewskiPosted 4 months ago
- First Ever English Language PodcastPosted 9 months ago
Fully Autonomous Weapons Threaten Rights in Peace, War
Geneva, May 12, 2014) – Fully autonomous weapons, or “killer robots,” would jeopardize basic human rights, whether used in wartime or for law enforcement, Human Rights Watch said in a report released today, on the eve of the first multilateral meeting on the subject at the United Nations.
The 26-page report, “Shaking the Foundations: The Human Rights Implications of Killer Robots,” is the first report to assess in detail the risks posed by these weapons during law enforcement operations, expanding the debate beyond the battlefield. Human Rights Watch found that fully autonomous weapons would threaten rights and principles under international law as fundamental as the right to life, the right to a remedy, and the principle of dignity.
“In policing, as well as war, human judgment is critically important to any decision to use a lethal weapon,” said Steve Goose, arms division director at Human Rights Watch. “Governments need to say no to fully autonomous weapons for any purpose and to preemptively ban them now, before it is too late.”
International debate over fully autonomous weapons has previously focused on their potential role in armed conflict and questions over whether they would be able to comply with international humanitarian law, also called the laws of war. Human Rights Watch, in the new report, examines the potential impact of fully autonomous weapons under human rights law, which applies during peacetime as well as armed conflict.
Nations should adopt a preemptive international ban on these weapons, which would be able to identify and fire on targets without meaningful human intervention, Human Rights Watch said. Countries are pursuing ever-greater autonomy in weapons, and precursors already exist.
The release of the report, co-published with Harvard Law School’s International Human Rights Clinic, coincides with the first multilateral meeting on the weapons. Many of the 117 countries that have joined the Convention on Conventional Weapons are expected to attend the meeting of experts on lethal autonomous weapons systems at the United Nations in Geneva from May 13 to 16, 2014. The members of the convention agreed at their annual meeting on November 2013 to begin work on the issue in 2014.
Human Rights Watch believes the agreement to work on these weapons in the Convention on Conventional Weapons forum could eventually lead to new international law prohibiting fully autonomous weapons. The convention preemptively banned blinding lasers in 1995.
Human Rights Watch is a founding member and coordinator of the Campaign to Stop Killer Robots. This coalition of 51 nongovernmental organizations in two dozen countries calls for a preemptive ban on the development, production, and use of fully autonomous weapons.
Human Rights Watch issued its first report on the subject, “Losing Humanity: The Case against Killer Robots,” in November 2012. In April 2013, Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, issued a report citing a wide range of objections to the weapons, and called for all nations to adopt national moratoria and begin inter7national discussions about how to address them.
Fully autonomous weapons could be prone to killing people unlawfully because the weapons could not be programmed to handle every situation, Human Rights Watch found. According to roboticists, there is little prospect that these weapons would possess human qualities, such as judgment, that facilitate compliance with the right to life in unforeseen situations.
Fully autonomous weapons would also undermine human dignity, Human Rights Watch said. These inanimate machines could not understand or respect the value of life, yet they would have the power to determine when to take it away.
Serious doubts exist about whether there could be meaningful accountability for the actions of a fully autonomous weapon. There would be legal and practical obstacles to holding anyone – superior officer, programmer, or manufacturer – responsible for a robot’s actions. Both criminal and civil law are ill suited to the task, Human Rights Watch found.
“The accountability gap would weaken deterrence for future violations,” said Bonnie Docherty, senior researcher in the arms division at Human Rights Watch and lecturer at the Harvard clinic as well as author of the report. “It would be very difficult for families to obtain retribution or remedy for the unlawful killing of a relative by such a machine.”
The human rights impacts of killer robots compound a host of other legal, ethical, and scientific concerns about these weapons, including the potential for an arms race, the prospect of proliferation, and questions about their ability to protect civilians adequately on the battlefield or the street, Human Rights Watch found.
For more Human Rights Watch reporting on fully autonomous weapons, please visit:
For more information on the Campaign to Stop Killer Robots, please visit: