May 16, 2024

Human Rights and Legal Research Centre

Strategic Communications for Development

Human Rights Concern: International Committee of the Red Cross (ICRC) calls for new legal rules to address and ban the use of killer robots or autonomous weapons during armed conflicts

5 min read

Speaking during a virtual briefing on the new ICRC position on autonomous weapon systems, Mr Peter Maurer, the President of the ICRC reiterated the need for the international community to come up with a legal instrument regulating the use of autonomous weapons. During the Virtual briefing, he remarked, “We must decide what role we want human beings to play in life-and-death decisions during armed conflicts”

Continuous development in the area of technology is helping the global community but it poses a threat in the area of weaponry and its use during armed conflicts. While addressing the public during the briefing, which will go down in history if ICRC’s campaign is taken into consideration by the international community, Peter said, “New developments in digital technologies are taking place at a startling pace, affecting the way we live and the way we work – even the way we think. They hold great promise for humanity, and I’ve talked regularly about how we at the ICRC are embracing the digital transformation to enhance humanitarian action around the world…New developments in digital technologies also affect the ways in which wars are fought. New weapon technologies give rise to serious humanitarian, legal and ethical dilemmas, and these dilemmas will be the focus of my talk today.”

Enlightening the public on ICRC stand concerning the use of autonomous weapons, Mr Peter Maurer said that once these autonomous weapons are activated, humans have less control of what, how and where they targets. Raising several questions, he asks if, in the course of operations, the robot or autonomous weapon loses control, what will be the consequences?

“Autonomous weapons raise many challenging questions from several perspectives – military, technical, legal, ethical, philosophical and of course humanitarian. This complexity also contributes to the political challenges governments face in building shared understandings of potential risks and necessary solutions.”

However, ICRC looked at the effects of an increase in the use of autonomous weapons to the civilians and says it poses a thread “These weapons are already being used: in limited circumstances, usually far from civilians, and against highly specific types of target, for example, to defend warships at sea from incoming missiles… And yet, current technology and military developments are fuelling interest in the use of autonomous weapons that attack a wider range of targets, over greater areas and longer durations, and even in urban areas – complex and dynamic environments.”

He went further to give an example of a threat the continuous use of autonomous weapons poses to the civilian population “Consider, for example, situations with a significant civilian population in the area of military operations. In the event an autonomous weapon is used: How will civilians be protected when the user of an autonomous weapon does not know exactly where or when, or what, they will destroy? Or imagine an autonomous weapon’s sensor is triggered by civilian buses with a similar shape to soldiers’ transport vehicles and starts striking all buses over a wide area without the user being able to intervene and deactivate?”

The use of autonomous weapons is also a potential escalator of wars/conflicts according to the ICRC “Autonomous weapons also increase the danger that conflicts will escalate, for example if there is no time, or means; to switch off an autonomous weapon before it is too late.”

In line with the Humanitarian consequences of the use of these weapons, the ICRC president said that robots will not recognized for example wounded soldiers and those who have surrendered in war front since they are protected under international law “The potential humanitarian consequences are concerning for the ICRC. These weapon systems raise serious challenges for compliance with international humanitarian law, whose rules require context-specific judgements by combatants. For instance, how will injured soldiers be spared when there is no one there to recognize they are hors de combat?”

Moreover, autonomous weapons raise fundamental ethical concerns for humanity, in effect substituting human decisions about life and death with sensor, software and machine processes.

“There is a distinct risk that we will see human control and judgement in life-and-death decisions gradually eroded to a point that is unacceptable. This has been stressed by many in foreign ministries, armed forces and humanitarian organizations, as well as by roboticists and artificial intelligence experts in industry.”


Still reiterating on the effects of these weapons to humans, the representative of ICRC said “Unfettered design and use of autonomous weapons present a fundamental challenge. They risk eroding current protections for the victims of war under international humanitarian law and the principles of humanity.”

after the virtual briefing on the position of ICRC on the use of autonomous weapons, they offer three specific recommendations taking into consideration the humanitarian principles as follows

First, our view is that unpredictable autonomous weapons should be ruled out, notably because of their indiscriminate effects, and that this would be best achieved through a prohibition of unpredictable autonomous weapons.

Second, we believe that the use of autonomous weapons to target human beings should be ruled out. This recommendation is grounded in ethical considerations to safeguard humanity and the need to uphold the international humanitarian law rules for the protection of civilians and combatants hors de combat. This would in our view be best achieved through a prohibition of anti-personnel autonomous weapons.

Third, and finally, we recommend that other autonomous weapons should be regulated, including through a combination of four types of limits:

  • first, limits on the types of target, such as constraining them to typically military objects, like tanks or incoming missiles
  • second, limits on the duration, geographical scope and scale of use
  • third, limits on situations of use, such as situations where civilians are not present
  • fourth, requirements for human-machine interaction, notably to ensure effective human supervision, and timely intervention and deactivation.

The president of the ICRC in the conclusion said that “These recommendations do not bar the development and use of new digital technologies of warfare in other ways, such as to increase weapons’ precision or to enhance human decision-making.”


“We shape technology. And in turn, technology shapes us. These developments do not occur in a vacuum. But beyond calculations of costs and benefits, decisions about what technology should be used for are based on human values.”

Read the full Virtual Briefing through the link below: Peter Maurer: “We must decide what role we want human beings to play in life-and-death decisions during armed conflicts” | ICRC

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe To Our Newsletter

Translate »