Russia and Global Security Risks
Autonomous Weapons and the Laws of War

Autonomous weapons are moving from engineering concept to army arsenals all around the world, and reports on their combat use are becoming more frequent. The first surprise about drones is that non-government military groups were the first to use them in different conflicts, writes Vadim Kozyulin

There were 31 armed conflicts in the world in 2020. The events in Nagorno-Karabakh in September-November 2020 stood out because the regular armies of Azerbaijan and Armenia took part in the hostilities. This might be considered the first clash of relatively modern and equal armies in the 21st century. Attack drones were used in the war on a large scale. Their efficiency was confirmed many times by video recordings, the authenticity of which is beyond doubt. In addition to weapons of the 20th century, the sides used advanced arms reconnaissance and unmanned combat air vehicles (UCAVs).

The rough equality of capability matters because it makes it possible to assess to what extent autonomous weapon systems are compatible with international humanitarian law (IHL).

Many experts were surprised that the parties to the conflict rarely used piloted aircraft planes or helicopters. In fact, this was the first war in which the primary aviation objectives – reconnaissance, target detection and strikes – were carried out by drones.

Should attack autonomous weapon systems be outlawed?

Apprehensions that lethal autonomous weapon systems (LAWS) will not comply with IHL requirements are today the main argument used by advocates of prohibiting their use on an international scale. This problem is a source of concern for many respected international NGOs, for example, the Campaign to Stop Killer Robots, Article 36, Human Rights Watch, the International Committee for Robot Arms Control. 

Opponents of autonomous weapon systems are afraid that LAWS will not be able to reliably distinguish civilians from combatants, apply proportional strike force against expected military superiority, and take precautions for the safety of civilians and civilian facilities. In this way, they could inflict severe injuries or unnecessary suffering. In other words, combat drones will violate the laws of warfare that are described in detail in international legal agreements.

In addition to the risks of violating IHL, experts note that drones will be hard to control. It will also be difficult to attribute responsibility for their actions. This creates a threat that the use of force by machines may violate human rights and fundamental moral principles.
Conflict and Leadership
Artificial Intelligence, Autonomy and Nuclear Stability: Towards a More Complex Nuclear Future
Andrew Futter
There aren’t any easy answers to potential risks posed by the AI-Automation-Nuclear nexus, but a starting point has to be a greater understanding of the key concepts and an appreciation of how the deployment of such technologies could play out. In the nuclear realm, and especially now in the increasing digital global nuclear realm, perceptions and the potential for misunderstanding will be as important as reality, writes Andrew Futter, Professor of International Politics at the University of Leicester.
Opinions


Elon Musk warned about the need to display extreme caution in dealing with artificial intelligence (AI). He said AI was likely to pose the “biggest existential threat. Stephen Hawking shared his view. He said: Success in creating effective AI could be the biggest event in the history of our civilization. Or the worst. We just dont knowUnless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of civilization. 

In the context of these concerns there is significance in looking at the use of autonomous weapon systems in the recent Nagorno-Karabakh conflict. Many human rights organisations recorded numerous IHL violations by both sides. 

Azerbaijan used UAVs of different types on a broad scale: 

  • Turkish attack vehicles Bayraktar TB2 with precision rockets and bombs;
  • Several types of small kamikaze drones, in particular, Sky Striker loitering munition, the Harop, Orbiter1K and Orbiter-3. Some of these aircraft are guided by air defence radar radio signals, while others rely on optical, infrared and other sensors;
  • Israeli reconnaissance and patrol drones Heron TP and Hermes 4507;
  • Old AN-2 biplanes converted into unmanned aerial vehicles that were used as bait for air defence systems that shelled them, thereby revealing the air defence positions.

Armenia was weaker in terms of drone use. Its armed forces had an arsenal of small drones of local make that were fitted for reconnaissance and artillery fire guidance but not for attacks: Х-55, Krunk and Baze drones.

It should be noted that UAVs and loitering munition drones are not yet qualified as LAWS because they are not completely autonomous and are controlled by operators remotely. However, it is impossible to verify this in practice because technically a loitering munition may be fully autonomous and conduct its mission without an operator. At any rate, it can be considered a prototype or precursor of LAWS.

For the most part, aerial drones are not designed for strikes at troops. It is clear that autonomous weapon systems are capable of carrying out pinpoint strike objectives that required large-scale massive attacks in the 20th century.

In his interview with Interfax, President of the ICRC Peter Maurer said: What I can say is that I have not seen in my eight years as president of the ICRC, and the organization has not seen in 157 years a war that has not been accompanied by violations of the Geneva Conventions and international humanitarian law. The conflict in Nagorno-Karabakh has shown how autonomous weapon systems can conduct hostilities while observing the rules of warfare in the 21st century. 

International humanitarian law: Proving harmony by higher mathematics

IHL is the principle of defining the distinctions that compel an attacker to carry out the following:

  • Distinguish between civilians and combatants
  • Distinguish between civilian facilities and military installations
  • Observe a ban on indiscriminate attacks (attacks that are not just aimed at specific military targets, but that destroy military installations and civilians or civilian facilities indiscriminately)
  • Observe the principle of proportionality during an attack (this principle bans attacks that are expected to lead to the loss of civilian life and damage to civilian facilities, which would be excessive in terms of achieving the desired military advantage)
  • Observe precautions during an attack (the attacker must determine the character of its targets, choose weapons and methods of warfare, pre-assess the consequences of an attack and warn civilians of possible threats in a timely manner).
  • Limit the use of weapons and methods of warfare (observe a ban on excessive destruction and exorbitant suffering).
A practical analysis of the principles of IHL makes it clear that under the law military leaders must have both mechanical and mathematical abilities. They must be able to: 

  • Identify images
  • Distinguish between military personnel and civilians
  • Identify and distinguish between military and civilian facilities
  • Learn the technical characteristics of their own weapons
  • Know the potentialities of enemy weapons
  • Analyse enemy defensive actions and their combat efficiency
  • Calculate the minimal forces and weapons required for reaching military objectives
  • Determine a method of inflicting minimal damage to civilians while performing a combat mission. 
To this end, commanders must have broad knowledge, substantial analytical abilities and math skills. They must have a vast data base, a high calculating ability and a capacity for processing Big Data. In other words, they must have the same qualities as neural networks or artificial intelligence. The very principle of proportionality points to the mathematical foundation of the rules of warfare. 

A computer is probably capable of making the necessary calculations much quicker and more accurately than the human mind burdened by prejudice, feelings and reflection. This can be confirmed by the use of autonomous weapon systems in Nagorno-Karabakh. These calculations will become more reliable when armed forces establish a network-centric system that consolidates calculation capacities and different resources by creating military cloud storage. The advanced armies in the world are now developing this. Even if prototypes (precursors) of LAWS are unable to compete with people in observing IHL, the next generation of autonomous weapons will become model soldiers leaving humans behind just as computers surpassed them in chess or a game of Go.

If autonomous weapon systems violate IHL, recording systems or black boxes with the parameters of an attack will serve as an instrument for an investigation.

One reservation is necessary: at this point, our outlook on autonomous weapon systems is only based on combat drones that mostly target hardware, not people. This situation could change if autonomous ground-based vehicles emerge on a battlefield. Several dozen countries are currently upgrading their unmanned ground vehicles (UGVs). Land-based robots could shock the world with their “efficiency” in combat actions where contact with people is inevitable. 

However, it seems that the main problem with lethal autonomous weapon systems will not be their inability to abide by IHL but the lack of the concept of human decency. The smartest computer will remain a soulless machine that is unable to comprehend the Friedrich Martens clause and observe the principles of humanity and dictates of public conscience.  Humaneness can hardly be digitised. This shortcoming will remain for a long time, and will probably always be the birthmark of even the smartest machine. This is the reason people must not lose control of autonomous weapons. No matter how advanced they may be, a machine must not be given the right to kill people.
 
Technology versus common sense

The value of human decency could be a sufficient argument in favour of banning or restricting LAWS. However, this appears to be impossible for a number of purely technical and political reasons:

  • Tests of remotely controlled land-based robots in some countries have shown that the effective range of radio communication with an operator does not exceed 2-5 km. A land-based robot will be able to act at a distance of up to 500 km. Limited radio communication is becoming a technical obstacle that is compelling the military to develop fully autonomous LAWS. 
  • Modern hostilities provide for the active use of radio-electronic warfare to suppress or change the quality of enemy control, communications and intelligence. At present, this version of armed struggle is markedly on the rise. Radio-electronic suppression is considered an effective way to counter autonomous weapon systems. 
Designers do not have a universal solution for maintaining control over remotely controlled combat robots that are subjected to a radio-electronic warfare attack. The possible scenarios for an UGV that loses communication with its operator are as follows:

  • Maneuver to a region with stable communications
  • Return to the base facility
  • Remain in place until further instructions
  • Go offline.
The military believe that the last version is like “an armored train that stands ready on a siding, to quote a popular song.

LAWS technology is not likely to appear in the heads of military strategists or political leaders. It is developed in laboratories, fostered by industry and promoted by lobbyists for armies. In other words, military technology is developed in the course of an arms race with weak public control. This race is concealed behind the walls of design labs for some time.

LAWS verification is technically difficult but possible. Video clips on the internet confirm that the use of precision munitions can be fairly well recorded: an operator shoots a video that is likely to be kept in defence ministry storage. So, in theory there exists an opportunity to study a specific case of the use of these weapons. Therefore, it is possible to sign an international agreement on presenting this kind of video evidence in certain cases. Such materials could confirm that the operator controlled a combat robot during an attack.

However, the priorities of secrecy will prevail in the majority of the worlds advanced armies during verification procedures. Verification mechanisms are bound to penetrate not only the organisational mechanisms of military units but also the technical design and software of cutting-edge weapons. Moreover, it will not be difficult to obviate verification of LAWS because in the final analysis, the difference between an autonomous and remote controlled robot is concealed in its software that, as distinct from a train on a siding, is easy to hide. At this point, control over LAWS comes down to the problem of global political mistrust.

The world is returning to the Cold War era. There is a shortage of trust, and new information and cyber technology is merely exacerbating the problem. When the entire arms control system is falling apart under real and far-fetched excuses, it is difficult to hope for mutual understanding in a sensitive field like controlling LAWS.

Complete prohibition of LAWS appears unlikely for these reasons. However, it is possible to sign a political declaration whereby the signatories pledge to preserve human control over autonomous weapon systems. This could become an acceptable alternative to the uncontrolled proliferation of killer robots across the globe. This idea is supported by the 11 principles adopted in September 2019 by the Group of Governmental Experts on Lethal Autonomous Weapons Systems. One of them reads: Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines.” After all, humankind differs from machines in its humanity and its concern for human dignity. 
Russia and Global Security Risks
Unmanned Aerial Vehicles over Nagorno-Karabakh: Revolution or Another Day of Battle
Alexander Yermakov
The Second Nagorno-Karabakh War has come, hopefully, to an end. The time has come for summarising events, and if political experts have a wide range of topics to choose from, from Erdogan's neo-Ottomanism to the role of Russia in the Transcaucasus or the degree of the betrayal’ of the Armenian prime minister, those interested in military technology are concentrating on one topic that has dominated the media since the beginning of the conflict.
Opinions
Views expressed are of individual Members and Contributors, rather than the Club's, unless explicitly stated otherwise.