Why engineers need to be more proactive in their engagement with the laws of war

As autonomous weapons play an increasing role in conflicts around the world, manufacturers cannot afford to sit idly by and adapt their designs in response to court rulings.

The war in Ukraine has made headlines over allegations of war crimes and the possibility of prosecution at the International Criminal Court. Feelings of horror aside, this perspective may seem out of place to most engineers and computer scientists. It’s not — the tech industry has lessons to learn, especially now that drones are playing such a prominent role in the evolution of warfare.

War is governed by international humanitarian law (IHL), also known as the law of armed conflict (LAC), through international treaties and protocols negotiated at the United Nations. IHL assumes that wars will occur, but deadly actions can only be taken by identifiable individuals who have the power to act within the law. Therefore, responsibility and therefore liability for damage to persons or property must always lie with an identifiable person, using a weapon system with predictable behavior.

Armed drones are an example of lethal autonomous weapons (LAWs). Who is legally responsible for the lethal effects of an autonomous weapon?

The UN Panel of Experts on LAWS asserts that they are covered by current IHL. The problem is identifying the human responsible for the fatal decision. The group believes that “this should be considered throughout the life cycle of the weapon system”, clearly placing the responsibility on supply chain technologists, not just military commanders. . How can engineers assume their responsibilities?

Article 35 of Additional Protocol I to the Geneva Conventions presents a significant difference between IHL and national laws for autonomous systems. It stipulates that new “means or methods of warfare” are not of unlimited use and that a State must determine that they comply with IHL. Article 36 imposes a testing regime. Operational use is covered by rules of engagement, established after extensive legal, military and technical collaborative work. When the GGE has completed its work, all LAWs, including armed drones, must have completed Section 36 reviews covering the entire lifecycle.

Engineers are able to fulfill their IHL responsibilities using systems engineering approaches to include identifying chains of authority to act and identifying limits to the use of armed. National laws do not appear to have an equivalent overarching requirement for a product to have identifiable liability for the consequences of its actions prior to use, except, perhaps, that it must be fit for purpose. .

Autonomous systems such as driverless vehicles are an area of ​​engineering where national legal regimes are already struggling to adapt to rapidly changing technology. New technologies are introduced into existing legal frameworks which then attempt to catch up and identify who is responsible for any accidents. For example, the opinion of the English and Scottish Law Commissions is that self-driving cars are incompatible with current UK law.

Autonomy and machine learning, with their non-deterministic behaviors, make change inevitable. Systems engineering approaches used for IHL include deriving technical specifications for each system from generic IHL principles, architectural analysis of human responsibility, and limiting subsystem behavior by design. LOIs require more Section 36 reviews during the concept and design phases, with the goal of reaching agreement among lawyers, engineers and IT people on the application of specific new technologies in a LAW. The result is the identification of responsibility for every aspect of a weapon’s behavior, reducing technical, legal and financial risk for all parties.

The lesson for producers of any stand-alone system is that system specifications should be based on legal limitations on the use of a product. These then underpin all phases of design, testing, upgrading, and disposal. This can be done and will reduce liability risks to manageable levels. However, it requires lawyers, regulators, marketers, engineers, and IT people to collaborate closely throughout the product lifecycle, from concept to disposal.

Should we sit back, do nothing new about our potential liabilities, let the courts decide who is liable, and then retrospectively change our designs and design processes to meet their rulings? Nope! As professional engineers, we must be proactive and initiate change based on sound engineering principles, turning liability into a manageable risk like any other.

Tony Gillespie is a visiting professor in the Department of Electronic and Electrical Engineering at UCL and author of the book IET Systems Engineering for Ethical Autonomous Systems. He was co-author of the first paper transforming the Geneva Conventions into engineering requirements and has authored several papers on AI, autonomy and responsibilities.

Sign up for the E&T News email to get great stories like this delivered to your inbox every day.

About Leah Albert

Check Also

Continue or heal? Suspension of swap payments following an event of default may not be unlimited | Hogan Lovells

Background The decision of the High Court (The joint administrators of Lehman Brothers International (Europe) …