Zum Inhalt springen
Review

PRIF Review 2023What Does Artificial Intelligence Mean for Arms Control?

Research Alliance CNTR Starts

What Does Artificial Intelligence Mean for Arms Control?

Illustration of artificial intelligence showing human bodies with brains and computer circuits highlighted in color.
Image: Placidplace via Pixabay (edited)

Artificial intelligence influences almost all areas of civilian life and the military today. Recent technological advances have demonstrated its transformative potential for society. In particular, the introduction of AI into military use is being discussed in many different ways. While some fear a loss of control due to the human element no longer being necessary, ethical principles are also being renegotiated and there is widespread demand for responsibility. At the same time, the use of AI systems also has the potential to lead to an improvement in arms control. As one of the cross-cutting topics in the Cluster for Natural and Technical Science Arms Control Research (CNTR), AI touches on almost all research topics in the project, which was launched in 2023.

Machine learning (ML), i.e. the development of statistical algorithms that can learn from data and generate new data, and artificial intelligence (AI) in the broader sense has made considerable progress in recent times. The potential applications, significance and dangers of these technologies are being discussed in almost all areas of life. As model developers continue building more capable AI systems, the possibilities for both beneficial and harmful uses of AI will keep growing.

AI has a major impact on military technologies through the optimization of battlefield management systems, of military administration and bureaucracy, such as logistics and recruitment, or the automatic recognition of military targets. Therefore, it has the potential to accelerate warfare.

As a Biosecurity expert with a deep foundation in both the theoretical and practical aspects of Bioengineering and Genetics, I am exploring the advantages and potential dangers posed by emerging (bio)technologies, along with strategies for risk mitigation.

Kadri Reis

Aside from reinforcing existing dynamics, AI and ML might also have the potential to overturn power structures, or at least challenge them. The dual-use character of technologies – meaning their potential to be used both in legitimate and illegitimate ways – both affects current and creates potential problems for arms control.

For instance, malicious actors such as non-state rogue actors could use highly capable models for assistance in creating biological threats. Large Language Models (LLMs) could potentially help to automate laboratories and research in general. Experiments can be designed, planned and eventually run on automated equipment. Several tools have already been developed, predicting chemical reactions and conducting retrosynthesis, with the aim to automate chemical synthesis and thereby reducing the workload of researchers.

As a political scientist specializing in arms control research, I look into the intersections of technological developments and political decision-making, and their impact on biological and chemical disarmament.

Una Jakob

That being said, AI may also help make arms control more effective and objective under certain conditions, such as in the evaluation of imagery from inspections or when distinguishing between a seismic event and a nuclear weapons test. AI has the potential to help estimate yields of and analyze nuclear explosions, thereby generating proliferation-relevant information on, for example, the design of warheads. Relevant technologies, such as seismic waveform analysis, already exist for both virtual testing and real-world nuclear test detection, but machine learning algorithms may make processing large amounts of data both faster and easier.

With my multidisciplinary background in mechanical engineering and war studies, I can understand the technical limits of the use of AI for design purposes on the mechanical side and its tactical usefulness in the theater.

Liska Suckau

In engineering, computer aided design and simulation are not a novelty, but machine learning has the potential to increase the speed of optimization and therefore speed up the development of new designs and materials, more efficient production chains, and enable new and higher degrees of machine autonomy. In the military realm, the vision of a lighter design holds the promise of a tactical, operational, or even strategic advantage over the adversary. One example may be the extended reach of a fighter jet, whose mission time is limited, among other things, by fuel use, which could be reduced by a lighter design.

When it comes to the cyber sphere, in general all positive aspects of AI and ML can be transferred to the realm of cyberspace and software architecture. Here, the loss of human control is particularly relevant since reaction times become shorter and shorter but human response times don’t. Nevertheless, AI-enhanced algorithms have the potential to enable the detection of slightly altered code (instead of looking for exact matches) and reveal cyber attacks more easily through the identification of their “digital fingerprints”.

As a computer scientist, I am analyzing the progress of AI and its use in weapon systems as well as how it can be regulated through arms control measures.

Thomas Reinhold

Introducing AI and ML to LAWS (Lethal Autonomous Weapon Systems) and drones will likely enhance their autonomy and counter their current problems of latency and disrupted or faulty communication links, but at the same time may undermine human control. It cannot yet be determined whether AI will positively influence verification and arms control processes of LAWS and drones, since even now there is no concrete regulation of these weapons.

As an economist with a basic background in econometrics, I can understand the fundamentals of modern learning models based on statistics and the ends-means rationality underlying AI.

Niklas Schörnig

These are only a few of many possible ways in which AI and ML can pose challenges, but also create opportunities for arms control. The example of AI illustrates the importance of monitoring and understanding new technologies and developments in the natural sciences within the context of peace and conflict research. In order to give scientifically sound recommendations for action, it is important not only to identify emerging problems at an early stage, but also to have the technical competence to address these problems. The Cluster for Natural and Technical Science Arms Control Research (CNTR) therefore consists of an interdisciplinary project team that brings together a variety of perspectives on AI and its other research topics. (sal)

Infobox

About CNTR

The Cluster for Natural and Technical Science Arms Control Research (CNTR) researches militarily relevant new technologies and developments in the natural sciences from an interdisciplinary perspective. The researchers in the cluster analyze the effects on international security, classify them on a scientifically sound basis and develop recommendations for action to strengthen arms control. CNTR is a joint project between PRIF, the Technical University of Darmstadt and Justus Liebig University Giessen and is funded by the Federal Foreign Office for a period of four years (January 2023 to December 2026).