AutoNorms
Weaponised Artificial Intelligence, Norms, and Order
An International Research Project
funded by the European Research Council (ERC), hosted by the Centre for War Studies (CWS) at the University of Southern Denmark (SDU)
An International Research Project
Weaponised artificial intelligence (AI) in the form of autonomous weapons systems (AWS) could diminish the role of meaningful human decision-making in warfare.
Today, states already use more than 130 systems that can autonomously track targets. But future autonomous weapons will increasingly include AI in their critical functions. Here, machines, rather than humans, will make life or death decisions. This development is likely to change international norms governing the use of force.
The EU-funded AutoNorms project will develop a new theoretical approach to study how norms, understood as standards of appropriateness, manifest and change in practices. It will investigate norm emergence and change across four contexts of practices (military, transnational political, dual-use and popular imagination) in the US, China, Japan and Russia. It will also review the impact AWS could have on the current international security order.
Autonomous weapons systems could diminish the role of meaningful human decision-making in warfare.
Our latest articles

Consequences of Using AI-Based Decision-Making Support Systems for Affected Populations
The following essay builds on remarks delivered by Ingvild Bode as part of the Expert Workshop “AI and Related Technologies in Military Decision-Making on the

Regulation and Prohibition of Autonomous Weapons Systems: A Future Outside the CCW?
On 21 October 2022, Austria, on behalf of 70 states, delivered a joint statement on Lethal Autonomous Weapons Systems (LAWS) at the 77th United Nations