AutoNorms
Weaponised Artificial Intelligence, Norms, and Order
An International Research Project
funded by the European Research Council (ERC), hosted by the Centre for War Studies (CWS) at the University of Southern Denmark (SDU)
An International Research Project
Weaponised artificial intelligence (AI) in the form of autonomous weapons systems (AWS) could diminish the role of meaningful human decision-making in warfare.
Today, states already use more than 130 systems that can autonomously track targets. But future autonomous weapons will increasingly include AI in their critical functions. Here, machines, rather than humans, will make life or death decisions. This development is likely to change international norms governing the use of force.
The EU-funded AutoNorms project will develop a new theoretical approach to study how norms, understood as standards of appropriateness, manifest and change in practices. It will investigate norm emergence and change across four contexts of practices (military, transnational political, dual-use and popular imagination) in the US, China, Japan and Russia. It will also review the impact AWS could have on the current international security order.
Autonomous weapons systems could diminish the role of meaningful human decision-making in warfare.
Our latest articles

‘Responsible AI’ in the Military Domain: Implications for Regulation
This blog is based on the regulation subpanel of the Realities of Algorithmic Warfare breakout session, held at the REAIM Summit 2023. Watch the full

AutoNorms at the UN GGE on LAWS in March 2023
The AutoNorms team regularly participates in meetings of the United Nations Group of Governmental Experts (GGE) on emerging technologies in the area of lethal autonomous