Research Article

Topics: All, Political Process

Can the UN GGE Go Beyond the Eleven Guiding Principles on LAWS?

The first session of the Group of Governmental Experts (GGE) on emerging technologies in the area of lethal autonomous weapons systems (LAWS) took place from 3-13 August 2021, in Geneva.  

After a delay caused by the Covid-19 pandemic, states parties to the UN Convention on Certain Conventional Weapons (CCW) could formally continue the discussion on these technologies ahead of the CCW Sixth Review Conference, taking place in December 2021. It is then that the CCW High Contracting Parties will review the mandate of the GGE. As several delegations noted, there is some pressure for the group to demonstrate its progress in its final report, to be adopted before the Review.

The work on LAWS within the framework of the CCW has been going on for eight years. The guiding principles agreed in 2019 are considered to be the main achievement of those years. These principles are broad formulations of how states see the development and use of LAWS-related technologies. For instance, they state that international humanitarian law continues to apply fully to all weapons systems and that human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines.

However, as pointed out by the Campaign to Stop Killer Robots, these were “simply intended to guide the deliberations”, not “supposed to be an end in themselves”. Many delegations therefore treated the principles as a lowest common denominator and a basis to build upon, while arguing for the need to elaborate on them. But is it possible to make progress on the substance?

The August session has shown a mixed picture. It began with a broad discussion about topics such as the challenges posed by LAWS to International Humanitarian Law (IHL); finding a common understanding on the characteristics of these weapons systems; the human element in the use of force and the concept of “meaningful human control”; a review of potential military applications of these technologies; and possible ways to address the risks posed by LAWS.

In these debates, there were some broad elements of agreement, mostly along the lines of the 2019 principles. For instance, all states parties agreed that some form of human involvement must be retained in either the use of force or the use of LAWS. Similarly, no state party doubted the principle that the use of all weapons systems should be guided and regulated by existing IHL. Many delegations also agreed that there was no operational interest in having fully autonomous weapons because it would be counter-productive to develop systems which cannot be controlled.

However, given the sensitivity of the topic, the disagreements were easier to notice. There is no common definition of LAWS (or AWS, as some states prefer), which poses a problem for the states that believe that it is necessary to move the dialogue forward. There is also no common understanding of the benefits and challenges of autonomy in weapons systems, as some delegations such as Palestine and Chile say there is undeniable proof of the risks of LAWS, while others including India and France are against “demonising technology”. Meanwhile, the US disagrees with the claim that AWS inherently have challenges, as it has “found this not to be the case”.  

In terms of developing a normative-operational framework for LAWS, which is one of the tasks that the GGE received from CCW states parties, the spectrum of proposals is wide. As pointed out by Richard Moyes from Article 36, a member of the Campaign to Stop Killer Robots, most delegations have suggested some kind of framework, either in the form of a legal instrument, commitments, or a best practices guide for militaries to follow. However, the group of states arguing that existing IHL provides a sufficient framework, including Israel, Russia and the US, continues to maintain its position.

For states interested in weaponised artificial intelligence, such as the US, Israel and the UK, the incentives to pursue their technological development seem too important to support strict regulations. Meanwhile, other states’ and campaigners’ appeals to regulate LAWS based on humanitarian and moral concerns does not resonate with states parties such as Russia, who do not want to “divide weapons into bad or good”. As John Williams writes, this is representative of a broader problem with regulating AI’s advances in all aspects of society.

The Chair’s proposals: Exposing divisions further

Towards the end of the first week, the Chair, Marc Pecsteen de Buytswerve, drafted a paper on elements of possible consensus recommendations for the final report to be submitted to the CCW. Several delegations raised concerns about the form of the paper, with some arguing that it looked like a legally binding treaty or a political declaration.

After the paper was re-drafted, there were issues with some of the new language that the Chair introduced in an effort to find consensus. States such as the US, Russia, India, and Israel, demanded that any introduction of new wording should be justified, arguing that previously agreed terms work perfectly well. Others, such as the Philippines, Panama, and Austria, considered it important to show that the GGE made progress and did not keep re-using the same words.

Overall, the session was productive in the sense that every delegation clearly expressed the kinds of proposals and language it would like to see in a final report. Because the GGE is based on consensus, a balance has to be found between all the perspectives. The Chair seemed optimistic about the potential to do so in the next draft of his paper, due to be discussed at the GGE meeting in September. As the first session ended, however, many questions remained about how this will be done. “Let’s see” was the conclusion voiced by many participants and observers from academia and civil society.

What are the ways forward?

Many options are possible for the future of the global debate on LAWS. Among them are an extension of the GGE mandate to continue the discussions, probably after the group submits a final report roughly along the lines of the 2019 guiding principles to the Review Conference.

Another one would be for the interested states parties to move the debate on LAWS into another venue, such as an ad hoc process led by an interested state, possibly a small or middle power. Taking the example of other disarmament talks such as on anti-personnel landmines or cluster munitions, some experts have suggested this option. Would this be a productive approach?

On the one hand, the most active states in developing these technologies would most likely not participate, which could lead to another case of “haves vs. have-nots” and more polarisation in the debate. On the other hand, moving beyond the CCW to agree a legally binding treaty could launch the process of setting legal norms on LAWS. Even if not it is not a universal ban, it could have an impact on the practices of states across which LAWS technologies can proliferate.

Featured image credit: Anna Nadibaidze

Share this article

Related articles


Call for Papers: Emerging Scholars on Emerging Technologies in International Security

Call for Papers Emerging Scholars on Emerging Technologies in International Security International Workshop hosted by the AutoNorms project Center for War Studies, University of Southern Denmark Date: Friday, 1 November 2024Place: University of Southern Denmark, Odense, DenmarkApply by: Friday, 31 May 2024 DescriptionEmerging technologies including, but not limited to, artificial

Read More »

The Imaginaries of Human-Robot Relationships in Chinese Popular Culture

The portrayals of artificial intelligence (AI) and human-robot interactions in popular culture, along with their potential impact on public perceptions of AI and the regulations governing this evolving field, have garnered growing interest. Building on previous studies on public imaginaries of AI in Hollywood movies, particularly focusing on the Terminator

Read More »