AI, Autonomy, and Dual-Use Technology: The Legal Dilemmas of Modern Warfare
Artificial Intelligence (AI) is redefining the nature of modern warfare, raising questions about the ethical deployment and operability of military systems. International Humanitarian Law (IHL) seeks to balance the principles of humanity and military necessity, as well as proportionality—standards that become increasingly blurred with the advent of AI. Dual-use technologies, such as AI, are essential to commercial innovation and research & development (R&D), but their military applications generate regulatory challenges related to export controls, due diligence obligations, and the allocation of liability and accountability for harm caused by autonomous systems.
Understanding Dual Use Technologies
The term "Dual Use Technology" emerged during World War II in the context of export controls on goods that had both civilian and military applications. The impact of such technologies is multi-fold for modern warfare. Until the late 20th century, defense research was the driving force behind technological progress in civilian sectors. Additionally, dual-use technology refers to a class of military technologies that are also useful for civilian applications. However, the rise of high-tech commerce has reversed the interplay between military technological advancements and civilian progress, with transactions now taking place between multinational companies worldwide. Technology companies are taking the lead in defence innovation. Private companies hold technical know-how and employ high-tech experts, supplying their own governments and others through exports.
Overarching Legal Issues
No single international legal instrument defines dual-use technology; however, it is governed through multilateral export control regimes that harmonise control lists. Additionally, they prevent proliferation and establish global standards. The four major multilateral non-proliferation export control regimes are: 1) The Wassenaar Arrangement, 2) Missile Technology Control Regime, 3) The Australian Group, and 4) Nuclear Suppliers’ Group. Together, these regimes build the foundational governance architecture for dual-use technologies; however, their traditional scope and enforcement mechanisms face significant challenges when applied to rapidly evolving and intangible technologies such as AI.
The legal and philosophical debates on dual-use AI and Lethal Autonomous Weapon Systems (LAWS) are vast. The subject of debate is centred around fully autonomous systems that are capable of civilian engagement on the battlefield. The complex integration of AI into military complexes fundamentally changes the nature of warfare, requiring careful deliberation on the application of the enduring principles of the Law of Armed Conflict (LOAC). Article 36 of the Additional Protocol I to the Geneva Convention provides the legal framework for the review of new weapons, means, and methods of warfare.
Within this context, the key IHL principles stressed for AI and military fusion remain fundamental. Distinguishing between civilians and combatants, as well as civilian objects and military objectives, must be based on their actual rather than intended use, prohibiting indiscriminate attacks (Article 48, Additional Protocol I). The principle of proportionality prohibits attacks likely to cause civilian damage excessive compared to the anticipated concrete military advantage, judged ex ante on all available information (Article 51(5)(b), Additional Protocol I). Precautionary obligations require all feasible measures to verify targets and minimise civilian harm during planning and execution (Article 57, Additional Protocol I). Accountability under international humanitarian law spans both state and individual responsibility; while attribution may be complex in the context of AI-enabled or autonomous systems, legal responsibility persists when harm occurs.
Against this legal backdrop, the ongoing debate remains nascent, largely framed by the Convention on Certain Conventional Weapons (CCW). The CCW emphasizes three interrelated objectives: first, limiting the methods and means of warfare to reduce harm to both combatants and civilians; second, prioritising the protection of civilian populations from the consequences of hostilities; and third, prohibiting means and methods of warfare that inflict superfluous injury or unnecessary suffering.
India’s Export Control and AI
The Foreign Trade Policy of India (FTP) governs India’s export control, and the Foreign Trade (Development & Regulation) Act, 1992 (FTDR Act) provides for the regulatory framework for international trade. The Special Chemicals, Organisms, Materials, Equipment, and Technologies (SCOMET) list regulates dual-use technologies. India is a member of MTCR, Wassenaar Arrangement, and Australia Group, facilitating the SCOMET list’s alignment with global export control regimes. India is not a formal member of the Nuclear Suppliers Group (NSG), though it aligns with NSG guidelines and has sought membership.
In the year 2016, Ambassador Amandeep Gill of India chaired the Group of Governmental Experts (GGE) under the aegis of the United Nations Office of Disarmament Affairs (UNODA) forum CCW, along with multiple agencies on the regulation of LAWS. The deliberations of the meeting addressed the legal challenges associated with AWS under the IHL pertaining to the question of liability and accountability.
Conclusion
The integration of artificial intelligence into military systems reveals structural challenges in the application of IHL when faced with LAWS and dual-use technologies. The effective application of IHL principles relies on human control, thorough weapons review processes under Article 36 of Additional Protocol I, and improved export control and due diligence systems suited to intangible and commercially driven technologies like AI. States have the main responsibility for ensuring compliance with IHL, especially when private actors across borders develop, supply, or enable military capabilities. This requires coordination between governments and the private sector, through progressive steps of Public Private Partnerships and moving ahead with a multi stakeholder approach.