Traditional frameworks of international law grapple to keep pace amidst an era of rapid technological advancement. From artificial intelligence (AI), quantum computing and autonomous weapon systems, emerging and disruptive technologies (EDTs) are not only reshaping the geopolitical landscape but also the nature and modus operandi of warfare. The need for robust, adaptive and forward-looking legal frameworks has become more urgent than ever given the competition among states to dominate these frontiers.
Historically, international law has evolved reactively – most probably after the manifestation of disruptive effects of emerging technologies. For instance, cyber norms only began to formulate after the surge in practice of state-sponsored hacking and data warfare in the twenty-first century. Nevertheless, in the case of emerging and disruptive technologies (EDTs), such a reactionary approach could prove perilous. The dual-use-nature of these emerging and disruptive technologies, (EDTs), for both military and civilian purposes with the added aspect of their borderless scope and pace of innovation, calls for a proactive approach in formulating legal mechanisms for the regulation of these EDTs.
The most profound challenge in regulation of EDTs lies in the complex nature and rapid evolution of these technologies. These emerging and disruptive technologies are intangible and mostly developed by private entities unlike conventional weapons or nuclear technologies that can be physically identifiable and managed through state-centric treaties. AI algorithms perform functions in a decentralized global system and these algorithms are proprietary and evolve with time as well. Accountability of an AI system for making a lethal decision on a battlefield is still a debate in international law. Domains like cyberspace and outer space are already marked by regulatory ambiguity and with the deployment of emerging and disruptive technologies in these areas, chances of widening of normative gap increases manifold. The UN charter and Geneva Conventions fell short of providing explicit legal guidance on issues like algorithm bias and autonomous weapon system.
In the domain of AI and Lethal Autonomous Weapon System (LAWS), the United Nations Convention on Certain Conventional Weapons (CCW) has initiated debate to formulate norms and potential regulatory bans. Ethical, humanitarian, legal and security challenges led to the discussion on the use and development of Lethal Autonomous Weapons System (LAWS) within the UN Group of Governmental Experts on LAWS (GGE) under the Convention on Certain Conventional Weapons (CCW) in 2016. The CCW framework was established in 1980 and became integral part of International humanitarian law (IHL) to ban or restrict weapons that could produce indiscriminate effects. Eleven guiding principles for the use and development of LAWS were released by GGE in 2019. These principles, include accountability, risk mitigation and compliance with humanitarian laws to LAWS, have laid foundation for their regulation. Generally, these principles state that IHL will apply to all weapons systems including LAWS.
However, within GGE, progress has been quite slow in establishing a binding norm to regulate LAWS given the shortcomings in its consensus-driven process where dissent from a single member state is enough to reject a proposal. This slow pace of discussions within GGE has raised concerns regarding rapidly narrowing window for establishing an effective regulatory mechanism for these weapons, leading to emergence of processes outside the scope of GGE. These discussions outside the framework of GGE emphasize on risks associated with these emerging and disruptive technologies, advocating for negotiation on a binding treaty to regulate and ban LAWS urgently. For instance, the UN Secretary-General and the International Committee of the Red Cross, have advocated for a treaty to regulate and ban autonomous weapons system by 2026. Countries are also concerned about the existence of multiple parallel processes, leading to fragment the normative and regulatory debates on emerging and disruptive technologies by undermining consensus on common approaches to regulate them.
Furthermore, movement to ban autonomous weapons has gained momentum besides the GGE’s ongoing efforts to regulate emerging and disruptive technologies. Civil society organizations like Stop Killer Robots and the International Committee of the Red Cross, are at forefront in expressing dissatisfaction with the working of GGE, highlighting their failure in making efforts for regulating autonomous weapons. States that are militarily significant have also resisted the formulation of a new binding instrument, terming existing International Humanitarian Laws (IHL) sufficient to regulate these weapons.
However, efforts are being made at multiple multilateral forums to address these gaps. At the first committee of the United Nations General Assembly (UNGA) on Disarmament and International Security in December 2023, Austria tabled a resolution which is co-sponsored by over forty states, setting a provisional agenda for discussion on LAWS. At the UNGA session in 2024 for the first time issue was discussed, highlighting the intent of member states in taking discussion outside the working of GGE. At the 2023 Latin American and Caribbean Conference on the Social and Humanitarian Impacts of Autonomous Weapons, the Belen Communiqué was adopted, calling for swift treaty negotiations to regulate LAWS. Major militaries like France, Russian and the United States had not endorsed the communique, yet they participated as observers in the proceedings of the conference. Similar efforts to regulate emerging and disruptive technologies were made in the CARICOM Declaration by Caribbean states in 2023 and the Freetown Communiqué by the Economic Community of West African States in 2024.
Nevertheless, diverging national interests are responsible for delay in their progress. States like Brazil and Austria calls for a preemptive ban on “killer robots’’, and major powers like the United States and Russia advocate for non-binding guidelines, considering strategic and economic dividends of AI militarization.
Soft law, which comprises of norms and code of conduct, has emerged as a practical alternative given the challenges to treaty-making in this divided international system. The Tallinn Manual in the International law serves as a reference point for applying existing international law to cyber domain, though it is non-binding in nature. It outlines the potential application of principles of sovereignty and proportionality to cyber conflicts. However, the biggest hurdle in cyberspace lies in enforcement, as both legal accountability and attribution in cyberspace are not only difficult but elusive as well.
Likewise, absence of a binding global treaty on Lethal Autonomous Weapons Systems (LAWS) serves as another challenge to managing harmful effects of disruptive emerging technologies. Since 2016, a global binding treaty on LAWS has been under discussion in the UN Group of Governmental Experts (GGE) but still consensus on its agreement has not been reached. Moreover, ban on fully autonomous weapons does not fall under the scope of current laws in the form of Geneva Conventions. Ethical Frameworks like EU AI Act and UNESCO’s AI Ethical framework provide limited scope and no mechanism for its enforcement. Though evolution of law always lags the pace of innovation, it still need not be obsolete. By embracing a dynamic, inclusive and principled approach towards technological advancements, the international community could benefit humanity by transforming these tech advancements into a force for global good. International law must evolve faster amidst an era of innovation. In other case, there is a chance that technology will control the humanity, not the other way around.
This article was published in another form at https://moderndiplomacy.eu/2025/07/04/how-international-law-is-adapting-to-emerging-and-disruptive-technologies/
Ms Nawal Nawaz is Research Assistant at the Center for International Strategic Studies (CISS), Islamabad.






