In 2025, autonomous weapon systems were no longer discussed primarily as future concepts or ethical hypotheticals. They became a routine part of modern warfare. From the battlefields of Ukraine to the dense urban environment of Gaza, autonomy, artificial intelligence, and robotics were increasingly embedded in how wars were fought, planned, and sustained. The year did not mark the arrival of fully independent “killer robots” operating without human involvement, but it did confirm a decisive shift: autonomy had become structural rather than experimental.
Most systems deployed in 2025 remained human-supervised, yet the role of software expanded rapidly. Algorithms increasingly handled navigation, target recognition, threat prioritisation, and mission planning, while human operators intervened mainly at key decision points. This hybrid model blurred the distinction between remotely operated weapons and autonomous systems, complicating both military doctrine and legal interpretation.
Rather than replacing soldiers, autonomy reduced reaction times, extended operational reach, and allowed a small number of operators to manage large numbers of platforms simultaneously. This logic shaped developments across air, land, and sea.
Ukraine: A Living Laboratory for Autonomous Warfare
The war in Ukraine continued to function as the most visible testing ground for autonomous and AI-enabled weapons in 2025. Aerial drones with onboard computer vision became more resilient to jamming and less dependent on continuous operator input. They could identify vehicles, fortifications, or heat signatures autonomously, adjusting flight paths and attack angles in real time.
On the ground, uncrewed ground vehicles moved beyond logistics and reconnaissance. Armed tracked robots, often remotely supervised but capable of autonomous movement and target tracking, were used for ambushes, perimeter defence, and high-risk operations. These systems reduced exposure for infantry while extending firepower into contested zones.
At sea, Ukraine expanded its use of autonomous surface and underwater drones. Maritime drones had already altered naval dynamics in the Black Sea, and in 2025 claims of underwater autonomous strikes highlighted how autonomy was reshaping naval warfare as well. The trend pointed toward a future where smaller states could challenge traditional naval power using relatively low-cost autonomous systems.
Gaza: AI in Dense Urban Conflict
In Gaza, autonomous and AI-assisted systems were integrated into intelligence, surveillance, and targeting workflows. Rather than acting independently, algorithms processed vast amounts of data from sensors, drones, and communications to generate target recommendations and prioritisation lists. Human commanders remained formally responsible for strike decisions, but automation increasingly shaped the speed and scale of operations.
The use of AI-driven decision support in such a dense civilian environment intensified debates around proportionality, accountability, and civilian protection. Human rights organisations questioned whether meaningful human control could be maintained when algorithms filtered, ranked, and accelerated lethal decisions under combat pressure.
Swarms, Scale and Speed
One of the most significant shifts in 2025 was the move toward scale. Militaries invested heavily in swarm concepts, where dozens or even hundreds of autonomous drones operate together, coordinating movements and adapting collectively to threats. While fully autonomous swarms remained limited, supervised swarm operations became more realistic, with one operator overseeing many systems rather than piloting each individually.
This approach promised saturation, redundancy, and psychological impact, while also lowering costs compared to traditional platforms. It reinforced a broader trend: autonomy favoured numbers, speed, and adaptability over individual platform sophistication.
Beyond the Battlefield: Industry and Arms Dynamics
The acceleration of autonomous weapons was not limited to active war zones. Defence industries expanded rapidly, with autonomy spreading into underwater drones, loitering munitions, robotic air defence, and autonomous logistics. Lessons learned in Ukraine and Gaza fed directly into procurement programmes worldwide, tightening the feedback loop between conflict and innovation.
This dynamic raised concerns about proliferation. Autonomous systems, particularly aerial and maritime drones, proved easier to replicate, modify, and export than traditional weapons, lowering barriers for both state and non-state actors.
Law, Ethics and the Lag of Governance
Throughout 2025, international discussions on lethal autonomous weapons continued, particularly at the United Nations. While there was broad agreement on the importance of human control, states remained divided on definitions, thresholds, and enforcement. Technology continued to advance faster than regulation, leaving gaps in accountability and legal clarity.
The core issue remained unresolved: when algorithms meaningfully shape lethal outcomes, responsibility becomes harder to assign, even if a human is formally “in the loop”.
A New Normal for Warfare
By the end of 2025, autonomous weapon systems were no longer framed as emerging technology. They were part of the standard toolkit of modern armed conflict. The wars in Ukraine and Gaza demonstrated that autonomy does not arrive as a single dramatic leap, but as a steady accumulation of software, sensors, and machine intelligence embedded into existing weapons.
The result is a form of warfare that is faster, more data-driven, and increasingly shaped by machines—while humans struggle to keep pace, both operationally and ethically.
