Home Bots & Bullets UNSW Expert Calls for Ban on AI-Powered Autonomous Weapons

UNSW Expert Calls for Ban on AI-Powered Autonomous Weapons

by Marco van der Hoeven

Comparing the potential consequences of artificial intelligence (AI) weaponry to the horrors caused by chemical and biological weapons in the past, a leading AI expert from the University of New South Wales (UNSW) has made a strong case for banning autonomous systems from the battlefield.

In the latest episode of UNSW’s ‘Engineering the Future’ podcast series, Scientia Professor Toby Walsh, chief scientist at UNSW’s AI Institute, expressed grave concerns about the rising use of lethal autonomous weapons in current conflicts, notably the ongoing situation in Ukraine. He argues that these weapons should be added to the UN’s Convention on Certain Conventional Weapons, which regulates emergent weapon technologies.

The Geneva Convention, established in 1864, sets boundaries on weaponry and strategies in warfare with the aim of minimizing brutality. Chemical and biological weaponry, due to the devastating effects witnessed during World War I, have been prohibited since 1925. Walsh now advocates for AI-powered weaponry to join this list of prohibited armaments.

Previously banned from Russia for questioning its claims about creating a more “humanitarian” AI-powered anti-personnel land mine, Walsh also highlights other autonomous weapons currently deployed in the Ukraine conflict which he believes should be banned.

The professor elaborates on the potentially transformational impact of AI on warfare: “I’m pretty sure historians will look back at the Ukrainian conflict and note how drones, autonomy, and AI started altering the nature of combat – and not in a favorable way,” he remarks.

Walsh raises the point that handing over the act of killing to machines may fundamentally alter warfare’s character. From a legal standpoint, machines cannot be held accountable under international humanitarian law, which relies on principles like distinction and proportionality. “Law is about holding people accountable,” Walsh notes, pointing out that machines lack this responsibility.

He further stresses the unpredictability and chaos of the battlefield as the worst possible environment for robots, which lack human judgment. The moral implications of replacing humans with machines in warfare, according to Walsh, are even graver. “War is sanctioned because it’s one person’s life against another,” he says. “When you replace a soldier with a machine, you remove the potential for empathy, understanding, and accountability.”

Holding a hopeful outlook for the future, Walsh concludes, “I believe we will eventually recognize the dangers of autonomous weapons, much like we did with chemical and biological ones. My concern is that we might only do so after witnessing the devastating consequences of their misuse.”

Misschien vind je deze berichten ook interessant