Preceding discussions at the United Nations in Geneva on autonomous weapons The Future of Life Institute has published a sequel to the grim warning video Slaughterbots, that in 2017 warned the world of the dangers of autonomous weapons. In the new video those lethal autonomous weapons have arrived.
The Future of Life Institute, working with the Campaign to Stop Killer Robots and the International Red Cross defines slaughterbots, also called “lethal autonomous weapons systems” or “killer robots”, as weapons systems that use artificial intelligence (AI) to identify, select, and kill human targets without human intervention. Whereas in the case of unmanned military drones the decision to take life is made remotely by a human operator, in the case of lethal autonomous weapons the decision is made by algorithms alone.
Slaughterbots are pre-programmed to kill a specific “target profile.” The weapon is then deployed into an environment where its AI searches for that “target profile” using sensor data, such as facial recognition. When the weapon encounters someone the algorithm perceives to match its target profile, it fires and kills. They argue weapons that use algorithms to kill, rather than human judgement, are immoral and a grave threat to national and global security.
The International Committee of the Red Cross (ICRC) syas on killer robots: “We do not need to be resigned to an inevitable future with Slaughterbots. The global community has successfully prohibited classes of weaponry in the past, from biological weapons to landmines.”
As with those efforts, the International Committee on the Red Cross (ICRC) recommends that states adopt new legally binding rules to regulate lethal autonomous weapons. Importantly, the ICRC does not recommend a prohibition of all military applications of AI – only of specific types of autonomous weapons. There are many applications of military AI already in use that do not raise such concerns, such as automated missile defense systems.”
The United Nations’ Convention on Certain Conventional Weapons (CCW) in Geneva has earlier established a Group of Governmental Experts on Lethal Autonomous Weapons to debate this issue and to develop a new “normative and operational framework” for consideration by states. This group has produced a set of eleven non-binding Guiding Principles on Lethal Autonomous Weapons from which to develop a new instrument. The group is expected to share the output of those discussions in a report to states for the Sixth Review Conference later this month.