Bad Robots: Autonomous Weapons and how fast we are approaching to having Killer Robots.

Bad Robot Outcome:
The proliferation of AI is increasingly making things that were thought impossible to be possible and easily accomplished. Technological advancements in the arms and military theatres are increasing the pace at which autonomous arms are being adopted. Killer robots are increasingly becoming a reality. These killer robots can make killing humans easy and provide an easily accessible mechanism to kill people for authoritarian states and non-state actors, among them terrorists.

The Story

A war is ensuing between two countries. The conquering country wants to minimise human casualties on its end. It thus uses an army of thousands of small killer drones with instructions to kill anything in the city they seek to conquer. Within a few hours, the city is quiet and safe for occupation, but there is only one problem; thousands of dead bodies line the street everywhere.
Such a scenario seems to only exist in ‘The Terminator’ Hollywood movie scene, but that is a reality that seems to close in on us daily. Battles are already being fought using automated weapons. The recent conflict between Armenia and Azerbaijan over the Nagorno-Karabakh region showed how much difference armed can have in an armed conflict. The killing of Iranian Army general Qasem Soleimani by the US was carried out using a drone operated from thousands of miles.

The fallout

The human loop has always been an essential aspect of the military, but that seems to be slowly fading away. However, advancements in AI, robotics, and image recognition, seem to slowly bypass the human loop to create weapons systems that can find and use lethal force without human involvement. Most of the powerful militaries are continually perfecting their autonomous weapons to eliminate the human loop. Some countries are already using kamikaze drones, commonly known as loitering munitions. Once launched, these drones can stay in the air loitering, wait and identify the target, and automatically engage without any human intervention.

Loitering munitions and other military projects such as DARPA’s CODE continually increase autonomy where drones are programmed to adjust and improvise as they fulfill their mission. Russia and the US, in particular, have pursued autonomous missiles that can autonomously operate by homing on the target and attacking without human input.

These advances leave many asking the question of what could go wrong.

In the movie Slaughterbots, terrorists release a swarm of tiny drones into a classroom. These drones attack the students with deadly force, all while the terrorists are miles away from the actual theatre of attack. The film shows how advances in autonomous weapons set the stage for terrorists and other non-state actors to carry out mass civilian murders.


Such drones are increasingly becoming popular even in actual conflicts such as the Gaza-Israel conflict and in Iraq, where drones are used to target civilians and military installations. The advancement in drone technology will make autonomous drone cheap, thus their proliferation which raises the probability of local conflicts turning into genocides. There is no telling what could happen if these weapons went rogue, but as sci-fi movies have shown, it would be apocalyptic.

Our view

Avoiding the eventual outcomes of these proliferating autonomous weapons, countries have been pushing to establish a pre-emptive ban on such weapons similar to the 1972 ban enacted on the use of bioweapons. Although there are some challenges in reaching a consensus on such a ban, countries must ensure AI development on weapons remains careful, open, collaborative, and transparent to avoid an eventual autonomous weapons race that could spiral the world into chaos of unprecedented magnitude.