Home » Sector Topics » AI in Government

The Advent of Killer Robots

The Advent of Killer Robots
An unmanned drone. Image: Adobe Stock.
  • Next-generation warfare involves human-machine teaming (HMT).
  • Human-AI interfaces are more error prone than traditional combat.
  • Autonomous robots as a solution poses ethical challenges.

At the core of the military’s use of human-machine teaming (HMT) is artificial intelligence used to program flights and target drone strikes. But while drones remove combat pilots from dangerous situations, the use of AI increases the probability of human error. The solution may lie in fully autonomous drones, but ethical problems might prevent “killer robots” from ever becoming a reality.

Drone pilots face many challenges

Human factors, like difficulty interpreting the AI-interface and working in an unstimulating work environment, are responsible for a higher percentage of major mishaps for Predator drones than for manned aircraft [1]. The amount of information presented on multiple screens, control requirements, and necessary human information processing exceeds that of most work domains [2]. Phil Hall, a pilot for NASA’s Global Hawk research drone, told NBC News that combat pilots in traditional aircraft can sense problems immediately, “but when there’s a drone involved,” he said, “there’s a bit of translation, and there’s only so much of the situation you can read” [3].

Another reason for mishaps; The job of a drone pilot is uncomfortable and boring. “Mary “Missy” Cummings, associate professor of aeronautics and astronautics and engineering systems at MIT, says that the unstimulating work environment makes it difficult for a drone pilot to “jump into action” in the rare event human intervention is needed [4].

Is a fully autonomous drone the answer?

Fully autonomous drones have been touted by the military as the solution for the challenges posed by human-AI interfaces. Autonomous weapons can react in a fraction of the time it would take a human, and their superior aim may reduce collateral harm [5].

The technology is advancing so rapidly, that “large-scale adversarial swarms” are “an imminent threat,” U.S. Naval Postgraduate School engineering professor Isaac Kaminer told Forbes, outlining his institution’s plan to model swarms of millions of drones [6]. Other countries are also developing the technology. For example, India has announced plans to build a swarm of 1,000 autonomous drones [7]. According to a U.N. report, Tukey has already deployed autonomous weapons; The STM Kargu, a rotary wing kamikaze drone produced in Turkey, attacked members of the Libyan National Army without requiring data connectivity with an operator [8].

STM_Kargu
STM Kargu. Credit: Armyinform via Wikimedia Commons.

The U.S. National Security Commission on AI states that properly designed, tested, and used autonomous weapon systems could reduce the risk of accidental engagements and decrease collateral damage. However, machine learning systems rely on data to draw conclusions about what to target, writes Zachary Kallenborn, a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism. This data dependence makes them brittle, he continues, writing that many subtle environmental factors like color differences, tree branches, or fog can result in a case of mistaken identity. More alarmingly, he writes that the systems could be used by nefarious governments to invoke ethnic cleansing [5].

The most concerning issue is that machines cannot comprehend the value of human life or make ethical choices. The trajectory of AI-enabled warfare is eerily reminiscent of the Terminator franchise when judgment day – a full-blown nuclear attack by machines of humans – happened shortly after human decisions were removed from strategic defense. We already have AI that can write its own code; Whether “killer robots” will be able to write their own mission parameters remains to be seen. In the meantime, you might want heed Sarah Connor’s advice and stock up on that two million sunblock.

References

STM Kargu Pic: Armyinform.com.ua, CC BY 4.0 <https://creativecommons.org/licenses/by/4.0>, via Wikimedia Commons

[1] Too Many Screens- Why Drones Are So Hard To Fly, So Easy To Crash

[2] Fortifying Remote Warriors: Addressing Wellness Issues Among Intelligence Airmen.

[3] In the virtual cockpit: What it takes to fly a drone

[4] Driving drones can be a drag

[5] National Security Commission on Artificial Intelligence Final Report

[6] The U.S. Navy Plans To Foil Massive ‘Super Swarm’ Drone Attacks By Using The Swarm’s Intelligence Against Itself

[7] In a first, India demonstrates combat drone swarm system

[8] UN Security Council Letter