Robots In War – New Technology, Old Problems

ADM-20 “Quail” decoy drone
in service 1957

Combat Drones – the very term brings up images of Arnold Schwarzenegger strutting his stuff and saying “I’ll Be Back.” The reality however is quite different from the sci-fi fantasies you find in the works of Issac Asimov or of the Wachowski’s. Using robotic systems is well established on the modern battlefield. New concerns may be nothing more than debates over closing the barn door long after the cows have fled.

The idea of a combat drone is nothing new. The first recorded aerial unmanned attack vehicle even pre-dates the airplane, with the Austrian attack on Vienna utilizing hydrogen-filled balloons to drop explosives behind the enemy lines. Serious development however came with the World Wars. World War I saw some development of what were called Radio Planes, it was World War II which saw them come into their own, as targets for anti-aircraft gunners to practice on. Here is a picture of one “Norma Jeane” working at a munitions factory where she is helping to build one of these target drones. You may today know her by her stage name of Marilyn Monroe:

Of course, the generals concerned over the lives of the men under their command turned to these drones for use as “Aerial Torpedos,” unmanned ground attack aircraft. Their thought, to create an unmanned attack vehicle control system which could be refitted to obsolete, or worn out equipment no longer safe for men to fly. Some were nothing more than flying bombs, but others were outfitted with bombs to drop and cannons to shoot along pre-determined lines. It was during the test of these systems in which Joseph Kennedy, older brother to future president John Kennedy, lost his life. Even with the accidents during testing, the aerial torpedo entered into service, and are credited with destroying hundreds of enemy installations and warships. The Germans were heavily in to the development of this technology as well, with the infamous V-1 “buzz bomb” seeing service for the last half of the war. The Japanese, confused at seeing aircraft apparently committing suicide, responded in kind, spawning the Kamikaze.

After the war, research continued. Propeller driven craft were replaced with jet, and a new family of drones were born. Eventually, advances in solid rocket motor technology enabled the creation of missiles as a derivative of the drone research. But the drones kept going and getting ever more sophisticated. Many systems took on “fire and forget” capabilities, where they would find their own targets based on specific criteria fed to them.

During the 1980s, crude levels of computerized intelligence became possible, enabling the rise of systems such as the Phalanx ship defense system, which can detect and respond to a wide variety of anti-ship weapon systems. The advance in communications technology, however, enabled the deployment of remote control drones over a far longer distance. By the 1990s, this enabled for far finer control of drones even when you were across the entire globe.

And this added capability enabled drones to do far more on the battlefield than just launching a half-dozen Rocket-Propelled Grenades, as this 1985 Iranian Mohajer-4 (above) could do. The modern day drone, such as the MQ-9 Reaper (below) can now use smart weapons, able to target specific ground targets after hours of operation.
The concern comes with the combination of the new computerized intelligence with the enhanced capabilities of drones. Eventually the capability of systems such as the Phalanx, to pick its targets and engage without a human in the loop, will be brought to the unmanned drone systems, such as the Reaper.

It is this concern which brought forth this paper from Human Rights Watch, titled “Losing Humanity: The Case Against Killer Robots,” paints a bleak picture based on a worst-case scenario. In it, they bring up systems such as the Phalanx with concerns about the system’s ability to determine friend from foe. Countering their argument you find papers such as “Accelerating AI” by John McGinnis, which finds that as Computerized Intelligence systems transition to true Artificial Intelligence, a morality system is a key component of this transitional phase. Backing McGinnis up is Georgia Tech professor Ronald C. Arkin, who in his study of AI has found that machines, not having their judgement clouded by emotions such as fear or aggression, will pick methodically, and carefully, their targets. With a hard-wired judgement system, they respond in predictable ways, resulting in less, not more, bloodshed in simulations.

Some people point to the civilian deaths by drones such as the Predator as examples of the dangers of drone warfare. However when studied, the mistakes were not machine, but from men. An incorrect identification, a technicians error, the deaths of innocents by drones have been due to human error. The drones did not make the mistake. Supporters of drone warfare point to these attributes as why the man in the machine is the problem, not the machines themselves. The New York Times interviewed Doctor John Arquilla of the Information Operations Center at the Naval Postgraduate School, who had this to say:

A lot of people fear artificial intelligence. I will stand my artificial intelligence against your human any day of the week and tell you that my A.I. will pay more attention to the rules of engagement and create fewer ethical lapses than a human force.

He further went on to say:

Some of us think that the right organizational structure for the future is one that skillfully blends humans and intelligent machines. We think that that’s the key to the mastery of 21st-century military affairs.

At no point is he arguing for machines to make the decisions, but to be part of the system needed to make and execute decisions. Going back to the earlier example of the Phalanx, it is only turned on when the ship is on alert, and is prepared to defend itself if attacked. That is the arrangement which Dr Arquilla believes is the future, and based on the successful use of the Phalanx along with other drones over the past century, history would tend to agree with him.

For good or ill, automated combat machines are part of modern warfare. Focus should be on how to ensure safety of these systems. After all, it is not just nations like the United States or China which can field drones. The capability for them is actually within reach of everyone, both great nations as well as terror states. The key to ensure their safe use is for the introduction of the “Ethical Governor” on a widespread basis.

Mankind is a competitive animal, driven by nature to compete for almost everything. There is hope for competition to be peaceful drives, but some are driven to violence. The use of the combat drone can at least put some hope that should violent means be the only options available, the impact to human life can be minimized.