WAR: THE AUTONOMOUS AGE
“Wars are a human phenomenon, arising from human needs for human purposes. This makes intimate human participation at some level critical, or the entire exercise becomes pointless.”–Colonel Thomas K. Adams, U.S. Army
War at its root is a violent business, or rather a violent human business. Since the dawn of humankind, conflict has troubled us.
Could humans be removed from the fight? Is artificial intelligence a threat to humanity?
Our world is changing–rapidly. This statement is most evident when considering the growth of our beloved technology. Automation has reached new heights with recent breakthroughs in machine learning, artificial intelligence and cutting-edge robotic and biotech systems.
Humanity is at a point on the technology growth scale where technology is growing and changing faster than we humans can process and faster than we can place any ethical restraint on it. Nowhere is this more evident than in the realm of Artificial Intelligence.
Automation vs. Autonomous
Artificial intelligence [AI] as a definition of computational function has become a muddled and oversimplified term in this flurry of technology leaps. Although AI has no firm definition, one might divide the term into two camps: automated and autonomous systems.
Automated systems are those that compute results based on a set of instructions or criteria. The result is that for a set of inputs the same output is always generated.
In an Autonomous system, however, the system is first trained with thousands of scenarios and their resulting output. Once taught the system can reasonably guess at an output based on the inputs that it receives [through algorithms.] The output can be different depending on past knowledge.
How AI currently differs from human thinking is in the area of experience and intuition. The statistical result is not always the best one. Humans are still superior in this regard.
Currently, a theoretical autonomous weapons system could efficiently target and destroy a particular building or an isolated target. What it could not do is determine something is a target in passing and then reason that only this target would be affected by a strike. A human might see a target of opportunity and then call off the attack if a school bus were nearby for example.
Although progression toward artificial reasoning is slow, there is still progression and much investment to bridge the human vs. computer gap.
Show me the Money
The US military budgeted over 70 billion dollars for technology and unmanned equipment research in 2017, building momentum from the successful use of such systems during the 1990’s Gulf Wars and beyond.
The Canadian military follows suit with its new strategic defence policy, “Strong, Secure, Engaged”—2017, which speaks of the need for increased attention to cybersecurity and automated defence technology.
With all of the attention that the military seems to put on AI and drone technology, it is the commercial sector that wins the AI war. For example, the commercial sector in America is spending three times more on autonomous vehicle research than the military.
Contributing to the technology gap is talent acquisition. In the software and robotics engineering fields there is fierce competition for expertise, jobs in these fields are far more attractive in civilian industry. Currently, commercial drones show far more promise and capability than ‘like’ military systems.
Why does a strong commercial drone market seem like an issue from a military perspective? Conceivably, the military could give up on RND in favour of off-the-shelf solutions to serve their needs—and save money. However, anyone [terrorists] would have free access to the same market. Also, the military would effectively defer the hard [ethical and law] decisions to commercial entities. Google might decide how robots should ‘think’ and act. Government agencies could lose control of their law and policy-making ability.
Unmanned systems were proven in the Gulf Wars of the 90’s and have shown their worth in subsequent conflicts. UAVs [Unmanned Arial Vehicles] are invaluable as surveillance and intelligence agents. It was inevitable that these machines began to be retrofitted with weapons; a decision that expanded their utility astronomically.
The word from the front is that soldiers love these new robotic systems. They minimize risk to troops by taking them away from risky efforts [dismantling mines] and gave them valuable intelligence on demand [surveillance]. Not only did the use of robots frustrate opposing forces, but psychology experts who study the psychological affects robotic of systems say these machines have tremendous effects on moral in war. Being hunted by a relentless machine is demoralizing and psychologically horrifying. However, the appearance of robotics on the battlefield can have adverse effects on the users as well.
In some cases, especially where the warrior mentality is strong, the use of robots can make an army look apprehensive and unwilling to take risks. This ‘dishonoring’ effect can lead to contempt among the true warrior types. Warriors may view the entity using the autonomous system as cowardly. Some believe that this mentality might promote aggressive attacks on the heart of the user’s country.
Finally, with a decline in military recruitment, autonomous attack platforms might fill the lack of personnel gap. However, would AI soldiers make the right decisions?
Permission to Fire
“One of the most misnamed weapons in our system is the unmanned aerial vehicle. It may not have a person in the cockpit, but there’s someone flying it. In addition to the person on the joystick flying the thing, there’s someone over their shoulder. There are actually more people probably flying it than a manned airplane.” — Jim Mattis, US Defense Secretary
Unmanned systems as they are today are far from unmanned. It is true the actual drone system does not contain a human; however, it often takes many people to operate such a platform. From data analysts to pilots to strategic planners, the average unmanned drone requires a team of operatives to deploy.
As AI technology becomes more sophisticated, computer systems replace human operators. These automated systems are far more adept at analyzing the growing amount of intelligence data and can decide how to act in microseconds. However, there is one major area of controversy regarding automating weapon systems. Will we give a machine the ability to choose to fire a weapon?
The decision to fire on an enemy is not a light one. Currently, the decision to fire must follow International Law, the chain of command and even then, a human operator must push the launch button.
There are many laws governing war and what is just cause for aggressive action. In the case of an unmanned attack drone, and the violation of law, who would accept responsibility for the war crime? The design engineer, the manufacturer, the buyer, the government? How would codes and ethics be programmed into the machine? Who decides the ethics that govern a lethal robot? These are among today’s most pressing questions about autonomous weapon systems.
Arming unmanned systems and allowing them the authority to actively fire those weapons at their ‘will’ is the source of much controversy and ethical discussion, some calling outright banning unmanned weapons.
Worldwide, there are a number of prominent scientific contributors advocating for the disarming of unmanned robots. One such anti-weaponization advocate is Ian Kerr, the Canada Research Chair in Ethics, Law & Technology. Ian and his colleagues are actively lobbying the Canadian Government to disarm and ban unmanned weapons platforms. One can see how heated the conversation can get by visiting Ian’s website www.iankerr.ca.
The arming of automated systems and allowing those systems the reasoning capacity to fire has been compared to developing atomic weapons by many. Autonomous weapons are the most significant technological change in Warfare since the atom bomb’s creation. If developed to its full potential, fully autonomous weapons systems will change humanity in a most similar way to the A-bomb. Who will drive the change?
“Arsenal plane.” U.S. Dept. Of Defence, Air Force illustration, 16 Feb. 2016, www.defense.gov/Photos/Photo-Gallery/igphoto/2001663429/ . Air Force Secretary Deborah Lee James introduced this artist’s concept of the Strategic Capabilities Office’s arsenal plane in a video released in February at an Air Warfare Symposium in Orlando, Fla., Feb. 26, 2016. Air Force illustration
“Blog.” Ian Kerr, 29 Oct. 2015, www.iankerr.ca/
Cummings, M. L. “Artificial Intelligence and the Future of Warfare.” Chatham House, the Royal Institute of International Affairs, Jan. 2017, pp. 1–18. International Security Department and US and the Americas Programme
Dyer, Geoff. “US military: Robot wars.” Financial Times, Financial Times, 7 Feb. 2016, www.ft.com/content/849666f6-cbf2-11e5-a8ef-ea66e967dd44
Editors, Big Think. “Big Idea: Technology Grows Exponentially.” Big Think, 26 Mar. 2011, http://bigthink.com/think-tank/big-idea-technology-grows-exponentially
Fitzgerald, Ben, and Jacqueline Parziale. “As technology goes democratic, nations lose military control.” Bulletin of the Atomic Scientists, vol. 73, no. 2, 2017, pp. 102–107., doi:10.1080/00963402.2017.1288445
Government of Canada, National Defence, Royal Canadian Air Force. “Article | Royal Canadian Air Force | News Article | Operation Innovation.” Government of Canada, National Defence, Royal Canadian Air Force, 19 Oct. 2017, www.rcaf-arc.forces.gc.ca/en/article-template-standard.page?doc=operation-innovation%2Fizkjrliv
Hansen, Ken. “What’s happening to Canada’s defence spending?” Macleans.ca, 6 Mar. 2018, www.macleans.ca/opinion/whats-happening-to-canadas-defence-spending/
Harris, Shane. @War: the rise of the military-Internet complex. Sold by Amazon Digital Services LLC, 2015.
“Killer robots: Experts warn of ‘third revolution in warfare’.” BBC News, BBC, 21 Aug. 2017, www.bbc.com/news/technology-40995835
Military.com 21 Feb 2018 By Richard Sisk. “Mattis’ Pet Peeve: Calling Drones ‘Unmanned Aerial Vehicles’.” Military.com, 21 Feb. 2018, www.military.com/defensetech/2018/02/21/mattis-pet-peeve-calling-drones-unmanned-aerial-vehicles.html
Singer, Peter W. Wired for war: the robotics revolution and conflict in the twenty-first century. Penguin Books, 2010
United States, Congress, National Defence. “Strong, Secure, Engaged.” Strong, Secure, Engaged, Ministry of National Defence, 2017. Canada’s Defence Policy