To invest machines with these capacities engineers have developed complex

To invest machines with these capacities engineers

This preview shows page 160 - 161 out of 432 pages.

stimuli. To invest machines with these capacities, engineers have developed complex algorithms, or computer-based sets of rules, to govern their operations. An AI- equipped aerial drone, for example, could be equipped with sensors to distinguish enemy tanks from other vehicles on a crowded battlefield and, when some are spotted, choose on its own to fire at them with its onboard missiles. AI can also be employed in cyberspace, for example to watch for enemy cyberattacks and counter them with a barrage of counterstrikes. In the future, AI-invested machines may be empowered to determine if a nuclear attack is underway and, if so, initiate a retaliatory strike. 4 In this sense, AI is an “omni-use” technology, with multiple implications for war-fighting and arms control. 5 Many analysts believe that AI will revolutionize warfare by allowing military commanders to bolster or, in some cases, replace their personnel with a wide variety of “smart” machines. Intelligent systems are prized for the speed with which they can detect a potential threat and their ability to calculate the best course of action to neutralize that peril. As warfare among the major powers grows increasingly rapid and multidimensional, including in the cyberspace and outer space domains, commanders may choose to place ever-greater reliance on intelligent machines for monitoring enemy actions and initiating appropriate countermeasures. This could provide an advantage on the battlefield, where rapid and informed action could prove the key to success, but also raises numerous concerns, especially regarding nuclear “crisis stability.” Analysts worry that machines will accelerate the pace of fighting beyond human comprehension and possibly take actions that result in the unintended escalation of hostilities, even leading to use of nuclear weapons . Not only are AI-equipped machines vulnerable to error and sabotage , they lack an ability to assess the context of events and may initiate inappropriate or unjustified escalatory steps that occur too rapidly for humans to correct . “Even if everything functioned properly, policymakers could nevertheless effectively lose the ability to control escalation as the speed of action on the battlefield begins to eclipse their speed of decision-making ,” writes Paul Scharre, who is director of the technology and national security program at the Center for a New American Security. 6 As AI-equipped machines assume an ever-growing number and range of military functions, policymakers will have to determine what safeguards are needed to prevent unintended, possibly catastrophic consequences of the sort suggested by Scharre and many other s. Conceivably, AI could bolster nuclear stability by providing enhanced intelligence about enemy intentions and reducing the risk of misperception and miscalculation; such options also deserve attention. In the near term, however, control efforts will largely be focused on one particular application of AI: fully autonomous weapons systems. Autonomous Weapons Systems Autonomous weapons systems, sometimes called lethal autonomous weapons systems, or “killer
Image of page 160
Image of page 161

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture