Artigo Revisado por pares

Stuxnet and the Limits of Cyber Warfare

2013; Taylor & Francis; Volume: 22; Issue: 3 Linguagem: Inglês

10.1080/09636412.2013.816122

ISSN

1556-1852

Autores

Jon R. Lindsay,

Tópico(s)

Information and Cyber Security

Resumo

Abstract Stuxnet, the computer worm which disrupted Iranian nuclear enrichment in 2010, is the first instance of a computer network attack known to cause physical damage across international boundaries. Some have described Stuxnet as the harbinger of a new form of warfare that threatens even the strongest military powers. The influential but largely untested Cyber Revolution thesis holds that the internet gives militarily weaker actors asymmetric advantages, that offense is becoming easier while defense is growing harder, and that the attacker's anonymity undermines deterrence. However, the empirical facts of Stuxnet support an opposite interpretation; cyber capabilities can marginally enhance the power of stronger over weaker actors, the complexity of weaponization makes cyber offense less easy and defense more feasible than generally appreciated, and cyber options are most attractive when deterrence is intact. Stuxnet suggests that considerable social and technical uncertainties associated with cyber operations may significantly blunt their revolutionary potential. This article is part of the following collections: A Decade of Nuclear Scholarship in Security Studies Acknowledgments Jon R. Lindsay is an assistant research scientist with the University of California Institute on Global Conflict and Cooperation (igcc), located at uc San Diego. He holds a PhD in political science from the Massachusetts Institute of Technology, an ms in computer science from Stanford University, and he has served as an officer in the us Navy. He would like to thank Erik Gartzke, Robert Giesler, Brendan Green, Tim Junio, Sean Lawson, Carrie Lee Lindsay, Charles Perrow, Joshua Rovner, and the editors and anonymous reviewers at Security Studies for their valuable comments and advice on previous drafts. Notes The original announcement of “Rootkit.TmpHider” was posted by Sergey Ulasen of VirusBlokAda on an information security forum on 12 July 2010, http://www.anti-virus.by/en/tempo.shtml. For an accessible account of Stuxnet's discovery see Kim Zetter, “How Digital Detectives Deciphered Stuxnet, the Most Menacing Malware in History,” Wired Threat Level Blog, 11 July 2011, http://www.wired.com/threatlevel/2011/07/how-digital-detectives-deciphered-stuxnet. Aleksandr Matrosov, Eugene Rodionov, David Harley, and Juraj Malcho, “Stuxnet under the Microscope,” eset, white paper, 20 January 2011. The dubious honor of “most sophisticated malware” has perhaps passed to a Stuxnet relative named Duqu or to the Flame spyware (which is twenty times the file size of Stuxnet). Mark Clayton, “Stuxnet Malware Is ‘Weapon’ Out to Destroy … Iran's Bushehr Nuclear Plant?” Christian Science Monitor, 21 September 2010. David E. Sanger, “Obama Order Sped Up Wave of Cyberattacks Against Iran,” New York Times, 1 June 2012. William J. Broad, John Markoff, and David E. Sanger, “Israel Tests on Worm Called Crucial in Iran Nuclear Delay,” New York Times, 15 January 2011. Mark Clayton, “The New Cyber Arms Race,” Christian Science Monitor, 7 March 2011, (“cyber equivalent”). In the vein of “a new era of warfare,” the cover of the 3 July 2010 edition of The Economist depicted a digitized mushroom cloud. “Stuxnet: Computer Worm Opens New Era of Warfare,” Transcript, 60 Minutes, CBS News, 4 March 2012. “Russia Says Stuxnet Could Have Caused New Chernobyl,” Reuters, 26 January 2011. Sanger, “Obama Order.” Arguments for the Cyber Revolution thesis by former senior us officials include Mike McConnell, “Cyberwar is the New Atomic Age,” New Perspectives Quarterly 26, no. 3 (Summer 2009): 72–77; Richard A. Clarke and Robert Knake, Cyber War: The Next Threat to National Security and What to Do about It (New York: Harpercollins, 2010); Joel Brenner, America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare (New York: Penguin Press, 2011). On Stuxnet as an rma, see James P. Farwell and Rafal Rohozinski, “Stuxnet and the Future of Cyber War,” Survival 53, no. 1 (February-March 2011): 23–40; Joseph S. Nye Jr., “Nuclear Lessons for Cyber Security?” Strategic Studies Quarterly 5, no. 4 (Winter 2011); Paulo Shakarian, “Stuxnet: Cyberwar Revolution in Military Affairs,” Small Wars Journal (April 2011); Sean Collins and Stephen McCombie, “Stuxnet: The Emergence of a New Cyber Weapon and Its Implications,” Journal of Policing, Intelligence and Counter Terrorism 7, no. 1 (2012): 80–91. Remarks by Secretary Panetta on Cybersecurity to the Business Executives for National Security, us Dept. of Defense, New York City, 11 October 2012, http://www.defense.gov/transcripts/transcript.aspx?transcriptid=5136. Barack Obama, “Taking the Cyberattack Threat Seriously,” Wall Street Journal, 19 July 2012. “Senate Select Intelligence Committee Holds Hearing on Worldwide Threats,” Defense Intelligence Agency, 31 January 2012, http://www.dia.mil/public-affairs/testimonies/2012-01-31.html. Adm. Mike Mullen, quoted in Marcus Weisgerber, “DoD to Release Public Version of Cyber Strategy,” Defense News, 8 July 2011. This is an astonishing claim coming from a man well familiar with the world's nuclear arsenals. James A. Lewis and Katrina Timlin, Cybersecurity and Cyberwarfare: Preliminary Assessment of National Doctrine and Organization (Washington, dc: Center for Strategic and International Studies, United Nations Institute of Disarmament Research, 2011). See inter alia, Nicholas Burns and Jonathon Price, Securing Cyberspace: A New Domain for National Security (Aspen, co: Aspen Institute, 2012); Kristin M. Lord and Travis Sharp, America's Cyber Future: Security and Prosperity in the Information Age (Washington dc: Center for a New American Security, 2011); David J. Betz and Timothy C. Stevens, “Cyberspace and the State: Toward a Strategy for Cyber-Power,” International Institute for Strategic Studies (IISS) Adelphi Paper, no. 424 (2011); Paul Cornish, David Livingstone, Dave Clemente, and Claire Yorke, “On Cyber Warfare,” Royal Institute of International Affairs, Chatham House Report (November 2010); Franklin D. Kramer, Stuart H. Starr, and Larry K. Wentz, eds., Cyberpower and National Security (Washington, dc: National Defense University Press, 2009). Adam P. Liff, “Cyberwar: A New ‘Absolute Weapon’? The Proliferation of Cyberwarfare Capabilities and Interstate War,” Journal of Strategic Studies 35, no. 3 (June 2012); Thomas Rid, “Cyber War Will Not Take Place,” Journal of Strategic Studies 35, no. 1 (February 2011): 5–32; Martin C. Libicki, Cyberdeterrence and Cyberwar (Santa Monica, ca: rand, 2009); Evgeny Morozov, “Cyber-Scare: The Exaggerated Fears over Digital Warfare,” Boston Review (July/August 2009); Myriam Dunn Cavelty, “Cyber-Terror: Looming Threat or Phantom Menace? The Framing of the us Cyber-Threat Debate,” Journal of Information Technology & Politics 4, no. 1 (2007): 19–36; Martin C. Libicki, Conquest in Cyberspace: National Security and Information Warfare (Cambridge University Press, 2007); Gregory J. Rattray, Strategic Warfare in Cyberspace (Cambridge, ma: Massachusetts Institute of Technology (mit) Press, 2001); Bradley A. Thayer, “The Political Effects of Information Warfare: Why New Military Capabilities Cause Old Political Dangers,” Security Studies 10, no. 1 (Autumn 2000): 43–85; Peter D. Feaver, “Blowback: Information Warfare and the Dynamics of Coercion,” Security Studies 7, no. 4 (Summer 1998): 88–120. On the direct technical effects of Stuxnet on Iranian computer systems, I draw on forensic investigation by computer security firms Symantec, eset, and Langner Communications; Nicolas Falliere, Liam O Murchu, and Eric Chien, “W32.Stuxnet Dossier, version 1.4,” Symantec, 4 February 2011, http://www.symantec.com/content/en/us/enterprise/media/security_response/whitepapers/w32_stuxnet_dossier.pdf; Aleksandr Matrosov, Eugene Rodionov, David Harley, and Juraj Malcho, “Stuxnet under the Microscope, version 1.31,” white paper, eset, 20 January 2011, http://go.eset.com/us/resources/white-papers/Stuxnet_Under_the_Microscope.pdf; Ralph Langner, “Stuxnet Attack Code Deep Dive” (presentation at Digital Bond scada Security Scientific Symposium (S4) in Miami, fl, 18–19 January 2012), http://www.digitalbond.com/2012/01/31/langners-stuxnet-deep-dive-s4-video; a synthesis of technical details accessible to lay readers and a detailed interactive timeline can be found in Zetter, “How Digital Detectives Deciphered Stuxnet.” To assess Stuxnet's indirect strategic effects on Natanz, I rely on International Atomic Energy Agency (iaea) inspection reports (http://www.iaea.org/newscenter/focus/iaeairan/iaea_reports.shtml) and Institute for Science and International Security (isis) analyses of Iranian enrichment operations (http://isisnucleariran.org/). I supplement these with contemporary press reporting, particularly the New York Times’ David E. Sanger's path-breaking investigation of Olympic Games. For a detailed history of computerization in the American private and public sector, see James W. Cortada, The Digital Hand, 3 vols. (New York: Oxford University Press, 2004–2008). The “productivity paradox” debate over the relationship between IT inputs and firm performance has been resolved following clarification of the critical role of organizational structure and process; Erik Brynjolfsson, Lorin M. Hitt, and Shinkyu Yang, “Intangible Assets: Computers and Organizational Capital,” Brookings Papers on Economic Activity no. 1 (2002): 137–81. For a textbook introduction to technical cybersecurity, see Ross J. Anderson, Security Engineering: A Guide to Building Dependable Distributed Systems, 2nd ed. (Indianapolis, in: Wiley Publishing, 2008). For a good introduction to offensive cyber operations, including attack/disruption and exploitation/theft, see William A. Owens, Kenneth W. Dam, and Herbert S. Lin, eds., Technology, Policy, Law, and Ethics Regarding u.s. Acquisition and Use of Cyberattack Capabilities (Washington, dc: National Academies Press, 2009). Ross Anderson, Chris Barton, Rainer Bohm, Richard Clayton, Michel J. G. Van Eeten, Michael Levi, Tyler Moore, and Stefan Savage, “Measuring the Cost of Cybercrime,” Proceedings of the Workshop on the Economics of Information Security (June 2012); Kirill Levchenko et. al, “Click Trajectories: End-To-End Analysis of the Spam Value Chain,” Proceedings of the IEEE Symposium and Security and Privacy (May 2011): 431–46; Misha Glenny, DarkMarket: How Hackers Became the New Mafia (New York: Vintage, 2011); Cormac Herley and Dinei Florêncio, “Nobody Sells Gold for the Price of Silver: Dishonesty, Uncertainty and the Underground Economy,” Economics of Information Security and Privacy (2010): 33–53. Bryan Krekel, Patton Adams, and George Bakos, “Occupying the Information High Ground: Chinese Capabilities for Computer Network Operations and Cyber Espionage,” prepared for the us-China Economic and Security Review Commission by Northrop Grumman, 7 March 2012; Office of the National Counterintelligence Executive, “Foreign Spies Stealing us Economic Secrets in Cyberspace,” report to Congress on Foreign Economic Collection and Industrial Espionage 2009–2011, October 2011; Shadows in the Cloud: An Investigation into Cyber Espionage 2.0, joint report of the Information Warfare Monitor and Shadowserver Foundation, 6 April 2010, http://shadows-in-the-cloud.net; “Gauss: Abnormal Distribution,” Kaspersky Lab Global Research and Analysis Team Report, August 2012, http://www.securelist.com/en/analysis/204792238/Gauss_Abnormal_Distribution. Christian Czosseck, Rain Ottis, and Anna-Maria Talihärm, “Estonia after the 2007 Cyber Attacks: Legal, Strategic and Organisational Changes in Cyber Security,” Journal of Cyber Warfare and Terrorism 1, no. 1 (2011); John Bumgarner and Scott Borg, “Overview By the us-ccu of the Cyber Campaign Against Georgia in August of 2008,” us Cyber Consequences Unit Report, August 2009; Ronald Deibert, John Palfrey, Rafal Rohozinski, and Jonathan Zittrain, eds., Access Contested: Security, Identity, and Resistance in Asian Cyberspace (Cambridge, ma: mit Press, 2011); Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (New York: PublicAffairs, 2011). Military doctrine has not stabilized for cyber concepts yet, and debate continues on the distinctions between cyber warfare, computer network operations, information operations, electronic warfare, etc. In this paper I focus on the use of computer hacking to cause mechanical damage in the service of strategic objectives. Cyber warfare clearly encompasses the tactical modalities of cyber attack (degredation of normal hardware or software functionality), exploitation (covert theft or use of data or computational resources), and defense (efforts to prevent adversarial attack or exploitation); my emphasis in this paper is on the primary aggressive move of attack. David A. Fulghum, “Why Syria's Air Defenses Failed to Detect Israelis,” Aviation Week, Ares Blog, 3 October 2007. Some sources dispute whether the Israelis used cyber attack or more traditional forms of electronic jamming; Ellen Nakashima, “u.s. Accelerating Cyberweapon Research,” Washington Post, 18 March 2012. Raphael Satter, “us General: We Hacked the Enemy in Afghanistan,” Associated Press, 24 August 2012. Martin Libicki, Cyberdeterrence and Cyberwar (Santa Monica, ca: rand, 2009) distinguishes “operational cyberwar—cyberattacks to support warfighting” from “strategic cyberwar, cyberattacks to affect state policy”; see Libicki, Cyberdeterrence, 6. The Cyber Revolution thesis treated in this paper emphasizes the latter threat, particularly via ics attack. ics are the industrial plant equivalent of military command and control (C4ISR) systems; they include the embedded controllers that drive machines like generators, valves, production lines, etc.; embedded sensors that monitor their performance; Supervisory Control and Data Acquisition (scada) systems that allow human operators to visualize and manage the process; and the network architecture that connects it all together. For a primer on ics security, see Joseph Weiss, Protecting Industrial Control Systems from Electronic Threats (New York: Momentum Press, 2010). Anna Mulrine, “cia Chief Leon Panetta: The Next Pearl Harbor Could Be a Cyberattack,” Christian Science Monitor, 9 June 2011. According to Scott Berinato, “The Future of Security,” Computerworld, 30 December 2003, the first use of the phrase “digital Pearl Harbor” was in 1991 by then rsa Data Security president D. James Bidzos. rma discourse since the 1990s has focused on the impact of networks on military operational efficiency, but it also has always included a strain of futurism about information warfare as a substitute for traditional operations altogether, e.g., James Adams, The Next World War: The Weapons and Warriors of the New Battlefields of Cyberspace (London: Arrow, 1998); John Arquilla and David F. Ronfeldt, Networks and Netwars: The Future of Terror, Crime, and Militancy (Santa Monica, ca: rand, 2001). Widely cited as an example of supply-chain sabotage is an elaborate 1982 counterintelligence operation in which the cia allegedly tampered with Canadian software that the Soviets planned to steal. Once the Soviets installed it in controllers on the Trans-Siberian oil pipeline, this Trojan horse caused “the most monumental non-nuclear explosion and fire ever seen from space” and “significant damage to the Soviet economy,” according to Thomas C. Reed, At the Abyss: An Insider's History of the Cold War (New York: Random House, 2004), 268–69. However, Rid, “Cyber War Will Not Take Place,” finds little corroborating evidence for Reed's story, which should have had eyewitnesses aplenty; Electrical blackouts in Brazil in 2007 and 2009 have been blamed on hackers, but no supporting evidence has emerged while simpler explanations have been offered in each case: Marcelo Soares, “Brazilian Blackout Traced to Sooty Insulators, Not Hackers,” Wired Threat Level Blog, 9 November 2009, http://www.wired.com/threatlevel/2009/11/brazil_blackout; also, a Wikileaks cable from the American Embassy in Brasilia dated 1 December 2009, 11:27 a.m. gmt, discounts the possibility of a cyber attack in the 2009 blackout. Other examples of physical damage include malicious experiments likely created for hacker bragging rights, like the 1999 Chernobyl or Spacefiller virus, which could overwrite Basic Input Output System (bios) data and effectively turn a computer into a useless brick. On the INL Aurora demonstration, see Jeanne Meserve, “Staged Cyber Attack Reveals Vulnerability in Power Grid,” CNN, 26 September 2007. On the historical absence of cyberwar, see Sean Lawson, “Beyond Cyber-Doom: Assessing the Limits of Hypothetical Scenarios in the Framing of Cyber-Threats,” Journal of Information Technology & Politics 10, no. 1 (December 2012): 86–103; Michael Stohl, “Cyber Terrorism: A Clear and Present Danger, the Sum of All Fears, Breaking Point or Patriot Games?” Crime, Law and Social Change 46, nos. 4–5 (December 2006): 223–38. Bill Gertz, “Computer-Based Attacks Emerge As Threat of Future, General Says,” Washington Times, 13 September 2011. Alexander also cited “the August 2003 electrical power outage in the Northeast u.s. that was caused by a tree damaging two high-voltage power lines. Electrical power-grid software that controlled the distribution of electricity to millions of people improperly entered ‘pause’ mode and shut down all power through several states.” William J. Lynn III, “Defending a New Domain: The Pentagon's Cyberstrategy,” Foreign Affairs 89, no. 5 (September-October 2010): 97–108, quote at 98–99. Obama, “Taking the Cyberattack Threat Seriously.” These and other trends lowering barriers to entry for cyber attack are described in Kenneth J. Knapp and William R. Boulton, “Cyber-Warfare Threatens Corporations: Expansion Into Commercial Environments,” Information Systems Management (Spring 2006). Clayton, “The New Cyber Arms Race.” On cascading attacks, see Scott Borg, “Economically Complex Cyberattacks,” IEEE Security and Privacy 3, no. 6 (December 2005): 64–67. A classic example of malformed input is a buffer overflow attack in which the attacker provides an input parameter larger than the space allocated for it by the programmer, who has failed to check the length of the input; the input string thus overwrites memory for the function's internal control variables, which were supposed to be inaccessible but can now be changed arbitrarily. Note that some types of attacks exploit physical connections rather than logical inputs. Although most malware goes through the front door to exploit programming flaws, side channel attacks can exploit information from the physical implementation of a system, such as excess heat generated by correct passwords. Furthermore, even the best-designed systems can and often do fail through social engineering techniques, such as phishing scams that exploit human gullibility. Andy Greenberg, “Shopping for Zero-Days: A Price List for Hackers’ Secret Software Exploits,” Forbes, 23 March 2012, http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/. For a typical statement of the cyber offense dominance claim, see Kenneth Lieberthal and Peter W. Singer, “Cybersecurity and u.s.-China Relations,” Brookings Institution, February 2012, 14–16. One interesting area where the problem of identifying and solving a novel signature has been inverted between offense and defense is in the use on websites of a scrambled phrase known as a “captcha” to discriminate humans from machines: the defender can rapidly generate new scrambled phrases while the attacker has a more costly solving problem. Criminals have solved this problem, however, not technically but through outsourcing captcha solving to people willing to solve a thousand per dollar. See Marti Motoyama, Kirill Levchenko, Chris Kanich, Damon Mccoy, Geoffrey M. Voelker, and Stefan Savage, “Re: captchas—Understanding Captcha-Solving from an Economic Context,” Proceedings of the USENIX Security Symposium, Washington, dc, August 2010. “Senate Select Intelligence Committee Holds Hearing on Worldwide Threats,” Defense Intelligence Agency, 31 January 2012, http://www.dia.mil/public-affairs/testimonies/2012-01-31.html. Ross Anderson and Tyler Moore, “The Economics of Information Security,” Science 27, no. 5799 (October 2006): 610–13; Terrence August and Tunay I. Tunca, “Network Software Security and User Incentives,” Management Science 52, no. 11 (November 2006): 1703–720; Johannes M. Bauer and Michel J. G. Van Eeten, “Cybersecurity: Stakeholder Incentives, Externalities, and Policy Options,” Telecommunications Policy 33, no. 10 (2009): 706–19; Ludovic Piètre-Cambacédès, Marc Tritschler, and Göran N. Ericsson, “Cybersecurity Myths on Power Control Systems: 21 Misconceptions and False Beliefs,” IEEE Transactions on Power Delivery 26, no. 1 (Fall 2011): 161–72. DIA, “transcript,” 31 January 2012. On the complexity of attribution see David D. Clark and Susan Landau, “Untangling Attribution,” Proceedings of a Workshop on Deterring Cyberattacks, ed. National Research Council (Washington, dc: National Academies Press, 2010), 25–40. For richer discussion of the challenges of cyber deterrence—which might mean deterring cyber attacks or using the threat of cyber attack to deter other activity. See National Research Council, Proceedings of a Workshop. General Martin Dempsey, speech at the Commonwealth Club of California, 27 July 2012, http://www.commonwealthclub.org/events/archive/podcast/general-martin-dempsey-chairman-joint-chiefs-staff-72712. I am grateful to Erik Gartzke for framing the gap between the “logic of possibility” and the “logic of consequence” in cyber warfare discourse; see Gartzke, “The Myth of Cyber War: Bringing War on the Internet Back Down to Earth” (paper presented at the International Studies Association Annual Convention, San Diego, April 2012). iaea, “Implementation of the npt Safeguards Agreement and Relevant Provisions of Security Council Resolutions 1737 (2006), 1747 (2007), 1803 (2008) and 1835 (2008) in the Islamic Republic of Iran,” GOV/2010/10, 18 February 2010. Whitney Raas and Austin Long, “Osirak Redux? Assessing Israeli Capabilities to Destroy Iranian Nuclear Facilities,” International Security 31, no. 4 (Spring 2007): 7–33; Office of the Director of National Intelligence, “Iran: Nuclear Intentions and Capabilities,” November 2007, http://www.dni.gov/files/documents/Newsroom/Press%20Releases/2007%20Press%20Releases/20071203_release.pdf. The fep layout is described in iaea, “Implementation of the npt Safeguards,” and in David Albright and Corey Hinderstein, “The Iranian Gas Centrifuge Uranium Enrichment Plant at Natanz: Drawing from Commercial Satellite Images,” Institute for Science and International Security, 14 March 2003. Natanz enriches uranium hexafluoride (uf 6) gas, which it obtains from the Isfahan uranium conversion facility, to make leu in two facilities: a small above-ground pilot fuel enrichment plant (pfep) for research, and a much larger underground fuel enrichment plant (fep) for industrial production. While inspectors have never detected enrichment over 5 percent leu at the fep, the pfep has produced small amounts of 20 percent leu, ostensibly for medical and scientific research. If Iran were to make a breakout dash to enrich enough 93 percent heu for a few bombs within a few months, it would almost certainly have to use the industrial-sized fuel enrichment plant at Natanz. See David Albright, Paul Brannan, Andrea Stricker, Christina Walrond, and Houston Wood, “Preventing Iran from Getting Nuclear Weapons: Constraining Its Future Nuclear Options,” Institute for Science and International Security, 5 March 2012. The precise configuration of Natanz’ networks has not been revealed to iaea inspectors, but we can gain some insight into the defensive challenge from Siemens-recommended best practices for ics security and through analysis of the pattern of exploits employed by Stuxnet, as discussed in Eric Byres, Andrew Ginter and Joel Langill, “How Stuxnet Spreads: A Study of Infection Paths in Best Practice Systems,” Tofino Security white paper, 22 February 2011. The Iranians probably diverged significantly from best practices, but the operational implications of this are ambiguous, as discussed below: it may either have provided more vulnerabilities to exploit, or it may have invalidated target intelligence. According to Byres et al., the outer level of the fep would have been the enterprise network, which hosted most of the everyday business and administrative computers. Within that was the perimeter network—sometimes called “the demilitarized zone” among ics administrators—where servers managed the computer equipment in the control systems and provided data to end users in the enterprise network. Firewall servers on the perimeter network gateways would have been set to “deny by default” so that they only allowed incoming connections from authorized users with legitimate credentials and outgoing connections only to specifically approved servers for maintenance. This network may indeed have had physical connections from the fep's exterior networks to sensitive ics to facilitate remote management and troubleshooting—there might not have been an “air gap”—but there would have been, nonetheless, multiple logical layers of defenses to penetrate. The perimeter network protected simatic systems, and there may have been different system partitions for each of the different cascade modules in the fep's two production halls. Each of these included the process control network that hosted human interface servers for the simatic operator and engineering systems as well as the control system network that hosted the automation system running the controllers and peripherals driving industrial processes. David Albright, Paul Brannan, and Christina Walrond, “Stuxnet Malware and Natanz: Update of isis December 22, 2010 Report,” Institute for Science and International Security, 15 February 2011, 2. Symantec has not publicly released the names of these companies. Epidemiological data came from Stuxnet itself: as it copies itself from computer to computer, each instance keeps a log of all the machines infected by the lineage (evidence of developers interested in debugging or accountability). From samples of the worm collected in the wild, Falliere et al. traced a total twelve thousand infections to five internet domain names, the names of which have not been publicly disclosed. One of these domains was infected on three separate occasions, one was infected twice, two were infected only once, and one had three different computers infected at once (as if an infected thumb drive was repeatedly connected), for a total of ten known initial infections. There are three known versions of Stuxnet, but based on iaea inspection data only the first version appears to have done any damage at Natanz. The three different compilations of Stuxnet attacked multiple sites in three waves: June and July 2009, March 2010, and April and May 2010. The iaea observed that about one thousand centrifuges were disconnected in January 2010, as covered later, but in subsequent inspections, the Iranians were already bringing them back under vacuum when the second and third waves hit. These second and third versions thus appear to have had no dramatic effect as the total number of enriching cascades began to increase after August 2010. Considering only insertions of the first version, Stuxnet's damage thus resulted from four initial infections, each in a different domain in Iran. The delay between compilation and infection could have been due to the logistic challenges of testing and getting the worm to the human agents who would launch the attack, or to internal bureaucratic processes within the attacking organization such as legal review. I assume that compilation, the process that packages human-readable programs into the executable binary file, occurred on computers at the attack's home facility, although remote compilation is technically possible. The attack waves, defined as the infections associated with a single compilation, are distributed across the ten initial infections: four, one, and five. The minimum time between compile and infect time was twelve hours, the next shortest was over six days, and the maximum was twenty eight days. Contractors might have been especially attractive as mules, as they could have unwittingly received the malware at tradeshows. Employees or contractors might also carry infected simatic files directly to computers in the interior control system while performing maintenance, thus bypassing safeguards in the perimeter network altogether and vastly simplifying Stuxnet's infiltration. Alternatively, attackers could have sent phishing emails to employees with infected attachments that would open and drop the worm. See Byres et al., “How Stuxnet Spreads,” 13. A lot of attention has been paid to a zero-day vulnerability in Windows shortcut (.lnk) files that enable a hacked shortcut to surreptitiously load malware binaries as soon as the icon is simply viewed onscreen (MS10-046). This vulnerability appeared for the first time in the second version of Stuxnet, compiled on 1 March 2010. As I argue elsewhere, most of the centrifuge damage attributed to Stuxnet occurred prior to March 2010; thus it might not have been the celebrated .lnk vulnerability that delivered the payload that actually did the work at Natanz. The first version of Stuxnet used a less sophisticated autorun.inf vulnerability to propagate via removable media; Falliere, “W32.Stuxnet Dossier,” 31–32. Sanger, “Obama Order.” William Yong, “Iran Says It Arrested Computer Worm Suspects,” New York Times, 10 October 2010. Of course, Iran was likely

Referência(s)
Altmetric
PlumX