Zero Day: Buggy Code and the Cyberweapons Arms Race

Malware and weaponized software flaws exploit the inherent risks that come with building tightly coupled, complex infrastructure.

A computer monitor bares jagged teeth as a man cowers in fear beneath the desk.
Paul Blow

Cadbury likely never expected to find itself implicated in the most destructive and costliest cyberattack in history. But on June 27, 2017, the British candy company and organizations across the globe — FedEx, the Danish shipping giant Maersk, the drug firms Pfizer and Merck, even hospitals in Pennsylvania — watched horrified as an apparent ransomware attack disrupted operations and rendered computer systems useless.

The attack originally centered on Ukraine, hitting power companies, airports, banks, and virtually the entire federal government before wrecking networks worldwide. The culprit was first thought to be a piece of ransomware called Petya, but it was something else entirely. NotPetya, as the worm came to be known, was self-spreading malware that indiscriminately trashed hard drives while masquerading as ransomware. “It was the equivalent of using a nuclear bomb to achieve a small tactical victory,” remarked Tom Bossert, White House homeland security advisor at the time. Estimates suggest that NotPetya inflicted damages amounting to more than $10 billion.

NotPetya was a blunt reminder that infrastructures of software and hardware link us together — and that failures can quickly exploit shared vulnerabilities, spin out, and cascade. Everything, it seems, is based on the same buggy code.

The implications of this predicament are at the heart of Nicole Perlroth’s new book, This Is How They Tell Me the World Ends: The Cyberweapons Arms Race. Perlroth examines what happens when nations treat this mutual infrastructure not as a resource to be fixed, but a target to be compromised and attacked. She builds on her reporting in The New York Times and follows the work of journalists such as Kim Zetter, Andy Greenberg, David Sanger, and others in tracing the growing use of digital tools for espionage, sabotage, and war. Her account makes the case that, absent meaningful checks and balances, the development and use of what she describes as “cyberweapons” undermines infrastructure security and makes all of us less safe.

Complex software binds the world together, sitting behind what were once discrete and isolated systems. Even the smallest failure can have devastating, cascading effects.

The story of NotPetya is knotty and remains partly shrouded in uncertainty. In 2016, a group calling itself the “Shadow Brokers” started publicly releasing some of the most powerful digital tools developed and used by the National Security Agency. It was a serious breach. A number of these tools relied on previously unknown and undisclosed software vulnerabilities, which are known among hackers as “zero days” because there’s virtually no time for them to be fixed before they are exploited. Hackers search for these bugs and sell them to clients around the world. Exploits and attacks built on such flaws are difficult to detect and stop. Once the Shadow Brokers doxed the NSA, criminals and nation states could potentially repurpose and retarget these tools for their own ends.

Of all of the material leaked by the Shadow Brokers, one particular tool and vulnerability stood out — EternalBlue, a powerful exploit that took advantage of bugs in Microsoft Windows. The NSA had been using EternalBlue for years without disclosing the underlying flaws to Microsoft or the public because it was a powerful and useful tool. “It netted some of the very best counterterrorism intelligence we got,” a former government hacker tells Perlroth.

“The intelligence it produced was so critical, one former intelligence analyst told me,” she writes, “that turning it over was never seriously considered. Instead the NSA held on to EternalBlue for seven years — over a period that saw some of the most aggressive cyberattacks on American networks in history — and prayed it would never be found.”

Only when it was clear that the Shadow Brokers had accessed the tool did the NSA report the vulnerabilities to Microsoft, and a patch was publicly released in March 2017. A few weeks later, before many Windows machines had been updated, the Shadow Brokers publicly released EternalBlue. It was quickly repackaged, first by North Korean hackers as the ransomware attack WannaCry, and then by Russian hackers affiliated with the Russian Main Intelligence Directorate (GRU) in the form of NotPetya.

NotPetya repurposed EternalBlue and combined it with a version of Mimikatz, an open-source application that’s widely used by cybercriminals to extract credentials. The resulting worm was initially launched through a compromised version of M.E.Doc, a popular Ukrainian accounting application. NotPetya spread quickly, pulling passwords from infected machines and moving laterally to compromise other connected computers. The attack might have started in the Ukraine, but it quickly moved around the world, seeking out and compromising computers running the same unpatched versions of Microsoft Windows.

This was not a clever act of espionage. NotPetya encrypted hard drives as it progressed, and panicked users had no way to decrypt them. “Even if you paid the ransom, there was no chance of getting any data back,” writes Perlroth. “This was a nation-state weapon designed to exact mass destruction.”

Perlroth frames NotPetya as a symbol and logical endpoint of the heedless and largely unaccountable development of cyberweapons — a story of US hubris run aground. In her telling, the rush by the US to develop offensive tools and techniques undermined security at home and abroad. It tacitly sanctioned other nations to develop destructive cyber capabilities; it fueled the growth of a market for vulnerabilities, exploits, and crucially skilled hackers that feeds not only the US, but despots around the world; and it preserved the insecurity of devices and software in order to make exploits and attacks possible.

The digital arms race democratized the ability to spy and conduct sabotage with often disastrous results. As Perlroth reports, ex-NSA hackers leave and join specialized boutique firms to sell their skills to foreign nations, such as the United Arab Emirates, where the rules and limits on hacking appear to be little more than idle pleasantries. Spyware firms like the NSO group find financial backing in the US — Francisco Partners, a US-based private equity company bought a controlling stake years ago—use zero-days to spy on journalists, dissidents, and civil rights activists, and others in Mexico, Saudi Arabia, and a laundry list of other countries. NotPetya, then, is simply the most concrete manifestation of a bitter irony: The tools developed by the U.S. to ostensibly improve national security are turned back against a vulnerable world.

But the story of NotPetya is complicated, it does not fit a single tidy narrative. NotPetya also suggests alternative and more difficult lessons. Besides a once-secret NSA tool, at its core was a vulnerability and piece of malware, Mimikatz, that had been publicly known for years. The fact that this well-known tool was allowed to remain potent points to larger systemic issues in software security that have little to do with nations secretly attempting to corrupt and target infrastructure. Though EternalBlue, as Perlroth notes, was disclosed to Microsoft before the attack, creating and deploying patches is difficult, particularly for critical infrastructure.

“Often any software updates to these systems need to be approved at high levels, and often only occur during narrow maintenance windows or when it is safe to pull systems offline — which can easily mean once or twice a year,” she writes. “Even critical patches, like the one Microsoft had rolled out for EternalBlue’s underlying bugs [before NotPetya], are not applied if there is even the smallest chance of disruption.”

With this in mind, the NotPetya attack becomes a cautionary tale about the inherent risks that come with building tightly coupled and complex infrastructure. The problem is that the software ecosystem that we have created makes these catastrophic failures possible in the first place. Improving our collective security, then, isn’t simply a question of limiting offensive operations (and Perlroth, ultimately, does not argue that it is). It’s a matter of creating real consequences for abhorrent behavior in the short term and minimizing the targets in the long term.

Limiting the harms associated with the digital arms race is about drawing and then enforcing clear lines and limits on how these weapons are used. Not all exploits and attacks are alike. In some cases, state-based attacks targeting critical infrastructure might be justified. A universal prohibition of aggression in cyberspace is not only unlikely, but also undesirable. Perlroth catalogs the harms caused by the development of malicious intrusions and attacks. But she also notes some successes, arguing that Stuxnet, a cyberattack that targeted the Iranian enrichment facility in Natanz, likely prevented a larger regional armed confrontation over Iran’s nuclear program.

“Stuxnet was a masterpiece,” she writes. “It kept Israeli jets on the ground. Fewer people are dead because of it. It set Iran’s nuclear programs back years and arguably helped push Tehran to the negotiating table.”

Stuxnet was a scalpel; NotPetya was a grenade. There is a real distinction here, but all too often it doesn’t appear to have been made into a difference. Russia, whose own businesses and networks were attacked by NotPetya when it boomeranged back, was ultimately let off relatively easy. The US eventually imposed sanctions and named some of those that were responsible, but it wasn’t a response equal to the attack. The problem is that the US and its allies have not moved to impose real costs and consequences when states engage in large, indiscriminate attacks; it’s that they haven’t confronted regimes that target journalists and civil society members through digital means or otherwise; it’s that diplomacy between rivals and allies can break down to the point where a risky sabotage operation is seen as the only way to prevent a devastating war. Solving these issues can make fixing even the most critical bugs start to look easy.

A larger reconsideration is also in order. Buggy code is going to exist, and malicious actors are going to try to take advantage of those flaws. Making vulnerabilities more difficult to find and harder to exploit remains important. But addressing the larger forces that drive infrastructure insecurity is vital. For decades, public policy and corporate design have pushed our infrastructure into increasingly lean and intricate formations. Complex software binds the world together, sitting behind what were once discrete and isolated systems. Even the smallest failure can have devastating, cascading effects. Bolting on additional security and safety practices has often made these systems even more complex and tightly integrated, creating new possibilities for failures that they were designed to mitigate. A larger rethink is in order, which Perlroth considers in the closing pages of her book.

“A secure architecture entails identifying the most critical systems — the crown jewels — whether that is customer data, medical records, trade secrets, production systems, or the brakes and steering systems in our cars, and compartmentalizing them from noncritical systems, allowing interoperability only where critical,” she writes.

This is easier said than done. Economic incentives, network effects, and brute corporate power typically push in the other direction, toward monopoly, complexity, and tight coupling. These forces, as much if not more than the cyberweapons arms race, are behind our current predicament of instability and insecurity. By encouraging looseness and diversity within infrastructure, vulnerabilities and failures become smaller and potential catastrophes become relatively manageable problems. The world, in other words, does not end.

Follow The Reboot

Join a growing community that’s examining the state of the internet and exploring its future. Subscribe to our newsletter.

A computer monitor bares jagged teeth as a man cowers in fear beneath the desk.

Artwork By

Paul Blow

Contact Us

Have an idea for a story or illustration? Interested in discussing partnerships? We want to hear from you. Send us a note at info(at)thereboot(dot)com.

Recommended Reading