Hold North Korea Accountable for WannaCry—and the NSA, Too

Credit to Author: Andy Greenberg| Date: Tue, 19 Dec 2017 19:33:56 +0000

Seven months after the WannaCry ransomware ripped across the internet in one of the most damaging hacking operations of all time, the US government has pinned that digital epidemic on North Korea. And while cybersecurity researchers have suspected North Korea's involvement from the start, the Trump administration intends the official charges to carry new diplomatic weight, showing the world that no one can launch reckless cyberattacks with impunity. "Pyongyang will be held accountable," White House cybersecurity chief Tom Bossert wrote in an opinion piece for the Wall Street Journal.

But for some in the cybersecurity community who watched WannaCry's catastrophe unfold, North Korea isn't the only party that requires accountability. They argue that if guilty parties are going to be named—and lessons are to be learned from naming them—those names should include the US government itself. At least some of the focus, they say, belongs on the National Security Agency, which built and then lost control of the code that was integrated into WannaCry, and without which its infections wouldn't have been nearly as devastating.

"As we talk about to whom to attribute the WannaCry attack, it’s also important to remember to whom to attribute the source of the tools used in the attack: the NSA," says Kevin Bankston, the director of the New America Foundation's Open Technology Institute. "By stockpiling the vulnerability information and exploit components that made WannaCry possible, and then failing to adequately shield that information from theft, the intelligence community made America and the world’s information systems more vulnerable."

For many cybersecurity researchers, in fact, WannaCry has come to represent the dangers not only of rogue states using dangerous hacking tools, but of the US government building those tools and using them in secret, too.

WannaCry's origins stretch back to April, when a group of mysterious hackers calling themselves the Shadow Brokers publicly released a trove of stolen NSA code. The tools included an until-then-secret hacking technique known as EternalBlue, which exploits flaws in a Windows protocol known as Server Message Block to remotely take over any vulnerable computer.

While the NSA had warned Microsoft about EternalBlue after it was stolen, and Microsoft had responded with a patch in March, hundreds of thousands of computers around the world hadn't yet been updated. When WannaCry appeared the next month, it used the leaked exploit to worm through that massive collection of vulnerable machines, taking full advantage of the NSA's work.

Exactly how the Shadow Brokers obtained the NSA's highly protected arsenal of digital penetration methods remains a conundrum. But in recent years, two NSA staffers have been indicted for taking home top-secret materials, including collections of highly classified hacking tools. In one of those cases, NSA staffer Nghia Hoang Pho also ran Kaspersky antivirus on his home computer, allowing the Russian security firm to upload that trove of NSA code to its own servers, although the company insists that it subsequently destroyed its copy of the code as soon as it realized what it had scooped up. It's not clear if either of the two staffer's security breaches led to the Shadow Brokers' theft.

'To have a discussion about accountability for North Korea without the discussion of how they got the material for the attack in the first place is irresponsible at best, and deceptive at worst.'

Former NSA Analyst Jake Williams

Despite those security breaches, Bossert's 800-word statement about "accountability" for the North Korea's hackers who created and launched WannaCry didn't once mention the NSA's accountability for creating, and failing to secure, the ingredients for that disaster, notes Jake Williams, a former NSA hacker himself and the founder of Rendition Infosec. "If someone blew up a bomb in New York City and the Syrian government had given them the fissile material to make it, we’d be holding them accountable," says Williams. "North Korea couldn't have done this without us. We enabled the operation by losing control of those tools."

In a press conference Tuesday, Bossert did indirectly acknowledge the role of the NSA's leak in making WannaCry possible when questioned about it. "The government needs to better protect its tools, and things that leak are very unfortunate," he said. "We need to create security measures to better protect that from happening."

But at other times in his press conference, Bossert seemed to avoid any direct statement that North Korea had used leaked NSA code in its malware, while also shifting blame to the previous administration. "The underlying vulnerability of the software that [North Korea] exploited predated and pre-existed our administration taking power," Bossert said. "I don’t know what they got and where they got it, but they certainly had a number of things cobbled together in a pretty complicated, intentional tool that does harm that they didn't entirely create themselves."

That muddied statement is the opposite of accountability, Williams argues. "We bear a large piece of the blame on this," he says. "To have a discussion about accountability for North Korea without the discussion of how they got the material for the attack in the first place is irresponsible at best and deceptive at worst."

To the NSA's credit, it did in fact inform Microsoft about its EternalBlue tool, in time for Redmond to push out a patch before WannaCry occurred. But that patch doesn't absolve the NSA of responsibility for having created and lost control of EternalBlue in the first place, Williams says.

Thanks to the complications of patching millions of Windows computers, a large fraction of machines never got Microsoft's security fix. Aside from WannaCry, other hackers, including the likely Russian operations that launched NotPetya, a malware worm that also caused significant damage, used EternalBlue, too. Even now, Williams points out, hackers still use the NSA's original code rather than recreating EternalBlue's attack, a sign that the complexity of the coding involved means that the attack may never have been possible if not for the NSA's leak. "Absent that, I don't know if we’d see a weaponized exploit for this vulnerability," Williams says.

The question of accountability for WannaCry is just one case in a long-running debate about whether and when the NSA should maintain hacking tools that exploit secret vulnerabilities in software, rather than reveal those vulnerabilities to software companies who can fix them.

The discussion of accountability for WannaCry should include accountability for our own government's role in those debacles, too.

For the last decade, the NSA has abided by rules known as the Vulnerabilities Equities Process, which determine when the government should reveal those hackable flaws versus exploiting them in secret. The Trump administration has promised a more transparent implementation of the VEP than the Obama administration's, and has said that more than 90 percent of vulnerabilities the government finds will be reported to companies so that they can be fixed. "Vulnerabilities exist in software," Bossert said in his press conference Tuesday. "When we find vulnerabilities, we generally identify them and tell the companies so they can patch them."

But some critics point out that even the Trump administration's revamped VEP has problems, too. The review board that chooses which vulnerabilities will be released and which ones hoarded in the dark is weighted towards intelligence agencies and law enforcement, according to the Open Technology Institute. It doesn't include what the OTI describes as "meaningful reporting requirements" to Congress or the public about how vulnerabilities are treated. And the VEP remains just a White House policy, not law, so it's subject to change at any time.

All of which means that the discussion of accountability for WannaCry—and any other cyberattack that uses the NSA's leaked hacking tools—should include accountability for our own government's role in those debacles, too.

"Without continued reforms to the White House’s vulnerability equities process and ultimate codification of that process into law," says the OTI's Bankston, "one of our biggest enemies when it comes to cybersecurity will continue to be ourselves."

https://www.wired.com/category/security/feed/