by Mimansa Ambastha (L.L.M. 2019)
Modern information technology is intrinsically full of vulnerabilities, from software coding/algorithms to hardware security systems. Security professionals around the world agree that no cyber technology can claim to be 100% secure against manipulation, as any piece of technology inherently presents vulnerabilities that arise due to commercial considerations of choosing functionality over iron-tight security. One such flaw is a ‘Zero-Day vulnerability,’ i.e. a vulnerability that remains unknown to the software vendor/manufacturer, and which can be exploited by anyone with capabilities to launch immediate cyber-attacks (giving zero days to security professionals to fix the problem). While such zero-day vulnerabilities pose threats to user safety, they also provide opportunities for government agencies to build targeted Internet surveillance tools for law enforcement purposes, mass Internet surveillance tools for intelligence purposes, and cyber-weapons for military use.[1] Herein lies the dilemma: our state agencies are tasked with protecting the nation, a task that involves both securing the nation’s systems and gathering valuable intelligence against actual and potential adversaries. The former would require the agency to disclose any vulnerability to the vendor so that it may be patched, whereas the latter would require restricting disclosure and exploiting the vulnerability to target potential adversaries at the cost of general cybersecurity. Treated as an “equities” issue between conflicting national security value propositions, this dilemma gave birth to the Vulnerability Equities Process (“VEP”),[2] a high-level inter-agency deliberation process that guides the United States government (“USG”) to either disclose or restrict information on zero-day vulnerabilities.
The VEP is the result of a decade-long government review of cyber capabilities. In 2008, the George W. Bush Administration directed a working group to develop a joint plan for improving the government’s offensive capabilities and protecting both government and public information systems.[3] This working group recommended VEP adoption to strike a balance between the government’s “offensive and defensive mission interests” upon the discovery of a vulnerability.[4] In 2010, a working group led by the Obama administration’s Director of National Intelligence worked on this recommendation, producing the VEP in a document titled, “Commercial and Government Information Technology and Industrial Control Product or System Vulnerabilities Equities Policy and Process.”[5] However, it wasn’t until 2014 that the VEP became publicly known, due in part to a blog post by then Special Assistant to the President and Cybersecurity Coordinator, Michael Daniel,[6] and a lawsuit[7] filed by the Electronic Frontier Foundation under the Freedom of Information Act that sought access to VEP workings surrounding the ‘Heartbleed bug’. Finally, in January 2016, the government released a redacted version of the 2010 VEP. After public insistence, this was followed by a superseding VEP “charter” released by the Trump Administration in November 2017.[8]
The VEP creates an Equities Review Board (“ERB”) to make disclosure decisions. It is comprised of representatives from multiple stakeholder agencies,[9] with the NSA acting as Executive Secretariat. The process covers only those zero-day vulnerabilities that are newly discovered—that is, discovered after the effective date of the initial VEP[10] and not publicly known to the vendor/supplier.[11] These could include exploits[12] developed by an agency or acquired on the open-market. Once a member agency identifies such a vulnerability, it may submit relevant information to the Executive Secretariat specifying its nature, impacted products and services, and recommendations for disclosure or restriction.[13] Other agencies claiming equities in the identified vulnerability can also submit their recommendations, which the ERB will use to make a decision by a majority vote.[14] Generally, decisions should favor disclosing vulnerabilities, and must only favor restriction in the exceptional cases of a demonstrable, overriding interest in using them for purposes of lawful intelligence, law enforcement, or national security.[15]
Exploiting vulnerabilities is an established and often necessary tool for the USG to gain valuable intelligence insights and conduct foreign espionage. Indeed, given that USA’s allies and rivals around the world are in engaged in a cyber arms race,[16] immediate disclosure of all strategic vulnerabilities by the government might amount to unilateral disarmament.[17] Further, digital exploits have helped law enforcement dismantle crime rings,[18] nab kidnappers,[19] shut down child pornography websites,[20] monitor digital trails of criminal operations, and hack suspects’ computers or mobile devices for investigation and evidence.[21] However, it is equally true that an increasing number of undisclosed and unpatched vulnerabilities threaten the resilience of the nation’s digital infrastructure. In fact, recent reports indicate that 49% Americans do not feel confident about the federal government’s ability to protect their data.[22] Moreover, vulnerabilities significantly expand the government’s surveillance capabilities over its citizens, especially in light of recent amendments to Rule 41 of the Federal Rules of Criminal Procedure, which authorizes “remote access” warrants for mass hacking of computers.[23] This raises further concerns about the government using such powers to crack down on political dissidents and opposing perspectives. Vulnerabilities can be used to manipulate public opinion and tamper with voting machines[24]. Thus, the efficiency of VEP’s ‘balancing act’ in decision-making is crucial for a well-oiled democratic machinery. In its current form however, the VEP leaves much to be desired, and has created many new problems of its own.
One of the biggest concerns about the VEP is that it provides an impetus for a growing market for undisclosed vulnerabilities. While this market has existed for some time, it was previously dominated by companies offering “bug-bounties” to those who found security vulnerabilities in their products.[25] This incentivized hackers to disclose vulnerabilities that could be patched, improving software safety. It also encouraged private vendors to develop safer products in the first place, instead of shouldering the cost and bad press involved in announcing and patching each new vulnerability.[26] However, in the past decade, cyberspace has become a favored battlefront for nations and non-state actors alike to remotely trigger debilitating cyber attacks like Stuxnet, WannaCry, and NotPetya, leading to failure of critical infrastructure and massive economic disruption.[27] This in turn has caused demand for vulnerability-exploits to increase exponentially, creating a highly lucrative market. – Western nations, led by the US,[28] offer 10 to 100 times[29] the rewards offered by software companies.[30] This has led to the rise of private malware vendors[31] who commonly offer zero-day exploits in popular products and services, including Microsoft Word, Adobe Reader, and Apple’s iOS operating systems.
Estimated Price List for Zero-Day Exploits for Popular Software Products[32]
The enticing purchase price of zero-day exploits has caused researchers to stop disclosing them in bug-bounty programs, instead opting to sell them to the highest bidder.[33] Defense contractors like ManTech, Booz Allen Hamilton, Harris, and Raytheon, for example, have reportedly acquired formal US government contracts to infiltrate targeted software.[34] Even mere brokers, who arrange transactions between government agencies and hackers, make millions of dollars.[35] This ‘reverse’ market trend has dismantled the previous structure, which encouraged public disclosure of vulnerabilities, at the cost of public cybersecurity. Concerns that software programmers may deliberately create vulnerabilities in a company’s products to sell them to government agencies later[36] further decrease overall market incentives for secure software.[37]
Malware vendors often sell exploits non-exclusively to multiple government agencies at once.[39] Even where exploits are purchased with accompanying exclusivity agreements,[40] a vulnerability’s continued secrecy is not guaranteed if it may be independently discovered by someone else.[41] In this sense, every decision to retain a vulnerability without fixing it arguably increases risk to national systems.
High purchase prices point to another disturbing criticism of VEP: even though the USG claims to disclose up to 90% of the vulnerabilities through VEP,[42] the government could simply choose not to disclose the few high-severity flaws that pose the greatest security concerns. Given the five to six-figure prices its buying vulnerabilities at, it seems logical that the USG is only paying so much for high-stake vulnerabilities that would be retained for law-enforcement or intelligence purposes, and obviously not for the benevolent purpose of donating their disclosure. This also emphasizes an oft-repeated concern that many of such purchased vulnerabilities may be deliberately kept out of the VEP disclosure process by defining them under the open ended language of ‘exceptions’ to disclosures. Agencies can over-use these exceptions or transgress beyond their reasonable interpretations in order to keep them from entering the process in the first place. Even though the VEP offers an annual internal audit that “may” be shared with Congress,[45] it offers little help without meaningful descriptions of how vulnerabilities were identified. This is especially true if agencies adopt differing substantive interpretations of VEP vulnerabilities, or end up submitting operational end-to-end exploits instead of technically specified vulnerabilities.[46] For example, the FBI purchased a so-called “black-box exploit”[47] to access a suspect’s iPhone after the 2016 San Bernardino attacks. The FBI later stated that it was unable to submit information on the vulnerability to the VEP because it had not purchased the rights to the technical details from the third party seller.[48] This raises additional concerns that purchase contracts could be intentionally structured to withhold exchanging technical components that would trigger the VEP, thereby avoiding the process entirely.[49]
If the treasure-trove of vulnerabilities amassed by USG were to be stolen or leaked, they could quickly become ammunition for devastating cyber-attacks. Case in point is the 2017 “WannaCry” ransomware attack, which exploited a Windows server vulnerability, crippling 10,000 organizations (including hospitals and transportation) in 150 countries. All told, it caused damage of $4-8 billion.[50] This vulnerability, “EternalBlue,” was a manufactured exploit that the NSA had used for years before it was stolen and leaked in a 2016 breach.[51] Despite the NSA’s warnings before WannaCry’s onset, Microsoft’s hastily issued patch left many systems compromised.[52] A few months later, “EternalBlue” became the basis for the NotPetya malware attack, touted as the most destructive global cyberattack in history, causing upto $10 billion in damage worldwide.[53] 2017 also saw Wikileaks release dozens of exploits used by the CIA to hack Android and iOS smartphones, the sheer number of which suggested violations of VEP.[54] Worse still, WikiLeaks indicated that the vulnerabilities it released were in the hands of hackers before it even published them.[55] Such security breaches cast a harsh new light on vulnerability retention under the VEP. Eliminating security breaches is crucial if the VEP is to remain justifiable.
It wasn’t until 2017 that the VEP was fully unclassified, and it will surely be fine-tuned in the coming years with help from the public. There is much room for improvement. As an executive creation, the VEP does not bear legislative sanction that is typically required when citizens grant their government powers of surveillance and policing in exchange for greater security. In fact, so far, citizens or their elected representatives do not have a participating or deciding role in VEP at all, despite the fact that unpatched vulnerabilities pose substantial threats to their interests. Comprehensive legislation on this topic would be difficult, given the constantly evolving nature of vulnerabilities and their impacts; not to mention the need for secrecy, high-level expertise, and quick-decision making. To strike a balance, VEP’s current composition can be diversified by adding an elected representative (or two) and civilian agencies who can counter any biases that intelligence and military agencies may harbor towards retention. The VEP’s Executive Secretary function should be transferred from the NSA to the Department of Homeland Security to remove any appearance of NSA bias.[58] Agencies in general must maintain the highest possible security standards around their exploit stash. Vulnerabilities enormously empower agencies’ national security operations, but citizens cannot be expected to accept the VEP bargain if agencies fail in their primary responsibility of keeping these vulnerabilities out of the wrong hands.
There needs to be additional transparency concerning VEP’s implementation. It no doubt attempts to make disclosure the rule and restriction the exception, and lays down fair criteria[59] to calculate the net benefits to overall national security. These include factors such as the vulnerability’s demonstrated and future value, its operational effectiveness, the possibility of patching and risk mitigation, the likelihood of third-party discovery, and the existence of alternatives.[60] Participating agencies have too much discretion in defining vulnerabilities, which further prevents many vulnerabilities from entering the process in the first place. Even when a vulnerability is known to be undisclosed, there is no way of knowing if this was determined by the ERB under the VEP, or if the vulnerability simply never entered the process in the first place.[61] It’s no wonder then that both the House of Representatives[62] and Senate[63] have prepared bills imposing basic reporting requirements for vulnerabilities. Passage of such laws is a good first step to improving the quantity and quality of disclosures. Further, civilian agencies participating in VEP, while beyond the scope of these bills, should voluntarily report the same information as their intelligence counterparts to improve overall transparency.
Finally, there are strong indications that the VEP in fact legitimizes and bolsters the market for zero-day vulnerabilities, reducing remediation of software flaws by vendors. This issue, while perhaps difficult to address, is significant: by incentivizing hackers to keep vulnerabilities secret, government participants in this market may become indirectly complicit in the attacks on network infrastructure. One possible way to address this issue is to look beyond the binary determinations of disclosure/retention, and instead opt for alternatives illustrated in the VEP itself. Restriction of the vulnerability must only be justified when it provides an enormous benefit that is irreplaceable and unobtainable by other means. Importantly, USG needs to prioritize sharing information with private vendors rather than alienating them, as much of USA’s IT infrastructure and cyber-capabilities are privately developed, and lasting trust and cooperation with the private sector will, realistically, be necessary.
The (present) impossibility of creating ‘perfect’ technology leaves us to make peace with the existence of vulnerabilities—hidden or otherwise—that will continue to play a role in cybersecurity and national security. However, we should keep watch for excessive government use, which frequently undermines the cybersecurity of citizens and enterprises alike. The aforementioned norms surrounding transparency, participation, and cooperation with civilian and market stakeholders—though cumbersome—ought to be implemented as an initial step towards fairly informing the choices that both security propositions offer. After all, the government is of the people, by the people and for the people, before anything else.
[1] Bruce Schneier, ‘The Vulnerabilities Market and the Future of Security’, Forbes, May 30, 2012 (https://www.forbes.com/sites/bruceschneier/2012/05/30/the-vulnerabilities-market-and-the-future-of-security/#31fffacd7536)
[2] Vulnerabilities Equities Policy and Process for the United States Government, November 15, 2017 (https://www.whitehouse.gov/sites/whitehouse.gov/files/images/External%20-%20Unclassified%20VEP%20Charter%20FINAL.PDF)
[3] Ari Schwartz and Rob Knake, “Government’s Role in Vulnerability Disclosure,” Discussion Paper 2016-04, June 2016, at http://www.belfercenter.org/sites/default/files/legacy/files/vulnerability-disclosure-web-final3.pdf.
[4] The White House, “HSPD-54/HSPD-23 Cybersecurity Policy,” presidential directive, January 8, 2008, at https://epic.org/privacy/cybersecurity/EPIC-FOIA-NSPD54.pdf.
[5] Director of National Intelligence, “Commercial and Government Information Technology and Industrial Control Product or System Vulnerabilities Equities Policy and Process,” directive, at https://www.eff.org/files/2015/09/04/document_71_- _vep_ocr.pdf.
[6] Michael Daniel, ‘Heartbleed: Understanding When We Disclose Cyber Vulnerabilities’, April 28, 2014 ( https://obamawhitehouse.archives.gov/blog/2014/04/28/heartbleed-understanding-when-we-disclose-cyber-vulnerabilities)
[7] Complaint, Electronic Frontier Foundation v. National Security Agency (https://www.eff.org/files/2014/07/01/eff_v_nsa_odni_-_foia.pdf )
[8] Supra note 2.
[9] Agencies include Department of Defense (including the NSA), Department of Justice (including the FBI), Department of State, Department of Energy, Department of Homeland Security, Central Intelligence Agency, Defense Intelligence Agency, Office of Management and Budget, Department of the Treasury, Department of Commerce etc. Supra note 2 at 3.
[10] February 16, 2010, which is the date when the VEP was first conceptualized as the “Commercial and Government Information Technology and Industrial Control Product or System Vulnerabilities Equities Policy and Process under a working group led by the Director of National Intelligence under the Obama Administration.
[11] Supra note 2, Annex A at 11.
[12] An exploit is the manner of transforming a vulnerability into an actual tool to breach a system.
[13] Supra note 2 at 7-9.
[14] Id.
[15] Supra note 2 at 1.
[16] ‘China tips the scale of global cybersecurity by hoarding vulnerabilities’, AccessNow, September 20, 2018 (https://www.accessnow.org/china-tips-the-scale-of-global-cybersecurity-by-hoarding-vulnerabilities/)
[17] White House Statement – Rob Joyce, ‘Improving and Making the Vulnerability Equities Process Transparent is the Right Thing to Do’, November 15, 2017 ( https://www.whitehouse.gov/articles/improving-making-vulnerability-equities-process-transparent-right-thing/)
[18] ‘The Secret World of Vulnerability Hunters’, The Christian Science Monitor, February 10, 2017, (https://www.csmonitor.com/World/Passcode/2017/0210/The-secret-world-of-vulnerability-hunters)
[19] Id.
[20] ‘The FBI Used a ‘Non-Public’ Vulnerability to Hack Suspects on Tor’, MotherBoard, November 29 2016, (https://motherboard.vice.com/en_us/article/kb7kza/the-fbi-used-a-non-public-vulnerability-to-hack-suspects-on-tor)
[21] Supra note 18
[22] ‘How Americans have viewed government surveillance and privacy since Snowden leaks’, Pew Research Centre, June 4, 2018 ( http://www.pewresearch.org/fact-tank/2018/06/04/how-americans-have-viewed-government-surveillance-and-privacy-since-snowden-leaks/)
[23] Jennider Daskal, ‘Rule 41 Has Been Updated: What’s Needed Next’, JustSecurity, December 5, 2016 (https://www.justsecurity.org/35136/rule-41-updated-needed/) ; ‘Help Us Stop the Updates to Rule 41’, Electronic Frontier Foundation, June 16, 2016 (https://www.eff.org/deeplinks/2016/06/help-us-stop-updates-rule-41)
[24] ‘The Vulnerabilities of Our Voting Machines’, Scientific American, November 1, 2018 ( https://www.scientificamerican.com/article/the-vulnerabilities-of-our-voting-machines/)
[25] Supra note 1.
[26] Id.
[27] Id.
[28] ‘Shopping For Zero-Days: A Price List For Hackers’ Secret Software Exploits’. Forbes March 23, 2012 ( https://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/#25419cf72660)
[29] In August 2017, Apple launched a $200,000 bug bounty program for finding vulnerabilities in its products. Zerodium, a highly popular online marketplace for vulnerabilities, offered $1.5 million for a “fully functional zero-day exploit” for cracking iOS 10, Apple’s mobile operating system. See Supra note 18.
[30] ‘Meet The Hackers Who Sell Spies The Tools To Crack Your PC (And Get Paid Six-Figure Fees)’, Forbes, Mar 21, 2012 ( https://www.forbes.com/sites/andygreenberg/2012/03/21/meet-the-hackers-who-sell-spies-the-tools-to-crack-your-pc-and-get-paid-six-figure-fees/#c54c2d31f745)
[31] Examples include Northrop Grumman Corporation, Vupen Security, Netragard Inc., Endgame, Raytheon etc.
[32] Supra note 28.
[33] Id.
[34] Supra note 18.
[35] Supra note 28.
[36] Supra note 1.
[37] Marcia Hofmann & Trevor Timmmarch, ‘”Zero-day” exploit sales should be key point in cybersecurity debate’ March 29, 2012, Electronic Frontier Foundation, (https://www.eff.org/deeplinks/2012/03/zero-day-exploit-sales-should-be-key-point-cybersecurity-debate)
[39] Supra note 30.
[40] Id.
[41] Supra note 18.
[42] ‘Feds Explain Their Software Bug Stash—But Don’t Erase Concern’, Wired, November 15, 2017 (https://www.wired.com/story/vulnerability-equity-process-charter-transparency-concerns/)
[43] Michelle Richardson, ‘Locking in Transparency on the Vulnerabilities Equities Process,’ JustSecurity, July 27, 2018 (https://www.justsecurity.org/59795/locking-transparency-vulnerabilities-equities-process/)
[44] Supra note 2 at 10.
[45] Id at 5.
[46] Michelle Richardson and Mike Godwin, ‘What the White House Needs to Disclose about its Process for Revealing Cybersecurity Vulnerabilities’, JustSecurity, November 2, 2017, (https://www.justsecurity.org/46647/white-house-disclose-process-revealing-cybersecurity-vulnerabilities/)
[47] A black-box exploit examines the functionality of the exploit without peering into the internal structures or workings of the vulnerabilities it is based on. See Gao, Tsao et al, Testing and Quality Assurance for Component-based Software, Artech House. pp. 170 (2003). ISBN 978-1-58053-735-3.
[48] ‘FBI: Sorry, But We’re Keeping the iPhone Crack Secret’, Fortune, April 27, 2016 (http://fortune.com/2016/04/27/fbi-apple-iphone-crack/)
[49] Vulnerabilities Equities Process, Electronic Privacy Information Center, (https://epic.org/privacy/cybersecurity/vep/)
[50] ‘Why Governments Won’t Let Go of Secret Software Bugs’, Wired, May 16, 2017 (https://www.wired.com/2017/05/governments-wont-let-go-secret-software-bugs/)
[51] ‘NSA officials worried about the day its potent hacking tool would get loose. Then it did.’ Washington Post, May 16, 2017 (https://www.washingtonpost.com/business/technology/nsa-officials-worried-about-the-day-its-potent-hacking-tool-would-get-loose-then-it-did/2017/05/16/50670b16-3978-11e7-a058-ddbb23c75d82_story.html?utm_term=.57719c9bd5aa)
[52] Id.
[53] ‘The Untold Story of NotPetya, the Most Devastating Cyberattack in History’, Wired, August 22, 2018 (https://www.wired.com/story/notpetya-cyberattack-ukraine-russia-code-crashed-the-world/ )
[54] ‘How the CIA’s Hacking Hoard Makes Everyone Less Secure’, Security, August 3, 2017 ( https://www.wired.com/2017/03/cias-hacking-hoard-makes-everyone-less-secure/)
[55] ‘Vault 7: CIA Hacking Tools Revealed’, WikiLeaks Press Release March 7, 2017 (https://wikileaks.org/ciav7p1/)
[58] Supra note 3 at 15.
[59] Supra note 2, Annex B at 13.
[60] Id.
[61] Supra note 46.
[62] Section 1510 of H.R. 6237 ( https://www.congress.gov/bill/115th-congress/house-bill/6237/text )
[63] Section 721 of S. 3153 ( https://www.congress.gov/bill/115th-congress/senate-bill/3153/text )