Same old story: 40 years of debating encryption (long version)

A while ago, Tresorit, an encryption software company asked me to write a guest post called „The encryption debate: 40 years, the same arguments“ for their company blog. Please check it out.  Below you can read the non-redacted, longer version of my guest blog.

Alarm bells must have been ringing at GCHQ, Britain’s top-secret signals intelligence agency in 1976 (Diffie & Hellman, 1976). In a scientific paper, US-Researchers just theorized, out in the open, about one of the most influential technologies – public key encryption with asymmetric keys – a technique that promised almost unbreakable encryption. British intelligence agencies discovered the same technique around 1973, but held it top-secret. If the enemy – the Soviet Union – would use public-key cryptography, NATO’s ability to listen into the Kremlin’s communication could be severely hampered. Meanwhile, the National Security Agency, a close 5-eyes partner to GCHQ, launched a secret initiative to contain the potential danger of a widespread public use of the secret art of encryption. NSA allegedly tried to cut off government funding for encryption research – via the National Science Foundation – and even approached scientist to not publish their findings, to keep the secret of encryption hidden in the dark. Publishers and Researchers were advised not to publish research on cryptologic matters via gag-orders (Rid, 2016, p. 309). Academics protested and argued, that science demands that such powerful knowledge should be discussed and evaluated out in the open (Bari Kolata, 1980). Head of NSA, Bobby Inman was not amused and directly approached the scientific community, outlining the dangers of the widespread use of encryption:

„There is a very real and critical danger that unrestrained public discussion of crypto- logic matters will seriously damage the ability of this government to conduct signals intelligence and the ability of this government to carry out its mission of protecting national security information from hostile exploitation.“ (Inman, 1979)

For the first, but not the last time in history, the ‚going-dark problem‘, the dystopian vision of a a near future where all communication would be encrypted and thus unreadable for SIGINT agencies worldwide, was born. Around the same time, the digital or personal computer revolution began and the problem of cyber-crime or hacking became visible. Encryption was too useful against these threats. A ban was not feasible. Instead, NSA began to lobby its government to limit the export of cryptography. The Reagan administration thus classified cryptography as dual-use technology, like ammunition.

In 1993, the same year the World Wide Web started its global diffusion, NSA tried another pitch by proposing the Clipper-chip with the help of FBI. Together they argued, that terrorists, child-molesters and criminals would be using encryption (without providing any numbers on how many of the approx. 900 wiretaps annually encountered encrypted messages) and thus could not be caught because the FBI was going dark too (Senate, 1994). This narrative would become a standard in government attempts to restrict encryption. The proposal followed a clear logic: if encryption was to valuable to be banned, maybe it could be influenced in NSA’s favor by setting a government standard that industry had to adopt in telephones and computer systems. The idea had a certain appeal among policymakers of the Clinton administration because it presented a compromise: the public could use safe, encrypted communication while NSA could maintain its eavesdropping capability. NSA developed Clipper with a built-in backdoor: not just sender and recipient could read the encrypted communication, but also the government, who held a third key in escrow in a secret database.

In 1994, a fierce public reaction followed. Republicans and Democrats, civil-libertarians, hackers and tech-entrepreneurs opposed the initiative, fearing that the Internet could spread with an inferior technical standard in place (Levy, 1994). Tech-experts argued, that a classified, unevaluated algorithm was a security risk and that the escrow database was not just economically costly, legally complicated, but also highly insecure in the light of hacking attempts on U.S. digital infrastructures. Senators in Congress pointed out that no sane terrorist would use a technology from which he/she knew, that the US government could listen-in. The bad-guys would simply use a more secure alternative, freely available on the Internet (Senate, 1994). Ignoring these profound arguments, the government maintained its policy of export control and aligned itself with the particular interest of the FBI/NSA arguing, that Clipper far superior than everything on the market. This argument that was disproven in 1994, when a security researcher discovered a vulnerability in Clipper (Abelson et al., 2015).

The result of this episode was a general consensus among law-makers, the public and the expert community that systems with a backdoor are fundamentally less secure than those without (Kehl, Wilson, & Bankston, 2015). Backdoors do not exist just for law enforcement, but also for hackers and intelligence agencies from other states. Around the same time, Russia and China, following the US example, just launched their Information Warfare programs, utilizing hacking to disrupt digitized critical infrastructures (Baocun & Fei, 1995). At the end of the millennium, the Western democracies realized that „the overall health of the American computer industry was far more important to the security [SIGINT gathering] mission of NSA … America is simply more secure with unbreakable end-to-end encryption“ (Hayden, 2016). In 1999, Germany published a policy paper mandating safe encryption and promising not to interfere with technical standards or banning encryption, but instead promoting widespread use (Eckpunkte 1999). Likewise, the US began lifting strict export controls and even began the funding of secure communications technology such as the TOR (Kehl et al., 2015).

Unfortunately, this consensus did not last very long. After the terrorist attacks of 9/11, most Western states discovered the Internet and encryption as a potential national security threat. Both democratic and authoritarian states used the pretense of the war on terror to launch surveillance initiates like the NSA terrorist surveillance program (Risen & Lichtblau, 2005), the EU data retention directive (2006), Internet censorship (2008) or first state-hacking initiatives. Encrypted services, like VPN or the TOR Network, became primary tools to circumvent surveillance attempts of nation-states. The Arab Spring of 2011 demonstrated how state surveillance and hacking could be used against citizens and that encryption was an important, privacy-enhancing tool for journalists and activists worldwide. Therefore, it is no wonder that governments, yet again set-aim to restrict the widespread use of encryption. Authoritarian countries like China, the United Arab Emirates or Iran arguably where the first to restrict the use of encryption, for example by VPN-blocking or banning the use of encrypted services like TOR, to circumvent Internet censorship.

But it was not just authoritarian regimes who began to target encryption. Around the same time, NSA & GCHQ began to undermine encryption standards with classified programs like Bullrun and Edgehill. These programs utilized weaknesses in routers and the software implementation of encryption, allowing bypassing of VPN encryption. In 2013, Edward Snowden revealed the existence of multiple other programs, for example that NSA was vacuuming unencrypted traffic between Google Data centers by tapping into the fiber optic cables on US domestic soil, but potentially even globally. As a consequence, many internet giants like Google, Apple or Facebook began implementing encryption into their products, not just to shield customers from mass-surveillance by nation-states (Russia and China potentially doing the same), but also from cyber-crime and hackers.

Then, in January 2015, UK’s conservative Prime Minister Cameron launched a proposal to ban end-to-end encryption in messaging services such as WhatsApp or Apples iMessage to „deny terrorists a safe space to communicate“. Like Clipper, the proposal mandated manufactures to embed a backdoor in their systems, that could be access by law-enforcement. Manufactures, who would not comply, would be prohibited selling their software in the UK. In summer 2015, German security minister Thomas De Maiziere joined the chorus demanding exceptional access for law-enforcement. In the US, Head of FBI, James Comey began to lobby against Apple’s iOS security because it prohibited brute-force attacks guessing the passcode of iPhones. The US Intelligence community complained about hostile legislative environment on banning encryption that „could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement” (Nakashima & Peterson, 2015). The climate did indeed change, with the Paris Attacks of November and the San Bernadino shooting in December 2015. Since then, we can see a coordinated campaign of Western democracies arguing, like their authoritarian counterparts, that the government should have access to encrypted communications, either via backdoors or via software exploits. The latest initiative by France and Germany is just another example of this general trend.

This could be a crucial moment in time. In 1994, a vast majority of citizens and politicians opposed the government regulation and/or weakening encryption for very good reasons. It was perceived as an unreasonable and potentially dangerous state-encroachment on digital technologies. But not just privacy demands strong encryption, but also national security. Former NSA-head Michael Hayden for example argues convincingly: „the number one threat facing America is the cyber threat …I think the government has a right to demand this [weakening encryption], I just don’t know if its a wise thing for the government to demand this. My judgement is that we are probably better served by not punching any holes into a strong encryption system, even well-guarded holes.“ However, although almost all the good arguments of the Clipper debate are still valid, 2016 is not a time of reason and careful evaluation of future implications of security demands. The constant fear of terrorism, the rise of right-wing law-and-order populism with figures like Donald Trump, Viktor Orban or the German AFD, disregarding liberal values such as free speech or privacy, and the constant lobbying of law enforcement agencies for new surveillance capabilities, is clouding rational judgment of policy-makers and citizens. Except the expert and tech-community, there is no grass-roots momentum against government exceptional access and the weakening of encryption. Additionally, the interests of authoritarian regimes and democratic societies are in a peculiar alignment: both want to restrict the widespread use of encryption under the pretense of counter-terrorism. This is potentially dangerous because in the worst case, it could lead to an international initiative prohibiting encryption. If the European Union, with its focus on human rights, privacy and freedom of speech, will not act as a steward of privacy-enhancing encryption, no one else will.

Sources

  • Abelson, H., Anderson, R., Bellovin, S. M., Benalo, J., Blaze, M., Diffie, W., . . . Weitzner, D. J. (2015). Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications. Computer Science and Artificial Intelligence Laboratory Technical Report, MIT-CSAIL-TR-2015-026.
  • Bari Kolata, G. (1980). Cryptography: A New Clash Between Academic Freedom and National Security. Science, 209/4460), 995-996.
  • Baocun, W., & Fei, L. (1995). Information Warfare. Liberation Army Daily, June 13.
  • (2008). Access Denied. The Practice and Policy of Global Internet Filtering. London: MIT Press.
  • Diffie, W., & Hellman, M. (1976). New directions in cryptography. IEEE transactions on Information Theory, 22(6), 644-654. Retrieved from http://math.boisestate.edu/~liljanab/MATH308/NewDirectionsCryptography.pdf
  • Hayden, M. V. (2016). Hayden: The Pros and Cons of Access to Encrypted Files. Retrieved from https://www.youtube.com/watch?v=6HNnVcp6NYA
  • Inman, B. R. (1979). The NSA Perspective on Telecommunications Protection in the Nongovernmental Sector. Cryptologia, 3, 129-135. doi:10.1080/0161-117991853954
  • Kehl, D., Wilson, A., & Bankston, K. S. (2015). Doomed to repeat history? Lessons from the Crypto Wars of the 1990s. Report from the New America Foundation.
  • Levy, S. (1994c). Battle of the Clipper Chip. The New York Times. Retrieved from http://www.nytimes.com/1994/06/12/magazine/battle-of-the-clipper-chip.html
  • Nakashima, E., & Peterson, A. (2015). Obama faces growing momentum to support widespread encryption. Retrieved from https://www.washingtonpost.com/world/national-security/tech-trade-agencies-push-to-disavow-law-requiring-decryption-of-phones/2015/09/16/1fca5f72-5adf-11e5-b38e-06883aacba64_story.html
  • Rid, T. (2016). Maschinendämmerung: Eine kurze Geschichte der Kybernetik. Propyläen Verlag.
  • Risen, J., & Lichtblau, E. (2005). Bush lets US spy on callers without courts. New York Times, 16, A1. Retrieved from http://www.ftlcomm.com/ensign/desantisArticles/2006_934/desantis939/wiretap_NYT.pdf
  • Senate, U. S. (1994). The administration’s clipper chip key escrow encryption program: hearing before the Subcommittee on Technology and the Law of the Committee on the Judiciary United States Seneate.
Beitrag erstellt 145

Verwandte Beiträge

Beginne damit, deinen Suchbegriff oben einzugeben und drücke Enter für die Suche. Drücke ESC, um abzubrechen.

Zurück nach oben