Over the past five years or so, government intelligence and law enforcement agencies in the United States and the United Kingdom (U.K.) have been alleging that the widespread availability of end-to-end encryption services and applications is preventing them from performing their duties by concealing users’ communications and data from lawful surveillance. The phrase used by governments to describe this trend was “going dark,” a shorthand term meant to convey the message that their ability to obtain access to electronic communications was being shut off by technology, which they warned would impair their ability to investigate criminal activity and prevent terrorist attacks.
But these same encryption technologies are necessary for the security and privacy of people worldwide, not to mention the global economic infrastructures that depend on strong encryption to operate. The solution, according to law enforcement and intelligence agencies, is to require all encryption-enabled applications be somehow modified so that user secrecy is maintained while still allowing government access to these data, pursuant to the applicable legal process.
Critics of these proposed government mandates mainly based their objections in two areas. First, what the government agencies were proposing—essentially mandating a “back door” in all cryptographic applications—would irrevocably compromise the security of these applications, such that other parties, including criminals, hostile state actors, and authoritarian governments, could use those same “back doors” to gain illicit access. This was unacceptable to privacy and human rights advocates as well as to corporations and financial institutions, whose existence also depended upon the security and integrity of end-to-end encryption. Second, critics disagreed with the premise that surveillance was indeed “going dark,” and questioned the degree to which government-proposed solutions would actually change the accessibility of terrorists’ and criminals’ communications.
These criticisms resonated generally, and there was sufficient political pushback in 2014 and 2015 to convince the Obama Administration to declare that it would not seek legislation requiring government access to encrypted communications in the short term. And law enforcement and intelligence agency opponents of strong end-to-end encryption know that selling their solution to the general public is an uphill battle.
But that hasn’t stopped law enforcement and intelligence agencies from seeking alternative paths to their goal. Earlier this year, a bipartisan group of U.S. Senators introduced a new bill aimed at stemming the distribution of child pornography. The bill, titled the EARN-IT Act (short for Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2000), was written to force internet providers and platforms to provide means to support automated scanning for child sexual abuse materials (CSAM). Because end-to-end encryption makes this kind of search difficult, EARN-IT would require companies to either (a) come up with a solution (e.g., a “back door”) that would allow this kind of search of encrypted data, or (b) stop using encryption altogether. As analysts have pointed out, EARN-IT is using our universal disgust for CSAM as a means to an end—stopping the use of strong end-to-end encryption.
Lately, governments have been eyeing yet another means of implementing anti-encryption policy: app stores. Because platforms like Apple and others limit the applications their customers can load on their devices by requiring all applications to be registered with and approved by the platform before they can be made available for purchase or download, they make natural chokepoints for de facto government regulation, but often without the need for messy political processes, because governments can go straight to the companies to implement their preferred policies.
Here is a recent example: Apple complied with a 2019 request by the Chinese government to ban the HKmap.live app from their App Store, as it was being used by protestors in Hong Kong to crowdsource the location of police checkpoints. The Chinese market represents 17% of Apple’s global revenue, and any qualms Apple may have about the privacy and security of their Hong Kong customers were apparently outweighed by a possible loss of access to China entirely. While the governments of democracies like the United States and the U.K. might disagree with the human rights abuses that such a move could enable, they also see that following such a model themselves might open up the opportunity to achieve their policy goals while avoiding the legislative or regulatory processes that have frustrated their plans so far.
Indeed, this use of soft power to effect technology policy change has not been lost on western governments like the United States and U.K. On October 11, the U.S. Department of Justice issued a memorandum titled “International Statement: End-To-End Encryption and Public Safety,” which is signed by high-ranking government officials including Priti Patel, the U.K. Secretary of State for the Home Department, and William Barr, the U.S. Attorney General, along with other senior ministers from Australia, New Zealand, and Canada (the “Five Eyes”), along with India and Japan. The statement makes the requisite noises about understanding the need for strong encryption to protect “personal data, privacy, intellectual property, trade secrets, and cyber security,” but then pivots to the “challenges to public safety” strong end-to-end encryption poses. The senior government officials “urge industry to address our serious concerns” and “call on technology companies to work with governments” to take steps to “[e]mbed the safety of the public in system designs,” “[e]nable law enforcement access to content,” and “[e]ngage in consultation with governments…to facilitate legal access.” The statement then references the same concerns regarding CSAM and child exploitation found in the EARN-IT bill, citing multiple national and international organizations with strong interests in preventing and prosecuting child sexual exploitation.
The implications of the statement are fairly clear to technology companies: start implementing systems that give us access to user data or face the political and economic consequences of being globally labeled as a hindrance to government efforts to protect children from abuse. The statement points to Facebook Messenger as the source of over 12 million worldwide reports of CSAM in 2019, reports governments say will disappear if they are not granted back-door access to end-to-end encrypted systems. The statement notes that “measures to increase privacy—including end-to-end encryption—should not come at the expense of children’s safety,” presenting a very unpalatable either-or question for technology companies to publicly answer.
This is the kind of pressure governments know can make global technology companies sit up and take notice. Further, there’s a network effect that can follow these larger companies’ decisions. If enormous companies like Apple, Google, and Facebook are pressured to limit user access to end-to-end encryption by allowing only “back-doored” apps on their platforms, the vast majority of user traffic will thus be open to government inspection by the very fact that users are generally at the mercy of these platforms when it comes to the apps they are able to use. Sure, there are sometimes open source and other DIY software options that offer end-to-end encryption, but the vast majority of users have neither the time nor wherewithal to build these solutions, and in cases like Apple’s iPhone and iPad, they cannot load these unapproved apps at all. In other words, it is possible to effect policy globally through the judicious application of pressure to a few key technology platforms. And if one country’s government manages to succeed via this approach, that may be enough to drive technology platforms in other countries to follow suit out of necessity, in large part because cross-border data agreements and safe harbors depend on this kind of comity.
Thus, if the HKmap.live model continues to be successful, governments may finally be able to achieve their goals of crippling end-to-end encryption without having to go through the long, and politically expensive, process of legislation. But the privacy and security implications of such a move would still remain—we would be left with compromised versions of the strong encryption we have come to expect in our internet communications, a scenario that most information security experts and computer scientists agree would quite likely mean more data breaches, more cybercrime, fewer protections for protestors and dissidents, and a reduced trust in critical institutions and applications that increasingly rely on strong encryption, like electric grids, banking, and voting. Given how much contemporary society depends on secure, reliable internet communication, it would not be hyperbole to see such a change as potentially catastrophic.
Rather than cave to this kind of soft power being brought to bear, technology companies should recognize this kind of activity for what it is—an end-around for government law enforcement and intelligence agencies to gain greater control over the communications of their citizens without having to go through the kinds of democratic approval processes that have been a hinderance to them thus far. A decision of this magnitude cannot be a mere business decision, nor should the world’s technology giants see it as such.