Deputy Attorney General Rod Rosenstein.
Last week, Deputy Attorney General Rod Rosenstein gave a speech about encryption at the U.S. Naval Academy, solidifying the Trump administration’s anti-encryption stance. During his campaign, the president advocated a boycott of Apple products when the company took the Federal Bureau of Investigation (FBI) to court after the bureau tried to force Apple to hack into the San Bernardino shooter’s iPhone. However, until this summer, the administration was mostly silent on the encryption debate.
In August, Rosenstein accused unnamed technology companies of being “unwilling…to help enforce court orders” to provide access into encrypted devices, and said that “legislation may be necessary” unless companies “work with [law enforcement] to stop criminals from defeating law enforcement.” On Tuesday, Rosenstein was even more explicit. He argued that “the advent of ‘warrant-proof’ encryption is a serious problem” and that “our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection, especially when officers obtain a court-authorized warrant.” He asserted that “there is no constitutional right to sell warrant-proof encryption.” Finally, he said that there is such a thing as “responsible encryption” that provides security while also guaranteeing access for law enforcement (i.e. exceptional access), and offered several suggestions of what those approaches to encryption could be.
Before responding to assertions about security and the feasibility of his exceptional access proposals, it’s important to set the record straight about Rosenstein’s many legal arguments.
First, it is not true that we are newly experiencing the “advent of ‘warrant-proof’ encryption.” Encryption was not recently invented or discovered, as Rosenstein suggests. Ciphers have been used to secure sensitive communications or information for millennia, including by our founding fathers. The use of full-disc and end-to-end encryption has certainly increased with the advent of the Internet and the adoption of digital and connected devices, and companies have increasingly started providing encryption as a default setting and in a manner where they don’t maintain access to the key. However, unbreakable encryption is nothing new.
Further, the concept of “warrant-proof encryption” is a myth, a made-up term that the Department of Justice and the FBI use to describe a situation in which strong encryption poses an obstacle to an investigation. The reality is that encryption is just math—a set of very complex equations—and when encryption is developed, the only concern is making sure that the math is right. Because those equations are so complex, law enforcement often can identify and exploit vulnerabilities in the code to access the contents of a particular device, as it did with the San Bernardino shooter’s iPhone. But if it can’t, it’s not because the encryption is “warrant-proof,” it’s just because the math is right. In other words, if the code was deliberately breakable, or warrant-friendly, to make-up another term, it wouldn’t do what encryption is meant to do – keep data highly secure – it would leave it vulnerable to exploitation.
Even if we were to adopt Rosenstein’s framing, it’s still not the case that encryption is “warrant-proof” if a company chooses not to maintain access to the key. The owner of the device or account is still capable of providing access if legally compelled. Think about law enforcement access to materials within a house that a suspect owned and occupied: there wouldn’t be a third party from whom you could seek the evidence; you’d have to get a search warrant and go directly to the suspect’s home to execute the warrant. The access that Rosenstein claims “warrant-proof” encryption takes away is merely a third-party pathway to evidence. Companies’ using strong encryption without retaining a key are not eliminating any possibility of access.
Second, Rosenstein’s statement about the history of law enforcement’s ability to access evidence requires significant correction: our society has always had a system where evidence of criminal wrongdoing can be totally impervious to detection, including when officers obtain a court-authorized warrant. Papers and other physical items that might someday be introduced as evidence can be destroyed. Individuals can plan crimes entirely in their head, and unless that planning was recorded in a way that could be accessed by law enforcement, no warrant could make it accessible as evidence. People have always engaged in ephemeral conversations that if law enforcement was not present to witness, would not be accessible as evidence in a criminal investigation. By Rosenstein’s definition, all of those things should be considered “warrant-proof,” yet he is not advocating that Congress should outlaw people throwing out or destroying papers, or require that every thought be transcribed or conversation recorded because otherwise criminals could take advantage and evade law enforcement.
As these examples show, a warrant guarantees law enforcement the right to attempt to access or obtain evidence—to search for evidence—but it does not guarantee that officers will be able to obtain the evidence they seek. Additionally, no free society would stand for that level of intrusion into its privacy. Since physical papers and ephemeral conversations have largely transitioned to digital documents or online communications, requiring software and hardware developers to establish a means to guarantee access to those data would be a modern-day version of prohibiting the destruction of any papers or requiring ubiquitous recording of all ephemeral conversations—something that law enforcement would never have considered in the analogue era and still won’t consider today.
Third and finally, several federal courts have held that there is, in fact, a constitutional right to sell or otherwise make available unbreakable encryption. The First Amendment to the Constitution guarantees the right to free speech, which includes not only the right of the individual to express oneself as she or he sees fit, but also to be protected from compelled speech. U.S. federal courts have repeatedly held that code, like that used to develop encryption, is speech, and thus its development and distribution cannot be infringed upon by the government.
In 1995, the 9th Circuit of the U.S. Court of Appeals held in Yniguez v. Arizonans for Official English that “speech in any language consists of the “expressive conduct” of vibrating one’s vocal chords, moving one’s mouth and thereby making sounds, or of putting pen to paper, or hand to keyboard” and is protected by the First Amendment. A year later, the U.S. District Court for the Northern District of California relied heavily on the precedent set by Yniguez when it ruled on Bernstein v. U.S. Department of State, a case brought by a U.C. Berkeley graduate student and teaching assistant because he was prohibited by the State Department from publishing encryption source code that he developed. The court ruled in favor of Bernstein’s right to publish the code online, and held that:
Nor does the particular language one chooses change the nature of language for First Amendment purposes. This court can find no meaningful difference between computer language, particularly high level languages as defined above, and German or French. All participate in a complex system of understood meanings within specific communities. Even object code, which directly instructs the computer, operates as a “language.” When the source code is converted into the object code “language,” the object program still contains the text of the source program. The expression of ideas, commands, objectives and other contents of the source program are merely translated into machine-readable code.
Finally, in 1999, the Sixth Circuit Court of Appeals ruled in favor of code being speech in Junger v. Daley, a case brought by a professor at Case Western Reserve University against the U.S. Department of Commerce. Here, the professor wanted to publish encryption source code for his students to access, but believed that regulations promulgated during the first debate over encryption, dubbed the Crypt Wars, prohibited such publication. Like the Ninth Circuit, this court held that encryption source code is speech protected by the First Amendment.
Many have also argued that requiring individuals or entities to write software according to vague government-mandated specifications would amount to unconstitutional compelled speech. In the San Bernardino case, Apple argued that the government should not be permitted to conscript the company to develop a new operating system to circumvent the encryption protecting the shooter’s iPhone because, among other things, any such requirement would amount to unconstitutional compelled speech. Dozens of companies and advocacy groups, from Amazon, Google and Facebook to the Center for Democracy & Technology (CDT), and the Electronic Frontier Foundation (EFF) and dozens of technologists, researchers, and cryptographers, and many others argued the same in their briefs supporting Apple that they submitted to the court. Because the FBI was able to obtain outside assistance to hack into the phone, the case was mooted and the court never issued a ruling on the issue.
During his speech, Rosenstein also made several proposals about how companies could address law enforcement’s needs. These approaches included key escrow schemes and providing different levels of encryption to people and users based on their industry or use needs. A detailed response to the technical requirements and feasibility of his proposals is best left to technologists and cryptographers. However, it is worth noting here that many technologists have written extensively about the cybersecurity risks created by systems that purport to provide exceptional access to plaintext for law enforcement.
For example, the authors of the paper Keys Under Doormats examined and critiqued various exceptional access (i.e. encryption backdoor) schemes, and a broad group of technologists and other experts recently participated in a series of workshops and a study on encryption organized by the National Academies of Sciences. A published transcript of the workshops show those experts strongly recommended against the adoption of solutions such as those Rosenstein proposed because of inherent and significant security flaws, the extreme complexity of their implementation that would virtually guarantee their failure, or the reality that users could still evade law enforcement by layering on their own unbreakable encryption or switching from U.S. brands to open source or foreign developed software that dominates the global encryption market.
Furthermore the major cyber attacks and data breaches that have dominated the news for years, like those at Equifax and Yahoo!, have made it clear that the only kind of “responsible encryption” is the unbreakable kind. The debate we should be having now is not how to water-down technology to meet law enforcement’s needs, but rather, how to ensure that law enforcement can adapt quickly and effectively as technology evolves. Rosenstein is right that society must make a “fully-informed decision” on this issue. Civil society and industry have engaged earnestly in this round of the encryption debate since it began in 2014. For society to be truly fully-informed as Rosenstein urges, we must all be clear about the boundaries and protections of constitutional law, the limitations in conducting investigations that law enforcement has always faced, and the security risks and benefits to requiring an encryption backdoor.
Image: Getty/Alex Wong