Apple’s recent announcement that it will encrypt its newest iPhones is again pushing to the fore the question of whether the law should be updated to require companies to have systems that would enable them to comply with court orders for information. In other words, does the law properly balance privacy and security in this area?
It is easy to get side-tracked by the inflammatory nature of the Apple announcement. The novelty in Apple’s approach is that it has chosen to justify its encryption policy as a means to be unable to respond to lawful court orders. Heretofore, such encryption measures were justified as a way to create safer security systems for users, with a by-product of precluding the government’s ability to obtain such information, even through lawful court-issued warrants. That a large American corporation such as Apple – which enjoys myriad benefits from the American legal system – decided to tout its new encryption as an anti-law enforcement measure is certainly one result of the Snowden leaks and the backlash against companies who may be perceived as being too cozy with the government.
However, the substance of Apple’s announcement – using encryption that results in systems that are harder to hack as well as for the government to obtain lawful access to – is not new. The “going dark” problem has been the subject of debate for years. It has in recently years taken on increasing urgency in domestic and international law enforcement and intelligence communities. Apple’s announcement, particularly its formulation of encryption as an anti-government tool to thwart court-ordered interception, has served to resurrect an issue that had been dormant after the Snowden leaks.
Currently, a limited number of companies that provide certain means of communications — such as traditional telephone firms like AT&T — are required by a law called the Communications Assistance for Law Enforcement Act (CALEA) to have systems that permit government access through a court order. Not covered are emails, chat, instant messaging, and other modes of communicating that are common now, but were not in 1994 when CALEA was passed. The law has not been updated to take into account that we communicate now in very different ways than when CALEA was written. As a result, companies like Apple need only provide the government with “technical assistance” in implementing court ordered interceptions, but technical assistance does not require a system that the government can intercept with a court order.
Thus, Apple cannot be legally faulted for its new policy – it has a legitimate business interest in ensuring the integrity of its systems for its customers and there is no law on the books that requires it to build systems that will enable the government – even through lawful process – to access that data. Sure, many would find it preferable if Apple chose to keep a key to its encryption and spend its time and effort in creating ways to build the safest system within that paradigm. And it may be foolhardy for companies to thumb their noses at the government, who they may one day call on for help in other areas. But Apple is not legally required to take a different tack. Because companies are increasingly touting encryption, Congress needs to assess whether the growing inability to access such means of communication, with a court order, is coming at too great a cost.
This debate is not an easy one. It involves a host of questions about what precisely the benefits and costs would be with respect to privacy and security. For example, whether bad actors already have alternative means to communicate that are fully encrypted, whether the government has alternative means to access such data lawfully, what the incremental costs of a company retaining a “key” would be, and whether there are methods to reduce those costs, among other questions. Furthermore, hearings on these issues could not be entirely public, since airing governmental capabilities would provide a roadmap to those intent on avoiding government detection.
In assessing this balance, it is worth remembering that this country has been through a legal paradigm that was similar to what it faces from the proliferation of encryption schemes such as Apple’s. In the late nineteenth century, the Supreme Court held in Boyd v United States that there was an inviolate “zone of privacy” — including a person’s private papers and effects — that the government could not breach, even with a court order based on probable cause. This is the zone that Apple encryption would re-establish, since even if the government meets the highest evidentiary threshold before a neutral and detached magistrate, the resulting court warrant would be a dead letter. There is a cautionary tale here in seeing what happened to Boyd, if history is not to repeat itself at the risk of great cost to all of us. It got whittled away bit by bit, with the Court devising legal fictions to permit court authorized access to information in that zone. Boyd is as a consequence now largely obsolete as a holding to the point that the current Fourth Amendment debate focuses on when the government needs to proceed by a warrant based on probable cause, and not on whether the government is forbidden from obtaining the information under any circumstances. The demise of Boyd’s sacrosanct zone of privacy came about because it became increasingly clear that its rule served to undermine other important values, principally a diminished ability to promote safety occasioned by the inability to investigate people who used private zones for illicit purposes. In short, creating a lawless zone was untenable.
To my mind – although, as in many areas of the law, there is no perfect solution — the cost of a system where we may be more at risk to illegal hacking is outweighed by the vital role lawful electronic interception plays in thwarting crime – including devastating terrorist attacks. Law enforcement and intelligence officials, including most recently FBI Director James Comey, have noted that we all – including criminals — increasingly use non-telephonic means to communicate. The ability to monitor electronic communications is decreasing with every new encryption tool on such communication systems. Law enforcement authorities in the US and overseas rightfully note how such data is critical to solving everyday crimes, such as kidnapping, fraud, child pornography and exploitation, among many others. And at least as important, preventing terrorist attacks requires such ability, as intelligence agencies note (although due to the Snowden leaks, resulting in the public perception that the intelligence community has too much, not too little, access to information, the ramifications from encryption on traditional law enforcement is likely to be relied on by the government in the public debate on this issue).
This is a judgment Congress needs to make, and soon. In weighing the interests, however, it is no answer to say that the government should revert to means other than lawful intercepts obtained through court orders based on probable cause to prevent crimes. The reality of electronic communications is here to stay and plays a vital role in how crimes are perpetrated by allowing people to communicate with conspirators and to carry out their nefarious plans. In this regard, the government and privacy advocates both need to be consistent in their arguments: it is the latter who usually remind us that the advent of smartphones and “big data” makes traditional Fourth Amendment line-drawing obsolete. And they have a point, as the Supreme Court is starting to recognize. But by the same token, it is increasingly important to have an ability to monitor such communications, after meeting the necessary Fourth Amendment standard upon a showing to an independent Article III court.
The question is whether the 1994 law should be updated to require companies to comply with court orders for information. The 9/11 Commission Report admonished us all to be vigilant about our laws and systems, with due regard for civil liberties and privacy, and not to wait for the next disaster to examine these laws and practices. The Apple announcement should serve as a wake-up call for Congress to hold hearings to assess whether we have the balance right between safety and privacy.