Gather around young’uns: Back in the antediluvean early 90s, when the digital world was young, a motley group of technologists and privacy advocates fought what are now, somewhat melodramatically, known as the Crypto Wars. There were many distinct battlefields, but the overarching question over which the Crypto Wars were fought was this: Would ordinary citizens be free to protect their communications and private files using strong, truly secure cryptography, or would governments seek to force programmers and computer makers to build in backdoors that would enable any scheme of encryption to be broken by the authorities? Happily for both global privacy and the burgeoning digital economy—which depends critically on strong encryption—the American government, at least, ultimately saw the folly of seeking to control this new technology. Today, you are free to lock up your e‑mails, chats, or hard drives without providing the government with a spare key. (The conflict was featured on the front page of Wired Magazine’s second issue, and later detailed in Steven Levy’s lively book Crypto.)


Fast forward to 2014: Apple has announced that the new version of its mobile operating system, iOS, features full disk encryption to protect users’ data, and in contrast to earlier versions of iOS, Apple will not leave itself a backdoor that previously allowed the company to access at least some of the phone owner’s encrypted information. The announcement has been greeted with alarm by cyberlaw professor Orin Kerr, in a series of Washington Post blog entries that seem designed to prove Santayana’s hoary dictum about the perils of ignoring history. Apple, Kerr avers, is playing a “dangerous game” by implementing “a policy that only thwarts lawful search warrants.” Police investigations, he fears, will now be stymied by criminals who refuse to unlock their phones, rendering search warrants to access those devices little more than “a nice piece of paper with a judge’s signature.”


Normally, Kerr’s writing on electronic privacy is marked by an understanding of modern telecommunications technology nearly as impressive as his legal erudition, but in this case, I fear, he has succumbed to an uncharacteristic fit of technopanic. While he writes as though the corporate anarchists at Apple are brazenly thumbing their noses at police with a radical new policy, the truth is more nearly the opposite: It is Apple’s backdoor access that was the abberation, even for Apple. If you encrypt your MacBook’s hard drive with Apple’s FileVault, or your Windows computer with Microsoft’s BitLocker, then unless the user chooses to send either company a backup copy of her encryption key, they can no more unlock those encrypted files than a bookbinder can decipher the private code you employ in your personal diary. Strong encryption is not even new to smartphones: Google’s Android operating system—the world’s most popular mobile platform, running on twice as many devices as iOS—has featured full‐​device encryption since 2011, and Google has never had backdoor access to those encrypted files. And, of course, there have always been a wide array of third‐​party apps and services offering users the ability to encrypt their sensitive files and messages, with the promise that nobody else would hold the keys. Does encryption occasionally stymie legitimate law enforcement investigations? Of course—though way, way less often than you might think. The point to remember here, though, is that criminals have had access to backdoor‐​free encryption for many, many years before Apple announced its new policy without ushering in a terrifying new age of unstoppable criminals and impotent police.


Still, Kerr is right that encryption will now be far easier and more prevalent: Unbreakable encryption is not novel, but the decision to make iOS and Android devices encrypted by default is. Previously, at least, criminals had to be savvy enough to make the choice to use encryption consistently—and many weren’t. Encryption by default, because it protects average crooks as well as sophisticated cybercriminals, is likely to be a practical impediment in many more investigations. Criminals can still be punished for refusing a court order to unlock their devices, but may escape more serious charges that would be provable only with that encrypted evidence. Does this strengthen the case, as Kerr suggests, for legislation requiring device manufacturers to build in backdoors or retain sensitive data? It does not, for several reasons.


First, as Kerr belatedly acknowledges in a follow‐​up post, there are excellent security reasons not to mandate backdoors. Indeed, had he looked to the original Crypto Wars of the 90s, he would have seen that this was one of the primary reasons similar schemes were almost uniformly rejected by technologists and security experts. More or less by definition, a backdoor for law enforcement is a deliberately introduced security vulnerability, a form of architected breach: It requires a system to be designed to permit access to a user’s data against the user’s wishes, and such a system is necessarily less secure than one designed without such a feature. As computer scientist Matthew Green explains in a recent Slate column (and, with several eminent colleagues, in a longer 2013 paper) it is damn near impossible to create a security vulnerability that can only be exploited by “the good guys.” Activist Eva Galperin puts the point pithily: “Once you build a back door, you rarely get to decide who walks through it.” Even if your noble intention is only to make criminals more vulnerable to police, the unavoidable cost of doing so in practice is making the overwhelming majority of law‐​abiding users more vulnerable to criminals.


Second, and at the risk of belaboring the obvious, there are lots of governments out there that no freedom‐​loving person would classify as “the good guys.” Let’s pretend—for the sake of argument, and despite everything the experts tell us—that somehow it were possible to design a backdoor that would open for Apple or Google without being exploitable by hackers and criminals. Even then, it would be awfully myopic to forget that our own government is not the only one that would predictably come to these companies with legal demands. Yahoo, for instance, was roundly denounced by American legislators for coughing up data the Chinese government used to convict poet and dissident Shi Tao, released just last year after nearly a decade in prison. Authoritarian governments, of course, will do their best to prevent truly secure digital technolgies from entering their countries, but they’ll be hard pressed to do so when secure devices are being mass‐​produced for western markets. An iPhone that Apple can’t unlock when American cops come knocking for good reasons is also an iPhone they can’t unlock when the Chinese govermment comes knocking for bad ones. A backdoor mandate, by contrast, makes life easy for oppressive regimes by guaranteeing that consumer devices are exploitable by default—presenting U.S. companies with a presence in those countries with a horrific choice between enabling repression and endangering their foreign employees.

Third—least obviously, but perhaps most importantly—any backdoor or retention mandate both implicitly assumes and, if it is to be effective, must effectively encourage centralized over decentralized computing and communications architectures. When Kerr contemplates requiring “cellular phone manufacturers” to enable police access to their devices, he tacitly presupposes that the manufacturer is in control of the software running on the device. That may describe Apple’s notoriously tightly integrated ecosystem—but it is hardly the norm for computing devices. Most, of course, come preinstalled with an operating system and some default software packages chosen by the manufacturer, but if the user wants to install new software or a different operating system, she is free to do so. That software may be released by a huge corporation like Apple or Google, with teams of lawyers on retainer to comply with lawful orders and subpoenas, by a tiny startup, by a lone developer working from his basement, or by a dispersed global community of open source coders.


As writer Cory Doctorow explains in his insightful essay “Lockdown: The Coming War on General‐​Purpose Computing,” the only real way to make mandates of the kind Kerr discusses effective is to prohibit computers (and smartphones, of course, are just small computers with embedded cellular radios) that are truly controlled by their lawful owners:

We don’t know how to build a general‐​purpose computer that is capable of running any program except for some program that we don’t like, is prohibited by law, or which loses us money. The closest approximation that we have to this is a computer with spyware: a computer on which remote parties set policies without the computer user’s knowledge, or over the objection of the computer’s owner. Digital rights management always converges on malware.

If you saddle Apple, or any other device manufacturer, with a legal obligation to help police unlock a device, you necessarily encourage them to centralize control over the software running on that device. Apple, again, is already pretty centralized, but there’s not much point in requiring Google to release an insecure version of Android if any user can just install a patch that removes the vulnerability. You can require Apple to store iMessage chats for the convenience of police, but if users can simply install an open‐​source, peer‐​to‐​peer chat application that isn’t designed to spy on them, all that does is drive privacy‐​conscious users (including, of course, criminals—but by no means criminals alone) away from iMessage. In the long run, the options are an ineffective mandate that punishes companies that choose centralized models, or a somewhat more effective mandate that will still be circumvented by sophisticated criminals… but only at the cost of destroying or marginalizing the open computing architectures that have given us decades of spectacular innovation. Even if we ignore very serious concerns about privacy and security, these are both terrible options.


Fourth and finally, we should step back and maintain a little perspective about the supposedly dire position of 21st century law enforcement. In his latest post in the Apple series, Kerr invokes his influential “equlibrium adjustment theory” of Fourth Amendment law. The upshot of Kerr’s theory, radically oversimplified, is that technological changes over time can confer advantages on both police investigators and criminals seeking to avoid surveillance, and the law adjusts over time to preserve a balance between the ability of citizens to protect their privacy and the ability of law enforcement to invade it with sufficiently good reason. As I hope some of my arguments above illustrate, technology does not necessarily provide us with easy Goldilocks policy options: Sometimes there is just no good way to preserve capabilities to which police have grown accustomed without imposing radical restrictions on technologies used lawfully by millions of people—restrictions which are likely to as prove futile in the long run as they are costly. But this hardly means that evolving technology is bad for law enforcement on net.


On the contrary, even if we focus narrowly on the iPhone, it seems clear that what Apple taketh away from police with one hand, it giveth with the other: The company’s ecosystem considered as a whole provides a vast treasure trove of data for police even if that trove does not include backdoor access to physical devices. The ordinary, unsophisticated criminal may be more able to protect locally stored files than he was a decade ago, but in a thousand other ways, he can expect to be far more minutely tracked in both his online and offline activities. An encrypted text messaging system may be worse from the perspective of police than an unencrypted one, but it is it really any worse than a system of pay phones that allow criminals to communicate without leaving any record for police to sift through after the fact? Meanwhile activities that would once have left no permanent trace by default—from looking up information to moving around in the physical world to making a purchase—now leave a trail of digital breadcrumbs that would have sounded like a utopian fantasy to an FBI agent in the 1960s. Law enforcement may moan that they are “going dark” when some particular innovation makes their jobs more difficult (while improving the security of law‐​abiding people’s private data), but when we consider the bigger picture, it is far easier to agree with the experts who have dubbed our era the Golden Age of Surveillance. Year after year, technology opens a thousand new windows to our government monitors. If we aim to preserve an “equilibrium” between government power and citizen privacy, we should accept that it will occasionally close one as well.