The FBI, true to form, claims that these broad new mandates aren’t an expansion of power but merely an update aimed at “preserving our ability to execute our existing authority.” And the proposal does bear a superficial similarity to the Communications Assistance for Law Enforcement Act of 1994, which required telecoms to ensure that their new digitally switched networks were as wiretap-friendly as the old analog phone system.
But the current proposal is far more radical, in part because the Internet is not much like a traditional phone network. To see why, consider Skype, a popular program that allows users to conduct secure text chats, phone conversations, video conferences, and file transfers. Skype is designed as a distributed peer-to-peer network, meaning there’s no central hub or switching station through which calls are routed; only the login server used to register members as they sign on to the network is centralized. Calls are encrypted end-to-end, meaning that only the end users who are parties to a call hold the secret keys to secure the conversation against online snoops. There’s no device Skype can install at their headquarters that would let them provide police with access to the unencrypted communications; to comply with such a mandate, they’d have to wholly redesign the network along a more centralized model, rendering it less flexible, adaptable, and reliable as well as less secure.
Skype is just one of the thousands of firms, large and small, that would be burdened with the obligation to design their systems for breach. We’ve already seen how this can cause security vulnerabilities on traditional phone networks: In 2005, it was discovered that unknown hackers had exploited wiretap software built into Vodaphone Greece’s computer system for law-enforcement use to eavesdrop on the cellular phone conversations of high Cabinet officials and even the prime minister. Designing for surveillance means, more or less by definition, designing a less secure, more vulnerable infrastructure. It’s for just this reason that similar proposals were wisely rejected during the Crypto Wars of the 1990s, a decision that helped give rise to a thriving online economy that’s wholly dependent on strong encryption.
It’s not just hackers who could exploit such vulnerabilities, of course. A network architecture designed for the convenience of American law enforcement also necessarily makes eavesdropping easy for the many regimes whose idea of a “national-security threat” includes political dissent or blasphemous speech. And there’s always the threat of interception by insiders: An engineer at Google was recently fired for using his privileged access to snoop into the private accounts of several teenage users. One way to alleviate such concerns is for firms like Google to enable end-to-end encryption, so users can feel secure that even the company’s own employees won’t have the keys needed to read their communications. The government’s proposal would deny them the ability to make that promise.
Companies in the burgeoning cloud-computing sector know full well that businesses and consumers alike are eager to take advantage of the convenience and flexibility of cloud services but are also skittish about entrusting extremely valuable data to third parties. At a recent Capitol Hill hearing, a panoply of high-tech executives testified that the complexity and unpredictability of American surveillance law, as well as the relatively weak protections afforded to data stored in the cloud, were hampering the adoption of cloud services and placing U.S. companies at a disadvantage relative to foreign competitors. The government’s proposal would only exacerbate the problem.
One could argue that these are costs worth bearing if the government’s plan had a prayer of actually working, but it doesn’t. There are already a plethora of open-source encryption tools freely available on the Internet, which sophisticated terrorists and criminal enterprises will have even greater incentive to use once we’ve announced that the encryption built into communication services can’t be trusted. That’s a genie there’s no way to rebottle.
Fortunately, law enforcement still has a recourse that makes it unnecessary to impose architectural mandates on tech companies or weaken the security of all our communications. They can get old-fashioned physical search warrants and bug the devices used by their suspects. Less convenient, to be sure, but with the advantage of not imposing massive economic and privacy costs on everyone who isn’t a suspect.
There is one type of surveillance that genuinely would be rendered impractical by widespread use of secure communications, however. Known individual suspects can be targeted by other means, but if the government wanted to do wholesale surveillance, in which the whole communications stream is automatically analyzed and filtered by artificial intelligence software hunting for suspicious communications by unknown parties — as several accounts have suggested the National Security Agency did under the warrantless wiretapping program authorized by President George W. Bush — they really would need a back door at the system level. But while governments may consider it a bug when network architecture renders such sweeping surveillance infeasible, citizens should probably regard it as a feature.
It’s hard to blame harried law-enforcement officials for wishing they could freeze time or control disruptive technological changes. They can’t, of course, but they could do a great deal of damage to both the high-tech economy and the security of global communications before they figure that out.