Encryption and a Prescription for Conscription

You’ve probably seen the stories about U.S. law enforcement’s efforts to require a backdoor in encryption used in electronic communications. More recently, the U.S. Department of Justice (DOJ) succeeded in convincing a judge to force Apple to help hack an iPhone used by the perpetrators of the San Bernardino attack. These related but distinct issues are written about frequently, but aren’t fully understood by many. Complex matters of civil liberties, technology, privacy and the emotional impact of the underlying terrorism cases muddy the waters, leaving the issue ripe for knee-jerk reactions.

The iPhone has become the lightning rod for both of these controversies given its popularity and use of end-to-end encryption.  This type of encryption is designed to be decipherable only by the sender and intended recipient of encrypted communications; no one in between – including the communications provider, app developer or device designer – should be able to decrypt those communications. End-to-end encryption is used by a variety of different applications, including those that exchange texts, email, voice and video communications.        

Apple, Google and other companies use end-to-end encryption because it enhances their users’ privacy and security by decreasing the number of people that can access and decipher encrypted communications. Of course, this also means that Apple doesn’t have the ability to peek into – or allow law enforcement to peek into – those communications, even in the course of an important criminal investigation like the one in San Bernardino.   

U.S. law enforcement, frustrated by Apple’s inability to comply with search warrants and other legal process, has attempted two approaches to secure access to encrypted data: (1) requiring companies like Apple and Google to implement a backdoor encryption key for government use; and (2) forcing Apple to create special versions of its iPhone software to help the government defeat Apple’s own privacy and security safeguards. We’ll address each approach in turn.

Encryption Backdoor

There was a push by U.S. law enforcement last year to require technology companies to include in their products a backdoor encryption key that would allow the U.S. government to access user data during court-approved searches and electronic surveillance. It wasn’t the first such attempt, but it was notable for its scope and the initial support it received on Capitol Hill after the terrorist attacks in Paris and San Bernardino. The request for new legislation stemmed specifically from law enforcement’s concern about terrorists “going dark,” which can make a terrorist more difficult to surveil prior to an attack, and can hinder an investigation after an attack. A backdoor encryption key would make encryption implemented in U.S. products transparent to the government.

A request for legislation requiring companies to make encryption transparent to the government raises a plethora of privacy questions.  And as much as I enjoy debating the privacy implications, I think it’s unnecessary here given the impracticality of such legislation.  Simply put, it won’t work, and it would do lasting harm to the U.S. economy.

The market is already full of strong encryption options that won’t be impacted by U.S. regulation.  This includes communications programs for mobile devices and computers, open source applications with contributors all over the world, and established programming libraries developers can incorporate in an application with little effort.  By passing a law requiring U.S. software developers to include a government backdoor, the U.S. would put its own companies at a disadvantage, encouraging an explosion of additional encryption development outside of the U.S.  We would quickly see users flocking to these non-U.S. offerings, and an abandonment of software developed in the U.S. which would likely be viewed as tainted by customers that already have trust issues in a post-Snowden world.   

To maintain competitiveness in the face of such regulation, some U.S. companies may consider moving overseas rather than complying with a law that makes their product less attractive to privacy and security conscious customers.  Larger U.S. companies that couldn’t move offshore would modify their platforms to allow the use of third-party encryption plugins so that their customers could take advantage of strong encryption developed elsewhere.  The end result is that most people would end up using encryption the government couldn’t pierce, and U.S. companies would be at a distinct disadvantage in the marketplace.  Nobody wins.    

Forcing Companies to Defeat their Own Security

After early praise by many members of Congress, efforts to pass a law requiring the backdoor encryption key discussed above stalled recently as privacy advocates weighed in and highlighted the problems inherent in that legislation. Frustrated with no way to access the phone used by the perpetrators of the attack in San Bernardino, the DOJ adopted a new strategy – using a law originally enacted in 1789 which says only that “[courts] may issue all [orders] necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”

The DOJ used this catch-all statute, referred to as the All Writs Act, to convince a federal Magistrate Judge to issue an order well beyond the scope of traditional subpoenas and search warrants.  Because Apple’s privacy and security safeguards make it impossible for Apple to access its customers’ iPhones, the DOJ convinced the court to effectively commandeer Apple’s development team, requiring them to create and provide to the DOJ a custom version of IOS, the iPhone’s operating system, for the sole purpose of circumventing Apple’s own safeguards. Once circumvented, the DOJ hopes it will be able to execute a brute-force attack against the phone, gaining access to it in relatively little time.

Two points here are worth further elaboration. First, courts issue subpoenas and search warrants all the time, but those orders generally compel a person to turn over information or a physical thing. Here, Apple did not have a program, method, or even any information that could assist the government. Thus, in a completely unprecedented move, the DOJ requested, and the court ordered, that Apple create an entirely new version of IOS. By conscripting Apple and its developers, the court effectively deputized the company, turning it into an agent of the government against its will.    

The second point worth highlighting is this: Congress wasn’t involved in this significant expansion of government authority. Yes, it had the opportunity to be involved in a related issue when it considered the encryption legislation, but declined, allowing the debate to continue. Only after Congressional support for the encryption legislation waned did DOJ take the unprecedented move of seeking a court order. This move has been described by some as a creative use of all of the tools at DOJ’s disposal, but strikes many as a clear sidestepping of Congress’ authority to legislate this important issue.    

In an open letter about the case this week, Apple CEO Tim Cook characterized the order as a “dangerous precedent,” and noted that there are no guarantees that the insecure version Apple has been ordered to create wouldn’t somehow be reused, reverse engineered, or otherwise disclosed. Dissemination of a tool or method that unlocks the iPhone’s privacy and security safeguards would have a huge impact on all of Apple’s hundreds of millions of customers, and would almost certainly inflict an enormous reputational and financial cost on the company.   

The DOJ’s extraordinary demands in this case aren’t limited to mobile phone companies. The arguments made by DOJ attorneys could just as easily be applied to cloud, security and other companies, forcing government-dictated changes in technology and driving customers to use non-U.S. solutions that aren’t subject to court-compelled hacking by the vendor that created them. For this reason, we will continue to educate legislators and encourage a real partnership between technology companies and U.S. law enforcement. If you have questions or want to express a different view on these issues, please email me at shane.mcgee@fireeye.com.