For the past couple years we’ve been on a six month cycle where US federal law enforcement offices1 start demanding tech companies implement “responsible encryption” that would enable law enforcement to decrypt content under extenuating circumstances. Every time they start making those demands, the tech sector refuses, insisting that it would be bad for their products and put their users at risk. Having focused on applied cryptography for a while in graduate school, I often find myself answering questions about how a “responsible encryption” system might be designed, and why it would actually weaken security for everyone but the bad guys. So today we’re going to take a step away from crypto-currencies and trading, and talk about the cryptography that makes it possible.
In the early history of modern cryptography, the US government set out to employ every capable cryptographer in the country, to prevent strong encryption from getting out to the world. Over time bits and pieces leaked out or were independently developed, and it became clear that private encryption was inevitable.
The US Government released the Data Encryption Standard (DES), which used 64 bit keys (offering 56 bits of actual security). This was the first widely used encryption standard outside of government. A 56 bit key takes a lot of computing power to crack, but in 1975 when DES was released very few entities had the computing power to crack it in any reasonable amount of time. The US Government, however, was an exception. They weren’t too concerned about bad guys using DES, because if they needed to they could point a lot of computers at it and get the data they needed.
As computers advanced, DES became weaker. By 1999 there were private systems capable of cracking DES in under 24 hours. With modern cloud computing, the ability to crack a 56 bit key is in reach of virtually anyone.
By 2002, the Advanced Encryption Standard (AES) had been adopted as a Federal standard. AES supports 128 bit, 192 bit, and 256 bit keys.
A Bit About Bits
When we say a key is 56 bits, 128 bits, or 256 bits, it’s easy to get lost. The important thing to understand is that every bit you add doubles the complexity of your key. A 1 bit key has two possibilities - 0 or 1. A 2 bit key has four possibilities - 0, 1, 2, or 3. Every time you add a bit, you double the number of possible keys. Going from a 64 bit key to a 65 bit key doubles your security. Going from a 64 bit key to a 128 bit key increases your security astronomically.
To put things in perspective, if you could capture all the energy the Earth gets from the sun and use it with a theoretically optimal computer (one running at the limits of physics), you could try all possible combinations of a 128 bit key in about a second (known as brute forcing). If you could capture all the energy emitted by every star in the galaxy and use it with a theoretically optimal computer, it would take 2.392 million years to brute force a 256 bit key. With modern computers and available energy sources, even for an organization with the resources of the US Government, breaking a 128 bit key is not remotely feasible.
The Desire for a Backdoor
When AES was adopted in 2002, the US Government lost their backup plan of trying every possible DES key. For the past sixteen years, everyone has had the ability to encrypt data so securely that nobody can access it without the key.
This has been crucial to economic development. We use it for e-commerce, to keep financial information away from thieves. We use it to protect medical data, keeping personal information away from employers, insurance companies, and other groups that might like to take advantage of your medical records.
Of course, criminals use it too. Terrorists encrypt their communication. Organized crime can use encrypted e-mail or messaging. There are online market places where you can buy just about anything with complete anonymity. And it goes without saying that the government would like to stop these activities.
How Could it Work?
So far, most of what I’ve talked about is called “symmetric encryption”. With symmetric encryption you have one key that is used to encrypt your data and later used to decrypt your data. You have to keep that key a secret, or anybody can use it to decrypt data.
There’s also a concept of “asymmetric encryption”. With asymmetric encryption, you can have a public key that anyone can use to encrypt data, and a private key that is required to decrypt data that was encrypted with the public key. (Asymmetric encryption can also be used for digital signing, but that’s another topic.)
If we wanted to create an encryption system with a back door, the government could publish a public key, and require that anyone who is encrypting data must encrypt their secret key with the government’s public key, and store or transmit it with the encrypted data. Then if the government needs access to the data, they can use their private key to decrypt the secret key, and use the secret key to decrypt the data.
So What’s Wrong With That?
There is a critical question with the above system - Who has access to the keys, and under what circumstances can they be used?
If the government’s private key were stored on specialized hardware, disconnected from the Internet, requiring physical access in a vault protected by state of the art systems and armed guards, it’s plausible that their private key would stay secure. If the FBI gets a warrant, they can take the encrypted secret key to that vault, get it decrypted, and take it back to decrypt the data.
But we’ve already established that since 2002 the tools have existed to encrypt data very securely. If a bad guy wanted to keep the government from getting their data, they need only use technology that existed almost two decades ago. They could play the government’s game and provide a key encrypted with the government’s private key, but until the government decrypted that key and tried to decrypt the data, they would have no way of knowing whether it was the real thing. They might have encrypted a different key than the one used to encrypt their data, or they might have encrypted their data twice and only made one of the keys available to the government.
What this means is that it’s going to be easy for bad guys to cheat. Honest citizens using off-the-shelf technology will have their data encrypted in a way the government could access it, but the guys we have any reason to catch will use additional technology to protect their data from the government. That puts honest citizens at risk in the event that the government’s key is compromised, while criminals and terrorists have as much protection as ever.
Now, if we want to catch the bad guys who are cheating, we could check encrypted data at random. We could put the government’s private key in data centers around the country to check that encrypted traffic has the properly encrypted secret key, and that the data is accessible after decryption. We might also need to have police stop people on the streets to make sure their phones and laptops are encrypted with the proper back doors. If we catch people without the proper back doors, we charge them with a crime right then, regardless of what they were trying to hide.
But in that scenario the government’s private key is everywhere. There are copies in lots of data centers. Police forces might have their own copy. You no longer need a warrant to get access to it, and it’s not stored in a vault with state of the art physical security and armed guards. Then it’s only a matter of time before the government’s private key falls into the wrong hands. When that happens, those wrong hands will be able to decrypt every honest citizen’s financial data, medical records, etc. for as long as that private key has been in use.
Even if the government is aggressively looking for data encrypted without the mandated back door, criminals will still have options. They could use steganography to hide their encrypted data inside other data - that picture of their kids might encode the plans to rob a bank, and it would be nearly impossible to prove that there even was a hidden message, much less its content.
Even if the government does a perfect job protecting their keys, quantum computers create a risk for anyone who follows the rules. As quantum computing advances Shor’s algorithm will chip away at asymmetric cryptography. Without a back door, the secret key that protects the data will be safe from Shor’s algorithm. But if honest Americans encrypt their secret key with the government’s public key, the development of quantum computers will eventually make their secret key accessible.
The Unavoidable Trade-off
When you’re dealing with a back door encryption system, the better you protect the back door key the easier it will be for criminals to cheat, and the harder you make it for criminals to cheat the more likely it becomes that every honest citizen’s data will be compromised.
It’s undoubtedly frustrating for law enforcement to be thwarted when they can’t access the communications or data of a suspect. But back door encryption systems will put every honest American’s data at risk, while a committed criminal already has the tools to keep their data safe.
1 As a side note - this post takes a US-centric view because OpenRelay is US-based, and because the US is currently discussing the topic. US policy wouldn't necessarily impact the encryption available to those in other countries. The general concepts would apply to the people of whatever nation or state implemented the policies discussed below.