Encryption backdoors are a terrible example, here’s a historical example to showcase why…
On Tuesday, speaking at a cybersecurity conference in New York, US Attorney General William Barr paid homage to a long-standing international law enforcement tradition: making legal demands for the tech industry to break its own encryption.
If you’re familiar with Hashed Out, you already know we’ve been extremely outspoken on this topic – it’s an awful idea. And if you’re familiar with me, you can safely assume that our CEO is having a cardiac event as he’s reading this headline. So, quickly, a disclaimer. Though the case we’re about to lay out is tangentially related to politics, this is not a political article. It’s a thought exercise.
Beside, originally the example I chose was gun control, which is much more contentious than whether it should be legal to sell booze.
This is going to be a fairly US-centric discussion, but then, it was the US Attorney General that made the most recent round of comments. And when it comes to tech, the US is a pretty good bellwether.
The question is simple, what if we applied the same attitudes we have about gun control, and in the past – prohibition – to the encryption debate.
And no, it’s not that the only way to stop is a bad guy with encryption is a good guy with encryption…
It’s that weakening or even banning encryption doesn’t punish criminals. It punishes law abiding citizens. As we’re about to discuss, it’s really not that hard to spin up an encrypted messenger app and distribute it among you and your friends. Or your terrorist buddies. Or your human trafficker pals. Or really any ne’er-do-well that criminals share an orbit with.
So, today we’re going to kick the tires on the “going silent” encryption debate, talk about creating your own encrypted messenger app and then we’ll talk about whether these law enforcement agencies are even making their case in good faith or if this is just window dressing for an eventual surveillance state.
Let’s hash it out.
The government wants to fight for its right to (be) party (to your encrypted conversations)
When you hear US Attorney General William Barr say:
“As we use encryption to improve cybersecurity, we must ensure that we retain society’s ability to gain lawful access to data and communications when needed to respond to criminal activity…”
It sounds perfectly reasonable on its face. It’s not.
I really don’t want to turn this into a civics lesson but there’s a lot of ground to cover here and we’re going to need to contextualize it properly. There’s a technical reason that this isn’t a reasonable request, and then there’s also a bit of spin on it that misrepresents what’s actually at stake here, too.
Let’s start with the technical side – there’s just no way to do what law enforcement is asking in the way that it’s asking.
To effectively create the type of legal access to encryption that is being requested – and not just in the US, Australia just passed a truly terrible piece of legislation, too – you really only have two viable options.
- Key Escrow
Of the two methods, I’d argue that key escrow represents the lesser of the two evils. But really, that’s like being asked if you’d rather be constipated or incontinent – they’re both shitty options.
Key Escrow would involve a trusted third party storing ALL the keys – the idea being law enforcement could take a warrant to the entity administering the secure key store. There are a ton of problems with this idea.
For starters, that’s a ton of keys. The average enterprise is managing over 80,000 private keys. And managing those keys is a massive challenge for those companies. We work with a ton of them. They fall into one of two categories, either they’re drowning in keys and constantly losing money because of compromises and mismanagement – or they’re our customers.
All kidding aside though, the biggest companies in the world struggle to manage their private keys. And that’s in spite of all the money and resources they invest in it. Now, in order to operate a key escrow system, you have to either trust one (or several) of those Enterprise companies with the resources to manage all those extra keys at scale – OR you have to trust the government.
And trusting the government extends beyond just trusting it to maintain the key store on both technical and security levels, it means trusting the government not to misuse the keystore. What kind of oversight gets baked into that? Any? We’re still reeling from the Shadow Brokers and Edward Snowden pulling the curtain back on the United States’ domestic surveillance program. Credibility isn’t exactly soaring.
And then there’s the security aspect. Unless you’re air gapping the location of the keys – or storing them printed on paper – you’re going to need to have Fort Knox levels of security to ensure it remains un-breached. And before saying, “well I guess we’ll air gap them,” that ignores how you’re going to constantly upload and update this database of keys. Because new keys are generated every second and you have to store them all. And you have to keep all of the old ones stored, too – lest the conversations that used them be lost forever.
That’s just not a tenable solution. Even if the key storage is decentralized and done proprietarily by the vendors, you’re only increasing the attack surface and the number of parties you have to trust with security.
The other method involves backdooring encryption, which sounds incredibly abstract but really just involves the number generators that are used in these cryptosystems. Random Number Generators – which were at the center of the PKI industry’s recent serial number snafu – are what generate the seeds that are used to create encryption keys. If you know the seeds that are being used, it’s a lot easier to guess the value that is the private key.
When you include an encryption backdoor, you’re essentially divulging what seed range you’re using to whatever party you’re weakening your encryption for. This would be less of a problem if these RNGs were truly random, but one of the biggest problems facing our industry right now is that they’re really not. And what makes things worse is a lot of these different implementations use the same seeds. So, it’s not a backdoor that’s opened on a case-by-case basis, putting that backdoor in undermines ALL encryption done with that cryptosystem – it’s not limited to just the device that law enforcement has a warrant to search.
There is no master key solution. And even if there were, that’s a fatally flawed idea, too. So, the idea is either to create the most high-value target in the history of the internet and task some company or government agency to administer it safely and efficiently. Or undermine encryption for everyone.
This isn’t just about legal access
Remember earlier we tossed out a parallel to prohibition? Let’s talk about that, because on their face these two things might seem kind of unrelated – they’re not.
In 1920 the US banned the sale, production and import of alcohol. All alcohol. There was a litany of reasons, a lot of them related to puritanism and “fixing the ills” present in society. Fun fact, by 1830, when temperance movements were first being started, the average American was drinking 1.7 bottles of hard liquor per week.
US Alcohol Consumption by Year
Either way, prohibition was presented as a completely reasonable, moral initiative.
And it was an epic failure.
Part of that is due to the aforementioned argument. Banning booze, or even heavily restricting it will only serve to disenfranchise law-abiding citizens. Criminals are still going to do whatever they want. They don’t follow laws – that’s kind of what the whole “being a criminal” thing is all about.
And that’s exactly what happened with prohibition. Instead of staying sober, an entire black market liquor economy popped up, only rather than being controlled, regulated and taxed by the government, it was run by criminals. You can still go to Chicago to this day and see the tunnels that Al Capone used to sneak his liquor through.
This encryption debate is being presented as a reasonable request by law enforcement, too – just a way to ensure they can access criminals’ phones. That’s incredibly misleading. This is a fourth amendment issue. The US fourth amendment reads in part that it guarantees:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Now, the obvious place to point is at the tail end where it gets into legal warrants. And that’s all good. But there’s a big difference between asking to serve a warrant that is legitimately granted through the proper channels and pre-emptively compelling a company to break its own product’s security so it can comply with a warrant once one is eventually granted.
Now, I’m not a legal scholar, I can’t speak to case law and legal precedents when it comes to this sort of thing. But that’s entirely the point. These are points of contentions that NEED to be decided by a court and not via some extra-judicial decree by the US Justice Department. There’s a lot of very murky, grey area that surrounds this debate and neither the tech industry, nor the law enforcement apparatus in this country have the objectivity to weigh out both sides.
And here’s the thing: going back to the prohibition parallel, encryption backdoors and key escrow are only going to hamstring regular, law-abiding citizens. And while it’s easy to say those law-abiding citizens have nothing to worry about, the NSA said the same thing about the Patriot Act. I’m not saying you shouldn’t trust the US government. I’m saying I don’t.
More importantly, the solution wouldn’t even solve the problem.
It isn’t that hard to spin up your own encryption
There will always be other secure messenger and full-disk encryption apps that operate outside of the legal jurisdiction of the countries demanding legal access to our devices.
And even if there aren’t – it’s not really that hard to make your own encrypted messenger app.
Think I’m kidding? You’d need moderate technical skills and two things to make one:
- Knowledge of an SMS notification framework like Amazon’s SMS
- A software library that supports the type of encryption you want to use
You wouldn’t even need to use an obscure software library. Hell, you could probably use an SSL/TLS library if you wanted. Just for kicks and giggles I put an ad up on Fiverr looking for someone to design one for me and got six responses from developers in the first hour.
Full disk encryption isn’t any harder.
All that breaking encryption will do is send these criminals further underground.
Just like booze during the age of prohibition, those encryption solutions will still be produced. Criminals will still be tunneling right underneath us. Only now those tunnels will be encrypted.
Let’s think about the people that law enforcement are most interested in keeping tabs on:
- Organized Crime
- Human Traffickers
- Terrorist Organizations
- Drug rings
Obviously, there’s more than that but it’s been a few years since I’ve played Grand Theft Auto so my criminal recall isn’t what it used to be. Regardless of how incomplete that list is, one thing most modern criminal enterprises have in common is they’re making money on, or facilitating their crime with the internet. Whether it’s selling counterfeit goods or some light corporate espionage and hacking, these criminals are not dumb. At least not on an organizational level. They have people who can code, who can program a little. They’re going to know how to spin up their own encryption solutions, or at the very least they’ll know how to find someone who can.
And guess what, they’re not going to be sending the FBI their seeds and nonces or logging their keys with the designated escrow service. And any metadata or intel that law enforcement may have been able to collect from the apps they were originally using will now be completely lost.
Use what you already have
It’s also worth pointing out that undermining encryption isn’t even necessary. The Apple-FBI case that served as the opening salvo for this debate was rendered moot when the FBI was able to contract a third party to crack the phone. The FBI also lied about the number of devices it couldn’t access.
And then there was the guidance from the European Commission’s Working Party 29, which is worth the comparison because it shows how other Western nations view the encryption debate. WP 29’s guidance can basically be summarized as: use what you already have.
- Access to unencrypted data held by data controllers
- Access to metadata held by data controllers
- Compelling alleged criminals/persons of interest to provide their key
- Targeted interception tools
- Tools for guessing or intercepting passwords
Undermining, or even outright banning – as the Trump administration has considered – some forms of encryption is a nuclear option, but we haven’t reached nuclear levels yet.
And if we ever do, as was stated earlier, banning and undermining the most popular encrypted messaging services is only going to push the criminal element further underground and make them harder to track and catch.
As always, leave any questions or comments below…
** Regrettably, no alcohol was consumed during the writing of this article