Can vs. Should: Real-World Equivalents in the Debate on Encryption Backdoors

Last month, we watched as Apple was once again pressured to break iPhone security mechanisms in order to assist in a government investigation. This time, the target was the Pensacola shooter’s iPhone. In 2016, when Apple was under similar pressure related to the San Bernardino shooter’s iPhone, virtually all tech companies supported Apple’s position that creating such self-defeating technology would be too dangerous to create. I don’t think positions have changed much since then, despite our disgust with these perpetrators and their evil actions.

A while back I wrote an article describing what I saw to be the slippery slope of mandated back doors. I still see it the same way as we enter 2020, but with the drumbeat continuing in many government circles, I think we need to start framing the debate as a real-world privacy issue rather than a technology challenge. Part of the problem in assessing the validity of what the government is considering is it’s mired in techno-jargon and quickly gets folks like me talking about best practice and the nuance of symmetric and asymmetric encryption, end to end encryption, forward secrecy, key management, and zero trust security design. How it all actually works is indeed a factor in the debate and can (/should) help us understand the technical risk that would come with such a mandate, but it’s also the thing that’s distracting the average person from seeing the issue for what it truly is. Therefore, I think it would be useful to attempt to describe what a government mandated backdoor looks like in the real, I.e. physical, world.

Let’s say there was a little black box in all of our homes that recorded all of our conversations. The box could only be opened with key which was owned by the government, who could use it any time they could convince a judge that there was a 51% chance of finding evidence relevant to an investigation. First questions first. Would you destroy the box?  Maybe play a radio next to it?  What if the mandate was to force home builders to install the boxes in a way in which they could not be destroyed or tampered with?  Every word you say is inexorably and indefinitely archived, and at any time in the future the government can access the archive and see what was said. Short of not talking, there’s nothing you can do. 

What’s your first concern in that scenario?  Are you worried about how the key to the box will be managed or are you worried about how you’ve lost the ability to have a truly private conversation in your home?

That’s my problem with the backdoor debate. All the focus on technology makes us ask the wrong questions, and wrong questions lead to wrong answers. Is there a key management problem in the little black box/real world scenario?  Absolutely. No way that key will be secret for long. No way there won’t be unscrupulous builders out there rigging access to the highest bidder. No way I’m confident my data will be safe after a break in. Still, it’s my secondary concern. Do I want to live in a world where every word I say is entered onto a permanent record? That’s the question.

For too long the technology industry has been building an alternate universe where real world rules and privacy expectations do not apply. The massive proliferation of online user data has been exploited by malicious actors seeking to gain advantage over governments, businesses and individuals to the tune of billions of dollars lost and countless lives harmed. To its credit, government has gotten its hands dirty and exploited the same system to turn the tables on many of those doing the harm. After all, fair is fair.

But now, some in the technology industry are trying to right the ship, and users are on board. Apple users want a product they know can only be accessed by them if God forbid they lose their phone. Wickr users want a product they know can give them the security of a face to face conversation when they can’t be there in person. Consumers are aware and experienced in the technology failures of the past and demanding more from their products to protect them. They deserve it.

So, what should government do in response to these industry changes? Resist them? Does it make sense to stop providers like Apple and Wickr from implementing security mechanisms that will ultimately prevent far more crime than law enforcement could possibly prosecute were they not implemented? No. Not in the greater good sense, not in the practical sense, and not in the personal sense in terms of how free I feel if I’m forced to put my data at greater risk than it has to be.

The wisdom of being able to gather more evidence at the expense of creating more victims makes no sense to me. That’s the technical backdoor debate in a nutshell. It’s still a secondary concern, however, to my human right to secure my data in the manner in which I see fit. As we and our representatives in Congress debate this issue going forward, let’s focus less on whether it can be done and more on whether we should do it.

Get in Touch

Learn how Wickr can help you collaborate securely and seamlessly.