End-to-end encryption for civilian messaging services is a dearly-held dream of many outside the intelligence and security communities. It certainly isn’t something that I myself disagree with; I’d like to think that the messages I send to my loved ones are, in fact, being read only by my loved ones. However, every time that somebody uses an app with E2EE to send a message or make a call, members of the worldwide intelligence communities cradle their heads in their hands and cry.

Allo-app-img_6663-640x427Yesterday, Google jumped on the ‘encryption-for-all!’ bandwagon, announcing their new messaging service Allo, messages sent through which not even Google itself will be able to decrypt (theoretically, and for now) when the app is operating in Incognito mode. After all, to the average citizen it is perfectly reasonable to take steps to ensure one’s privacy, especially when you know good and well that there are those out there with the capacity to intercept and read your unencrypted (and therefore insecure) messages should they choose to.

In fact, Google is actually late to the game on this one. As Wired pointed out earlier today, Facebook (with Messaging and Whatsapp) as well as Apple (iMessage, Facetime) have been quietly encrypting your communications for some time now. More people are aware of this now, due both to the consequences of the Snowden revelations and the extremely public throw down between Apple and the FBI over getting into the iPhone of the San Bernadino shooter. And that’s the real rub. For all that we are entitled to privacy (and so we should be, not disagreeing with that!), our intelligence services and security organizations have the duty to protect against threats to the security of the State and the citizens therein (that would be us). Of course, the problem with that is privacy for everyone means privacy for everyone….including criminals and terrorists. Apple cannot build the FBI a backdoor into an iPhone, because that sets a dangerous precedent for the future. Not to mention, once that capacity exists it can’t be taken back, and absolutely nobody can guarantee that it won’t eventually trickle down to some who will use it negatively. This is an ethical as well as legal dilemma, and there really is no simple (or, so far, complex) solution.

If we are determined to secure our privacy, then we must also be content with the fact that those who would cause harm are also protected by that right to privacy and secure communications. When, in the future, the FBI or MI5 or the Australian federal police are lambasted for failing to uncover evidence of an attack or plot, with the full weight of evidence and history they will be able to reply that they could not collect the intelligence or the evidence necessary to do so. And the public won’t have a leg to stand on, because it will be true.

But that doesn’t mean I want Apple to build backdoors into iPhones, because that is my private property and I haven’t done anything wrong, thank you very much!

See the problem?

However, technology cannot be rolled back, so end-to-end encryption is here to stay. We continue to charge our intelligence and security services with protecting us from threats, so they are going to keep collecting intelligence and evidence as necessary. Now more than ever, the individual right to privacy and the right of the State to ensure the national security are becoming incompatible. So the next question is, what do we do about it now?


Related Posts