Requiring cloud providers to provide back doors for law enforcement will weaken security for everyone.

 

A back door in security is a way for government, law enforcement, or some other entity to access encrypted or locked information.  If I want to protect my data from others using some kind of encryption key, it’s like locking my front door.  Having a back door is like opening up another way to get at my information without my knowledge or assistance.

 

Among friends, relatives and colleagues who know that I’m working on a security start-up, the Apple case has made its way into conversation often.  The issue was that the government wanted Apple’s help to unlock the iPhone belonging to one of the terrorists in the San Bernardino attacks, but Apple refused to help.  Apple said that if it helped the government break into this iPhone, it would have to help the US government and others around the world break into countless devices, which could lead to violations of citizens’ privacy.  The government had a pretty strong point – who could argue against the government’s fight against terrorism?

 
This argument has also been used by the members of the Five Eyes intelligence alliance – made up of Australia, Canada, New Zealand, the U.K. and the U.S. – who have asked industry to make it easier for governments with lawful access to obtain decrypted versions of encrypted data.
 

The San Bernardino case happened to be a visible point in time where some in the government have carefully chosen to make a strong point about encryption technology.  FBI Director, James Comey, testified before congress back in July of last year that “encryption threatens to lead us all to a very, very dark place.”  He wanted congress to enact laws that require the government to have a key to any encrypted information.  Congress, almost a year later, is starting to work on exactly that.  Senators Dianne Feinstein and Richard Burr have released an initial draft of a bill that would require any entity providing encryption services to help government agencies decrypt this information upon the issuance of a court order.  In essence, this bill would would require back doors to be built into any product or service that provides encryption.

 

Here’s why such back doors are a very bad idea:

 

Really strong encryption technology has been widely available for a long time.

 

What does really strong mean?  Strong encryption means that it would take all the computing power on the planet millions of years to decrypt a single message based on today’s technology.  This was true as well a few decades ago, but computing power has increased exponentially to the point where data encrypted with yesteryear’s technology can be easily cracked today.  But fortunately encryption technology has advanced as well.  There’s no reason to expect any less advancement in encryption technology in the future.  The point is that, at any given moment in time, strong encryption will remain strong.

 

Once upon a time, encryption technology in the US was carefully controlled.  Products using strong encryption couldn’t be exported, for example.  But software has a way of crossing borders effortlessly, and the US hasn’t had a monopoly on the brilliant mathematicians that invent the stuff.  So controls were lifted, and now it’s possible to download very strong encryption software over the Internet for free.  The algorithms are in the public domain – they can be accessed understood by anyone who wants to explore them.  The software based on these algorithms has been examined and tested extensively and made very robust.  It’s all available to anyone very easily.  And, in fact, it’s exactly the open nature of this software that has made it so good.  Weaknesses are spotted more quickly when lots of people are freely examining the code.  So any holes tend to get patched up faster than they would have back in the old days when the technology was more carefully controlled.

 

So why are back doors necessarily bad?  Can’t one just put a good lock on the back door and give it just to trusted people in the law enforcement and still keep users’ data secure?

 

Here’s the problem: a back door’s key is just a long string of numbers.  It, like software, can be copied and transmitted very easily.  Because of that, the keys could easily be used to get at your data for the wrong reasons.  Whether or not you feel some person in some government agency will abuse the privilege of having access to your back door, it’s virtually impossible for any entity – including the US or other government – to be able to guarantee that some hacker won’t be able to get it.  Witness the multitude of agencies, including the IRS, whose computer systems have been breached in recent years.

 

So it’s certain that requiring back doors on all businesses’ and individuals’ data will mean that their information is less secure than if there were no back doors.

 

But what about the bad guys?  The purpose of requiring back doors in the first place was to make it easier for law enforcement to stop the bad guys.  Remember that strong encryption without back doors is widely available around the globe.  Will the bad guys use this cheap, widely available strong encryption, or will they instead obey the law and use the weaker stuff with back doors?  Of course, they’ll use the good stuff.

 

The result would be a world where the government in reality has no better tools to catch the bad guys, and the good guys’ data is less secure.

 

Some argue that smart people in the tech industry can solve this problem if only they put their minds to it.  This presupposes that there’s some magic technology that can be placed in the government’s possession such it it will never get leaked or abused.  Since the creation of secrets, humanity has never been able to solve this problem.  It’s absurd to think some smart engineers can do so now.