Hacker News new | past | comments | ask | show | jobs | submit login
Why Security Backdoors are Bad (2016) (medium.com/cm30)
101 points by CM30 on May 4, 2017 | hide | past | favorite | 60 comments



I've said it before and I'll say it again: too much focus is being put on how backdoors can be abused, and too little on why mandating them is a terrible idea in general.

If we decide we don't have the right to conceal our digital correspondence from the government, what else should we not be allowed to conceal? The conversations you have in your car? In your home? Besides the right to privacy, what other rights make fighting terrorism harder? Freedom of association and movement? Freedom of speech, which is regularly used for recruitment?

Finally, if they really want someone's data, in a covert way, they're already able to plant hardware keyloggers, or spy on someone as they enter a password, etc. But it all requires manpower. What backdoors would let them do is monitor encrypted communications covertly and in bulk. To build up a database of all your chats and forum posts, and mine it for any anti-current-politics sentiment.


Building on this line of thought, I never completely understood the description of crypto as a safe to the public. To me, it's more about mandating how I decide to communicate: requiring I speak a language the FBI can understand, that I always speak loudly and clearly enough such that their recording devices can accurately capture what I'm saying, that I always write my letters in a language they can understand, etc.

Crypto isn't a safe, it's the ability to talk in a language only two people understand. Unlike a safe, they have access to the data. They just can't understand it.


Eben Moglen made this point during the first crypto wars, in 1999: http://moglen.law.columbia.edu/publications/yu-encrypt.html ("The right to speak PGP is the right to speak Navajo")


> ...that I always write my letters in a language they can understand, etc.

And more importantly, with legible handwriting! The FBI has limited resources and we wouldn't want them to waste time deciphering your handwriting only to learn what you've said to your lover, while a terrorist's handwriting goes unanalyzed.

I like this line of reasoning, that we must modify our behavior to enable surveillance. It switches things around from the traditional perspective, which is that we're doing extra things to increase our privacy, to instead be that we're doing extra things to decrease our privacy (using weak crypto, no crypto, adding and removing ssl here, etc).


I agree about the importance of keeping crypto secure, but I think the safe analogy is pretty apt (in situations without illegal stuff happening, which you seem to be implying). People can't understand encrypted data (at least, I'd be super impressed with you if you could understand encrypted data as is), people have to decrypt the data using an encryption key. The FBI's current issue is that with data in a safe, there's a clearly defined legal penalty for refusing to unlock a safe when given a legitimate court order if you're the only person with the safe combo (you can't plead the 5th to unlocking the safe). With encryption, there's currently no corresponding penalty.


> If we decide we don't have the right to conceal our digital correspondence from the government, what else should we not be allowed to conceal? The conversations you have in your car? In your home?

Let's say the Government decides that it has a zero tolerance for domestic violence, it goes ahead and implements "preventive measures" - cameras and microphones in every home, in every room. Now, they might achieve the great reduction of violence, might even catch one terrorist. As always - think about the children, we could prevent all those terrible parents that beat their offspring and catch all those wife beating husbands. And, lets think about what that would do to society...


A stab at a shorter version. Feel free to steal, improve, and then take credit. Takes his best point and then adds some examples of why it is scary:

If device makers and online services are required to set aside a master key for the government, how would we be protected from abuse if the government was ever to become corrupt?

If pervs at the Department of Homeland Security wanted to rifle through pictures of our children, they could. With enough government corruption, if the local sheriff, or the US President, or anyone in between, wanted to read the most private thoughts shared with us by our loved ones, they could... keeping in mind that even if we have nothing to hide, our loved ones might like their privacy.

The more corrupt the government, the more ways they would dream up to abuse the power of this kind of mandatory master key.

In the case of the US, thankfully our government is headed by people with the highest ethical standards... oh, wait, it isn't!

Besides privacy invasion by rogue government employees, the other problem is the fact that such keys can, and likely will, leak out to bad people who will use them to get into our bank accounts and take our money. Even the US NSA cannot protect its most dangerous hacking tools from being released on the internet. It won't be able to protect this master key capability either.

Requiring a master key for the government would be a very misguided and dangerous policy.


Given how much Democrats and Republicans detest each other's candidates, you'd think it would have sunken in by now that relinquishing basic rights and allowing the government to create powerful apparatuses prone to abuse is never a good idea, no matter how much you trust the current people in power. Even if you loved Bush, next up you get a closet-Muslim itching to take your guns away (/s), and even if you loved Obama, now you get a narcissist with fascist tendencies behind the reigns (/s?).


Ask not what your country can do for you. Ask what your country can do for them.

Why worry about precedence? When I'm in power I'll just change the rules!


The best counter-propaganda to the "give us backdoors" demand is this:

"FBI demands rights to look at your children's dick pics!"

It would be great to see some headlines like that.

And then in fine print down below: "And some other pics too"


But my children are innocent little angels and would never do that! But they do get dick pics from those other, evil children - isn't it great that the FBI can finally track them down and punish them?


FBI demands access to baby bath photos! Promises that they will be used for justice.


> Imagine your government made it mandatory to leave keys in a certain place so police could enter a property in a hurry.

Isn't that literally what the TSA lock on luggage is? Granted, I wouldn't regard a suitcase made of bunch of plastic or fabric as very secure against any form of attack, but why do all these arguments against backdoors never mention this? And with the TSA master keys leaked a couple of years ago all it needs is one malicious airport worker to open your bags and sniff your panties, erm, steal your laptop.


> Isn't that literally what the TSA lock on luggage is?... And with the TSA master keys leaked a couple of years ago all it needs is one malicious airport worker to open your bags and sniff your panties, erm, steal your laptop.

Good point.

> but why do all these arguments against backdoors never mention this?

However, I don't think arguments against back doors should mention TSA luggage locks! It's a bad prior.

You don't want people in the "airport" mindset when you're trying to convince them not to allow the government to snoop.

People who enter airports (or especially customs) basically give up all of their rights to privacy. And most Americans are apparently OK with this. Begrudgingly, perhaps, but ultimately most people accept it and go on with their life.

So, "think about airports" is a really terrible persuasive setting.

Instead, you want people thinking about their bedroom. About their car. About their child's playroom. About the settings where they live 99% of their life. Because that's the setting that government back doors in consumer electronics expose. And that's the setting where people get most uncomfortable about carte blanc government access.


Someone tell El Presidente that with a backdoor others will be able to read his tax returns and bingo he will veto any such bill!

Also, 'security backdoor' is an oxymoron. There is no such thing. It is either secure or has a backdoor.


> Imagine your government made it mandatory to leave keys in a certain place so police could enter a property in a hurry.

I'm sure this actually happened somewhere in the US... it might just have been proposed but I distinctly remember reading about mailbox-like containers at the bottom of drives locked with a master key. Does anyone else?

Edit: More than once apparently, "lockbox" was the term I was missing when googling earlier: http://wcfcourier.com/news/local/update-cedar-falls-city-cou... http://archive.northjersey.com/community-news/2.4225/rescuer...


You might also be thinking of a Knox Box. A key securely stored in a firetruck can be used to open a small box on your property, which contains a master key for the facility.

As scary as that sounds, it isn't /completely/ stupid. The system is designed to make using this backdoor is exceptionally noisy. Your Knox Box should have a tamper switch inside, so that any time it is opened, alarms go off. At the other end, when the key is removed from a firetruck for use, alarms go off.


I remember reading about those; allegedly, those are used by estate agents when showing houses for sale. Apparently those boxes are often really easy to break into.


Current state of the art in real estate lockboxes includes wireless technology so the devices can communicate usage. They are in the IoT space of vulnerabilities.


These are fairly standard in larger building complexes in Germany. We call them Feuerwehrschlüsseldepot (key depot for the fire dept.); they're very well secured, both physically and organizationally.


Damn, it's 2017 and we're still arguing on why leaving a gaping hole in your house is dangerous.


I remember thinking a long while ago that the whole "clipper chip" thing was going to be the end of this debate. I'm completely astonished by how this just continues to play out, over and over.


It doesn't seem so astonishing when you recognize the inordinate leverage the oligarchy has in controlling the government and framing the "public debate".


Yawn.

"The oligarchy" who you never identify would include the owners and shareholders of businesses who would suffer catastrophic business losses in the event an encryption-defeating law were passed.

Various billionaires are adversarial with respect to the laws they support and the media narratives their companies push (e.g. Washington Post vs Fox News). They don't act in concert.


oligarchy != billionaires club


Who are the oligarchy?


Those who are above the rule of law


This is a tautology and not an answer to my question. It means that you don't actually know how to define your belief system.


Or perhaps you just don't like my definition and its implications so much so that you're unable to consider it. Is there no one who is above the rule of law in your opinion?


Also, don't think about this tech just in the first world, but in the most corrupt, Third World hellhole where human rights are routinely violated and ask yourself why would we sentence brave people to even more misery and suffering?


Average Joe looked at your post, enjoyed the colorful icon at the top for a brief moment, and then his eyes glazed over.

If you really want to resonate with him, publish this in comic book form. I'm serious.


Or in other words, an article I wrote a while back explaining (to complete idiots) why a security backdoor is a bad idea in general. Why no, you can't 'have an encryption hole just for police/authorities to use'.

Perhaps a few politicians may even reconsider their attitudes towards encryption and security in general.

Eh, probably not.


Politians should be the first to worry about privacy.

  1 - They are not guaranteed to always be in power.

  2 - They will be the first to be targeted. 

  3 - They have the most to lose.
I just don't understand why they don't care about it.


And we should worry about our politician's security too. What if they are blackmailed while in power? That creates a terrible national security situation.

This line of argument would hopefully resonate with the constituencies of the politicians pushing hardest for these backdoors (although it's a fairly bipartisan effort).


With backdoors, which I'm not advocating for incidentally, the TLAs know they're being​ blackmailed and can intervene. Without that access politicians can still be blackmailed but it's harder for the TLAs to know about it.


Is that necessarily true? With a scheme like key escrow, is there a sure-fire way to detect a hacker as opposed to an authorized user accessing data? At the very least it seems that if the NSA itself or some other agency were hacked (again) then this problem would still exist. The NSA has notably lacked this type of audit trail in their systems before, hence their inability to track Snowden.


Congress was absolutely aghast when they found out they were being spied on by the three letter agencies, whenever they met with foreign representatives. But they don't care at all when it's applied to the general public, because they assume they'll have clout or connections to keep that at bay.

I've never heard anything come out of that above kerfuffle, presumably because there was some backroom dealing done to ensure it's not a problem for them specifically.


They trust the rule of law?

Surely politicians should only be more worried if they fear their actions being​ exposed - the only reason I see for that is if they're guilty of something. Otherwise they've only equal reason to be guilty of you're of the crowd that considers "I've nothing to hide" not to matter.


Iphones have built in security backdoor called trust apple code signing. That is the foundation of their walled garden.

So far Apple seem to protect it pretty well. They show that only the good guys can control the backdoor argument is not without merit.


For one system, over a few years when few people are trying to break in sure. But, the history of those trusted systems is terrible overall. Look at video game consoles for example and it's not just one or two that but all the major companies that tried generally several times each.

EX: https://nakedsecurity.sophos.com/2012/10/25/sony-ps3-hacked-...

And no the US government has not done much better.


Another example along those lines is the NES Classic Mini. Doesn't connect to the internet, doesn't let you write anything to the console and doesn't come with anything like games to buy or extra storage mediums to use.

Yet within weeks, people had hacked it wide open and stuck hundreds of ROMs on it.


I do not support the walled garden, mandatory code signing from anybody but the owner and so on. People should control their devices.

I just point that apple show that keeping keys really works.

I don't think that console jailbreaks are from leaked keys (except ps3) but from normal exploits.


There has been several different leaked keys for different systems. Remember, Blue Ray had the same issue five different times in 06, 08, 09, 11, and 12. https://en.wikipedia.org/wiki/AACS_encryption_key_controvers...

Also, maintaining back doors would be a significantly harder problem than just keeping keys on a server used to sign things. Apple can for example keep the keys on an offline private server, but that's not really viable if 10 or 100's of thousands of people across multiple agency's of state, local, and federal law enforcement need access to break into things.

Worse you could quickly find nobody outside the US would be willing to use our software or services.


> So far Apple seem to protect it pretty well.

That we know of.

It's also important to remember that the only public trial against that backdoor was only for show, and despite not officially breaking it, the attacker got everything on the phone anyway.


To prevent backdoor abuse, you could have a shared key controlled by various parties.

A scheme such as Shamir's Secret Sharing - "An algorithm in cryptography created by Adi Shamir. It is a form of secret sharing, where a secret is divided into parts, giving each participant its own unique part, where some of the parts or all of them are needed in order to reconstruct the secret."

Source - https://en.wikipedia.org/wiki/Shamir's_Secret_Sharing


Mathematically, you're correct. Unfortunately, real world systems have bugs, and even multiple parties can be compromised. Back doors are a bad idea.


But that's explicitly not a backdoor. That's straight forward, "If permission is granted from 2 opposing parties, crypt can be reversed". And that would have to be added to the crypt on cyphertext creation.

And the Bitcoin protocol has a similar thing, with escrow key permissions. Again, intended behavior, not some "super sekret backdoor".


If I'm the encrypter, and I don't want the two parties to read it, that's a backdoor.


That's being rather obtuse. I can easily parse that the contents are encrypted to X keys. I'm thinking of something obvious like GPG here.

I would accept the idea of a backdoor IFF the program encrypting hid the fact that it also encrypted to an escrow unawares to you, along with keeping that a secret.


No, I'm not being obtuse. I understand what you're saying; I believe that I understand why you think it's reasonable; and I totally reject the claim that it is not a backdoor. I don't care how many keys there are, and I don't care what the underlying technology is. If a group of people, no matter how many (other than the sender and receiver) can choose, by a designed feature of the encryption, to decrypt the communication, that's a backdoor, by definition.

And I assert that, no matter how carefully designed, a backdoor is always a bad idea.

Having two parties that have to agree is a mitigating feature. It makes it less bad. It doesn't make it good, though.


I've thought of systems like this as well.

To be clear, I do not want James Comey, Donald Trump, Joe Biden, or any "heroes" or "villains" of our power structures to be given any access to my encrypted data, whether they call it "lawful" or not.

That said, I don't think the "all backdoor implementations will inevitably have bugs" argument holds water in all cases, so this battleground will move to a philosophical one based on reputation and the ideals we want to uphold. This means we can't just keep shouting, "you're too stupid to make a sufficiently secure system" at the government over and over again.

So what would an effective and "secure" backdoor system look like?

Say the manufacturer creates a device with a protection similar to Apple's Secure Enclave, where there is a key i burned into the system at manufacture, which is not accessible in any way after it has been programmed in.

A second, independent, random key j is created at the same time, and the value i' = "i xor j" can be retrieved from the device with physical access only, e.g. by programming i' into a separate memory section inside the IC itself that is not accessible on the data bus while the system is running normally, it is only powered on by the hardware if one of the microprocessor's pins is jumpered at system boot.

Now, to retrieve the value i, you need both i' and j, and you can only get i' with hardware access. So even if the database containing all the j-values ever created gets compromised, nobody can break into your phone without physical access.

Now, of course, if the list of j-values got published, that would be a huge embarrassment for the US, but it would not mean that everybody's bank account and private emails would be accessible to every script kiddie overnight.

This is not a 100% risk-free solution, but it is exactly the kind of so-called "balanced" approach that the government is going to try and sell us, because it does actually ensure that the gub'mint can't break into your encrypted data, at least without physical access.

So the argument to actually use against Jim Comey is simply to point out that rest of the free world won't require this of their manufacturers, so nobody outside the US will ever buy an Apple product again since they don't feel any US manufacturer can be trusted, even if the system is technically secure.

Also, any terrorist can still create an unbreakable crypto system by running custom software on a $20 Raspberry Pi, and good luck getting ISIS to cooperate with a lawful warrant for assistance in decrypting their operative's device.


The "it will be found and abused" argument is wrong on so many levels. Let's start with the facts: - All modern software in fact has a public well known backdoor. - That backdoor is called the private key for software updates. - Despite being well known feature, no criminals have abused it so far, and we consider it completely safe. - The argument of "it can't be done" is actually null and void. It was already done and even implemented. - Why do you trust those companies more than your government. I don't see how a corrupted cop is more likely than a corrupted developer. Being a developer obviously makes your point of view biased. - Where the government can be limited by whatever laws you choose to implement, those companies and their security practices are entirely up to them. - It's not a question of whether it's possible, it's a question of whether it should be done, and if so how. - Open source is not much more trustable. End of the day most of the sane people do not recompile every binary, and with CVEs and security patches coming all the time you still need someone to provide the updated binaries.


P.S. If you really are overly paranoid about the government, you should've assumed the NSA actually have all those private keys, but they reserve it for the most critical cases and only select few know it. You would also assume the russians / china have access to that because you're overly paranoid and everything can be breached, and they don't care about these companies being american.


One thing is an easter egg for enabling debug stuff in order to help diagnosis, and another, a super user backdoor, which in my opinion, should be legislated: i.e. in case it exists, it should be disabled by default, or having a big warning (labeled in the box, with a removable stick, or whatever), so you can choose before the purchase.


average joes probably aren't reading things on an obscure medium blog getting linked on hacker news. this community has read, in depth, the similar writings by Schneier and the rest.


But those who read this here might share it with five average joes, perhaps one of which might really get upset about the situation and call their congresscritter. If this argument is presented in several different formats and enough average joes hear about it and get upset enough to act on it, then maybe--just maybe--we can fight this bill off, again. And perhaps, over time, public awareness will grow enough that Jim Comey eventually stops making flawed arguments and begins working with the system instead of trying to game it.


I think it's naive to believe that the FBI (or any other intelligence or law enforcement agency) will ever be willing to see their own power checked. The way to check the power of those agencies is by having Congress pass exceptionally strong laws that explicitly forbid them from abusing their power.


I think the immediate purpose of this article is not necessarily to mobilize the average joes to collectively demand better security law. Mobilizing the average joes is less a problem of theory and more a widespread organizational effort. To have that effort in the first place though, you need people willing to take a leadership role who deeply understand the theory behind their actions.


I shortened it from a three minute read to a one-second one: Someone will find it.


And publish it. And then everyone will have it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: