Hacker News new | past | comments | ask | show | jobs | submit login
FBI Wants It to Be Impractical to Deploy Strong Encryption Without Key Escrow (rietta.com)
293 points by rietta on Mar 16, 2016 | hide | past | web | favorite | 179 comments



Remember how New York’s (physical) master keys became easily accessible[1] despite the fact that they were supposedly so carefully managed? All that effort, all that trouble, and now not only is there essentially no security at all but the master keys created a security hole that did not need to exist.

The security of encryption is similarly proportional to the security of keys. The fewer things you have to secure, the easier it is to keep them secret. The “master key” concept in New York only served to create something of great value that people wanted to acquire, and massively increased the risk when that fell into the wrong hands. Obviously the same thing could happen with an encryption key, except it is worse because you don’t even have to be in the same country as the source of the key to acquire it or use it.

[1] http://www.nydailynews.com/new-york/pols-public-outraged-sho...


Interesting article. I find amusing this (quite popular) type of argumentation:

> "This is a serious security breach," said Councilman Peter Vallone (D-Queens), who heads the Council's Public Safety Committee. "We know terrorists are planning to attack our subways, and the MTA and NYPD better find these magical morons quickly, and then make them disappear for a year in jail."

Like the only thing between terrorists and the subway platforms is that they can't afford ticket, so they have to go through the staff gate.


But that is so revealing of the mindset.

  * We can have strong encryption just for the good guys
  * We can have master keys that only approved staff will use
  * We can block all the bad things on the internet and it'll be like they don't exist
  * If we have a back door into an encrypted device, only the good guys will use it
It's like no politician ever read about crime. Staff can't be blackmailed or bribed. No one working for the govt ever used resources for their own ends. No govt agency ever tried to blackmail an inconvenient public figure. Because a terrorist once made a terrible attempt at a bomb in his trainers, quick, we'd better make millions of people remove their shoes at airports. Clearly no terrorist has the wit to try a different approach.

The list is so damn long, yet people who really should know better get it wrong with depressing regularity. Are most politicians really that stupid? What are their advisers advising FFS?


It's a huge lack of systems thinking. It's like they believe that the universe somehow cares about what they were trying to accomplish.

* If we reward schools for increasing student test scores, then we'll have better schools.

* If we fund a "war on drugs", we'll reduce the damage drugs do.

* If we enact rent controls and mandate the construction of below-market rate housing units, it'll help people afford housing.

This belief - that "having a goal and doing something that pattern-matches to helping" works - is incredibly dangerous in policy-makers. It's also incredibly difficult to fix, since the incentives for politicians are to make rationalizations that are convincing to voters, and that kind of reasoning is much easier to convey.


I don't know about the universe; I think it's rather a weird sort of overly-strong faith in the capacity of other humans. Legislation at every level seems to treat the people implementing and enforcing the legislation as a fleet of magic genies, who will Do What You Mean and achieve exactly your goal with no side-effects.

Though also, do note, if you apply systems thinking to legislators, establishing that they're "doing something that pattern-matches to helping" is exactly in line with their current incentives. You need much more direct oversight to give them an effective feedback loop for the consequences of their actions, rather than just the "optics" of the actions themselves.

Things like https://www.trudeaumetre.ca/, but done by an opposition-minded third party who won't give them an inch but won't like like campaign ads, would be a good start. Ultimately, something like prediction markets on the upholding of each elected candidate's campaign promises during their tenure would be amazing, in the sense that sufficiently-large bets would be a form of lobbying without lobbying.


Thanks for your comment about applying systems thinking to legislators. It much more clearly captures the point I was trying to make about political incentives.

The best ideas I have for fixing political structures are weird utopia-ish things that you can't get to from democracy. Things like government-by-prediction-market, which would distribute decision-making to people who think they have a better idea than everyone else and pay for performance.


> If we enact rent controls and mandate the construction of below-market rate housing units, it'll help people afford housing.

To be fair, this does in fact help many people afford housing. It just doesn't fix the endemic issue.


These are political favors to local interest groups. It actively hurts the ability to find housing for everyone not blessed by the powers that be.

Dense housing is affordable housing. When there's an empty apartment that you're moving into, where do you think the old resident moved to? There's only a few options - a new place got built that they moved into, the old resident died, the old resident moved out of the housing area, or the recursive option that cashes out into one of the previous ones. If you stop building units ("affordable housing" mandates) or arbitrarily keep low-income residents from making housing bids (rent-control), the only bids for housing are going to be from tech workers and market-rate housing gets ridiculous.

Fundamentally, the problem is that there's X people who want to live in the area, and only Y housing units. The price is going to go up until the market clears. Handing out political favors so that the people who vote for you don't feel this reality isn't solving the problem: only building more housing will.


> the problem is that there's X people who want to live in the area, and only Y housing units

I don't think this is the same problem you would address by rent control.

> Handing out political favors so that the people who vote for you don't feel this reality isn't solving the problem: only building more housing will.

And yet, there are many people living in rent-controlled housing who would disagree with you. The person and problems you're painting do not exist outside your head.


>I don't think this is the same problem you would address by rent control.

Of course not. Rent control is fundamentally a tool for getting political benefits at the expense of economic ones. It explicitly creates winners in the voting district that implements it at the cost of losers everywhere else. It's in the same moral class as dumping toxic waste into a river.

>And yet, there are many people living in rent-controlled housing who would disagree with you.

I'm not at all surprised that the beneficiaries of political favors support handing out those favors.


> Rent control is fundamentally a tool for getting political benefits at the expense of economic ones. It explicitly creates winners in the voting district that implements it at the cost of losers everywhere else. It's in the same moral class as dumping toxic waste into a river.

I'd love to hear the argument for this. Again, many people living in rent-controlled houses would disagree. Just because people are poor you cannot write them off as votes.


http://www.econlib.org/library/Enc/RentControl.html

>Economists are virtually unanimous in concluding that rent controls are destructive. In a 1990 poll of 464 economists published in the May 1992 issue of the American Economic Review, 93 percent of U.S. respondents agreed, either completely or with provisos, that “a ceiling on rents reduces the quantity and quality of housing available.”


Ok, so economists agree that "a ceiling on rents reduces the quantity and quality of housing available". Thankfully, economists don't run our country. How does this support the statement "Rent control is fundamentally a tool for getting political benefits at the expense of economic ones."? Quantity and quality of housing are arguably much less important than ensuring there is a viable working class—just see how terrible a place SF or Manhattan is to live.


There's two different questions that policies should answer - what sort of stuff do we get out of it, and who gets that stuff. These are often referred to as allocative and distributive, though I'm remembering the precise technical wording from memory so I could be wrong on that.

That's the distinction that drove me to make the political versus economic benefits distinction. Rent control as a whole makes things worse - there's less housing, it's of worse quality, and it drives up the cost of market rate housing. On the whole, it's a bad allocation of resources: we'd much rather have cities with more housing that's of higher quality (because people are more than willing to pay for it). The problem is that this sort of policy creates winners and losers, and the losers are often the ones who get to set local policy.

There's two solutions - have a ban on rent control at a higher jurisdictional level (Washington State does this), or bribe current residents to lift rent control. The latter should make everyone better off - we pay for keeping the working class viable with the overall economic gains made by lifting rent controls.

Anyhow, my overall point is that we should hand out the who-benefits stuff without making society worse off as a whole, simply because the locals who happen to live in an area are able to hold overall policy hostage.


>pay for keeping the working class viable with the overall economic gains made by lifting rent controls

The economic gains from lifting rent control are going to go to slumlords, not just get dispersed to everyone. Also, housing isn't an elastic supply market. when rent goes up it doesn't necessarily mean the quantity of housing will increase as a result since housing is often limited by zoning, drainage, and many other things than profitability of rentals. Also when you consider the destabilizing effect of people having to move every few years due to rental price fluctuation, it's not really a clear cut benefit to "bribe current residents to lift rent control" if that were even an option.

>we should hand out the who-benefits stuff without making society worse off as a whole, simply because the locals who happen to live in an area are able to hold overall policy hostage

but this is how democratic communities work... local communities decide what is 'making society worse' where they live; as they should, since it obviously effects them the most.


>local communities decide what is 'making society worse' where they live; as they should, since it obviously effects them the most.

Local optimization is not the best strategy, even according to the those doing the local optimizations. Both participants in a Prisoner's Dilemma are better off if they hand their decision-making over to a third party, conditional on the other person also doing so. The Bay Area as a whole wants affordable housing, they just don't want to make the Bay as a whole better off by making unfair sacrifices in their neighborhood.

It's a classic coordination problem, caused in large part because decision-making is too local compared to the regional benefits of housing construction.


You support his point. It is a systemic issue, political patches do not work.


I don't really see many people claiming that rent control is going to solve rent for everyone. Seems like a blatant straw-man fallacy.


* If we fund a "war on climate change", we'll reduce the damage of (or hey, even stop!) climate change.


That's what being politician means. This job isn't about solving problems, it's about making an appearance that you solve them.

Suppose a terrorist did something with his shoe that he wasn't supposed to. Here is how it should be solved:

"Based on our research, the probability of dying because of the terrorist attack is XXX, which we can reduce by YYY using blah-blah, therefore we should allocate ZZZ funds in that area."

Would you vote for this guy? Well, most people won't even understand what he is talking about.

So here is what happens in practice:

"Okay, I don't know how to decrease national debt or prevent poverty, so let's find something else. Oh hey, terrorism! I know how to solve it. Forbid shoes at the airport. Sounds simple enough. So we can say 'we successfully fight terrorism'. And just in case people don't care enough, make this issue the most important one and add 'hey, think of the children' into the mix."

So the general population becomes scared of terrorism 'cause guys running for office told them they should be.

It's not politicians who are stupid, it's people who vote for them.


They are not stupid. They are just driven by wrong set on incentives. One which emphasizes short-term local success - i.e., being able to see the phone of the terrorist and maybe find something there and get a good press release and promotion, and maybe find some other terrorist (that rarely happens but has to be at least part of the law enforcement motivation) or drug seller (that happens much more often and is really what it all is about). Long-term problems do not figure much in that calculation - if the system gets corrupt and ruins security, it's not their fault, they didn't do it, they just did their job. And note that with drug war going on, which law enforcement is losing miserably, and terrorist threat to which they don't really have a good answer (not their fault here, nobody does, at least not in the enforcement realm) they may not exactly feel they have the luxury of long-term planning. Now add mass-media to that which provides extremely sensationalistic short-term reporting with almost no long-term focus - and you get perfect collection of incentives driving this behavior.


Change it up:

• We can have guns just for the good guys • We can have guns that only approved staff will use • We can block all the bad people from having guns and it'll be like guns don't exist! • If we have guns, only the good guys will use them.

Seems like any dangerous technology can follow this mindset. :P


If only good guys used guns, nobody would need them.

It actually is "we know bad guys already have guns, so better if good guys have them too".

Same with crypto, btw.


That's not completely true. If bad guys didn't have guns, we'd still want good guys to have them. I don't want police having to take on a guy with knives and a baseball bat, themselves only armed with knives and a baseball bat.


I do. The police shouldn't have access to weaponry that citizens cannot obtain except in very specialized circumstances, like say a particular unit in a city like Detroit or Chicago.


Don't make the mistake of assuming that the police are always the good guys.

You actually can't assume that anyone is always going to be the "good guy", or that anyone else will always be a "bad guy".

And if you could program a smart gun to have a sense of morality, capable of judging between appropriate and inappropriate uses of lethal force, you no longer need a human to carry it around, do you?

If the bad guys can't have guns, good guys can't have them either, because sometimes they are exactly the same people in different circumstances. That's the big hole in good guy vs. bad guy reasoning.


Why not? It works.

Cops without guns: https://youtu.be/cX5CPx4RKWw

Cop with gun: https://youtu.be/RdoeBXt06Bc


Those cops were very brave, but you're going to be hard pressed to find even a tiny minority of Americans willing to put cop lives in that kind of danger.


The British approach is safer for both officers and the public. By emphasising containment and de-escalation, British officers avoid the kind of chaotic and unpredictable confrontations that lead to fatalities.

Only one British police officer was killed in the line of duty last year - PC David Phillips, who was run over during a pursuit. In the same year, eight American officers were killed by vehicular assault, 36 were fatally shot, three died after being assaulted and two died of accidental gunshot wounds. In 2014, no British police officer was killed on duty.

Even accounting for the difference in population and the prevalence of firearms, there is a substantial disparity. Britain does have armed criminals, but our police are not routinely armed.

There were riots across Britain in 2011, precipitated in large part by the fatal shooting by police of Mark Duggan. Only one other person was shot by the police in 2011. In the same year, the FBI estimated that 400 Americans were killed by police officers, but no official count exists.

There is considerable interest amongst US police forces in learning from the British approach, as shown in this documentary:

https://www.youtube.com/watch?v=66pr23xUKZc


And the police of Britain are so anaemic that they let _THOUSANDS_ of children be abused rather than risk confrontation.

You are not selling me.


British cops do that quite regularly.


Government agencies & the military have to use computers. Those devices either will be secure or they won't be. How does the CIA, NSA, and DoD feel about that?


The system is concerned with legibility to itself first, more than reducing crime. Without legibility, there is no avenue to even be effective through! Specific instances of staff being blackmailed or bribed are simply addressed through the traditional channels of power, and thus creation of their possibility is not a problem.


Yes, the good guys will use it in many ways that were never intended....

http://www.nytimes.com/2014/07/21/us/politics/edward-snowden...


It's basically the "us vs. them" attitude - i.e. dehumanizing the "enemy". As soon as everyone realizes that it's just "humans vs. humans" we'll get more intelligent discourse, and, perhaps, some sort of actual progress.


> Are most politicians really that stupid?

They aren't stupid, they just don't share your interests.


"It's like no politician ever read about crime. Staff can't be blackmailed or bribed. "

More likely they know their audience (most voters) don't know about crime, terrorism and encryption to know how bad the plans are.


> Like the only thing between terrorists and the subway platforms is that they can't afford ticket, so they have to go through the staff gate.

Obviously, terrorists must always break all laws in the process of committing terrorism. They have to illegally park their cars, jaywalk to the location, commit some acts of public indecency on the way, litter at random, and pop their heads into a crowded theatre just to yell "fire!"


And terrifyingly enough, the Council's Public Safety Commitee said to this: "We know terrorists are planning to attack our subways, and the MTA and NYPD better find these magical morons quickly, and then make them disappear for a year in jail."

So, clearly the solution is to JAIL the users of the keys, instead of actually replacing this with a better system.


Although not in favor of key escrow systems, I'm not convinced that they cannot be implemented in a secure fashion. There are fundamental differences between physical keys and digital key that make analogies between the two unconvincing to me.

With a physical master key, all it takes is one possessor of a copy of the key to agree in order to get access to what the key is protecting, or compromising one possessor.

With digital data, such as an encryption key, an escrow system could be combined with a secret sharing system so that you need to gain access to multiple shares before you can get the underlying encryption key. With a sufficient number of shares required, distributed among a large enough number of independent escrow agents in different legal jurisdictions, the probability of someone getting your key illegitimately via the escrow system can be made arbitrarily small.


At best you are only going to be able to patch the security hole that having key escrow has created though. You can never to better than that, and it's quite likely that the system would be less than 100% secure in practice.


People challenging this anti-crypto movement should really push this example because it is really perfect. Before we consider key escrow, please explain exactly how physical key escrow was breached in New York and tell us how that will be prevented when the key in question isn't even something you have to physically get.


This should always be presented as an additional argument, though, not the only argument. Because an argument based solely on securing the backdoor implies that if it could be kept secure enough, there's no other reason not to do it.

Even if we could somehow guarantee absolutely that the backdoor will only be used by the intended users, backdoors are still not acceptable. For much the same reason that it's not acceptable to put a camera in every home or car, even if the footage were "only" accessible by the government with a warrant.


The answer to the begged question of the New York key escrow example is that it is impossible to guarantee a backdoor is 100% secured and only accessible by the intended users.

I'm not sure the camera analogy will carry weight. It sounds hyperbolic to most people as it tries to equate actually being recorded in your private affairs to the possibility of materials being accessed when a warrant is issued.

Most people have no problem with law enforcement accessing evidence when a warrant is issued. The only way to win over the general population is to demonstrate encryption is fundamentally different then simple access.


Something similar happened with the playstation 3 and its master key.

https://en.wikipedia.org/wiki/PlayStation_3_homebrew


Why? They're not a taxi company. Thry


> The fewer things you have to secure, the easier it is to keep them secret.

Except when this approaches towards zero security. The line is so thin, and actual expertise is needed to discern this sweetspot. I've seen entire corporations use apparently bulletproof security (Think Google's data centres), but fail to use DNSSEC or background check their security guards.

Avoid weak links like this, they are bad for business.


So... the FBI is essentially arguing we should all keep our doors unlocked because they have had to do some investigations in the past where they came to a home that was locked and it was hard for them to enter the home.


The FBI want to have a giant warehouse that houses a copy of every house key but we don't need to worry because no one will ever manage to break in to the warehouse.


And everyone with legitimate access to the warehouse will be 100% trustworthy no matter what for the entire span of time they are granted access, and nobody without legitimate access will ever be allowed in, even someone like a co-worker of someone with access, and even under the full supervision of someone with legitimate access.


An even stronger guarantee: That the definition of trustworthy and "good guy" are unchanging, and even in a dystopian future where a rogue actor is in control of government, those keys are safe because they understand the morality of the people who created them.


I'd say something to invoke Godwin's law, but it seems pretty clear that genocidal governments that sweep into power love having access to copious amounts of detailed records.


They don't even have to be genocidal; they could just be Richard Nixon.


Or Donald Trump.


The connection between outlawing maths and burning books doesn't take that much of a mental leap.


> And everyone with legitimate access to the warehouse will be 100% trustworthy no matter what for the entire span of time they are granted access

Of course. These are upstanding FBI agents we're talking about. There's no way one would ever cheat or steal or blackmail anyone.

http://www.cnn.com/2015/03/30/politics/federal-agents-charge...


It's worse than that. They want to ship those keys to their agents, only they'd arrive instantly and the only authentication is probably a password (their kid's name and birthday) with no human intervention.


I hate how we always have to tiptoe around saying that the primary threat would be a non-government actor breaking into the warehouse. I'm much more worried about the FBI itself having the keys.


With the key inside the warehouse, no one can get in.


Or that you can get a lock on your door, so long as it's one of a short list of locks where they have a spare key.

This has been tried, and works about as well as you might expect: https://theintercept.com/2015/09/17/tsa-doesnt-really-care-l...


The only reason they aren't actually advocating for exactly that is because it's easy for them to break into a locked home. If there were some new technology that made breaking into a house almost totally impossible, this is exactly what would happen.



The FBI already has access to pretty much any locked door they want if they get a warrant. Their problem is that warrants don't work against encryption.


The front door is a less useful metaphor than your safe. We have safes in addition to locked front doors because it's accepted getting into the house is generally not that hard, whether you be law enforcement or a criminal.


A warrant is just as effective against nearly all safes. I have little doubt that if a safe was in FBI custody as long as the San Bernardino shooter's phone has been, the FBI would have been able to legally and physically get whatever was inside.

The FBI simply wants the digital world to mirror the physical world and a 100% unbreakable physical lock is almost impossible to produce in the physical world. It is easy to see why they would want that to be true in the digital world.


In this case, the digital world mirrors the physical world. Since this phone doesn't have secure enclave, they could trivially break into it. The fact that they've said they can't makes them liars. Very public liars. And are liars considered trustworthy?

https://www.aclu.org/blog/free-future/one-fbis-major-claims-...

[EDIT for added link]


> the FBI would have been able to legally and physically get whatever was inside.

What if the safe has a self-destruct mechanism for the contents in case of breach?


And what if government agencies bought that safe to handle classified material because it was certified to have met or exceeded the standards for handling such data. Then the FBI forces the safe manufacturer to make a way to get into the safe. Now the other agency can come after the manufacturer for lying to them during the procurement process because there is demonstrably a way for an unauthorized person to break into the safe.


My parents worked for the NSA before there was an NSA. A guy came in to do a talk about how there is no such thing as security. He had a table with a bunch of common locks and even some safes. He proceeded to unlock each one in a matter of seconds to demonstrate his point.

Modern encryption however is changing the game. It's no longer true that security is just a matter of perception. It's becoming a reality and that makes for an interesting and bold new world we're entering.


Maybe, but the game is still the same. There are weak points for encryption just as there were weak points with the safe that he unlocked.



It's a secret - break codes and write code, ;). While the NSA was started in the 50s it was not widely known until the 70s https://www.eff.org/nsa-spying/timeline


Getting in a safe is also not that hard if you're the FBI, I think. I'm pretty sure that every safe can be cracked in a day or two without destroying whatever is inside.


I wonder if anyone has explained to them there is this thing called open-source software. Sure you may be able to convince/force Apple to give you some sort of key escrow system but do you think you can convince the GPG developers?

If you implement key escrow and it's public knowledge that encryption systems that implement it are useless then people that actually want to hide stuff will simply use GPG and other uncompromised systems.

The only thing this sort of system is good for is enhancing the reach of the surveillance apparatus. That is, spying on innocent people. As for why they want to do this.. no-one knows but it's awfully concerning.


I see this objection raised so frequently, and I feel it really misses the point badly.

The tech community tells itself that it won the first "crypto wars". You cannot win "wars" against governments in that sort of sense and the first crypto war was never actually won at all. I think in light of events in recent years we need to reinterpret the events of the 90's in a new light - the tech industry didn't win, rather, after realising how awful and worthless the software the cypherpunks produced really was, the government simply got bored of playing.

Nobody, and I mean nobody, gives one tiny shit about GPG. GPG is so bad, such truly unusable software, that terrorists would literally rather die or risk lifetime imprisonment than use it:

   http://privacy-pc.com/articles/how-terrorists-encrypt-threatscape-overview.html
Governments don't care about GPG now, they don't care about some theoretical open source program that you could install from abroad, they only care about the encryption their adversaries actually use which - given that 99.9% of the FBI's adversaries are not crypto experts - turns out to be whatever ordinary people are using automatically thanks to tech companies switching it on.

This is especially true because often people don't meticulously plan crimes out ahead of time: they either commit crimes of passion, or they make basic mistakes. So if you have to plan ahead and convince not only yourself, but all your accomplices, all to install some exotic and awkward to use piece of technology ... well, a lot of bad guys won't do it.

So. If the FBI succeeds in breaking the encryption used by Apple, Google, Microsoft, Twitter, Facebook and a few other big names, then they've got 99% of the guys they want.


GPG was just an example. Other end-to-end open-source systems are more widely used like Signal are the same, not exactly easy to insert a backdoor into a system that is a) open and b) completely distributed with no central key authority.

Maybe it still misses the point because the FBI doesn't actually care about hitting hard targets. If that is the case that is pretty sad.

Average crimes can be solved with average tools, we shouldn't be authorising access to phones and other electronic intercepts or access without crimes that go beyond average.

You could argue the San Bernadino case was beyond average, and you would probably be right. But the perps knew that too, that is why they destroyed the phones after they were done, chances are they took other measures too but no one will know as they destroyed the devices.

What is clear is not that these laws wouldn't make their jobs easier - they almost certainly would. But they aren't needed and that implementing them would have 0 effect on the actually hard targets that they in theory would be useful for neutralising.


> Other end-to-end open-source systems are more widely used like Signal are the same, not exactly easy to insert a backdoor into a system that is a) open and b) completely distributed with no central key authority.

If Signal weren't available from any of the major app stores, such that it didn't work on iOS at all and didn't work on Android devices without turning on the intimidating option to allow non-Play-Store apps, how much usage do you think it would get compared to today?

Making real security hard to get would cause far fewer people to actually use it.


> they only care about the encryption their adversaries actually use which - given that 99.9% of the FBI's adversaries are not crypto experts - turns out to be whatever ordinary people are using automatically thanks to tech companies switching it on.

Of course, this is also the technology that 99.9% of U.S. federal staff use too. Probably more like 100% counting personal use of technology.

Across the board, enterprise security today is dependent on the security of consumer technology. Unfortunately, a lot of people in the federal government don't get this; they still think there is a difference between "secure federal technology," and what every else uses.

There's not even a Cabinet official charged with enterprise security for the federal government, so there is no one to even speak up. The FBI and intelligence chiefs are all very high-profile, and they are primarily concerned with access, so the federal conversation seems skewed. Which it is--it matches the skew of the federal mindset in general. The best technologists go to the breaking and entering teams. Meanwhile OPM loses 22 million accounts.


Yup.

By analogy the lawyers can outlaw six egg omelets with butter and orange flavor. So, then McDonald's won't be able to sell them, but I can still make them in my own kitchen.

The lawyers can outlaw strong encryption on products from Apple, Google, Microsoft, etc. and, then, crooks who use those products can more easily be caught, and that will amount to nearly all the common crooks. But I can still get some simple, open source C code for some simple command line RSA or PGP de/encryption and use it for secure communications with others who do the same. And serious people will, and likely do.

I.e., just get the open source code for RSA from Schneier's book or look at the open source code in Zimmerman's PGP. Or just read Schneier's book and write your own code and make it open source for yourself and all people you want to communicate with.

So, to send an encrypted message in a file, from a smartphone, tablet, laptop, desktop, etc., copy the file to an old computer, if only via diskette, running PC/DOS and never connected to the Internet. Run the command line C program for encryption. Get the output file, in just simple base 64. Then copy that file to the smartphone or whatever and send it, with no attempt at security. Done.

This way, it doesn't matter what Apple, Google, Microsoft, do/don't do since they are just moving base 64 gibberish that is perfectly safe even if printed in the NYT.

The command line programs? Easy enough for middle school children to use. Simple.

Math 1. Lawyers 0.

Now what is there to argue about?

So, all this stuff about the FBI is just the village idiot playing public pocket pool, right?


> Then copy that file to the smartphone or whatever and send it, with no attempt at security. Done.

They still know who you are communicating with. There are ways to communicate without disclosing with who you are communicating.


If the worst case scenario fulfill and people will get jailed for petty crimes based on evidence snooped from their personal electronic devices and social media accounts, they will realize that they actually have something to hide and will look for alternatives to puppet companies. That's where open source and/or non-USA&Co originated software comes to play. See Twitter and Whatsapp role in recent mass protests.


Usage patterns can easily change for a variety of reasons. WhatsApp is hugely popular now, but tomorrow it might be something else not based in the US.

If they manage to write a law such that it bans strong encryption from the App Store, that will achieve their goal. But any other scenario is at best a temporary gain.


How many people can be bothered if it's not a turnkey solution?


Like I said it's not about the average person, it's about someone with something to hide. Those with something to hide will always go the extra distance.

The issue I take with the FBI approach is it will have no effect on those that have stuff to hide but destroy any semblance privacy for those that don't.

Terrorists will use GPG, citizens will use their backdoored iPhone full disk encryption and everyone but the terrorists lose.


> it's not about the average person, it's about someone with something to hide

Keep in mind that the FBI would love it if everyone held that to be true. We all have something to hide, average person or not. No matter if you are a criminal, political activist, pervert, lawyer, priest, or just the baker down the street. Paraphrasing John Oliver: your banking statements, medical data, dick pics, private messages, dick pics, dick pics, and your secret diary are all things you most likely want to hide.

( https://www.youtube.com/watch?v=zsjZ2r9Ygzw )

The FBI would not mind a situation where only people with technological know-how use GnuPG, LUKS with dm-crypt, etc., and the rest whatever came with their smartphone. Most criminals are not particularly tech-savvy, so if that group loses access to strong encryption by default, they largely gain what they want.

Of course there is no practical way to make the use of strong encryption by individual citizens around the globe illegal, and they know this. They may however succeed in outlawing strong disk encryption available by default on store-bought devices. So when a suspect uses an Android or IPhone, and the FBI has his or her device, they want to be able to access its contents without the suspect's consent. Ideally, they also want the most popular messaging platforms (like Facebook's WhatsApp) to have a backdoor available, in order for the vendor to be able to comply with warrants for such data.


That's a problem too. Right now people with something to hide don't stand out amidst a background of similar encryption. If they suddenly have to switch to something relatively more exotic, that alone would be a coup for people in signals intelligence, don't you think?


Yeah definitely. This is currently the problem with ToR.

The people that need the protection that ToR provides paint a target on their backs because there isn't enough ToR usage for them to be inconspicuous. Which is sad.. because for all of the bad usage of ToR there are people that depend on it to preserve free speech and any weakening of it could easily get them imprisoned or in many cases executed.


It doesn't have to look "exotic" at all: Sam just sends Joe a file in base 64. But that is how all the multimedia content is sent in e-mail now. Until dig into what Sam sent, it looks like just another JPG file sent in e-mail. But translate it from base 64 and give it to an image program and will discover that the file is not a JPG file. Or a PNG, BMP, MP3, WAV, EXE etc. file either. But about have to translate from base 64 to conclude this.


Oh. I misinterpreted what you were saying to mean "it doesn't matter because we have GPG."


>Always

I gave some sources in https://news.ycombinator.com/item?id=10582206 that might change your mind. Changing the default makes a difference.



Would it be possible to provide crypto as an open source "interface library" to commercial applications? So instead of the application doing the crypto (eg. Apple iOS) it would be farmed out to an optional library of the user's choosing.

Apple could make it easy for a user to install such a library and then say (truthfully) that the cryptographic functions of their OS is not in their hands, since that feature is handled by a third party open source maintainer.

At that point it would be an infinite game of whack-a-mole for the FBI to try to get backdoors in open source crypto interface libraries which could be maintained outside of the US.


A whack-a-mole game the FBI has no chance of winning.

How? Keep the de/encryption software simple, dirt simple, just open source C code, run as a command line program on, say, an old PC/DOS system with no hard disk. For encryption, just put the file on, say, a smartphone, to be encrypted on a diskette, give the diskette to the PC, erase the orignal file from the smartphone, have the command line C code on the PC do the encryption and write the results as a base 64 file to a diskette, and, with no effort at all at encryption or security, let the smartphone read the diskette and send the file. Simple.

Just keep it simple, just dirt simple, really small, open source code.

The de/encryption has to be, what, just a few loops in C, in a few hundred lines of code? The rest is just dirt simple C file I/O just one byte at a time? So, very much do not want some 100,000 line app with the de/encryption buried deep inside somewhere. And want the de/encryption run on some other device, maybe an old PC/DOS machine with no hard disk and no network connection. Or get a Raspberry Pi running some simple operating system and where can be sure that no important data will be left on the computer.


Only terrorists.


Just those that have a lot to hide, such as the terrorists the FBI wanted to catch in the first place.


It has been explained to them multiple times in congressional hearings. If you have a long time, the recent House Judiciary Committee Hearing is worth the watch:

The Encryption Tightrope: Balancing Americans’ Security and Privacy https://www.youtube.com/watch?v=g1GgnbN9oNw

There was another one that I watched back in June, but I cannot find the video at the moment. Should have bookmarked it!

In the rietta.com article, I linked the this hearing, starting with Susan Landau's testimony starting at https://www.youtube.com/watch?v=g1GgnbN9oNw&t=3h35m50s. But there is more must watch portions of the hearing.

I like GnuPG and I use it regularly with work as does my team. And some of our clients do too, but not nearly enough. But my mom and dad are never going to use it. They do both use iPhones though. So being able to protect them is important.


Sort of a bummer that lawmakers don't have a better understanding of encryption in general and what it protects. They'd condemn hackers breaking into phones/accounts and stealing important notes/pictures, but turn around and condemn the very technology preventing that from happening to _everybody_

Anybody here want to run for office and be a voice for tech rights?


It might help if one of them gets their personal information leaked because they used unencrypted technology in their private lives. I do suspect this won't have any significant influence since they think the government will prevent the escrow from being hacked.

Part of me hopes that there will be a significant data leak from whatever the NSA has stored. Maybe they'd realize that single point of failure sucks.


There has been a significant leak of NSA data - that was Snowden. There have been significant leaks of the personal data of most (all?) applicants and holders of security clearances from OPM servers. So government is incapable of setting up systems to secure themselves. If backdoor access is required, it destroys the incentive for private industry to develop improved technology for privacy (is the argument that our technology is better going to sell when a back door is always available for access?). If better private technology were available then the government breaches would have been less likely. If better private technology were available then more commercial breaches would be stopped.

We should be careful when the FBI is advocating a capability to help investigate events which occur vanishingly rarely (and for which they have many other tools and avenues) vs the integrity of infrastructure which affects a wide swath of public security. When one looks at the entire balance, requiring a backdoor has a high chance that it would lock us into much less secure state of affairs than if everyone is free to pursue and apply the best privacy/encryption measures possible.


> It might help if one of them gets their personal information leaked because they used unencrypted technology in their private lives

This has already happened. An attacker going by the name Guccifer spent several years going around punking high ranking officials and ex-officials. Remember those goofy paintings of GW Bush's that hit the news a few years ago? Those were stolen and posted by Guccifer.


I'm not convinced that a lack of understanding isn't borne from a lack of giving a rat's ass to begin with. Why learn about something when your vote and influence are already bought and paid for?


> a lack of giving a rat's ass to begin with.

Agreed...

> your vote and influence are already bought and paid for?

Overly cynical. The standard whipping boy for 'buying politicians' is Big Business, and Apple certainly qualifies as that. In fact, this is an affront to essentially every big business in the world with IP to protect.


>In fact, this is an affront to essentially every big business in the world with IP to protect.

it really isn't, because many of the biggest business want a greater degree of population control, and having access to all of every individuals info is a means to that end.


It seems a bit of a reach to say that government is demanding the tools to build a surveillance state because of unnamed "big business" wanting population control. That seems to only happen in bad sci-fi and Internet comments.


And all of those contractors that the NSA and the FBI work with, they are just too poor to even think about lobbying for more work in the shadow of Apple's influence. /s

Big business is on both sides of this conflict.



Both Bernie and Hillary have been vague about their stance on having encryption backdoors, almost to the point of implicitly supporting encryption backdoors but not wanting to outright say it. Bernie's message about privacy isn't necessarily incompatible with encryption backdoors, at least not through the lens of political speak and all of its half-truths.

disclaimer: bernie supporter, and not a supporter of encryption backdoors


Encryption policy will ultimately be resolved by the legislature and not the president. The only presidential candidate who was on the record as explicitly against encryption backdoors was Rand Paul, but that ship has sailed:

http://www.washingtontimes.com/news/2015/nov/13/rand-paul-sa...

> “The head of the FBI came out with this recently. He says, ‘Oh, we’re going to ban encryption.’ And it’s like we want to build a backdoor into Facebook and a backdoor into Apple products,” the presidential hopeful said at the Yahoo Digital Democracy conference this week. “A backdoor means that the government can look at your stuff, look at your information, your conversations. … The problem is, is that the moment you build an opening — and I’m not an expert on coding or anything — but the moment you give a vulnerability to a code that someone can get into your source code, not only can the government, but so can your enemies, so can foreign governments.”


The issue is that the NSA, FBI, etc are essentially controlled by the President. These groups have and will continue to have extremely creative interpretations of the law, to the point of essentially ignoring it and doing whatever the fuck they want without repercussion. Legislature can pass whatever they want, but three letter orgs will continue to have secret courts, secret laws, and secret operations that operate outside established public law.


The first sentence is a very naive point of view. Most people argue that it is the other way round. People who worked there.


The directors for these agencies are appointed by the President. He hand picks them, and has the ability to fire them with adequate cause, of which there is recent precedent. I am not sure how you can argue they are not controlled by the President, unless you want to try some sort of secret-blackmail theory.

However, it isn't surprising that the FBI or similar agencies are full of egocentric people who believe they hold the power in the relationship. It seems to attract these types of people, really.


It's only a dichotomy if you see it from the angle where data is sacrosanct and its beset on all sides by evil trying to do it in.

The better way to approach this issue, long term, is from a legal point of view with an interim state where encryption holds us over. That is the law decides who may or may not own or access a certain type of data with penalties upon tort or criminality. And we develop civil protocols for days governance between people and between people and governments.

Like trademark. You could have it so trademark, i.e. authentication, is protected by mathematics, or you can have it protected legally.

Personally I don't believe the answer to data theft or surveillance is more mathematics in the form of encryption, but sensible laws regulating data its, use and access with penalties for transgressing. Obviously this would require international cooperation and would be a long way off and in the interim we'd need encryption to protect against unauthorized access until we reach that state of data governance. But ultimately the answer is not "make everything s black hole".

We don't protect against thieves by building impenetrable houses, we rely on legal instruments to dissuade burglary.


Thieves can't traditionally steal your stuff on a mass scale without you ever knowing about it and then cost effectively use it to economically and psychologically manipulate entire populations.

Once someone has your data, you don't know what they're doing with it and neither does the government. Which means they have to be prevented from getting it in the first place, which means encryption and laws that encourage and facilitate encryption.


Then the problem is putting so much meaning to data. Why should my ID (or SSN) have so much value? Why should my medical records have so much value? Medical records have value mainly because it can lead to discrimination, so the solution to that is remove the value of discrimination (job, medical care costs, etc.) based on medical conditions.


> Why should my ID (or SSN) have so much value?

Because it existed before the invention of public key cryptography and is now permanently entrenched. If you think you can fix that, go do it and then make this argument after nobody is using SSNs anymore. Also, your argument for not deploying cryptography is "we should solve that problem cryptography would solve if it was more widely deployed"?

> Why should my medical records have so much value? Medical records have value mainly because it can lead to discrimination, so the solution to that is remove the value of discrimination (job, medical care costs, etc.) based on medical conditions.

You say "the solution" like all we have to do is snap our fingers and people will stop discriminating based on medical conditions even though doing so is highly profitable. The way the laws against that type of discrimination work is by preventing the discriminating party from obtaining that information.

Also, good luck passing or enforcing a law that says prospective mates can't discriminate against you based on your medical or mental health records. To say nothing of the outright violence that would result if the names of women who get abortions became known to the wrong people.


'Reason change from the world as it is, not from what you would have it be'

If a proposed system requires changing the world to make it feasible. Well, probably not the best starting point.


> That is the law decides who may or may not own or access a certain type of data with penalties upon tort or criminality.

What if criminals are willing to break those laws? (That's kind of the definition of criminals, after all.) Who cares, you say, because data isn't sacrosanct? Well, some of the data we'd like to protect is financial, and criminals can use it to steal my money, so I care.

What if foreign governments are willing to break those (US) laws? Again, who cares, you ask? Well, if I'm a company facing foreign competition, and the foreign government is willing to do economic espionage to help their companies, then I care. And if I'm the US government or military, I definitely care.

What if the US government is willing to break those laws? It'll never happen, you say? Read some history. It's happened before, and it will again.

The law only protects me against people willing to obey the law. I also need protection against those unwilling to obey the law.


Because the government hasn't already discredited itself regarding spying on us?


Here is the source code for the Linux Kernel including dm-crypt: https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux....

Here is the source code for cryptsetup: https://gitlab.com/cryptsetup/cryptsetup

Here is the source code for GnuPG: http://git.gnupg.org/cgi-bin/gitweb.cgi

I have these files on my hard drive.

There are probably cryptographers that can recite the RSA/AES algorithms from memory.

We have the Internet and general purpose computers.

The only way you're getting rid of encryption is destroying all of that. I don't know why people feel the need to resort to arguments about economic damage, civil rights, whatever.

This is the fucking _Internet_ we're talking about. This isn't some biscuit tin with a particular pattern that Uncle George really, really likes. It's the god damn Internet.

You want to destroy one of the most beautiful creations humanity has ever seen, in the name of what? Stopping a few marathons being bombed?

Is the entire government clinically insane? Would they turn the sky green if it gave them more power? Am I still living in reality?


What most people probably don't realize is even if the government was somehow able to successfully outlaw cryptography and eliminated it completely from the face of the earth, it would do absolutely nothing to prevent future marathon bombings. The two simply aren't related, and there is nothing on this phone that is going to magically help the FBI, NSA, or any other TLA find and stop terrorists from acting.


In the future, the few years following Snowden's revealations may be viewed as the golden years of strong cryptography: A time when service providers and application developers began taking these issues seriously.

We're moving into a new era now. All it may take is a single attack in the US to drive the legislative and judicial branches to roll back all the fantastic improvements we've seen over the past few years.


They can "roll back" for mass marketed products but not for some simple, open source software in standard C that someone runs on an old PC with PC/DOS, no hard disk and no network connection. So, when turn the power off, the PC forgets everything about the de/encryption. The only non-volatile storage is on diskette. If worried, then just burn those. With the encryption done, the output is just a base 64 file of gibberish safe to mail to the NYT, FBI, CIA, NSA, etc. And with no attempts at security, can use products from Apple, Google, and Microsoft can send such data with no problem.

The little command line programs on PC/DOS? Easy enough for middle school students to use; when that was state of the art computing, middle school students did use it.


And then they will outlaw the possession of said standard C software, as well as of computer hardware that does not comply with government-mandated eavesdropping. And then they will pass laws that force you to give up the encryption key to that Base64 data or face imprisonment. Even if the Base64 data is just a sequence of completely random bits - because who would keep sequences of random bits around, unless they're a terrorist trying to hide something?

I'm sorry, but I don't think pushing encryption down into the underworld is a viable solution to the problem. There's no limit to the bad laws that the government can pass. The only real long-term solution is to recognize encryption as a right, otherwise we'll only keep seeing these repeated attempts to outlaw it.


> And then they will pass laws that force you to give up the encryption key to that Base64 data or face imprisonment.

IMNAL, but my understanding is that such a law would run into rock solid, granite hard, iron clad parts of a little issue called the US Constitution. E.g., if the cops ask you a question, then you don't have to answer. The person's lawyer can just tell the cops that "My client has no idea what that base 64 gibberish is."

For encryption as a recognized right, no, that's asking a bit much of the US political system.

BTW, for the person receiving the base 64 code (that's the way JPGs, etc. are sent in e-mail), first go through base 64 decoding and, then, apply the receiver's private key to that to decode back to the secret message, e.g., where and when the boy and his girlfriend are going to meet and carve their initials on a tree.

Base 64 is in the internet standard for e-mail and there is called MIME for multi-media internet mail extensions. So, the idea of MIME is to permit sending pictures, audio, movies, etc.

So, in arithmetic, base 10 has digits 0-9, that is, 10 digits. Base 16 has, right, 16 digits, 0-9-A-F. Base 2 has, you guessed it, 2 digits, 0-1. Well, presto, bingo, base 64 has 64 digits, 0-9, a-z, etc., all simple, ordinary printable characters such as e-mail had been sending right along.

Well, with 6 bits, can count from 0 to 63, that is, have 64 different patterns. So, given a stream of bits, can replace each 6 of them with one of the base 64 digits. And there is a simple solution for what to do with any few bits left over. So, that is how to take any stream of bits and 'encode' it to just printable characters easy to send via e-mail.

A huge fraction of all Internet data is sent as base 64. So, base 64 data alone is nothing suspicious.


> IMNAL, but my understanding is that such a law would run into rock solid, granite hard, iron clad parts of a little issue called the US Constitution. E.g., if the cops ask you a question, then you don't have to answer. The person's lawyer can just tell the cops that "My client has no idea what that base 64 gibberish is."

"And here our intelligence network shows proof that your client has talked about this base64 gibberish in the past with other people, so let's add perjury to your charges".

But your point is valid, you have a right to not incriminate yourself in the US. The case with Apple, however, is that a third party you've trusted is being asked to breach that trust. The 5th does not apply at all.

Not to worry, however, as long as you don't communicate with anyone, you're safe. The moment you do communicate with someone though, you'd have to put your trust in them. And then the FBI could demand, from them, the conversations you've had. And then the 5th has no value.


> IMNAL, but my understanding is that such a law would run into rock solid, granite hard, iron clad parts of a little issue called the US Constitution. E.g., if the cops ask you a question, then you don't have to answer. The person's lawyer can just tell the cops that "My client has no idea what that base 64 gibberish is."

The US Constitution hasn't helped prevent the PATRIOT act, or the TSA's unreasonable search powers.

> A huge fraction of all Internet data is sent as base 64. So, base 64 data alone is nothing suspicious.

There's a difference between Base64 that decodes into a harmless cat picture, and Base64 that's apparently random. Unless we make it normal for everyone to have encrypted, random-looking data lying around, the few that choose to have it will be increasingly harassed by the government, even if they're not doing anything wrong.


If all US crypto is backdoored, the rest of the world and the criminals will use non-US crypto.

The FBI also wants to infiltrate communities of human rights activists. There are many reasons not to overly trust them.


Our Project (actor.im) already started to adopt russian encryption.


Open source encryption should do the trick, if a backdoor gets added, just fork it.


The FBI can whine about this all they want. I hope that the lawmakers interpret it as the FBI spitting in their face, because that's exactly what's happening. We have had this debate already -- in the 1990s -- and this was the result:

"A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication." [0]

The FBI can lobby to change that, but that's not what they're doing. They're lobbying that OTHER laws (e.g. the All Writs Act) enable what they want. Frankly, that's absurd... how can a 1789 law overrule a 1994 law when you're talking about modern technology?

Ironically, the FBI is creating incentives for new tech startups to incorporate outside of the USA. If you're building a product which depends on reliable encryption in order to be valuable, why the fuck would you incorporate in the USA?!? You would alienate foreign customers who are suspicious of the US legal/surveillance apparatus. And you would be entering a murky legal landscape where it seems increasingly likely that, if your startup ever becomes big enough to be a target, the government will require some kind of key escrow or it will shut down your business or even jail you.

In the face of that much uncertainty, it seems like it would be asinine to incorporate in the USA.

Maybe there is space in the market for a business to commoditize offshore incorporation. Make setting up a Seychelles corporation as easy as setting up a US LLC. Build in as many legal protection mechanisms as possible, e.g. owning the corporation via a trust that you are the sole executioner of.

[0] http://www.law.cornell.edu/uscode/47/usc_sec_47_00001002----...


Maybe there is space in the market for a business to commoditize offshore incorporation.

Like https://stripe.com/atlas except outside the US.


Scary that something so important will be determined by the perception of the vast majority of the public, who has no understanding of what's at stake.

To most people arguments like "it will help us find terrorists and pedophiles" and flawed analogies with doors and keys are much more appealing, only because they are easy to understand, while the opposite arguments sound philosophical, alien or carry less weight because they contradict what "the experts" (i.e. the FBI) claim.

But they shouldn't sound so: what's happening is wrong not just because of the practical fallacies of the pro-restrictions arguments -in which most discussions focus currently- it is wrong because the only way for government monitoring to be effective in the end, is to outright criminalise secure encryption by everyone. It should be blatantly obvious that the most dangerous of their claimed target group, wouldn't be dissuaded by the inconvenience of using custom/non-friendly software/hardware, so any lesser measure would be just useless.

And on that premise, how can many people not see how wrong it would be if one day we are called criminals for exchanging a truly private message with someone? In what words and with what simple examples can you make non-technical people see how bad this reality would be and how far it can stretch to things that they do care about? And that even if it didn't, it would still be fundamentally wrong...

Not a rhetorical question by the way, I've tried to participate in such discussions and failed miserably to be convincing -so any tips are appreciated.


So we've moved from the clipper chip to, prospectively in the near future, the Clapper chip. History is trying to repeat itself, but this time we have legal precedent.

https://en.wikipedia.org/wiki/Bernstein_v._United_States

https://blog.cr.yp.to/20160315-jefferson.html

All this sort of tactic will do is result in irreparable damage to the tech sector and the US economy. No murders, rapes, child abductions, terrorist plots to destroy buildings, or whatever other specters they summon to haunt us will be prevented.


> On October 15, 2003, almost nine years after Bernstein first brought the case, the judge dismissed it and asked Bernstein to come back when the government made a "concrete threat".

I guess the threat's technically not here yet, but seems like it's right around the corner. djb, pack your bags for the ninth circuit.


Comey's argument makes sense at first. Why not have a trusted escrow provider keep keys safe, and also respond to court orders when necessary. It feels almost like a checks and balances kind of argument, the kind that Americans find persuasive with our three-branch government.

The problem is that we now know that the government has the goal of unlawful surveillance without oversight from courts, the legislature, or the public. There is essentially an ongoing "by any means necessary" attack on civil liberties.

Why should we think that the government is not planning to infiltrate the escrow services and preemptively capture all keys?

There has been a profound breach of trust (revealed by Snowden) and we must insist upon the rule of law and basic democratic transparency before we consent to any further risks.

I try not to be cynical but I am thinking that the trend we are on is leading to strong crypto being largely criminalized. I am hoping that our decentralized systems adapt to this threat and offer solutions that cannot be shut down (like Bitcoin and Ethereum).

Incidentally, if Apple seems likely to lose the battle over a back door, it ought to offer an Ethereum smart contract that will unlock one phone every day, require a key provided by each member of congress (with 100% consent required to unlock a device), and publish all unlock key requests on the Ethereum blockchain after a 30 day delay in case an investigation is in progress.

This protects against mass surveillance, but offers a very small back door with full transparency and no potential for large scale use (or abuse).


So what do you do when the "trusted" escrow provider gets hacked, just like OPM was, and countless US corporations who've had customer records and credit card numbers stolen?

What's the point of using encryption if you're going to put the keys in the hands of some unaccountable entity which is easily hacked? You might as well not use it at all then.


I didn't argue any of those things. I think you missed the subtlety of my comment.


> The problem is that we now know that the government has the goal of unlawful surveillance without oversight from courts, the legislature, or the public … There has been a profound breach of trust (revealed by Snowden) and we must insist upon the rule of law …

Nothing Snowden alleges is illegal or unconstitutional, despite his and his supporters' repeated assertions. 'I don't like it' does not imply 'it is illegal and unconstitutional.' Neither the law nor the constitution forbids all bad things (nor does either mandate all good things, but that's another issue).

You are correct, of course, that one major issue is that States cannot be trusted to respect their citizens' liberties. Another is that States cannot be trusted to take care of their own data: an escrowed key will shortly become a leaked key.

> Incidentally, if Apple seems likely to lose the battle over a back door, it ought to offer an Ethereum smart contract that will unlock one phone every day, require a key provided by each member of congress (with 100% consent required to unlock a device), and publish all unlock key requests on the Ethereum blockchain after a 30 day delay in case an investigation is in progress.

That doesn't make sense, since the Congress is the legislative branch and it is the executive which actually does things. What could make sense is a smart contract which is unlocked by the judiciary.

But we're a long, long way from enforced smart contracts which do things like handle succession of new members &c. I can't wait until we're there, someday, but it'll take a good long while.


> Nothing Snowden alleges is illegal or unconstitutional...

First, I did not claim it was unconstitutional or strictly illegal (since secret things can't be considered by normal courts or legislatures, everything is in a sense extralegal, which is itself a big problem).

There are protections against unlawful search and seizure. Seizure in this case is data capture, search is viewing by a human (with or without a warrant). There was not any law granting the NSA the power to do domestic surveillance unless there was some suspected behavior involving foreign nationals or international circuits.

But it's more relevant that our leaders repeatedly assured us that no such surveillance was going on. Blatant lies told to the public is as serious a breach of trust as an unconstitutional program. Thus I think the technical constitutionality (or even strict legality) of the program is more of a detail than the core issue.

My suggestion of giving keys to each member of congress is simply for Apple to force the issue into the most democratic institution, effectively giving any member of congress veto power against unlocking the data. While this doesn't fit the exact delegation of powers that we're used to for such things, it does address the terrorism/kiddie-porn argument that Obama recently made... in other words, it addresses the meat of the emotional argument that our leaders are making.

>But we're a long, long way from enforced smart contracts which do things like handle succession of new members &c. I can't wait until we're there, someday, but it'll take a good long while.

I agree, but I think it might be a good strategic move for Apple if things start to look bad for Apple offering an unfettered secure-hardware / strong encryption platform. Apple could then claim that it had offered enough of a back door to prevent any interim attack should one occur.

The government's strategy is to leverage public outrage about any attacks (large or small) to get back doors into everything. Apple does not want to be accused of stonewalling if an attack occurs and the government subsequently claims that decrypting one or two phones a few weeks ago could have prevented it.


It seems as though the tech community (myself included) uniformly agrees that the FBI's requests are unreasonable.

Is there someone with a sound technological understanding of encryption that thinks we should have some back door / key escrow / master key? I've seen that Fred Wilson and other USV partners seem to think the FBI's requests are reasonable, and usually I trust their analysis. But this whole thing just seems like such a bad idea.


Rest of the world will just do encryption in their hardware/software. Good luck to USA tech companies if this law will be enforced.


This sounds like the equivalent to the TSA approved locks for luggage. Just decoration and no real protection at all.


The funny thing is that if you transport firearms you CAN'T use TSA locks. Which I find very funny, it is as if they are saying we don't trust our own people.


Keep an eye on the would-be profiteers. Why would VCs like Fred Wilson land on the side of the FBI and parrot their position despite certainly knowing better? Because there are billions of dollars of government investment money being lined up to implement the wishful thinking key escrow and other back-door schemes. Even if back-doored encryption is doomed in the market, co-investors will be rewarded.


As naive as it may be, I had never considered this. I think Fred is a trustworthy enough guy that he may not consciously be thinking about profiteering, but I can't figure out any other reason a tech investor would support back doors.


Shrewd plays or profiteering from idiocy... call it what you like. Look at the people on this forum who know a great deal about security. Some like to portray themselves as thoughtfully open minded - that they are seeking a solution. They make respectful noises about legal traditions. They know it boils down to exhuming the corpse of key escrow and rebranding it. But they are in a position to profit.


The FBI director is hanging out on a very long limb and sawing away. Even if you mandated in the US that everyone only use HTTP and no encryption everyone will just use it from other countries. Even if one country on earth has one competent programmer implement one secure framework everyone will use it. This is beating a dead horse long after its been cremated.


All you have to do is look at Korean banking to see that Rube Goldberg machines with questionable security practices can easily become standard with the "correct" legal apparatus in place.


Schenier, Hal Abelson, Adelman (from RSA) already wrote about key escrow almost 2 decades ago. From the text:

> All key-recovery systems require the existence of a highly sensitive and highly-available secret key or collection of keys that must be maintained in a secure manner over an extended time period. These systems must make decryption information quickly accessible to law enforcement agencies without notice to the key owners. These basic requirements make the problem of general key recovery difficult and expensive -- and potentially too insecure and too costly for many applications and many users.

> Attempts to force the widespread adoption of key-recovery encryption through export controls, import or domestic use regulations, or international standards should be considered in light of these factors. The public must carefully consider the costs and benefits of embracing government-access key recovery before imposing the new security risks and spending the huge investment required (potentially many billions of dollars, in direct and indirect costs) to deploy a global key recovery infrastructure.

https://www.schneier.com/cryptography/archives/1997/04/the_r...


If the FBI gets its way on this, quite simply the US will no longer deserve the technology industry it currently has.

I for one would follow companies moving abroad to do the right thing and avoid these shenanigans entirely.

On top of it being the right thing, it is also in Apple's economic interest to fight this since the US is but one market. They also happen to have enough cash to flat out threaten to move all affected products out of the US and have them built by a non-American subsidiary.


The problem with these slippery slope arguments is that they start treating unlocking a phone as the equivalent to breaking network encryption when technically they're not at all the same. Apparently the FBI wants you to think they're the same too? If so, don't let them away with it.

Signing a software update with a private key you already have is using crypto as it was intended. We presume private keys can be kept secure with enough effort, or public key encryption doesn't work, https doesn't work, software updates don't work, game over. Any attempts to get private parties to turn over their private keys should be strongly resisted, but requiring them to sign something given a search warrant adds a procedural step that acts as a check on government power (they can verify that the search warrant is valid, minimize scope of the change, and fight it in court if necessary).

Key escrow is a whole different thing, where they require a whole new system to be designed to preserve keys that would normally be destroyed. It's hard to preserve information when the user wants to destroy it (they can block network traffic and destroy the phone), resulting in all sorts of bad effects on system design and new vulnerabilities.


Let's imagine they succeed. So all strong encryption will need to have the keys stored "safely" somewhere. Perhaps at some government-controlled server, because we all know they are super secure.

So, what happens once the keys eventually leak? When nation states AND terrorist organizations get the keys to unlock everyone's encryption?

If "think of terrorists" is the rhetoric here, what happens when THEY have access to our devices?


As much as I object in principle, it's an interesting technological puzzle. I've been wondering about a hardware-based solution:

* A fuse that when broken reduces the cost of cracking the security from 'impossible' to something an organization with large resources, such as the FBI, can do in a day.

* When the fuse is broken, a message is displayed to the user indicating it. At least, the device might not boot, tipping off the user that something is wrong.

_

It would meet these requirements (am I overlooking any important ones)?

* Nobody could mass-crack the devices. To crack it, you would need the device in your possession and a day of significant computing power.

* It would require a significant investment of resources, so it wouldn't be done for trivial issues.

* Users would know when their device has been cracked: It would have to be out of their possession for 24 hours and they would be notified.

_

The question is, could such a thing be implemented in a way that it couldn't be hacked (without great difficulty)?


See my other comment in this thread for a solution


I think an interesting approach that could be taken by Apple is to concede to letting the FBI have a master key, so long as they hold an insurance policy that covers the damages in the case of a key leak, including but not limited to the potential damage to Apple's brand and market value, and the same damages to all of Apple's customers that relied on their security.

That would force the FBI to reconcile the costs of maintaining a multi-trillion dollar insurance policy with the expected value of a potential reduction in terrorism. When it comes to the US government, money seems to talk louder than anything else...seems like it could possibly work.


I, for one, would gladly enjoy having to pay for the insurance policy in the form of taxes out of my paycheck to cover the stupidity of having a master key system in place, not to mention when the premiums skyrocket after said key gets stolen and all of our iPhones get breached. /s


That's exactly the point. Apple agreeing to their demands in exchange for a reasonable insurance of the potential risks would force the FBI to reconcile their unreasonable demands with the public.

I would imagine that the biggest cheerleaders of the FBI's side in the general public are Republicans. Can you imagine them ever supporting this?


Great, 10GB file filled with random data will be illegal in another country.


They're right to want that, and the public is right to refuse them.


No FBI. Hell, not that long ago I started the work to become an agent. I love law. I love bad guys being put away. But you have _ZERO_ need for an escrow on strong encryption. It is perfectly legitimate that citizens don't give up their secrets. This is why we have discovery, warrants, and the Tenth Amendment (however riddled with holes it now is). The government has no compelling interest here.


How is this even news anymore?

It is utterly and completely unsurprising that all the government agencies involved in law enforcement on all levels will fight a never-ending fight against anything that impedes their ability to do what they think their job is.

They don't care about privacy or any of that, and see restrictions to getting full access to all data as we see bugs in our systems that keep them from running correctly.


News doesn't have to be surprising to make headlines.


Nothing stops anyone from developing software that can be installed on mobile phones to encrypt this or that. This will be a type of Cold War between those with something to protect and those who want everything transparent.

I will choose to encrypt outside of what they OS does and to hell with every other idea.

Besides, all one has to do is encrypt, use one-time pads and keys are largely irrelevant.


It's not clear to me where this article gets the idea of what the FBI wants. I haven't seen anything from the FBI or from Comey that indicates that they want to place limits on future cryptographics systems.

It even seems like Comey is aware that the particular technique he is asking Apple to use will not apply to future phones made by Apple, and that the reality is that cryptography will soon reach the point where the FBI will not be able to rely on decrypting data as part of their investigative approach.

Yes, if Apple creates this exploit, then that exploit will potentially be available to other state actors and criminal enterprises. But only for iPhones older than the 5s. Fundamentally, as other commenters have pointed out, this is not creating a backdoor -- this is using an existing backdoor to install a bigger backdoor. It is, of course, possible that in the future even newer iPhones might find themselves vulnerable to unauthorized decryption, but really, the FBI's ask in this particular case really is narrow in scope, because it does only apply to older phones.

I don't see the slippery slope here that many people seem to think exists. The slope begins and ends with a phone released in 2013. All any person (criminal or otherwise) has to do is buy a newer iPhone, and then this whole discussion no longer applies.


The issue isn't a technical one as much as a legal one. If the FBI can get the government to interpret the All Writs Act the way they want (any private entities must do what the govt says to aid a criminal investigation unless explicitly forbidden by law) to force Apple to write a custom version of iOS for them this time, the fear is that they can use the same law to install backdoors in ongoing versions. Legally there would be precedence for Apple to have to allow investigators to access other phones as well by means of making Apple write code to do so, which in the future could mean tracking location, turning on microphones and cameras, etc.


> the fear is that they can use the same law to install backdoors in ongoing versions

There is no reasonable interpretation of the All Writs Act (which regards subpeonas) that could be interpreted to force Apple to preemptively make their OS insecure. If they include a backdoor in future versions of the iPhone, or if the FBI discovers a vulnerability, then it is entirely possible that they could use the same precedent to force them to open the backdoor for them, or even give them a metaphorical prybar for the backdoor.

But the point is that Apple is rapidly moving towards (and in their opinion, has already achieved) a hard stop -- they no longer possess the technical capability to break a locked iPhone after the 5s. For a specific case, and a specific subpoena, there is no work that Apple can do to comply with the subpoena.

And, like I said, it seems clear that Comey understands this, and is not asking Apple to weaken security in the future, and I see no indications that he is asking for that.


Exactly. The slope begins with a phone released in 2013, but it doesn't end, ever - unless we can set the precedent now that Apple doesn't have to do this kind of thing.


The slippery slope fear is a bigger than which particular phone models will be made technically insecure by what the FBI is asking here and now in this case. The larger fear is the precedent that will be set by allowing such a ruling to stand--the strict legal precedent for cases much like this one, but also the momentum it will give to large agencies in the cultural fronts of the war against encryption.


I do understand what you are saying and it is precisely because of Director Comey's public statements on the matter that I called him an expert statesman in the article, which he is. The legal precedent set in this case when Apple is forced to comply will invariably change how companies implement cryptography within their systems.


Many people want gun laws to change. For better or worse there will still be millions of guns on the street. So if you don't want "bad guys" to use guns or encryption it would be pretty impractical to go back in time and prevent these technological & regulatory forces that provided the means for them to exist,


Washington Post published TSA Master Keys

https://www.schneier.com/blog/archives/2015/09/tsa_master_ke...


Ah, the twice-per-decade re-run of the Clipper Chip fiasco. I wasn't aware we were already due for one.

https://en.wikipedia.org/wiki/Clipper_chip


Is this equivalent to FBI residing in your phone ?

https://en.wikipedia.org/wiki/Quartering_Acts#Modern_relevan...


Watch the first video under 'further reading'. Many of those congressmen seem very biased and uneducated while deliberately misconstruing basic knowledge and law.

Edit: wanted to commend congressmen Mr. Issa, Ms. Lofgren and Mr. Johnson.


We need to have the FBI's entire internal records copied and held by a third party in the judicial branch, so that they can be examined under court order when necessary.


I remember the government making the same arguments when PGP came around.

All this has been ignored before and all this will be ignored again.


I look forward to seeing how the FBI plans to prevent people from all over the globe using PGP.


Yeah, I bet having to get a warrant to search houses makes their job harder, too. I:


Every local police department will be able to break strong encryption at their own discretion. This is not a conspiracy theory, this is an inevitability.

Let that sink in before we consider the validity of their proposal.

Their escapades will not be limited to criminal investigations, but whatever they want. There will be no oversight and access will be unlimited.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: