The security of encryption is similarly proportional to the security of keys. The fewer things you have to secure, the easier it is to keep them secret. The “master key” concept in New York only served to create something of great value that people wanted to acquire, and massively increased the risk when that fell into the wrong hands. Obviously the same thing could happen with an encryption key, except it is worse because you don’t even have to be in the same country as the source of the key to acquire it or use it.
> "This is a serious security breach," said Councilman Peter Vallone (D-Queens), who heads the Council's Public Safety Committee. "We know terrorists are planning to attack our subways, and the MTA and NYPD better find these magical morons quickly, and then make them disappear for a year in jail."
Like the only thing between terrorists and the subway platforms is that they can't afford ticket, so they have to go through the staff gate.
* We can have strong encryption just for the good guys
* We can have master keys that only approved staff will use
* We can block all the bad things on the internet and it'll be like they don't exist
* If we have a back door into an encrypted device, only the good guys will use it
The list is so damn long, yet people who really should know better get it wrong with depressing regularity. Are most politicians really that stupid? What are their advisers advising FFS?
* If we reward schools for increasing student test scores, then we'll have better schools.
* If we fund a "war on drugs", we'll reduce the damage drugs do.
* If we enact rent controls and mandate the construction of below-market rate housing units, it'll help people afford housing.
This belief - that "having a goal and doing something that pattern-matches to helping" works - is incredibly dangerous in policy-makers. It's also incredibly difficult to fix, since the incentives for politicians are to make rationalizations that are convincing to voters, and that kind of reasoning is much easier to convey.
Though also, do note, if you apply systems thinking to legislators, establishing that they're "doing something that pattern-matches to helping" is exactly in line with their current incentives. You need much more direct oversight to give them an effective feedback loop for the consequences of their actions, rather than just the "optics" of the actions themselves.
Things like https://www.trudeaumetre.ca/, but done by an opposition-minded third party who won't give them an inch but won't like like campaign ads, would be a good start. Ultimately, something like prediction markets on the upholding of each elected candidate's campaign promises during their tenure would be amazing, in the sense that sufficiently-large bets would be a form of lobbying without lobbying.
The best ideas I have for fixing political structures are weird utopia-ish things that you can't get to from democracy. Things like government-by-prediction-market, which would distribute decision-making to people who think they have a better idea than everyone else and pay for performance.
To be fair, this does in fact help many people afford housing. It just doesn't fix the endemic issue.
Dense housing is affordable housing. When there's an empty apartment that you're moving into, where do you think the old resident moved to? There's only a few options - a new place got built that they moved into, the old resident died, the old resident moved out of the housing area, or the recursive option that cashes out into one of the previous ones. If you stop building units ("affordable housing" mandates) or arbitrarily keep low-income residents from making housing bids (rent-control), the only bids for housing are going to be from tech workers and market-rate housing gets ridiculous.
Fundamentally, the problem is that there's X people who want to live in the area, and only Y housing units. The price is going to go up until the market clears. Handing out political favors so that the people who vote for you don't feel this reality isn't solving the problem: only building more housing will.
I don't think this is the same problem you would address by rent control.
> Handing out political favors so that the people who vote for you don't feel this reality isn't solving the problem: only building more housing will.
And yet, there are many people living in rent-controlled housing who would disagree with you. The person and problems you're painting do not exist outside your head.
Of course not. Rent control is fundamentally a tool for getting political benefits at the expense of economic ones. It explicitly creates winners in the voting district that implements it at the cost of losers everywhere else. It's in the same moral class as dumping toxic waste into a river.
>And yet, there are many people living in rent-controlled housing who would disagree with you.
I'm not at all surprised that the beneficiaries of political favors support handing out those favors.
I'd love to hear the argument for this. Again, many people living in rent-controlled houses would disagree. Just because people are poor you cannot write them off as votes.
>Economists are virtually unanimous in concluding that rent controls are destructive. In a 1990 poll of 464 economists published in the May 1992 issue of the American Economic Review, 93 percent of U.S. respondents agreed, either completely or with provisos, that “a ceiling on rents reduces the quantity and quality of housing available.”
That's the distinction that drove me to make the political versus economic benefits distinction. Rent control as a whole makes things worse - there's less housing, it's of worse quality, and it drives up the cost of market rate housing. On the whole, it's a bad allocation of resources: we'd much rather have cities with more housing that's of higher quality (because people are more than willing to pay for it). The problem is that this sort of policy creates winners and losers, and the losers are often the ones who get to set local policy.
There's two solutions - have a ban on rent control at a higher jurisdictional level (Washington State does this), or bribe current residents to lift rent control. The latter should make everyone better off - we pay for keeping the working class viable with the overall economic gains made by lifting rent controls.
Anyhow, my overall point is that we should hand out the who-benefits stuff without making society worse off as a whole, simply because the locals who happen to live in an area are able to hold overall policy hostage.
The economic gains from lifting rent control are going to go to slumlords, not just get dispersed to everyone. Also, housing isn't an elastic supply market. when rent goes up it doesn't necessarily mean the quantity of housing will increase as a result since housing is often limited by zoning, drainage, and many other things than profitability of rentals. Also when you consider the destabilizing effect of people having to move every few years due to rental price fluctuation, it's not really a clear cut benefit to "bribe current residents to lift rent control" if that were even an option.
>we should hand out the who-benefits stuff without making society worse off as a whole, simply because the locals who happen to live in an area are able to hold overall policy hostage
but this is how democratic communities work... local communities decide what is 'making society worse' where they live; as they should, since it obviously effects them the most.
Local optimization is not the best strategy, even according to the those doing the local optimizations. Both participants in a Prisoner's Dilemma are better off if they hand their decision-making over to a third party, conditional on the other person also doing so. The Bay Area as a whole wants affordable housing, they just don't want to make the Bay as a whole better off by making unfair sacrifices in their neighborhood.
It's a classic coordination problem, caused in large part because decision-making is too local compared to the regional benefits of housing construction.
Suppose a terrorist did something with his shoe that he wasn't supposed to. Here is how it should be solved:
"Based on our research, the probability of dying because of the terrorist attack is XXX, which we can reduce by YYY using blah-blah, therefore we should allocate ZZZ funds in that area."
Would you vote for this guy? Well, most people won't even understand what he is talking about.
So here is what happens in practice:
"Okay, I don't know how to decrease national debt or prevent poverty, so let's find something else. Oh hey, terrorism! I know how to solve it. Forbid shoes at the airport. Sounds simple enough. So we can say 'we successfully fight terrorism'. And just in case people don't care enough, make this issue the most important one and add 'hey, think of the children' into the mix."
So the general population becomes scared of terrorism 'cause guys running for office told them they should be.
It's not politicians who are stupid, it's people who vote for them.
• We can have guns just for the good guys
• We can have guns that only approved staff will use
• We can block all the bad people from having guns and it'll be like guns don't exist!
• If we have guns, only the good guys will use them.
Seems like any dangerous technology can follow this mindset. :P
It actually is "we know bad guys already have guns, so better if good guys have them too".
Same with crypto, btw.
You actually can't assume that anyone is always going to be the "good guy", or that anyone else will always be a "bad guy".
And if you could program a smart gun to have a sense of morality, capable of judging between appropriate and inappropriate uses of lethal force, you no longer need a human to carry it around, do you?
If the bad guys can't have guns, good guys can't have them either, because sometimes they are exactly the same people in different circumstances. That's the big hole in good guy vs. bad guy reasoning.
Cops without guns:
Cop with gun:
Only one British police officer was killed in the line of duty last year - PC David Phillips, who was run over during a pursuit. In the same year, eight American officers were killed by vehicular assault, 36 were fatally shot, three died after being assaulted and two died of accidental gunshot wounds. In 2014, no British police officer was killed on duty.
Even accounting for the difference in population and the prevalence of firearms, there is a substantial disparity. Britain does have armed criminals, but our police are not routinely armed.
There were riots across Britain in 2011, precipitated in large part by the fatal shooting by police of Mark Duggan. Only one other person was shot by the police in 2011. In the same year, the FBI estimated that 400 Americans were killed by police officers, but no official count exists.
There is considerable interest amongst US police forces in learning from the British approach, as shown in this documentary:
You are not selling me.
They aren't stupid, they just don't share your interests.
More likely they know their audience (most voters) don't know about crime, terrorism and encryption to know how bad the plans are.
Obviously, terrorists must always break all laws in the process of committing terrorism. They have to illegally park their cars, jaywalk to the location, commit some acts of public indecency on the way, litter at random, and pop their heads into a crowded theatre just to yell "fire!"
So, clearly the solution is to JAIL the users of the keys, instead of actually replacing this with a better system.
With a physical master key, all it takes is one possessor of a copy of the key to agree in order to get access to what the key is protecting, or compromising one possessor.
With digital data, such as an encryption key, an escrow system could be combined with a secret sharing system so that you need to gain access to multiple shares before you can get the underlying encryption key. With a sufficient number of shares required, distributed among a large enough number of independent escrow agents in different legal jurisdictions, the probability of someone getting your key illegitimately via the escrow system can be made arbitrarily small.
Even if we could somehow guarantee absolutely that the backdoor will only be used by the intended users, backdoors are still not acceptable. For much the same reason that it's not acceptable to put a camera in every home or car, even if the footage were "only" accessible by the government with a warrant.
I'm not sure the camera analogy will carry weight. It sounds hyperbolic to most people as it tries to equate actually being recorded in your private affairs to the possibility of materials being accessed when a warrant is issued.
Most people have no problem with law enforcement accessing evidence when a warrant is issued. The only way to win over the general population is to demonstrate encryption is fundamentally different then simple access.
Except when this approaches towards zero security. The line is so thin, and actual expertise is needed to discern this sweetspot. I've seen entire corporations use apparently bulletproof security (Think Google's data centres), but fail to use DNSSEC or background check their security guards.
Avoid weak links like this, they are bad for business.
Of course. These are upstanding FBI agents we're talking about. There's no way one would ever cheat or steal or blackmail anyone.
This has been tried, and works about as well as you might expect: https://theintercept.com/2015/09/17/tsa-doesnt-really-care-l...
The FBI simply wants the digital world to mirror the physical world and a 100% unbreakable physical lock is almost impossible to produce in the physical world. It is easy to see why they would want that to be true in the digital world.
[EDIT for added link]
What if the safe has a self-destruct mechanism for the contents in case of breach?
Modern encryption however is changing the game. It's no longer true that security is just a matter of perception. It's becoming a reality and that makes for an interesting and bold new world we're entering.
What did they do?
If you implement key escrow and it's public knowledge that encryption systems that implement it are useless then people that actually want to hide stuff will simply use GPG and other uncompromised systems.
The only thing this sort of system is good for is enhancing the reach of the surveillance apparatus. That is, spying on innocent people. As for why they want to do this.. no-one knows but it's awfully concerning.
The tech community tells itself that it won the first "crypto wars". You cannot win "wars" against governments in that sort of sense and the first crypto war was never actually won at all. I think in light of events in recent years we need to reinterpret the events of the 90's in a new light - the tech industry didn't win, rather, after realising how awful and worthless the software the cypherpunks produced really was, the government simply got bored of playing.
Nobody, and I mean nobody, gives one tiny shit about GPG. GPG is so bad, such truly unusable software, that terrorists would literally rather die or risk lifetime imprisonment than use it:
This is especially true because often people don't meticulously plan crimes out ahead of time: they either commit crimes of passion, or they make basic mistakes. So if you have to plan ahead and convince not only yourself, but all your accomplices, all to install some exotic and awkward to use piece of technology ... well, a lot of bad guys won't do it.
So. If the FBI succeeds in breaking the encryption used by Apple, Google, Microsoft, Twitter, Facebook and a few other big names, then they've got 99% of the guys they want.
Maybe it still misses the point because the FBI doesn't actually care about hitting hard targets.
If that is the case that is pretty sad.
Average crimes can be solved with average tools, we shouldn't be authorising access to phones and other electronic intercepts or access without crimes that go beyond average.
You could argue the San Bernadino case was beyond average, and you would probably be right. But the perps knew that too, that is why they destroyed the phones after they were done, chances are they took other measures too but no one will know as they destroyed the devices.
What is clear is not that these laws wouldn't make their jobs easier - they almost certainly would. But they aren't needed and that implementing them would have 0 effect on the actually hard targets that they in theory would be useful for neutralising.
If Signal weren't available from any of the major app stores, such that it didn't work on iOS at all and didn't work on Android devices without turning on the intimidating option to allow non-Play-Store apps, how much usage do you think it would get compared to today?
Making real security hard to get would cause far fewer people to actually use it.
Of course, this is also the technology that 99.9% of U.S. federal staff use too. Probably more like 100% counting personal use of technology.
Across the board, enterprise security today is dependent on the security of consumer technology. Unfortunately, a lot of people in the federal government don't get this; they still think there is a difference between "secure federal technology," and what every else uses.
There's not even a Cabinet official charged with enterprise security for the federal government, so there is no one to even speak up. The FBI and intelligence chiefs are all very high-profile, and they are primarily concerned with access, so the federal conversation seems skewed. Which it is--it matches the skew of the federal mindset in general. The best technologists go to the breaking and entering teams. Meanwhile OPM loses 22 million accounts.
By analogy the lawyers can outlaw six egg omelets with butter and orange flavor. So, then McDonald's won't be able to sell them, but I can still make them in my own kitchen.
The lawyers can outlaw strong encryption on products from Apple, Google, Microsoft, etc. and, then, crooks who use those products can more easily be caught, and that will amount to nearly all the common crooks. But I can still get some simple, open source C code for some simple command line RSA or PGP de/encryption and use it for secure communications with others who do the same. And serious people will, and likely do.
I.e., just get the open source code for RSA from Schneier's book or look at the open source code in Zimmerman's PGP. Or just read Schneier's book and write your own code and make it open source for yourself and all people you want to communicate with.
So, to send an encrypted message in a file, from a smartphone, tablet, laptop, desktop, etc., copy the file to an old computer, if only via diskette, running PC/DOS and never connected to the Internet. Run the command line C program for encryption. Get the output file, in just simple base 64. Then copy that file to the smartphone or whatever and send it, with no attempt at security. Done.
This way, it doesn't matter what Apple, Google, Microsoft, do/don't do since they are just moving base 64 gibberish that is perfectly safe even if printed in the NYT.
The command line programs? Easy enough for middle school children to use. Simple.
Math 1. Lawyers 0.
Now what is there to argue about?
So, all this stuff about the FBI is just the village idiot playing public pocket pool, right?
They still know who you are communicating with. There are ways to communicate without disclosing with who you are communicating.
If they manage to write a law such that it bans strong encryption from the App Store, that will achieve their goal. But any other scenario is at best a temporary gain.
The issue I take with the FBI approach is it will have no effect on those that have stuff to hide but destroy any semblance privacy for those that don't.
Terrorists will use GPG, citizens will use their backdoored iPhone full disk encryption and everyone but the terrorists lose.
Keep in mind that the FBI would love it if everyone held that to be true. We all have something to hide, average person or not. No matter if you are a criminal, political activist, pervert, lawyer, priest, or just the baker down the street. Paraphrasing John Oliver: your banking statements, medical data, dick pics, private messages, dick pics, dick pics, and your secret diary are all things you most likely want to hide.
( https://www.youtube.com/watch?v=zsjZ2r9Ygzw )
The FBI would not mind a situation where only people with technological know-how use GnuPG, LUKS with dm-crypt, etc., and the rest whatever came with their smartphone. Most criminals are not particularly tech-savvy, so if that group loses access to strong encryption by default, they largely gain what they want.
Of course there is no practical way to make the use of strong encryption by individual citizens around the globe illegal, and they know this. They may however succeed in outlawing strong disk encryption available by default on store-bought devices. So when a suspect uses an Android or IPhone, and the FBI has his or her device, they want to be able to access its contents without the suspect's consent. Ideally, they also want the most popular messaging platforms (like Facebook's WhatsApp) to have a backdoor available, in order for the vendor to be able to comply with warrants for such data.
The people that need the protection that ToR provides paint a target on their backs because there isn't enough ToR usage for them to be inconspicuous. Which is sad.. because for all of the bad usage of ToR there are people that depend on it to preserve free speech and any weakening of it could easily get them imprisoned or in many cases executed.
I gave some sources in https://news.ycombinator.com/item?id=10582206 that might change your mind. Changing the default makes a difference.
Apple could make it easy for a user to install such a library and then say (truthfully) that the cryptographic functions of their OS is not in their hands, since that feature is handled by a third party open source maintainer.
At that point it would be an infinite game of whack-a-mole for the FBI to try to get backdoors in open source crypto interface libraries which could be maintained outside of the US.
How? Keep the de/encryption software simple, dirt simple, just open source C code, run as a command line program on, say, an old PC/DOS system with no hard disk. For encryption, just put the file on, say, a smartphone, to be encrypted on a diskette, give the diskette to the PC, erase the orignal file from the smartphone, have the command line C code on the PC do the encryption and write the results as a base 64 file to a diskette, and, with no effort at all at encryption or security, let the smartphone read the diskette and send the file. Simple.
Just keep it simple, just dirt simple, really small, open source code.
The de/encryption has to be, what, just a few loops in C, in a few hundred lines of code? The rest is just dirt simple C file I/O just one byte at a time? So, very much do not want some 100,000 line app with the de/encryption buried deep inside somewhere. And want the de/encryption run on some other device, maybe an old PC/DOS machine with no hard disk and no network connection. Or get a Raspberry Pi running some simple operating system and where can be sure that no important data will be left on the computer.
The Encryption Tightrope: Balancing Americans’ Security and Privacy
There was another one that I watched back in June, but I cannot find the video at the moment. Should have bookmarked it!
In the rietta.com article, I linked the this hearing, starting with Susan Landau's testimony starting at https://www.youtube.com/watch?v=g1GgnbN9oNw&t=3h35m50s. But there is more must watch portions of the hearing.
I like GnuPG and I use it regularly with work as does my team. And some of our clients do too, but not nearly enough. But my mom and dad are never going to use it. They do both use iPhones though. So being able to protect them is important.
Anybody here want to run for office and be a voice for tech rights?
Part of me hopes that there will be a significant data leak from whatever the NSA has stored. Maybe they'd realize that single point of failure sucks.
We should be careful when the FBI is advocating a capability to help investigate events which occur vanishingly rarely (and for which they have many other tools and avenues) vs the integrity of infrastructure which affects a wide swath of public security. When one looks at the entire balance, requiring a backdoor has a high chance that it would lock us into much less secure state of affairs than if everyone is free to pursue and apply the best privacy/encryption measures possible.
This has already happened. An attacker going by the name Guccifer spent several years going around punking high ranking officials and ex-officials. Remember those goofy paintings of GW Bush's that hit the news a few years ago? Those were stolen and posted by Guccifer.
> your vote and influence are already bought and paid for?
Overly cynical. The standard whipping boy for 'buying politicians' is Big Business, and Apple certainly qualifies as that. In fact, this is an affront to essentially every big business in the world with IP to protect.
it really isn't, because many of the biggest business want a greater degree of population control, and having access to all of every individuals info is a means to that end.
Big business is on both sides of this conflict.
disclaimer: bernie supporter, and not a supporter of encryption backdoors
> “The head of the FBI came out with this recently. He says, ‘Oh, we’re going to ban encryption.’ And it’s like we want to build a backdoor into Facebook and a backdoor into Apple products,” the presidential hopeful said at the Yahoo Digital Democracy conference this week. “A backdoor means that the government can look at your stuff, look at your information, your conversations. … The problem is, is that the moment you build an opening — and I’m not an expert on coding or anything — but the moment you give a vulnerability to a code that someone can get into your source code, not only can the government, but so can your enemies, so can foreign governments.”
However, it isn't surprising that the FBI or similar agencies are full of egocentric people who believe they hold the power in the relationship. It seems to attract these types of people, really.
The better way to approach this issue, long term, is from a legal point of view with an interim state where encryption holds us over. That is the law decides who may or may not own or access a certain type of data with penalties upon tort or criminality. And we develop civil protocols for days governance between people and between people and governments.
Like trademark. You could have it so trademark, i.e. authentication, is protected by mathematics, or you can have it protected legally.
Personally I don't believe the answer to data theft or surveillance is more mathematics in the form of encryption, but sensible laws regulating data its, use and access with penalties for transgressing. Obviously this would require international cooperation and would be a long way off and in the interim we'd need encryption to protect against unauthorized access until we reach that state of data governance. But ultimately the answer is not "make everything s black hole".
We don't protect against thieves by building impenetrable houses, we rely on legal instruments to dissuade burglary.
Once someone has your data, you don't know what they're doing with it and neither does the government. Which means they have to be prevented from getting it in the first place, which means encryption and laws that encourage and facilitate encryption.
Because it existed before the invention of public key cryptography and is now permanently entrenched. If you think you can fix that, go do it and then make this argument after nobody is using SSNs anymore. Also, your argument for not deploying cryptography is "we should solve that problem cryptography would solve if it was more widely deployed"?
> Why should my medical records have so much value? Medical records have value mainly because it can lead to discrimination, so the solution to that is remove the value of discrimination (job, medical care costs, etc.) based on medical conditions.
You say "the solution" like all we have to do is snap our fingers and people will stop discriminating based on medical conditions even though doing so is highly profitable. The way the laws against that type of discrimination work is by preventing the discriminating party from obtaining that information.
Also, good luck passing or enforcing a law that says prospective mates can't discriminate against you based on your medical or mental health records. To say nothing of the outright violence that would result if the names of women who get abortions became known to the wrong people.
If a proposed system requires changing the world to make it feasible. Well, probably not the best starting point.
What if criminals are willing to break those laws? (That's kind of the definition of criminals, after all.) Who cares, you say, because data isn't sacrosanct? Well, some of the data we'd like to protect is financial, and criminals can use it to steal my money, so I care.
What if foreign governments are willing to break those (US) laws? Again, who cares, you ask? Well, if I'm a company facing foreign competition, and the foreign government is willing to do economic espionage to help their companies, then I care. And if I'm the US government or military, I definitely care.
What if the US government is willing to break those laws? It'll never happen, you say? Read some history. It's happened before, and it will again.
The law only protects me against people willing to obey the law. I also need protection against those unwilling to obey the law.
Here is the source code for cryptsetup:
Here is the source code for GnuPG:
I have these files on my hard drive.
There are probably cryptographers that can recite the RSA/AES algorithms from memory.
We have the Internet and general purpose computers.
The only way you're getting rid of encryption is destroying all of that. I don't know why people feel the need to resort to arguments about economic damage, civil rights, whatever.
This is the fucking _Internet_ we're talking about. This isn't some biscuit tin with a particular pattern that Uncle George really, really likes. It's the god damn Internet.
You want to destroy one of the most beautiful creations humanity has ever seen, in the name of what? Stopping a few marathons being bombed?
Is the entire government clinically insane?
Would they turn the sky green if it gave them more power?
Am I still living in reality?
We're moving into a new era now. All it may take is a single attack in the US to drive the legislative and judicial branches to roll back all the fantastic improvements we've seen over the past few years.
The little command line programs on PC/DOS? Easy enough for middle school students to use; when that was state of the art computing, middle school students did use it.
I'm sorry, but I don't think pushing encryption down into the underworld is a viable solution to the problem. There's no limit to the bad laws that the government can pass. The only real long-term solution is to recognize encryption as a right, otherwise we'll only keep seeing these repeated attempts to outlaw it.
IMNAL, but my understanding is that such a law would run into rock solid, granite hard, iron clad parts of a little issue called the US Constitution. E.g., if the cops ask you a question, then you don't have to answer. The person's lawyer can just tell the cops that "My client has no idea what that base 64 gibberish is."
For encryption as a recognized right, no, that's asking a bit much of the US political system.
BTW, for the person receiving the base 64 code (that's the way JPGs, etc. are sent in e-mail), first go through base 64 decoding and, then, apply the receiver's private key to that to decode back to the secret message, e.g., where and when the boy and his girlfriend are going to meet and carve their initials on a tree.
Base 64 is in the internet standard for e-mail and there is called MIME for multi-media internet mail extensions. So, the idea of MIME is to permit sending pictures, audio, movies, etc.
So, in arithmetic, base 10 has digits 0-9, that is, 10 digits. Base 16 has, right, 16 digits, 0-9-A-F. Base 2 has, you guessed it, 2 digits, 0-1. Well, presto, bingo, base 64 has 64 digits, 0-9, a-z, etc., all simple, ordinary printable characters such as e-mail had been sending right along.
Well, with 6 bits, can count from 0 to 63, that is, have 64 different patterns. So, given a stream of bits, can replace each 6 of them with one of the base 64 digits. And there is a simple solution for what to do with any few bits left over. So, that is how to take any stream of bits and 'encode' it to just printable characters easy to send via e-mail.
A huge fraction of all Internet data is sent as base 64. So, base 64 data alone is nothing suspicious.
"And here our intelligence network shows proof that your client has talked about this base64 gibberish in the past with other people, so let's add perjury to your charges".
But your point is valid, you have a right to not incriminate yourself in the US. The case with Apple, however, is that a third party you've trusted is being asked to breach that trust. The 5th does not apply at all.
Not to worry, however, as long as you don't communicate with anyone, you're safe. The moment you do communicate with someone though, you'd have to put your trust in them. And then the FBI could demand, from them, the conversations you've had. And then the 5th has no value.
The US Constitution hasn't helped prevent the PATRIOT act, or the TSA's unreasonable search powers.
> A huge fraction of all Internet data is sent as base 64. So, base 64 data alone is nothing suspicious.
There's a difference between Base64 that decodes into a harmless cat picture, and Base64 that's apparently random. Unless we make it normal for everyone to have encrypted, random-looking data lying around, the few that choose to have it will be increasingly harassed by the government, even if they're not doing anything wrong.
The FBI also wants to infiltrate communities of human rights activists. There are many reasons not to overly trust them.
"A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication." 
The FBI can lobby to change that, but that's not what they're doing. They're lobbying that OTHER laws (e.g. the All Writs Act) enable what they want. Frankly, that's absurd... how can a 1789 law overrule a 1994 law when you're talking about modern technology?
Ironically, the FBI is creating incentives for new tech startups to incorporate outside of the USA. If you're building a product which depends on reliable encryption in order to be valuable, why the fuck would you incorporate in the USA?!? You would alienate foreign customers who are suspicious of the US legal/surveillance apparatus. And you would be entering a murky legal landscape where it seems increasingly likely that, if your startup ever becomes big enough to be a target, the government will require some kind of key escrow or it will shut down your business or even jail you.
In the face of that much uncertainty, it seems like it would be asinine to incorporate in the USA.
Maybe there is space in the market for a business to commoditize offshore incorporation. Make setting up a Seychelles corporation as easy as setting up a US LLC. Build in as many legal protection mechanisms as possible, e.g. owning the corporation via a trust that you are the sole executioner of.
Like https://stripe.com/atlas except outside the US.
To most people arguments like "it will help us find terrorists and pedophiles" and flawed analogies with doors and keys are much more appealing, only because they are easy to understand, while the opposite arguments sound philosophical, alien or carry less weight because they contradict what "the experts" (i.e. the FBI) claim.
But they shouldn't sound so: what's happening is wrong not just because of the practical fallacies of the pro-restrictions arguments -in which most discussions focus currently- it is wrong because the only way for government monitoring to be effective in the end, is to outright criminalise secure encryption by everyone. It should be blatantly obvious that the most dangerous of their claimed target group, wouldn't be dissuaded by the inconvenience of using custom/non-friendly software/hardware, so any lesser measure would be just useless.
And on that premise, how can many people not see how wrong it would be if one day we are called criminals for exchanging a truly private message with someone? In what words and with what simple examples can you make non-technical people see how bad this reality would be and how far it can stretch to things that they do care about? And that even if it didn't, it would still be fundamentally wrong...
Not a rhetorical question by the way, I've tried to participate in such discussions and failed miserably to be convincing -so any tips are appreciated.
All this sort of tactic will do is result in irreparable damage to the tech sector and the US economy. No murders, rapes, child abductions, terrorist plots to destroy buildings, or whatever other specters they summon to haunt us will be prevented.
I guess the threat's technically not here yet, but seems like it's right around the corner. djb, pack your bags for the ninth circuit.
The problem is that we now know that the government has the goal of unlawful surveillance without oversight from courts, the legislature, or the public. There is essentially an ongoing "by any means necessary" attack on civil liberties.
Why should we think that the government is not planning to infiltrate the escrow services and preemptively capture all keys?
There has been a profound breach of trust (revealed by Snowden) and we must insist upon the rule of law and basic democratic transparency before we consent to any further risks.
I try not to be cynical but I am thinking that the trend we are on is leading to strong crypto being largely criminalized. I am hoping that our decentralized systems adapt to this threat and offer solutions that cannot be shut down (like Bitcoin and Ethereum).
Incidentally, if Apple seems likely to lose the battle over a back door, it ought to offer an Ethereum smart contract that will unlock one phone every day, require a key provided by each member of congress (with 100% consent required to unlock a device), and publish all unlock key requests on the Ethereum blockchain after a 30 day delay in case an investigation is in progress.
This protects against mass surveillance, but offers a very small back door with full transparency and no potential for large scale use (or abuse).
What's the point of using encryption if you're going to put the keys in the hands of some unaccountable entity which is easily hacked? You might as well not use it at all then.
Nothing Snowden alleges is illegal or unconstitutional, despite his and his supporters' repeated assertions. 'I don't like it' does not imply 'it is illegal and unconstitutional.' Neither the law nor the constitution forbids all bad things (nor does either mandate all good things, but that's another issue).
You are correct, of course, that one major issue is that States cannot be trusted to respect their citizens' liberties. Another is that States cannot be trusted to take care of their own data: an escrowed key will shortly become a leaked key.
> Incidentally, if Apple seems likely to lose the battle over a back door, it ought to offer an Ethereum smart contract that will unlock one phone every day, require a key provided by each member of congress (with 100% consent required to unlock a device), and publish all unlock key requests on the Ethereum blockchain after a 30 day delay in case an investigation is in progress.
That doesn't make sense, since the Congress is the legislative branch and it is the executive which actually does things. What could make sense is a smart contract which is unlocked by the judiciary.
But we're a long, long way from enforced smart contracts which do things like handle succession of new members &c. I can't wait until we're there, someday, but it'll take a good long while.
First, I did not claim it was unconstitutional or strictly illegal (since secret things can't be considered by normal courts or legislatures, everything is in a sense extralegal, which is itself a big problem).
There are protections against unlawful search and seizure. Seizure in this case is data capture, search is viewing by a human (with or without a warrant). There was not any law granting the NSA the power to do domestic surveillance unless there was some suspected behavior involving foreign nationals or international circuits.
But it's more relevant that our leaders repeatedly assured us that no such surveillance was going on. Blatant lies told to the public is as serious a breach of trust as an unconstitutional program. Thus I think the technical constitutionality (or even strict legality) of the program is more of a detail than the core issue.
My suggestion of giving keys to each member of congress is simply for Apple to force the issue into the most democratic institution, effectively giving any member of congress veto power against unlocking the data. While this doesn't fit the exact delegation of powers that we're used to for such things, it does address the terrorism/kiddie-porn argument that Obama recently made... in other words, it addresses the meat of the emotional argument that our leaders are making.
>But we're a long, long way from enforced smart contracts which do things like handle succession of new members &c. I can't wait until we're there, someday, but it'll take a good long while.
I agree, but I think it might be a good strategic move for Apple if things start to look bad for Apple offering an unfettered secure-hardware / strong encryption platform. Apple could then claim that it had offered enough of a back door to prevent any interim attack should one occur.
The government's strategy is to leverage public outrage about any attacks (large or small) to get back doors into everything. Apple does not want to be accused of stonewalling if an attack occurs and the government subsequently claims that decrypting one or two phones a few weeks ago could have prevented it.
Is there someone with a sound technological understanding of encryption that thinks we should have some back door / key escrow / master key? I've seen that Fred Wilson and other USV partners seem to think the FBI's requests are reasonable, and usually I trust their analysis. But this whole thing just seems like such a bad idea.
> All key-recovery systems require the existence of a highly sensitive and highly-available secret key or collection of keys that must be maintained in a secure manner over an extended time period. These systems must make decryption information quickly accessible to law enforcement agencies without notice to the key owners. These basic requirements make the problem of general key recovery difficult and expensive -- and potentially too insecure and too costly for many applications and many users.
> Attempts to force the widespread adoption of key-recovery encryption through export controls, import or domestic use regulations, or international standards should be considered in light of these factors. The public must carefully consider the costs and benefits of embracing government-access key recovery before imposing the new security risks and spending the huge investment required (potentially many billions of dollars, in direct and indirect costs) to deploy a global key recovery infrastructure.
I for one would follow companies moving abroad to do the right thing and avoid these shenanigans entirely.
On top of it being the right thing, it is also in Apple's economic interest to fight this since the US is but one market. They also happen to have enough cash to flat out threaten to move all affected products out of the US and have them built by a non-American subsidiary.
Signing a software update with a private key you already have is using crypto as it was intended. We presume private keys can be kept secure with enough effort, or public key encryption doesn't work, https doesn't work, software updates don't work, game over. Any attempts to get private parties to turn over their private keys should be strongly resisted, but requiring them to sign something given a search warrant adds a procedural step that acts as a check on government power (they can verify that the search warrant is valid, minimize scope of the change, and fight it in court if necessary).
Key escrow is a whole different thing, where they require a whole new system to be designed to preserve keys that would normally be destroyed. It's hard to preserve information when the user wants to destroy it (they can block network traffic and destroy the phone), resulting in all sorts of bad effects on system design and new vulnerabilities.
So, what happens once the keys eventually leak? When nation states AND terrorist organizations get the keys to unlock everyone's encryption?
If "think of terrorists" is the rhetoric here, what happens when THEY have access to our devices?
* A fuse that when broken reduces the cost of cracking the security from 'impossible' to something an organization with large resources, such as the FBI, can do in a day.
* When the fuse is broken, a message is displayed to the user indicating it. At least, the device might not boot, tipping off the user that something is wrong.
It would meet these requirements (am I overlooking any important ones)?
* Nobody could mass-crack the devices. To crack it, you would need the device in your possession and a day of significant computing power.
* It would require a significant investment of resources, so it wouldn't be done for trivial issues.
* Users would know when their device has been cracked: It would have to be out of their possession for 24 hours and they would be notified.
The question is, could such a thing be implemented in a way that it couldn't be hacked (without great difficulty)?
That would force the FBI to reconcile the costs of maintaining a multi-trillion dollar insurance policy with the expected value of a potential reduction in terrorism. When it comes to the US government, money seems to talk louder than anything else...seems like it could possibly work.
I would imagine that the biggest cheerleaders of the FBI's side in the general public are Republicans. Can you imagine them ever supporting this?
It is utterly and completely unsurprising that all the government agencies involved in law enforcement on all levels will fight a never-ending fight against anything that impedes their ability to do what they think their job is.
They don't care about privacy or any of that, and see restrictions to getting full access to all data as we see bugs in our systems that keep them from running correctly.
I will choose to encrypt outside of what they OS does and to hell with every other idea.
Besides, all one has to do is encrypt, use one-time pads and keys are largely irrelevant.
It even seems like Comey is aware that the particular technique he is asking Apple to use will not apply to future phones made by Apple, and that the reality is that cryptography will soon reach the point where the FBI will not be able to rely on decrypting data as part of their investigative approach.
Yes, if Apple creates this exploit, then that exploit will potentially be available to other state actors and criminal enterprises. But only for iPhones older than the 5s. Fundamentally, as other commenters have pointed out, this is not creating a backdoor -- this is using an existing backdoor to install a bigger backdoor. It is, of course, possible that in the future even newer iPhones might find themselves vulnerable to unauthorized decryption, but really, the FBI's ask in this particular case really is narrow in scope, because it does only apply to older phones.
I don't see the slippery slope here that many people seem to think exists. The slope begins and ends with a phone released in 2013. All any person (criminal or otherwise) has to do is buy a newer iPhone, and then this whole discussion no longer applies.
There is no reasonable interpretation of the All Writs Act (which regards subpeonas) that could be interpreted to force Apple to preemptively make their OS insecure. If they include a backdoor in future versions of the iPhone, or if the FBI discovers a vulnerability, then it is entirely possible that they could use the same precedent to force them to open the backdoor for them, or even give them a metaphorical prybar for the backdoor.
But the point is that Apple is rapidly moving towards (and in their opinion, has already achieved) a hard stop -- they no longer possess the technical capability to break a locked iPhone after the 5s. For a specific case, and a specific subpoena, there is no work that Apple can do to comply with the subpoena.
And, like I said, it seems clear that Comey understands this, and is not asking Apple to weaken security in the future, and I see no indications that he is asking for that.
Edit: wanted to commend congressmen Mr. Issa, Ms. Lofgren and Mr. Johnson.
All this has been ignored before and all this will be ignored again.
Let that sink in before we consider the validity of their proposal.
Their escapades will not be limited to criminal investigations, but whatever they want. There will be no oversight and access will be unlimited.