Backdooring stupid.crypt and forcing law abiding people to use it just insures that big badguys will use any other kind of encryption. All you've really accomplished is adding an extra charge of illegal encryption use at the expense of security for every human.
This potentially creates all sorts of pathologies. Is it illegal now for me not to update an old computer? If your backdoors are implemented in hardware, is it illegal to use old computers?
When people are against gun control, a common thread is "make guns illegal and only criminals will have guns." This argument has merit, but if we DID amend out #2 and make guns illegal, over time firearm proliferation would decrease.
Not so with encryption. Other, more free countries will constantly be developing better security methodologies, and reproducing those methods is effectively free. "Fuck up encryption, then only bad guys will have encryption" is a much stronger argument, because it's emphatically true.
The ignorant hubris of this is massively disheartening.
Yeah. There's no distinction whatsoever between encryption with backdoors and no encryption at all. Imagine our current web with no encryption. Your logins are all effectively plaintext; your online shopping is effectively plaintext; your emails are all effectively plaintext. "Furiously stupid" is a good way to describe this whole proposition.
Hmm, then wouldn't some people just make their own firearms, just as you are describing with encryption, right?
Given blueprints, (publicly available) or a template and accurate enough measures, a lathe, and a mill, anyone can make a firearm or parts for one in their garage.
Is there reading involved? Yes. But any argument you make w.r.t. The futility of illegalizing encryption is immediately portable to firearms manufacture.
Can you imagine asking every gun owner/computer owner to go to their local police station to surrender their guns/functional encryption?
That would be pretty spooky to me.
Not trying to make this a gun control debate, but for the longest time encryption was considered a munition, so it's not THAT non sequitur.
And, I think what you're adding here is that I've got an error in my statement that both parties will happily build their own firearms/encryption because the physical gun is harder to distribute than a copy of software.
And I agree in principle with this, until I realize that broad distribution of an encryption mechanism is exactly what a bad-acting government would want... crack once and everyone is compromised.
So, no, I think I would argue that its easier to distribute weapons than good, bespoke encryption.
And further, I would argue that if it is true for encryption, it is also true for firearms... that if they are outlawed, the power shifts to criminals as they will still use them.
The argument is a tautology, it can't be wrong! If guns ownership is a crime, then owning a gun makes you a criminal.
The tautology is compatible with the hypothesis that if guns were confiscated and illegal, eventually there would be a decrease in the amount of people getting shot. Probably an increase for a while as confiscation attempts resulted in agents getting in gun battles with people who don't want to surrender their property.
Whether the loss in life and liberty is worth the outcome is a matter of personal taste.
Even if that is true, "decrease" is not remotely equivalent to "eliminate".
The problem is that as law-abiding citizens, and those who have their weapons forcibly taken by law enforcement are left completely unable to defend themselves; while criminals are not completely unable to acquire firearms.
I keep seeing this "implausibility" of enforcing illegal encryption brought up, and I really think it's wishful thinking. If such encryption algorithms ever are made illegal in some manner, it will be trivial for the government to get the result they want.
It won't be about completely stopping people from using AES, nor will it be about imprisoning every person who continues to use it. What it will be about is turning "this target of our investigation is using illegal encryption" into an immediate cause for search/arrest warrant. And that will be more than enough for 95%+ of the purposes they're looking for.
It's stupid, sort of like a fourth ammensment onion router
Back doors are worse though - build a back door and it will be used, just not necessarily by the agency it was built for. There are a lot of groups with a lot of resources oriented around taking advantage of this, and few are legitimate. (and some are enemy nations).
There's a third problem - doing it in such a way that it can't be blocked from monitoring. (see "clipper chip" for more on that).
Also "technical solutions" makes it sound like the issue is in inventing the correct encryption scheme. Whereas in reality the issue exists because we have discovered (currently) unbreakable codes, and the invention of broken (backdoored) schemes does little to change that.
If we break all known forms of encryption, and find a reasonable proof that they are no longer possible, then I'll be more interested in this line of reasoning. And that's a pretty big if.
That basically means we have to entirely get rid of copyright, since all data (books, movies, software, corporate secrets, state secrets, etc) are just very large numbers.
Do we believe that there should be no restriction on the sharing of any data? I can see the appeal, but there are far reaching consequences if we say that.
What A.G. Barr is insinuating is to regulate algorithms.
Copyright is regulation of implementations.
For example, GPG is a software implementation of encryption algorithms. It has a copyright (used as the basis for its copyleft license). RSA, however, is an algorithm: a mathematical reality that can be described by copyrighted works, but never itself copyrighted.
A.G. Barr has expressed a desire to compel every American who implements that algorithm to do so incorrectly.
Words are just data. Are there illegal combinations of words to exchange? The law says, YES. Some speech is absolutely illegal, including making credible death threats, conspiring to break other laws, or disclosing certain state secrets to foreign powers.
Very few people argue that since words are easily available to everyone, that it is futile to make some combinations of words illegal.
Words uttered in a situational context that renders them of immediate harm are illegal. I can say "Fire!" in a theater while giving a lecture or putting on a show. I cannot knowingly claim the theatre is on fire when it isn't to cause a panic.
Point is, it is not the Word or content that is illegal. It is the union of word and context that is illegal.
Subtle difference, but it's the only thing that keeps that type of law from getting absurd and out of hand very quickly.
The number is not illegal, it’s the number in conjunction with a situational context that is illegal.
We may disagree with the intent of the law, but the argument that we are making numbers illegal, or math illegal, is parallel to the argument that other laws make words illegal.
Ok maybe you could catch me attempting to decrypt it, and be like "gotcha, that was in fact a secret!" But I'd reckon it would be more effective to simply wait until you finish decrypting the data, and simply take it from you.
If there are going to be laws around this, it's sure to be very pathological, and scary.
If the context around the number is that it is stored in a .mkv file with a name that looks like a Disney property, or a .key file attached to a program that uses such things for some kind of encryption the government unwisely bans, well, the number suddenly has context around it that makes an argument about the number and the context.
Same for words, really. Words about a threat to a government leader are probably fine in a text file that looks like a short story. Those same words in combination with a history of advocating violent revolution, &c. might make for a different argument.
In that sense, copyright = data, and encryption = functions.
All images are binary. All binary is just a number. We have made many such numbers illegal and even have software that will detect them and report you when you share the number with such number sharing services (dropbox, facebook, etc).
So making math illegal sounds entirely possible.
Math can be represented in a variety of ways, but the pattern being described is immutable.
What A.G. Barr is insinuating is not that we make implementations illegal, but that we make the use of algorithms categorically illegal.
The target here is not nerds able to pull code from GitHub or run open source or enterprise software. The target here is consumer stuff by companies like Apple and Google. The government doesn't want it to be easy to do end-to-end encryption.
For the average user, easy equals possible. The average user has neither the time nor the expertise to roll their own solution or run nerd tools. Look at how PGP/GPG's complexity and absolutely horrible UX (even for technical users!) has prevented e-mail encryption from ever taking off.
This reminds me of what a government guy told me about crypto export controls. Yes, they know that crypto export controls won't stop nerds using GitHub. What they want to do is to stop IBM, Google, Apple, Cisco, Juniper, etc. from selling ready-made polished crypto products to blacklisted countries.
In both cases I think the target is large corporations not individuals and the goal is to make crypto hard and keep it out of the hands of the average user or less-technical foreign organization.
That being said I still don't think it'll work. Just pointing out the thinking that's going on here.
The problem is that this either shows a stunning amount of ignorance or deliberate malice.
Let's just go back and consider that the government does not want the average user to have strong encryption. What is the play here? The average user is almost by definition not the bad guy, unless we consider the population at large to be criminals by default. Is the government trying to dragnet the entire population and keep everyone under the thumb for minor infractions? Because that's the only feasible target here. Barr can froth at the mouth, mad as the dickens, it won't prevent Bad Guys from using strong encryption. So his only feasible target is the (mostly) law abiding population.
The other point, preventing the likes of Google, IBM, Apple, et. al. of selling devices with strong encryption to blacklisted countries again shows either ignorance or malice. As parent wrote, encryption is just math. Are the government agencies so shockingly uninformerd that they think that in absence of secure IDevices, north korea will be forced to use backdoored technology?
The spread of physical goods can be controlled (to some degree), but the spread of information can at best be slowed down, but not stopped. Doubly so if there are already existing methods of secure communications that the government cannot efficiently crack.
The only conclusion I can come to is that they are well aware that they cannot catch any serious Bad Guy using mandated backdoors. Serious Bad Guys will use strong encryption anyway, they will cover their tracks and won't care what is legal or illegal (in the US). Furthermore, against targets like these, there are already time proven methods of infiltration, social engineering and good old fashioned bribery.
This only leaves the option of taking secure communications away from the population at large, perhaps because the government feels threatened from too many people being able to share ideas? I was never one for tinfoil hattery, so my hope is that I'm wrong.
If you have not handed your private keys over to anyone, they should be yours alone, but once you have uploaded your private keys to a coroprate cloud server, you may have to accept that law enforcement will be able to get warrant access.
This won't solve the problem for law enforcement, but it will make it easier to catch lazy people while preserving the option for full security for those who want to control their own data.
By banning 'the masses' from using encrypted communications, it'll sort the haystack and everyone who continues to do so can be profiled, plus they're already involved in illegal behavior.
Worse, for secrets we actually care about (nuclear codes?) we must still research proper encryption schemes since backdoors are admissions of weakness in a security protocol fundamentally as far as I've come to understand.
> We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement without materially weakening the security provided by encryption. Such encryption regimes already exist. For example, providers design their products to allow access for software updates using centrally managed security keys. We know of no instance where encryption has been defeated by compromise of those provider-maintained keys. Providers have been able to protect them.
This quote from the article seems to contradict itself. First it claims "... without materially weakening the security provided by encryption" then goes on to state "We know of no instance where encryption has been defeated by compromise of those provider-maintained keys" implying that there is a possibility of this kind of breach.
This whole thing seems like an oligarch's attempt to spy on it's people pretty plainly to me. Where is the liberty and freedom in this?
a lot of weight rests on those two words: "main problems". The main problems for the government are that criminal investigations are being impeded. By banning certain forms of encryption, they can criminally charge a suspect for merely refusing to decrypt data. And you can bet that the penalties will be stackable, allowing the government to use its discretion and perhaps charging someone with separate counts for each file he refuses (or is unable ...) to decrypt. I'm NAL, but I've also heard of the "forgone conclusion" doctrine, which somehow allows the constitution to fly out the window and allows the gov to imprison someone indefinitely until they decrypt the files. So, sadly, this ban does solve the main problems at considerable expense to citizens' liberties.
- citizens would be allowed to encrypt, but they'd be required to keep a set of the keys used or else they could risk prosecution.
- There could be a government cloud server where you "securely" upload whatever keys you use (or, realistically, probably outsourced to companies like equifax which would then charge you a fee to do so),
- existing cloud providers would be required to detect when clients were using encryption-looking libraries/subroutines and store a copy of the keys into some registry.
- this could ultimately lead to "whitelist-only" software libraries, so that you cannot run anything on the cloud without building it with their dev environment so they can be sure you're not secretly encrypting things.
- going even further, this could lead to deep packet inspection that simply detects encrypted transactions and queries them against the gov key registry to "make sure" they are properly decryptable. Any failures to decrypt could trigger an investigation.
You arrest me, scan my file system and find something named "plan.txt" which is just a bunch of gibberish... what do you do?
EDIT: I'll argue that the "main problem" is that as long as real encryption schemes exist, this is impractical to enforce.
> arrest me, scan my file system and find something named "plan.txt" which is just a bunch of gibberish... what do you do?
well, start by scanning every executable binary on your system. If they find a custom-rolled program that doesn't impregnate the encrypted files with known headers (for contrast, openssl ads the prefix "Salted_" to any file it encrypts) they can allege that you're using a clandestine encryption scheme and that "plan.txt" is one of the files. So again, the burden of proof would be on you to explain what that file was for, which can come at tremendous legal cost.
As someone who likes to be free to use the computers I own, this scares the shit out of me.
Encrypted data is information.
Encryption algorithms are math. Math can be expressed with data, but the immutable intangible reality that is being expressed is not information, nor property.
By your logic, the government can argue banning encrypted data, and encryption algorithm implementations.
The latter hits close to the mark of what A.G. Barr is insinuating. It would still be a significant for a government, especially the U.S. government, to ban the implementation of specific algorithms. That would equate to banning the writing of specific mathematical formulae, which is equivalent to censoring speech.
The wikipedia article on the topic does have some decent links to specific examples from reputable sources and is probably a good place to start - https://en.wikipedia.org/wiki/Parallel_construction
They are just too lazy to do that. They want to go on fishing expeditions and are mad they cant.
Yes it is incomprehensibly expensive for the government to do investigations on everyone, thats the point and how it was before everyone had digital artifacts
Encryption relies on a secret. It's like burying a treasure in a place only you know, and keeping the location a secret (e.g. in your head). Encryption just gives you a huge digital space where you can bury your treasure instead of a physical space where you can bury it.
Sure, people can just search everywhere for your pirate gold (brute-force attack), use advanced reasoning to narrow the search space, like "you lacked the means to 'bury' it in solid stone" (cryptanalysis), develop technology to speed up the search like ground-penetrating radar (e.g. GPUs, asic, special purpose programs) or try to coerce you to reveal the location (monkeywrench-to-knee passphrase cracking).
What the governments wants is that the maker of the shovel you used to bury your treasure not only has to track where you took that shovel but also has to tell the government that information without you telling the government got the information.
No, I couldn't. The operative word there is not "shoot", it's "warrant." The fourth amendment explicitly makes an exception for warrants. If the government has a warrant then I am legally bound to hand over my keys. If I don't, they can put me in prison for that.
I have no idea how you could possibly reach this conclusion. My initial comment did not contain the word "warrant".
By "government overreach" I mostly meant spying on me without a warrant, e.g. the activities brought to light by Edward Snowden, and the common practice of seizing devices at the border.
From you original comment:
>The fourth amendment to the U.S. Constitution guarantees that the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and the ninth amendment to the Constitution guarantees that the people retain un-enumerated rights. I, as a citizen of the United States, maintain that one of those unenumerated rights is my right to employ technological defenses against government overreach.
I read two ways to interpret that:
- "Government overreach" is to include even searches authorized by a warrant in which case you are defending unrestricted use of encryption.
- "Government overreach" is to only include warrantless searches in which cause you are only defending using encryption in a manner in which a warrant can break the encryption.
Considering the second option is basically the government's position and the rest of your post seemed anti-government, I thought you were advocating for the first interpretation.
The second option is the government's ostensible position. But you seem to have forgotten the central point of my argument which is that the government is not trustworthy. Just because the government says that it will only use its decryption keys when it has a warrant, history shows that the government cannot be trusted to keep its word on matters like this. The government does end-runs around Constitutional rights regularly. Therefore, the power to enforce the Constitution's constraints on government action cannot be entrusted to the government. It must remain with the people.
>The problem with this line of argument is that it is a general argument against government and not specific to this issue.
If your argument is that you can't trust the government, you can't trust the government regardless of whether they have a warrant or whether they are operating in the digital or physical world.
If they have a warrant, what exactly is it that you think I need to trust them about at that point?
Cryptographic back doors are totally different. It is not possible to build a back door that has auditability built into its basic physics the way warrants do. That the thing that William Barr doesn't understand. His mindset is something like, "If we can send a man to the moon, surely we can make a way for law enforcement to break encryption that doesn't threaten people's rights." Well, no, we can't. Sending a man to the moon is merely difficult. A back door that only "the good guys" can use is actually impossible.
This entire write up stinks and I don't trust the government to implement this overreach in any sort of way which benefits the average American citizen. :(
If encryption is being used to hide "valuable criminal evidence", how is that different from someone hiding evidence by burying it somewhere or simply destroying it?
We don't detain random people and force them to give up locations of bodies they may or may not have buried, and we don't randomly search people's houses and posesssions -- and we shouldn't be doing the same for encrypted data (and this includes requiring backdoors). If there is other evidence to believe a particular person committed a crime, then get a warrant that compels them to give up the location of the body or the encryption key. If they refuse, then depending on the other evidence used for the warrant it might make sense to hold them in contempt.
In my mind, decrypting data to prove your innocence (in the face of other evidence) is vastly different than decrypting your data because law enforcement is on a fishing expedition (no other evidence).
This is a slippery slope.
That discussion has occurred several times. The government keeps talking after they hear the inevitable "no", at which point it's no longer a useful discussion.
There are no possible alternatives. Every possible alternative is equivalent to a backdoor. Any continuation of a discussion leads to "can we have a backdoor".
"Can we continue the discussion" amounts to "you haven't given us a backdoor yet".
There is a useful distinction to draw, though. There are two versions of "no". Some people use "no, that's not possible" (e.g. for technical reasons or because it'll break security properties), in which case the response either involves asking someone else or trying to legislate without knowledge. And some people use "no, we won't do that" (because it's working as designed and we're not looking to reduce security), in which case the responses involves anger and something roughly equivalent in content and tone to "why do you hate (insert country name here)".
Other useful variations on "no": "and what would you have us say when a country you don't like comes to our offices in their country and asks the same question". That one seems to produce slightly more thought, but ultimately an entitled response suggesting there should be some way to prefer their particular jurisdiction over all the others.
This doesn't matter. Our rights are not premised on the ultimate physical availability of any given piece of information. There's no "we can always break into the safe" provision of the 4th Amendment.
Fundamentally, the government does not have the right to any piece of your information. A warrant grants them the temporary right to employ certain techniques to try to get it.
Safe manufacturers make the strongest safes they can, and in parallel, the government develops their own capabilities to attack those safes to execute warrants.
The same thing is true for encryption. At its base theory, encryption is just math--but it is implemented in software, and software is imperfect. The government can, and does, attack devices to break encryption systems to get what it needs.
In fact, the Justice Department Inspector General found that the FBI did not go far enough in trying this, before it tried to sue Apple in 2016. And ultimately the FBI did get into that iPhone by breaking it.
This is untrue. The equivalent thing that anybody has been able to do for a thousand years is keep their written down secrets in an undisclosed location. If the police don't know where you keep them and you don't tell them, they have never been able to read them. Finding an anonymous storage unit among millions is no easier than guessing the user's password.
But you could put them under covert surveillance ahead of time to find the location, you say? You can do the same thing to get their password then.
Uh... what? There is plenty in the law concerning modern digital encryption. Ciphers have been around for thousands of years. If by "our laws" you mean the constitution, Benjamin Franklin apparently didn't think encryption was worth restricting during the constitutional convention, and that's not because he did not know about it.
> Its existence has potential to be a huge shift in how we enforce the law. Regardless of our views on encryption, we need to have a conversation about that shift. Refusing to have that discussion is likely a quicker path to things like government enforced backdoors than if we engaged with government and law enforcement on possible alternatives
You're acting like this is a new debate, but this is something DOJ has been on about for a long time. If the past is any guide we'll certainly "have a conversation" about it when the DOJ begins attempting to put people using or providing forms of encryption they don't like into prison, just like they tried to do 30 years ago.
The government has done an astounding job of showing they are untrustworthy with access to our personal information, or frankly even their own information as the OPM breach makes painfully apparent.
Cryptography is the one thing in the world that isn't easily defeated by the absurd amount of violence States are willing to commit in the interests of controlling society.
That's a feature, not a bug.
I for one am tired of using coercion to define society and we would do well to embrace anything that disempowers violence.
If you have a back door, it’s there for everybody not just the people it’s intended for. Additionally, there’s not going to be a way to force people to use the encryption that happens to have a backdoor.
It’s an algorithm. People who don’t obey the rules will just use a more secure method when they need to protect something.
This is why there’s no point to having the conversations except to explain it to people.
Also focusing on enforcement is making the perfect the enemy of the good. What percentage of communication in this country flows through either Apple, Google, Facebook, or Amazon? A solution that works for those 4 companies would be a huge step even if it wouldn't result in 100% coverage.
And just to be clear, I don't think the answer is necessarily backdoors in encryption. But I recognize that there is a problem and that we should be open to talk about ways to fix that problem.
>What percentage of communication in this country flows through either Apple, Google, Facebook, or Amazon?
You make it sound like data collection and snooping on innocent people is the whole point of it. If backdoors are required on communication that goes through Apple, Google, Facebook, and Amazon then nobody's going to use the communication on those services for illicit activity that the government would care about. They would use something else.
>But I recognize that there is a problem and that we should be open to talk about ways to fix that problem.
There is no talk to be had, because the entire idea is silly. If the US government can mandate backdoors then so will every other government. This would make everyone vulnerable.
You either protect everybody or you make the compliant vulnerable.
Not trying to be snarky here: I don't understand what this conversation looks like. What does it look like? What purpose does it serve, what it's ultimate goal?
Also, the importance of being able to get away with crime should not be overlooked. It wasn't that long ago that being gay was illegal. And ICE is operating concentration camps this very second.
Should we now require all buildings to, by law, record audio conversations in case such conversations might one day be "needed" by law enforcement? Or perhaps there are other ways to perform targeted wiretaps?
It was the development of telephone communication, and warrants for wiretapping, that first made such private communications accessible to the government.
What number am I thinking of?
People have been hiding information and assets via buried treasure for millennia, with the key (location) only accessible in the brain of the one burying it.
Is it easier now? Sure. But there've always been physical analogues.
If you obtain a warrant to bypass that lock, then you have the right to compel me to hand over the keys. In this case, that metaphorical "key" would be my "private encryption key".
The point where this metaphor breaks is when I either refuse to provide that key, or have lost/destroyed it. On one hand, it's trivial for a physical lock to be bypassed, either by picking it or destroying it, thereby allowing you to "get inside" and (the end goal) "search". Of course, to "search" encrypted data does not involve "getting inside". It involves decryption.
The law can say anything it wants -- math is still math.
To be clear, I’m opposed to widespread access and would want a warrant at a minimum but honesty compels me to note that there are crimes which would be solved if someone used, say, SMS but not Signal and we should consciously accept that as the cost of not living in a surveillance state rather than pretending it’s not true.
Meanwhile the connected and savvy minority coddle pedophiles and grifters among their lot.
These are not really new concerns or ideas. The context has shifted from “meatspace” to “cyber space”. Generally the old ideas of trust and verify, avoid unenforceable, spurious, overreach still apply.
There’s an interesting parallel to the Pareto principle here, IMO. Society is pushing for more and more policing of the 80%ish and less on the 20%ish.
Wealth inequality, and civil rights inequality, filter into our tech contexts.
Too bad we largely focus on these things in our favored context rather than see it as the general political plight of the masses, as it really should be considered, IMO.
I agree that this is the real discussion, but I think that there is indeed a vibrant, worldwide discussion happening on this topic with more frequency and intensity than has ever been the case before.
The writing on the wall is unambiguous: the internet is an evolutionary force whose trajectory and destiny are to deprecate monopolistic government. This has already been shown convincingly with respect to censorship. It is increasingly obvious with respect to intellectual property and remix art. On the horizons are monetary policy and policing.
The humane and sane approach here is to get out of the way and let evolution run its course. Every time the state insists on pre-information age norms, it sounds to me like a whining adolescent, surprised that some of its childhood toys have broken.
The old model of investigative surveillance is broken - broken because of cameras which can reveal the conduct of (uniformed or undercover) state agents, broken because instantaneous worldwide communication moves much faster than bureaucracy, and yes, broken because cryptography.
That is just assuming the premise. Destruction of evidence is a crime, but destruction of private lawful communications is not. The FBI has no right to a married couple's sexting.
The usual case for destruction of evidence is one of two things. Either they produce some emails where you're conspiring to destroy evidence, or that they catch you in the act, seize the evidence you were destroying, and then use it to prove that what you were destroying was evidence.
Finding someone with a bucket full of confetti or an encrypted drive but no key isn't evidence of a crime, and it's unreasonable to be able to put anybody in jail just because they shredded their old credit card statements or can't remember the password for an old device that has been in a closet for three years.
IANAL or law enforcement, but I don’t see the problem with this system.
It's like finding some footage that you drove into and out of a place where there was a murdered body during the same time that the body went missing. That's circumstantial evidence you might have moved it, and it might convince a judge to issue a warrant and have the police search your residence for evidence. But if they can't find anything it's not reasonable to charge you with destruction of evidence for not producing the body, because they haven't proved beyond a reasonable doubt that you could have.
People forget passwords all the time. Sometimes the police find the phone of somebody else who left it in your car and you didn't even realize it was there, and now you think they planted it and they think you won't unlock it, and the person who knows their phone is missing would rather see you in jail than claim the phone and end up there themselves. Higher level paranoia security systems can make unused space indistinguishable from encrypted data, or send cover traffic when there is no real traffic, and there is no way to decrypt it because it's not actually encrypted data to begin with.
There is no way to prove you can't decrypt something which means it's unreasonable to demand that somebody do it when they may not be able to.
Edit: this doesn’t help in cases like terrorism where the owner of the device has already been killed of course.
So no, some judge may try to hold you in contempt of court, and it may work for a while, but at some point -- if you have a good lawyer it will turn in to a civil rights issue.
Also, you could plead the 5th as well.
NOTE: I am not a lawyer, and these are just my opinions on this matter.
Judges can certainly hold you in court indefinitely on a contempt of court charge as well. Considering the nature of the charges that this typically comes up in (see Francis Rawls) I wouldn't count on trying to make it a civil rights issue doing much in your favor.
Police can’t compel you to provide a combination to a lock (encryption key) to go on a fishing expedition, but if they KNOW the safe contains illegal contents then you can be held in contempt for not providing it (they already know it’s in there).
Without spending forever delving into the case record I can’t comment on whether they should really have a high enough certainty that the drive does contain CP, but the argument in this specific case does match our interpretation of the 5th amendment when it comes to physical locks.
This is a good argument WHY encryption is important, however. Backdooring crypto would allow law enforcement to fish through everything they want with wanton disregard for the fifth amendment, instead of needing to build up a suitable case for illegal acts being committed with standard investigatory techniques. This right here is how the system SHOULD work.
I’m guessing it’s because they are operating off testimony of a witness (defendants sister) claiming she was shown the alleged images, and since they weren’t on the unencrypted internal drive they MUST be on the encrypted external drives. That combined with the knowledge that these files were in fact purportedly known to be downloaded via his internet connection is enough for something, but all they have without the drives is hearsay, hence the compulsion to decrypt then?
Personally I think in this instance with recent rulings that an individual cannot be identified by an IP address and a single witness that there isn’t enough to KNOW anything, otherwise I could wardrive around, download a bunch of CP on somebodies connection and say I saw them looking at it through a window or something.
All I know is that we do have precedent for this in the physical world, so it’s not a logical leap to require disclosure of cryptographic keys when we KNOW what they unlock.
It isn't hearsay in this case though. Hearsay is when
Alice testifies that Bob told her he witnessed Charlie viewing child porn. Generally it should not be considered as fact that Bob witnessed this.
It's not hearsay if Alice testifies that she witnessed Charlie viewing child porn, which seems to be the case here. This kind of testimony is direct evidence.
Then there is the circumstantial evidence: The prosecution can show these encrypted drives exist, belong to the defendant, that he knows how to decrypt them and refused, etc.
I just don't see how it's reasonable to claim that isn't good enough for a jury to decide who to believe. They should just try him, or let him go.
The sender uses the public key to encrypt the plaintext, and the receiver uses their private key to decipher the ciphertext, as usual. But, on being compelled, the receiver can also choose an arbitrary target plaintext, and efficiently compute a new private key that maps the ciphertext to the chosen target plaintext.
It is designed for that, and more. In fact, it does not leave the theoretical "anyone could fake the logs", they created a tool to do so, so that you do not need an expert witness to explain to court that someone could doctor the logs - a tool exists for it. On purpose. They call it Deniable authentication:
> Messages in a conversation do not have digital signatures, and after a conversation is complete, anyone is able to forge a message to appear to have come from one of the participants in the conversation, assuring that it is impossible to prove that a specific message came from a specific person. Within the conversation the recipient can be sure that a message is coming from the person they have identified.
When a decryption has a backdoor, who knows when it's been decrypted?
You're gambling on the temperament of your judge if you do this.
Barr is specifically addressing cases where people refuse orders to decrypt their phones or messages and just go to jail instead. That's what he means by warantless.
"Obviously, the Department would like to engage with the private sector in exploring solutions that will provide lawful access. While we remain open to a cooperative approach, the time to achieve that may be limited. Key countries, including important allies, have been moving toward legislative and regulatory solutions. I think it is prudent to anticipate that a major incident may well occur at any time that will galvanize public opinion on these issues. Whether we end up with legislation or not, the best course for everyone involved is to work soberly and in good faith together to craft appropriate solutions, rather than have outcomes dictated during a crisis. "
(In other words, let's not do with cybersecurity policy what we did with counter-terrorism policy in the weeks after 9/11.)
But... we do have a plan, which is to just not do it in spite of any crisis or whatever. He is misleadingly framing it here like we don't have the ability to backdoor encryption which has never been the problem.
He clearly states that what we need to be weary about is public opinion changing, which is basically like saying that we should just get ready to compromise our standards in preparation for the day where reactionary desire is able to overcome our "sober" thinking of the present, or else fear the government coming in and doing it sloppily and by force.
Clearly that's irrational. We should resist it now and we should resist it then too, for just the same reasons we resist it now. There's no technology issue here, just an ethical/political one.
From my perspective life would not have been any meaningfully different either way if that phone stayed locked. I also can't image some future scenario where it makes such a big deal. What type of information is going to be on some laptop or smartphone that is so important it's worth compromising our general civil rights? There is almost always a hundred human errors around the crime already that they can piece it all together without godmode on every electronic device. A smartphone is rarely an all encompassing security mechanism for any big evil plot.
There is no 'backdoor' technology solution here that makes sense and they need to get used to it.
9/11 is an example of what hes proposing, not a counter example.
And, by the way, we have had the debate--even, arguably, in the midst of a crisis. This was all over the news for weeks after the shooting in San Bernardino when the FBI told us it was vitally important to gain access to the perpetrator's phone. They didn't get their back door. Legislation was proposed that would have required it, but it was never adopted. (Though, in fairness, FBI did supposedly get a private company to break the encryption. But this was only after a very long delay and after all the public debate had largely dies down.)
Barr's point is that it's better to have that argument now, in a level-headed moment and with opportunities for all the relevant stakeholders to provide input, than it would be to have it in the middle of some dire emergency.
If you oppose back doors, I would think you would agree with him on this -- government tends to be delegated sweeping powers in emergencies, so it would be much harder to stop gov-friendly proposals like "back door all the things" in that kind of moment than it would be to stop them now.
"We must do something now, it's clear that we need backdoors because of X!" <- That's a much worse position to be in during a debate.
Why do you think that?
"Don't do anything until we have a big problem" is how we got the TSA, Homeland Security, and the Patriot Act.
But notice how the label of “terrorist” is uniquely applied to the ethnic “other”, and now consider the first real gun control legislation — the Mulford Act:
But yes, guns have been regulated in the past — when? After the Black Panther party brought open carry weapons to the state capitol during a protest.
> We think our tech sector has the ingenuity to develop effective ways to provide secure encryption while also providing secure legal access.
Yeah, may be he can also claim, tech sector can achive perpetuum mobile. This just keeps coming back all over again. He should get over the fact that it's impossible, and move on to dealing with it. Next time he should consult actual security expects before producing the above nonsense.
Which is also bad and gross.
"Further, the process of transformation, even if it brings revolutionary change, is likely to be a long one, absent some catastrophic and catalyzing event – like a new Pearl Harbor."
A country is made of people. In some ways we of course act as "consumers" but that is not the beginning and end of what it means to be human. The government's needs are not endogenous; the government's justification for doing certain things is ultimately because people will be better off for it (otherwise it's simply "might makes right)". In addition, corporations, at the end of the day, get certain protections (and additional requirements as well) as they are they are machines to help people achieve various ends (e.g. providing goods, providing jobs, providing an opportunity to create wealth); they are not primary actors in themselves.
BTW my observation is not a comment on the specific politics of the past few years; past AsG and FBI heads have given similar talks and inherently will desire to achieve their job's objectives with the minimum of barriers. This scary formulation just shows how the terms of discussion have shifted.
National security is a major trump card across parties and administration, and will have to be responded to versus ignored, as that's where the argument is coming from.
It's easy enough to explain that Russia has mathematicians, ISIS has mathematicians the same way they had chemical engineers for the oil fields, China/PLA has mathematicians, etc.
The same fear mongering that is allowing an anti-encryption argument to advance can be used to fear monger right back towards encryption and be based in truth: Russia and terrorists can access my chats.
For the pro-encryption crowd, we know this is actually feasible technically and the end result of backdoors. We just have to explain it on common ground, where the argument lives.
His Twitter feed is well worth a follow if you care about these issues.
So I find it very hard to believe the job of law-enforcement is getting harder, not easier, just because we have some tiny scrap of privacy left.
Encryption is not an impediment to an investigation into an ongoing activity, files need to be decrypted, there are side channels everywhere, etc. Metadata and physical surveillance is enough to convict or put a person in a position where they could be convicted under some other law if there is no convincing explanation for why they were where they were.
Usually the point of mass surveillance is to retroactively look up a person of interest and blackmail them.
It would be trivial to have terror cells be distributed a USB with several GB of a OTP, and that would be unbreakable even into the age of quantum computing if used properly.
Thus isn't at all about terrorism or the "really bad guys." It's 100% about accessing the average Joe Blow's communications.
If the government can’t crack strong encryption as-is, the problem is that strong encryption is deployed at scale.
Removing strong encryption at scale would have far more effect than what you’ve described.
This seems to be the key part. He doesn't believe technologists who claim both goals cannot be achieved at once, he claims they can
The open source folks have worked around this before:
Software update systems have been successfully exploited to deliver malware:
> On a normal day, these servers push out routine updates—bug fixes, security patches, new features—to a piece of accounting software called M.E.Doc, which is more or less Ukraine’s equivalent of TurboTax or Quicken. It’s used by nearly anyone who files taxes or does business in the country. But for a moment in 2017, those machines served as ground zero for the most devastating cyberattack since the invention of the internet—an attack that began, at least, as an assault on one nation by another.
Presumably the software update systems for major operating systems, like for Android or iOS, are typically more heavily secured than M.E.Doc.
But they are also targets of limited value. To insert malware into iOS, you would need not only access to their software update system, you would need access to (and understanding of) their source code and build system, and access to their code signing key.
And even then, it's not clear that these software update systems are even capable of targeting patches down to the level of the phone of an individual person. There's no reason for it now. The central system really just needs to make the update available in its various OS flavors, and each client can request what it needs.
If we force these OS companies to create a targeted backdoor system, all the hard work will be done for the bad guys. They need only achieve access to the special "law enforcement access" system, they will have everything they need all ready to go.
Under these conditions, could Google or Apple keep out the bad guys with 100% success? I have great respect for these teams, but those are very long odds.
It's far safer, for them and for us, to just not build that functionality. This was the point that Apple so forcefully made when Jim Comey came after them to decrypt the San Bernadino iPhone.
EDIT to add: these companies operate in more than just the U.S. If they build a targeted backdoor system, you don't think other countries will demand access to that system as well? Look: Apple already compromised on iCloud hosting to maintain access to the Chinese market.
There is reason against it now, because it makes it impossible to do things like reproducible builds or other security checks like comparing the software being offered to other devices to verify that none of them is being offered compromised updates before installing any of them.
It would also require prohibiting the transparency necessary to implement any of those checks independently, or anyone could do so and then use that to detect the attack regardless of whether or not the attackers are domestic state sponsored.
Does anyone really think China won’t immediately demand backdoors?
I think it's disgusting how supposed "democracies" have been trying to emulate China, both in terms of surveillance and censorship. UK is one of the worst offenders here -- sometimes they didn't even hide the fact they were using China as a role model.
There used to be a time when the U.S. government and other countries would condemn China for this sort of stuff.
If encryption is a weapon then I would think the Second Amendment applies, eh?
There can only be 2 solutions:
- enshrine a right to privacy. Individuals should have a way to communicate in a way that is completely secure and free of evesdropping because they are believed to be innocent until proven guilty. Likewise, enforcement agencies should be granted the same to do their work.
- adopt symmetric transparency. Individuals will then be allowed to follow the intimate communications of any leaders or enforcement agencies, with the same level of ease. So if you want me to have to file a FOI to get info about an official, an equally difficult/time-consuming process should exist the other way around. OR if you want an officer to be able to monitor any individual in real time, then I should be able to monitor any officer in real time.
That second case should be automatic anytime the "nothing to hide" argument is invoked.
> If one already has an effective level of security — say, by way of illustration, one that protects against 99 percent of foreseeable threats — is it reasonable to incur massive further costs to move slightly closer to optimality and attain a 99.5 percent level of protection even where the risk addressed is extremely remote?
> if the choice is between a world where we can achieve a 99 percent assurance against cyber threats to consumers, while still providing law enforcement 80 percent of the access it might seek; or a world, where we have boosted our cybersecurity to 99.5 percent but at a cost reducing law enforcements access to zero percent — the choice for society is clear.
One issue with all proposals around this, is risk = probability X impact. While the above speaks to the risk, the impact of malicious actors having their hands on masterkeys would be insta-access to any & all gov-mandated communication channels, to the exact same access level as warrants would afford.
While the attorney is right, that so far most corp master certificates have not been compromised, none of those had this pricetag attached to it. And the impact of this would be retroactively applicable -ie for any present-day communication, we'll be taking on faith that no future masterkeys will be leaked, ever.
I would not take that bet; and so far, neither did insurance companies.
I believe that the US should establish a court similar to the Foreign Intelligence Surveillance Court created under the FISA Act. The government must make a case to a judge establishing probable cause, and if approved a warrant can be issued to a 3rd party communications provider to disable encryption on suspected devices such that lawful interception (i.e wiretap) can be executed.
Warrants are subject to renewal every 90 days and access to encrypted communications prior to the date of warrant approval and not provided by the platform specified in the warrant are prohibited (ie, obtaining a warrant to disable and intercept WhatsApp does not mean you can disable and intercept Signal as well).
I believe this balances the interests of individuals, governments and communication providers evenly.
How often are the people making the arguments from the same political party? This problem extends to pretty much every court, as we currently have 3 branches being gamed by 2 political parties.
This will be one of the fracture lines that break the country.
It seems inevitable that it would help that given nation's "enemies" more than that given nation. Their "enemies" will get a hold of them, and they can make use of them however they want free of restrictions unlike the given nation.
I don't see anyway around that problem.
The spirit of this initiative in 2019 is likely more about stopping strong encryption at scale, which is certain to be a frustrating black hole for LEO and the IC.
Perhaps HN would do well to ask how to solve the problem from a technical perspective, given the requirements. This includes both how to build a better mousetrap (one that doesn’t have a “backdoor” or significantly weakens the encryption mechanism), and how to solve concerns about abuse of exceptional access.
There is a simple way to solve concerns about abuse of "exceptional access": Not to include any "exceptional access" mechanisms. Securely implementing a cryptosystem is a daunting task almost never achieved. Intentionally creating a human-controlled mechanism to access plaintext makes the problem much, much worse.
> There’s no discussion of how to build exceptional access encryption that solves the weakening issue, just that it “can’t be done”.
Please consider that there is fundamentally no way to solve concerns about exceptional access. "Exceptional access" means that there is necessarily a human attack vector: Those humans who control whatever mechanism exists to provide LEO access to plaintext. This necessarily weakens any cryptosystem. If those people are compromised, "exceptional access" will simply be "routine access". Further, because decryption of data emits no obvious signs of physical tampering, even citizens who trust that "exceptional access" is not being abused cannot verify that.
I actually appreciate the name of your 5 hour old account. You're correct. We are experiencing mass hysteria over cryptography. However, it is not security professionals who are hysterical: it's people like you, who apparently never met an argument against liberty that they didn't like.
Same point: figure out a technological and procedural solution to the human attack vector. If “security professionals” all agree on ideology or theory that it’s not possible and thus refuse to help solve the problem, then exceptional access solutions generally will be worse off for it. It’s independent of whether they actually are deployed.
There is no solution. If you build in your "exceptional access" exception, then the system is broken by design and no one will use it. That's the end of the discussion, there's nothing more to discuss. You can rube goldberg "solutions" all day long, but in the end you're just figuring out ways to deploy a broken system.
Again, simply arguing that “it can’t be done”, which is of course theoretically true if the goal is to have zero additional risk by introducing a third party, isn’t going to stop such systems from being deployed, it will simply reduce the quality of such solutions due to talent refusing to work on the problem.
An idea that comes to mind: third party can’t trivially decrypt the data (maybe it requires substantial computation to decrypt) thus reducing practicality of bulk decryption. Make the exceptional access truly exceptional.
I agree that having a trivial way for governments to access encrypted comms at scale is bad; I don’t agree that governments should be completely locked out, without exception, of all comms deployed at scale by mega tech corporations.
There is no minimizing the risk. Your concept is broken. It does not -- and cannot -- provide security of any use. And I don't care what the government thinks about it.
Brainstorming ideas are meant to be thrown out. Attacking the idea respectfully is fine. It’s to help inspire other ideas.
Of course there’s a trade off involved. But whether it’s two party encryption or three party encryption with exceptional access none is perfectly secure anyway. There is major conflation of political ideologies with hardline technical viewpoints going on
So then you recognize that "exceptional access" mechanisms necessarily weaken a cryptosystem, which are already notoriously difficult to implement securely. This brings us back to your OP, where you complain about people telling you the truth you already recognize, and make two entreaties for assistance from HN:
1. "Perhaps HN would do well to ask how to solve the problem from a technical perspective, given the requirements. This includes both how to build a better mousetrap (one that doesn’t have a “backdoor” or significantly weakens the encryption mechanism) [...]"
2. "[...] and how to solve concerns about abuse of exceptional access."
I understand now that you suffer from severe cognitive dissonance with respect to the first. You just acknowledged that the "weakening issue" with "exceptional access" cannot be solved, yet still argue that it can be solved, presumably with more effort from security professionals.
I already addressed the second: Concerns about abuse of "exceptional access" also cannot be solved, except by avoiding their inclusion in the first place.
Your idea is also a non-starter. Human political masters will set the work parameters, not users (otherwise: Who would choose anything but an infinite amount of work to decrypt their communications?). Users would have no way to verify the work required to decrypt as, again, they cannot verify that communications have or have not been "exceptionally accessed". The work parameters must be updated as technology improves, so there must be a way for human political masters to update work requirements (potentially reducing them). Nobody outside certain SCI or ECI compartments has any idea what kind of cryptanalytic power USG can bring to bear. Maybe, like Skipjack, the proof of work cryptography is subtly weaker than expected in a way that only they know. Maybe the USG will just start allocating $100B/year to routinely use "exceptional access". And certainly, after such a backdoor scheme is deployed, LEO and IC will howl that they cannot access enough plaintext to stop child molesting terrorist superpredators, and anyone who would just think of the children would support reducing or eliminating the burdensome computational obfuscation parameters. Once again: Any such "exceptional access" scheme necessarily reduces security by inserting a critical dependence on trust in humans that cannot be verified and whose compromise has Biblically enormous value to many groups.
> I agree that having a trivial way for governments to access encrypted comms at scale is bad; I don’t agree that governments should be completely locked out, without exception, of all comms deployed at scale by mega tech corporations.
If we agree on the first part, then we should agree on everything that I've written. "Exceptional access" schemes only make sense for unconstitutional dragnet surveillance purposes and are a severe threat to liberty. If a target is known, and is found to be using cryptanalytically impenetrable cryptography, targeted physical surveillance will defeat that cryptography every time. If some impenetrable communications happen between two non-targets, it doesn't matter that those communications cannot be read, because the government doesn't want to read those communications anyway--right? Of course, serious criminals and terrorists--the ones on whom collection is really important for security--are not going to use known-compromised cryptosystems when non-broken ones are already ubiquitous. Therefore this "exceptional access" is only useful on the average citizens; unless, that is, the government is doing dragnet surveillance and attempting to "winnow" out secure communications, something they can only do effectively if they attempt decryption of every "exceptional access-enabled" communication.
Finally, consider your request in the historical context. For the great deal of our history, communications have defaulted to being private (there were no microphones in Lincoln's log cabin) and inaccessible to government agents except through testimony (which cryptography does nothing to prevent). Now your claim is that the government must have the ability to access any communication. But why? Our government and society worked just fine without substantially all communications being recorded and accessible to the government. Such a large shift in the balance of power will, I fear, lead inevitably to tyranny.
The rest of the world disagrees with you.
Two party crypto has two parties who could leak the data. Two party with exceptional access has three. Current crypto is susceptible to brute force via shor’s and quantum
The rest of the world absolutely does not agree with you. It’s just that a lot of people here live in a bubble.
Since as you say, "Current crypto is susceptible to brute force via shor’s and quantum", then there's no need for backdooring algorithms, since they're all already broken.
I mean, none of that is accurate, but given your argument you're asking for something you don't need because you already have it.
Any two can be used in concert to unlock the secret. You and the Org combine shards to access account. You or Org can be compelled by Govt. to reveal shard, through a warrant. The third shard is held at the DoJ, and also requires a warrant.