Hacker News new | past | comments | ask | show | jobs | submit login
Attorney General William P. Barr Delivers Address Conference on Cyber Security (justice.gov)
128 points by bellinom 81 days ago | hide | past | web | favorite | 219 comments

Modern encryption is really just math. Cryptography in consumer and off-the-shelf products (which Barr is targeting with his discussion) theoretically _could_ be modified in such a way that the government could decrypt it. The two ways of which I can think are (1) Encryption "backdoors" -- fancy math known only to the government; this would require new encryption ciphers or (b) key escrow. Both approaches have their shortcomings and I'm against both, but it's plausible that the government might try it anyway. All that said, because encryption is just math, any individual or group could employ their own encryption by implementing one of any known existing ciphers -- one without a known "fancy math back door" and refuse to follow the "key escrow" guidelines. In these discussions about the government being able to decrypt stuff, are we, in effect, suggesting that certain math be made illegal? If that's really what's being proposed, I'd urge people to consider "Illegal Numbers" and how effective that's been. https://en.wikipedia.org/wiki/Illegal_number

Breaking encryption for the government is so furiously stupid it blows my mind every time it is suggested. Especially here, where people actually give the idea merit. It makes me miss oldschool /. where 100% of everyone was on the same page. Your point illustrates a huge reason as to why.

Backdooring stupid.crypt and forcing law abiding people to use it just insures that big badguys will use any other kind of encryption. All you've really accomplished is adding an extra charge of illegal encryption use at the expense of security for every human.

This potentially creates all sorts of pathologies. Is it illegal now for me not to update an old computer? If your backdoors are implemented in hardware, is it illegal to use old computers?

When people are against gun control, a common thread is "make guns illegal and only criminals will have guns." This argument has merit, but if we DID amend out #2 and make guns illegal, over time firearm proliferation would decrease.

Not so with encryption. Other, more free countries will constantly be developing better security methodologies, and reproducing those methods is effectively free. "Fuck up encryption, then only bad guys will have encryption" is a much stronger argument, because it's emphatically true.

The ignorant hubris of this is massively disheartening.

> Breaking encryption for the government is so furiously stupid it blows my mind every time it is suggested.

Yeah. There's no distinction whatsoever between encryption with backdoors and no encryption at all. Imagine our current web with no encryption. Your logins are all effectively plaintext; your online shopping is effectively plaintext; your emails are all effectively plaintext. "Furiously stupid" is a good way to describe this whole proposition.

> This argument has merit, but if we DID amend out #2 and make guns illegal, over time firearm proliferation would decrease.

Hmm, then wouldn't some people just make their own firearms, just as you are describing with encryption, right?

Some people would, yes. Especially rudimentary single shot weapons. However, its much harder to make a reliable gun than it is to make reliable tough encryption. There are designs available for both and there always will be, illegal or not. But making a gun is manufacturing whereas using encryption would just require installing some software. Trivial.

I want to point out, that manufacturing a gun is not "non-trivial".

Given blueprints, (publicly available) or a template and accurate enough measures, a lathe, and a mill, anyone can make a firearm or parts for one in their garage.

Is there reading involved? Yes. But any argument you make w.r.t. The futility of illegalizing encryption is immediately portable to firearms manufacture.

I mean... manufacturing a working modern firearm in their garage is probably much more achievable to the general population than rolling out any kind of encryption software. Anyone with some basic hands-on competency can make a gun.

All you really need is a drill press and some basic tools. People made Sten guns in WWII and that's still a perfectly valid firearm design (fully automatic even) that requires almost no work to make.

Given that I have many, many crypto libraries in many many devices, some of which are heavily modified, chances of me even being able to replace those with broken crypto libraries is like... 0. Many people are in a similar situation, so I don't understand how we could even comply with a law like that if we wanted to (which we don't). So yeah, not only trivial to retain unbroken crypto, but nearly impossible to get rid of it.

Sure, and you'd be hard pressed to get a lot of people to give up firearms they already own. If you sent agents door to door, statistically some result in conflicts to the death with people that weeks earlier were considered law abiding.

Can you imagine asking every gun owner/computer owner to go to their local police station to surrender their guns/functional encryption?

That would be pretty spooky to me.

Not trying to make this a gun control debate, but for the longest time encryption was considered a munition, so it's not THAT non sequitur.

Being physical objects, gun distribution is much much more difficult than encryption distribution.

Ok, I believe we are in the middle of arguing OP's point about how the pro-gun people are wrong when using the argument "only the criminals will own them", and how the pro-encryption people are right when using the same argument about encryption.

And, I think what you're adding here is that I've got an error in my statement that both parties will happily build their own firearms/encryption because the physical gun is harder to distribute than a copy of software.

And I agree in principle with this, until I realize that broad distribution of an encryption mechanism is exactly what a bad-acting government would want... crack once and everyone is compromised.

So, no, I think I would argue that its easier to distribute weapons than good, bespoke encryption.

And further, I would argue that if it is true for encryption, it is also true for firearms... that if they are outlawed, the power shifts to criminals as they will still use them.

My point wasn't that "pro-gun people" are wrong.

The argument is a tautology, it can't be wrong! If guns ownership is a crime, then owning a gun makes you a criminal.

The tautology is compatible with the hypothesis that if guns were confiscated and illegal, eventually there would be a decrease in the amount of people getting shot. Probably an increase for a while as confiscation attempts resulted in agents getting in gun battles with people who don't want to surrender their property.

Whether the loss in life and liberty is worth the outcome is a matter of personal taste.

Sure, the saying has broad appeal because the tautology of it is interesting. The actual debate, however, centers on whether laying down your weapons makes you vulnerable to those that hold onto theirs.. and that was the lens I was looking through.

Many people do that already, perfectly legally. Certainly some percentage would choose not to follow laws banning them.

> When people are against gun control, a common thread is "make guns illegal and only criminals will have guns." This argument has merit, but if we DID amend out #2 and make guns illegal, over time firearm proliferation would decrease.

Even if that is true, "decrease" is not remotely equivalent to "eliminate".

The problem is that as law-abiding citizens, and those who have their weapons forcibly taken by law enforcement are left completely unable to defend themselves; while criminals are not completely unable to acquire firearms.

>are we, in effect, suggesting that certain math be made illegal? If that's really what's being proposed, I'd urge people to consider "Illegal Numbers" and how effective that's been.

I keep seeing this "implausibility" of enforcing illegal encryption brought up, and I really think it's wishful thinking. If such encryption algorithms ever are made illegal in some manner, it will be trivial for the government to get the result they want.

It won't be about completely stopping people from using AES, nor will it be about imprisoning every person who continues to use it. What it will be about is turning "this target of our investigation is using illegal encryption" into an immediate cause for search/arrest warrant. And that will be more than enough for 95%+ of the purposes they're looking for.

True, and this should frighten everyone. You'd be a suspected terrorist or criminal for using a VPN or tor or any foreign service that doesnt use the gov approved crypt. As long as you stayed out of the limelight and kept your head down you'd be fine. But if anyone looked into your activity, it would be easy to determine that you weren't using gov-crypt. This is inheritely authoritarian.

I can think of a few ways to make this a real pain for law enforcement. Sure I use my crypto to encrypt a tunnel then you use yours to encrypt a tunnel etc.... Make an onion out of the cryptosystem and law enforcement has got to get piles of warrants to cut through the various layers.

It's stupid, sort of like a fourth ammensment onion router

Key escrow has a number of problems, not the least enforcement that keys are valid and validated. (something that there's not a good history of, and international issues come up)

Back doors are worse though - build a back door and it will be used, just not necessarily by the agency it was built for. There are a lot of groups with a lot of resources oriented around taking advantage of this, and few are legitimate. (and some are enemy nations).

There's a third problem - doing it in such a way that it can't be blocked from monitoring. (see "clipper chip" for more on that).

> We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement without materially weakening the security provided by encryption.


Also "technical solutions" makes it sound like the issue is in inventing the correct encryption scheme. Whereas in reality the issue exists because we have discovered (currently) unbreakable codes, and the invention of broken (backdoored) schemes does little to change that.

If we break all known forms of encryption, and find a reasonable proof that they are no longer possible, then I'll be more interested in this line of reasoning. And that's a pretty big if.

I totally get what you are saying, but it is quite the rabbit hole if we determine that 'we can't have any illegal number... everyone should be able to share any number with anyone else'

That basically means we have to entirely get rid of copyright, since all data (books, movies, software, corporate secrets, state secrets, etc) are just very large numbers.

Do we believe that there should be no restriction on the sharing of any data? I can see the appeal, but there are far reaching consequences if we say that.

Care to run down that rabbit hole? I happen to think copyright is a concept which is intrinsically broken with the advent of modern computing power and connectivity.

While I happen to agree with you, I think it's important to distinguish the two:

What A.G. Barr is insinuating is to regulate algorithms.

Copyright is regulation of implementations.

For example, GPG is a software implementation of encryption algorithms. It has a copyright (used as the basis for its copyleft license). RSA, however, is an algorithm: a mathematical reality that can be described by copyrighted works, but never itself copyrighted.

A.G. Barr has expressed a desire to compel every American who implements that algorithm to do so incorrectly.

We don’t need to use copyright as an example.

Words are just data. Are there illegal combinations of words to exchange? The law says, YES. Some speech is absolutely illegal, including making credible death threats, conspiring to break other laws, or disclosing certain state secrets to foreign powers.

Very few people argue that since words are easily available to everyone, that it is futile to make some combinations of words illegal.

Words are not illegal per se.

Words uttered in a situational context that renders them of immediate harm are illegal. I can say "Fire!" in a theater while giving a lecture or putting on a show. I cannot knowingly claim the theatre is on fire when it isn't to cause a panic.

Point is, it is not the Word or content that is illegal. It is the union of word and context that is illegal.

Subtle difference, but it's the only thing that keeps that type of law from getting absurd and out of hand very quickly.

I agree with you, and make the same point about numbers.

The number is not illegal, it’s the number in conjunction with a situational context that is illegal.

We may disagree with the intent of the law, but the argument that we are making numbers illegal, or math illegal, is parallel to the argument that other laws make words illegal.

Ha but when a number is uniformly "random", and the context is lost, as in it's just a bunch of bits floating around in storage, what argument is there?

Ok maybe you could catch me attempting to decrypt it, and be like "gotcha, that was in fact a secret!" But I'd reckon it would be more effective to simply wait until you finish decrypting the data, and simply take it from you.

If there are going to be laws around this, it's sure to be very pathological, and scary.

If I have a random number without context, how is it illegal by itself? It isn’t.

If the context around the number is that it is stored in a .mkv file with a name that looks like a Disney property, or a .key file attached to a program that uses such things for some kind of encryption the government unwisely bans, well, the number suddenly has context around it that makes an argument about the number and the context.

Same for words, really. Words about a threat to a government leader are probably fine in a text file that looks like a short story. Those same words in combination with a history of advocating violent revolution, &c. might make for a different argument.

We are talking about functions, not data.

In that sense, copyright = data, and encryption = functions.

A function can be described with data.

>In these discussions about the government being able to decrypt stuff, are we, in effect, suggesting that certain math be made illegal?

All images are binary. All binary is just a number. We have made many such numbers illegal and even have software that will detect them and report you when you share the number with such number sharing services (dropbox, facebook, etc).

So making math illegal sounds entirely possible.

You are talking about data, so following that logic, what would be made illegal would be implementations not algorithms.

Math can be represented in a variety of ways, but the pattern being described is immutable.

What A.G. Barr is insinuating is not that we make implementations illegal, but that we make the use of algorithms categorically illegal.

While I don't know of any, had someone made an algorithm that could generate such illegal numbers I suspect it would be considered illegal from the first day of its existence.

This is all true, but I think encryption backdoors are more possible than people think.

The target here is not nerds able to pull code from GitHub or run open source or enterprise software. The target here is consumer stuff by companies like Apple and Google. The government doesn't want it to be easy to do end-to-end encryption.

For the average user, easy equals possible. The average user has neither the time nor the expertise to roll their own solution or run nerd tools. Look at how PGP/GPG's complexity and absolutely horrible UX (even for technical users!) has prevented e-mail encryption from ever taking off.

This reminds me of what a government guy told me about crypto export controls. Yes, they know that crypto export controls won't stop nerds using GitHub. What they want to do is to stop IBM, Google, Apple, Cisco, Juniper, etc. from selling ready-made polished crypto products to blacklisted countries.

In both cases I think the target is large corporations not individuals and the goal is to make crypto hard and keep it out of the hands of the average user or less-technical foreign organization.

That being said I still don't think it'll work. Just pointing out the thinking that's going on here.

> That being said I still don't think it'll work. Just pointing out the thinking that's going on here.

The problem is that this either shows a stunning amount of ignorance or deliberate malice.

Let's just go back and consider that the government does not want the average user to have strong encryption. What is the play here? The average user is almost by definition not the bad guy, unless we consider the population at large to be criminals by default. Is the government trying to dragnet the entire population and keep everyone under the thumb for minor infractions? Because that's the only feasible target here. Barr can froth at the mouth, mad as the dickens, it won't prevent Bad Guys from using strong encryption. So his only feasible target is the (mostly) law abiding population.

The other point, preventing the likes of Google, IBM, Apple, et. al. of selling devices with strong encryption to blacklisted countries again shows either ignorance or malice. As parent wrote, encryption is just math. Are the government agencies so shockingly uninformerd that they think that in absence of secure IDevices, north korea will be forced to use backdoored technology?

The spread of physical goods can be controlled (to some degree), but the spread of information can at best be slowed down, but not stopped. Doubly so if there are already existing methods of secure communications that the government cannot efficiently crack.

The only conclusion I can come to is that they are well aware that they cannot catch any serious Bad Guy using mandated backdoors. Serious Bad Guys will use strong encryption anyway, they will cover their tracks and won't care what is legal or illegal (in the US). Furthermore, against targets like these, there are already time proven methods of infiltration, social engineering and good old fashioned bribery.

This only leaves the option of taking secure communications away from the population at large, perhaps because the government feels threatened from too many people being able to share ideas? I was never one for tinfoil hattery, so my hope is that I'm wrong.

This idea represents the best possible compromise to the situation outlined here. I think we should all be crypto hardliners in the sense that we refuse to allow laws against certain kinds of math, but at the same time, we may have to compromise on government access to keys once they have been handed over to a third party.

If you have not handed your private keys over to anyone, they should be yours alone, but once you have uploaded your private keys to a coroprate cloud server, you may have to accept that law enforcement will be able to get warrant access.

This won't solve the problem for law enforcement, but it will make it easier to catch lazy people while preserving the option for full security for those who want to control their own data.

The scenario you described accomplishes both goals.

By banning 'the masses' from using encrypted communications, it'll sort the haystack and everyone who continues to do so can be profiled, plus they're already involved in illegal behavior.

You better believe I would start streaming random data all over the internet just to be an asshole

Then, in the US, you have obstruction of justice and/or interference with police/peace/public officer.

I think it's more of a protest, or am I not allowed to email myself numbers?

You are, up until the point where it's cost a law enforcement officer time to determine that either the numbers are intended to waste their time or that the numbers are a hidden/unbackdoored encryption. Then you're GG SOL

You literally cannot prove it either way, you can't prove it's not enciphered data, you can't prove it's not random garbage, that's the point. You can say that 'we could develop a safe backdoored system that will allow only lawful decryption in the event of emergency' in the same way you can say 'we can launch probes made of candy to distant planets that will build cities and plant potatoes for us'. It's a fantasy. The keys will leak, criminals etc will still blend into the crowd.

How so? Couldn't we come up with a way of disguising encrpyted message streams so that they did not stand out? It would be more expensive, and given enough analysis they could probably detect them anyway, but it strikes me as an arms race.

For centuries, the law has recognized information as property. Encryption is just a transform of information. The government can argue that they're simply banning certain types of property - something they've done for at least 100 years, maybe longer.

Can you explain how this actually solves the main problems? I can see this form of encryption catching unsophisticated "bad hombres". Unsophisticated here meaning, either ignorant of weaknesses in the technology they use, or aware, but unable to improve upon it. The most motivated adversaries will make use of other schemes.

Worse, for secrets we actually care about (nuclear codes?) we must still research proper encryption schemes since backdoors are admissions of weakness in a security protocol fundamentally as far as I've come to understand.

> We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement without materially weakening the security provided by encryption. Such encryption regimes already exist. For example, providers design their products to allow access for software updates using centrally managed security keys. We know of no instance where encryption has been defeated by compromise of those provider-maintained keys. Providers have been able to protect them.

This quote from the article seems to contradict itself. First it claims "... without materially weakening the security provided by encryption" then goes on to state "We know of no instance where encryption has been defeated by compromise of those provider-maintained keys" implying that there is a possibility of this kind of breach.

This whole thing seems like an oligarch's attempt to spy on it's people pretty plainly to me. Where is the liberty and freedom in this?

> Can you explain how this actually solves the main problems?

a lot of weight rests on those two words: "main problems". The main problems for the government are that criminal investigations are being impeded. By banning certain forms of encryption, they can criminally charge a suspect for merely refusing to decrypt data. And you can bet that the penalties will be stackable, allowing the government to use its discretion and perhaps charging someone with separate counts for each file he refuses (or is unable ...) to decrypt. I'm NAL, but I've also heard of the "forgone conclusion" doctrine, which somehow allows the constitution to fly out the window and allows the gov to imprison someone indefinitely until they decrypt the files. So, sadly, this ban does solve the main problems at considerable expense to citizens' liberties.

Conjecturing further:

- citizens would be allowed to encrypt, but they'd be required to keep a set of the keys used or else they could risk prosecution.

- There could be a government cloud server where you "securely" upload whatever keys you use (or, realistically, probably outsourced to companies like equifax which would then charge you a fee to do so),

- existing cloud providers would be required to detect when clients were using encryption-looking libraries/subroutines and store a copy of the keys into some registry.

- this could ultimately lead to "whitelist-only" software libraries, so that you cannot run anything on the cloud without building it with their dev environment so they can be sure you're not secretly encrypting things.

- going even further, this could lead to deep packet inspection that simply detects encrypted transactions and queries them against the gov key registry to "make sure" they are properly decryptable. Any failures to decrypt could trigger an investigation.

Ah yes, but then doesn't the problem boil down to proving that a random value is in fact an encrypted secret?

You arrest me, scan my file system and find something named "plan.txt" which is just a bunch of gibberish... what do you do?

EDIT: I'll argue that the "main problem" is that as long as real encryption schemes exist, this is impractical to enforce.

In theory, yes that's a big part of the problem. In practice, however, once the gov charges you, you're effectively guilty-until-proven-innocent because your court-appointed public defender is likely not going to be trained or equipped to provide a logical defense, much less hire an expert witness in computer forensics. Plus the gov will approach you with a plea "deal" : you can plead guilty to one charge of illegal encrypted data, pay $20k and 2 years' probation, or else risk going to the slammer for decades on the stacked charges with a maximum sentence of 3 years per file, times the 10 files they were "unable to decrypt" on your system.

> arrest me, scan my file system and find something named "plan.txt" which is just a bunch of gibberish... what do you do?

well, start by scanning every executable binary on your system. If they find a custom-rolled program that doesn't impregnate the encrypted files with known headers (for contrast, openssl ads the prefix "Salted_" to any file it encrypts) they can allege that you're using a clandestine encryption scheme and that "plan.txt" is one of the files. So again, the burden of proof would be on you to explain what that file was for, which can come at tremendous legal cost.

Exactly, thanks for spelling it all out. Back to your original point then. Banning specific types of property is one thing, but this isn't that. It's banning all forms of property, and as you say, whitelisting acceptable ones. This seems extremely dubious, and unconstitutional (just guessing).

As someone who likes to be free to use the computers I own, this scares the shit out of me.

There is an important distinction here:

Encrypted data is information.

Encryption algorithms are math. Math can be expressed with data, but the immutable intangible reality that is being expressed is not information, nor property.

By your logic, the government can argue banning encrypted data, and encryption algorithm implementations.

The latter hits close to the mark of what A.G. Barr is insinuating. It would still be a significant for a government, especially the U.S. government, to ban the implementation of specific algorithms. That would equate to banning the writing of specific mathematical formulae, which is equivalent to censoring speech.

I agree, and I could have been more precise. I highly doubt we'll live in a world where the AES algorithm or source code or even binaries is illegal. The crime will simply be if the gov can show you were using information-hiding practices illegally, such as without an adequate key escrow system (for a large-scale deployments) or refusing to decrypt communications when asked to by LE.

I guess that's one way to take over the tech industry.

He claims encryption is "warrant proof" which is not true. You can have a court order someone to open the lock. They want the ability to dig through people's stuff without them knowing. That's what it's really about.

With how prevalent parallel construction has become it seems obvious that any encryption backdoor would be immediately exploited as soon as law enforcement had even the slightest justification to poke around. If they find something, you can be sure they'll come up with a legal way to arrest you for it. If they find nothing, oh well, better luck next time. Not like you'll ever know they were even there (probably).

Do you have a good reference for more info on parallel construction?

Stingray manuals and law enforcement license agreements are strongly worded to require parallel construction. Start googling from there and discover a whole world of tinfoil.

No not really, as its more something I've read about on/off over the years, in a variety of articles, and I am by no means an expert.

The wikipedia article on the topic does have some decent links to specific examples from reputable sources and is probably a good place to start - https://en.wikipedia.org/wiki/Parallel_construction

Look at news stories of unusually large narcotics busts attributed to a random leo pulling over a car for a traffic violation.

Those cases suggest to many people that the end justifies the means.

They can issue a warrant on the client side user

They are just too lazy to do that. They want to go on fishing expeditions and are mad they cant.

Yes it is incomprehensibly expensive for the government to do investigations on everyone, thats the point and how it was before everyone had digital artifacts

Sure, but I can just refuse to decrypt my data. They can just break physical locks.

Encryption is nothing like a lock. We should stop using that analogy.

Encryption relies on a secret. It's like burying a treasure in a place only you know, and keeping the location a secret (e.g. in your head). Encryption just gives you a huge digital space where you can bury your treasure instead of a physical space where you can bury it.

Sure, people can just search everywhere for your pirate gold (brute-force attack), use advanced reasoning to narrow the search space, like "you lacked the means to 'bury' it in solid stone" (cryptanalysis), develop technology to speed up the search like ground-penetrating radar (e.g. GPUs, asic, special purpose programs) or try to coerce you to reveal the location (monkeywrench-to-knee passphrase cracking).

What the governments wants is that the maker of the shovel you used to bury your treasure not only has to track where you took that shovel but also has to tell the government that information without you telling the government got the information.

This is what a lot of people in our community seemingly refuse to recognize. For all intents and purposes, encryption is a unbreakable lock that can serve to perfectly hide valuable criminal evidence. Such a thing wasn't possible when our laws were written and has never before been possible in the physical world. Its existence has potential to be a huge shift in how we enforce the law. Regardless of our views on encryption, we need to have a conversation about that shift. Refusing to have that discussion is likely a quicker path to things like government enforced backdoors than if we engaged with government and law enforcement on possible alternatives.

Fine. Here's my contribution to the conversation: Mr Barr, your entreaties in this regard are based on the presumption that the government can be trusted. But our nation was founded on a mistrust of government, and your own actions demonstrate that the government cannot be trusted. Your own special counsel has issued a report that implicates the president in a felony (obstruction of justice) but you have failed to follow up in any way except to imply that there is "nothing to see here, move along." The fourth amendment to the U.S. Constitution guarantees that the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and the ninth amendment to the Constitution guarantees that the people retain un-enumerated rights. I, as a citizen of the United States, maintain that one of those unenumerated rights is my right to employ technological defenses against government overreach. Those are rights guaranteed to me by the Constitution. The right of the government to catch those it deems to be "the bad guys" is not.

The problem with this line of argument is that it is a general argument against government and not specific to this issue. You could use the exact same argument for why you shot a police officer who broke down your door after securing a warrant. It would quickly be dismissed in that instance so it should carry little weight in the discussion of encryption. If you want the government to completely give up this line of thinking, you need a way to explain to them why a digital lock/encryption should be treated different legally than a physical lock.

> You could use the exact same argument for why you shot a police officer who broke down your door after securing a warrant.

No, I couldn't. The operative word there is not "shoot", it's "warrant." The fourth amendment explicitly makes an exception for warrants. If the government has a warrant then I am legally bound to hand over my keys. If I don't, they can put me in prison for that.

Your initial comment seemed to imply that a warrant to break the encryption was "government overreach". If not, I don't see how what you originally posted is an argument against ways around encryption. The question is whether the government should be able to access this information and not whether the government can be trusted with access to that information. If your argument is the latter, than you are arguing against warrants in general.

> Your initial comment seemed to imply that a warrant to break the encryption was "government overreach".

I have no idea how you could possibly reach this conclusion. My initial comment did not contain the word "warrant".

By "government overreach" I mostly meant spying on me without a warrant, e.g. the activities brought to light by Edward Snowden, and the common practice of seizing devices at the border.

Not including the word "warrant" is exactly why I thought your problem was with warrants specifically.

From you original comment:

>The fourth amendment to the U.S. Constitution guarantees that the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and the ninth amendment to the Constitution guarantees that the people retain un-enumerated rights. I, as a citizen of the United States, maintain that one of those unenumerated rights is my right to employ technological defenses against government overreach.

I read two ways to interpret that:

- "Government overreach" is to include even searches authorized by a warrant in which case you are defending unrestricted use of encryption.

- "Government overreach" is to only include warrantless searches in which cause you are only defending using encryption in a manner in which a warrant can break the encryption.

Considering the second option is basically the government's position and the rest of your post seemed anti-government, I thought you were advocating for the first interpretation.

> the second option is basically the government's position

The second option is the government's ostensible position. But you seem to have forgotten the central point of my argument which is that the government is not trustworthy. Just because the government says that it will only use its decryption keys when it has a warrant, history shows that the government cannot be trusted to keep its word on matters like this. The government does end-runs around Constitutional rights regularly. Therefore, the power to enforce the Constitution's constraints on government action cannot be entrusted to the government. It must remain with the people.

Now we are just going in circles. This goes back to the first sentence of my response to you:

>The problem with this line of argument is that it is a general argument against government and not specific to this issue.

If your argument is that you can't trust the government, you can't trust the government regardless of whether they have a warrant or whether they are operating in the digital or physical world.

> you can't trust the government regardless of whether they have a warrant

If they have a warrant, what exactly is it that you think I need to trust them about at that point?

A warrant is a check and balance designed by one arm of the government to give another arm of the government oversight into the actions of a third arm of the government. If you don't trust the government, your trust in the entire system should logically fall apart.

Very different kind of trust. To serve a warrant at my home, agents of the government have to be physically present, and they have to give me a copy of the warrant printed on a sheet of paper. The physics of that situation provides auditability. If the warrant was not genuine, the people who served it would go to prison.

Cryptographic back doors are totally different. It is not possible to build a back door that has auditability built into its basic physics the way warrants do. That the thing that William Barr doesn't understand. His mindset is something like, "If we can send a man to the moon, surely we can make a way for law enforcement to break encryption that doesn't threaten people's rights." Well, no, we can't. Sending a man to the moon is merely difficult. A back door that only "the good guys" can use is actually impossible.

Sort of a side note, if police kick down the wrong door (this happened not long ago) they're not justified in pursuing murder charges in that case. If they served the right warrant at the right address than yes, that is justified.

This entire write up stinks and I don't trust the government to implement this overreach in any sort of way which benefits the average American citizen. :(

Cryptography has existed for over 3000 years [1], steganography [2] has been documented in use over 2000 years ago and it's possible it has been used much longer (the entire point is we wouldn't know).

If encryption is being used to hide "valuable criminal evidence", how is that different from someone hiding evidence by burying it somewhere or simply destroying it?

We don't detain random people and force them to give up locations of bodies they may or may not have buried, and we don't randomly search people's houses and posesssions -- and we shouldn't be doing the same for encrypted data (and this includes requiring backdoors). If there is other evidence to believe a particular person committed a crime, then get a warrant that compels them to give up the location of the body or the encryption key. If they refuse, then depending on the other evidence used for the warrant it might make sense to hold them in contempt.

In my mind, decrypting data to prove your innocence (in the face of other evidence) is vastly different than decrypting your data because law enforcement is on a fishing expedition (no other evidence).

[1] https://en.wikipedia.org/wiki/History_of_cryptography#Antiqu...

[2] https://en.wikipedia.org/wiki/Steganography

This- encryption + deletion of the key is basically destroying evidence. Which is already illegal and our justice system already has to deal with. No special casing necessary.

Lots of people are bringing up these type of issues, so let me just address them generally. Defaults are important. Yes, it was always possible to build some elaborate booby trapped safe, bury the evidence in the middle of the Mojave Desert, or cook up some home made encryption algorithm to hide evidence. However that wasn't the default. It took elaborate planning and dedication that most people simply didn't have. For example if the average person jotted down an offhanded note, they probably did it in plain English on a regular piece of paper and left it on their desk at home. Now the same note would by default be encrypted on their phone and protected against warrants.

In the same way that private verbal communications are protected by the fifth amendment -- we cannot force people to testify against themselves about possibly incriminating things.

This is a slippery slope.

> Regardless of our views on encryption, we need to have a conversation about that shift. Refusing to have that discussion

That discussion has occurred several times. The government keeps talking after they hear the inevitable "no", at which point it's no longer a useful discussion.

There are no possible alternatives. Every possible alternative is equivalent to a backdoor. Any continuation of a discussion leads to "can we have a backdoor".

"Can we continue the discussion" amounts to "you haven't given us a backdoor yet".

There is a useful distinction to draw, though. There are two versions of "no". Some people use "no, that's not possible" (e.g. for technical reasons or because it'll break security properties), in which case the response either involves asking someone else or trying to legislate without knowledge. And some people use "no, we won't do that" (because it's working as designed and we're not looking to reduce security), in which case the responses involves anger and something roughly equivalent in content and tone to "why do you hate (insert country name here)".

Other useful variations on "no": "and what would you have us say when a country you don't like comes to our offices in their country and asks the same question". That one seems to produce slightly more thought, but ultimately an entitled response suggesting there should be some way to prefer their particular jurisdiction over all the others.

> For all intents and purposes, encryption is a unbreakable lock that can serve to perfectly hide valuable criminal evidence.

This doesn't matter. Our rights are not premised on the ultimate physical availability of any given piece of information. There's no "we can always break into the safe" provision of the 4th Amendment.

Fundamentally, the government does not have the right to any piece of your information. A warrant grants them the temporary right to employ certain techniques to try to get it.

I'm going to reply to myself and also point out that we don't require safe makers to make "breakable" safes.

Safe manufacturers make the strongest safes they can, and in parallel, the government develops their own capabilities to attack those safes to execute warrants.

The same thing is true for encryption. At its base theory, encryption is just math--but it is implemented in software, and software is imperfect. The government can, and does, attack devices to break encryption systems to get what it needs.

In fact, the Justice Department Inspector General found that the FBI did not go far enough in trying this, before it tried to sue Apple in 2016. And ultimately the FBI did get into that iPhone by breaking it.

> For all intents and purposes, encryption is a unbreakable lock that can serve to perfectly hide valuable criminal evidence. Such a thing wasn't possible when our laws were written and has never before been possible in the physical world. Its existence has potential to be a huge shift in how we enforce the law.

This is untrue. The equivalent thing that anybody has been able to do for a thousand years is keep their written down secrets in an undisclosed location. If the police don't know where you keep them and you don't tell them, they have never been able to read them. Finding an anonymous storage unit among millions is no easier than guessing the user's password.

But you could put them under covert surveillance ahead of time to find the location, you say? You can do the same thing to get their password then.

> encryption is a unbreakable lock that can serve to perfectly hide valuable criminal evidence. Such a thing wasn't possible when our laws were written and has never before been possible in the physical world.

Uh... what? There is plenty in the law concerning modern digital encryption. Ciphers have been around for thousands of years. If by "our laws" you mean the constitution, Benjamin Franklin apparently didn't think encryption was worth restricting during the constitutional convention, and that's not because he did not know about it.

> Its existence has potential to be a huge shift in how we enforce the law. Regardless of our views on encryption, we need to have a conversation about that shift. Refusing to have that discussion is likely a quicker path to things like government enforced backdoors than if we engaged with government and law enforcement on possible alternatives

You're acting like this is a new debate, but this is something DOJ has been on about for a long time. If the past is any guide we'll certainly "have a conversation" about it when the DOJ begins attempting to put people using or providing forms of encryption they don't like into prison, just like they tried to do 30 years ago.

You act as if the court has no option if someone won't decrypt something they hold the keys too. This is why contempt charges are a thing.

The government has done an astounding job of showing they are untrustworthy with access to our personal information, or frankly even their own information as the OPM breach makes painfully apparent.

I recognize this, and I embrace it.

Cryptography is the one thing in the world that isn't easily defeated by the absurd amount of violence States are willing to commit in the interests of controlling society.

That's a feature, not a bug.

I for one am tired of using coercion to define society and we would do well to embrace anything that disempowers violence.

Except it’s not a discussion worth having.

If you have a back door, it’s there for everybody not just the people it’s intended for. Additionally, there’s not going to be a way to force people to use the encryption that happens to have a backdoor.

It’s an algorithm. People who don’t obey the rules will just use a more secure method when they need to protect something.

This is why there’s no point to having the conversations except to explain it to people.

Do you realize how condescending it is when someone comes to you with a problem and your answer is that "is not a discussion worth having"? Warrants losing their power in the digital age is a problem and our community's refusal to recognize that just pushes the government down alternative routes to something like PRISM.

Also focusing on enforcement is making the perfect the enemy of the good. What percentage of communication in this country flows through either Apple, Google, Facebook, or Amazon? A solution that works for those 4 companies would be a huge step even if it wouldn't result in 100% coverage.

And just to be clear, I don't think the answer is necessarily backdoors in encryption. But I recognize that there is a problem and that we should be open to talk about ways to fix that problem.

Do you realize how PRISM is the very reason that the government cannot be trusted?

>What percentage of communication in this country flows through either Apple, Google, Facebook, or Amazon?

You make it sound like data collection and snooping on innocent people is the whole point of it. If backdoors are required on communication that goes through Apple, Google, Facebook, and Amazon then nobody's going to use the communication on those services for illicit activity that the government would care about. They would use something else.

>But I recognize that there is a problem and that we should be open to talk about ways to fix that problem.

There is no talk to be had, because the entire idea is silly. If the US government can mandate backdoors then so will every other government. This would make everyone vulnerable.

Warrants aren’t losing their power, warrants are gaining power! Never before have you been able to issue a warrant to a phone company and get someone’s location in real time, to a tech company and get the copies of all their sent mail perfectly preserved, or to their bank or credit agency to get a record of nearly every transaction they have ever made. The rate at which new information available to law enforcement might be slowing down, but it is still increasing YoY even with encryption on the rise.

I do. And yet, there’s no other way to approach it.

You either protect everybody or you make the compliant vulnerable.

> Regardless of our views on encryption, we need to have a conversation about that shift.

Not trying to be snarky here: I don't understand what this conversation looks like. What does it look like? What purpose does it serve, what it's ultimate goal?

But we do have this discussion, but the authorities don't want it. We say "either we have secure communications, or not", they say "I don't believe you, let us have access".

While no physical lock is "unbreakable", there are lots of safes that are close enough to be the same thing to law enforcement. Those safes have existed for a long time.

Computer security is fundamentally different, it is not limited by space or time like physical security. My door locks have to protect me from my neighbors, but my crypto keys have to protect me from anyone with an internet connection.

Also, the importance of being able to get away with crime should not be overlooked. It wasn't that long ago that being gay was illegal. And ICE is operating concentration camps this very second.

Well no, here's another inaccessible bit of information: face-to-face conversations where there are no recording devices present. Hence the importance of testimony in a criminal investigation.

Should we now require all buildings to, by law, record audio conversations in case such conversations might one day be "needed" by law enforcement? Or perhaps there are other ways to perform targeted wiretaps?

You mean your Google home?

There's a reason I don't own one :-P

It's always been possible, and frequent. People have had innumerable conversations relevant to criminal intent, and refused to divulge information on them. The founders were certainly aware of that when the Fifth Amendment was added to the constitution.

It was the development of telephone communication, and warrants for wiretapping, that first made such private communications accessible to the government.

>For all intents and purposes, encryption is a unbreakable lock that can serve to perfectly hide valuable criminal evidence. Such a thing wasn't possible when our laws were written and has never before been possible in the physical world.

What number am I thinking of?

> and has never before been possible in the physical world

People have been hiding information and assets via buried treasure for millennia, with the key (location) only accessible in the brain of the one burying it.

Is it easier now? Sure. But there've always been physical analogues.

If we are going to continue with this metaphor of encryption being a lock...

If you obtain a warrant to bypass that lock, then you have the right to compel me to hand over the keys. In this case, that metaphorical "key" would be my "private encryption key".

The point where this metaphor breaks is when I either refuse to provide that key, or have lost/destroyed it. On one hand, it's trivial for a physical lock to be bypassed, either by picking it or destroying it, thereby allowing you to "get inside" and (the end goal) "search". Of course, to "search" encrypted data does not involve "getting inside". It involves decryption.

It's impossible to entertain this kind of absurdity -- there is no world in which you can effectively ban math.

The law can say anything it wants -- math is still math.

The problem is that there's no way for the government to actually enforce this 100% - sure, on the average person, but we have to assume that someone engaged in espionage / terrorism / etc. is going to take additional precautions. The threat doesn't go away by doing this.

Nothing assumes 100% perfection as a requirement. We might not like it but if, say, popular services had a way to satisfy warrant requests that would mean that most cases would be satisfied with a warrant even if a small percentage required something else, just as the existence of very hard to open safes doesn’t mean most criminals use them.

To be clear, I’m opposed to widespread access and would want a warrant at a minimum but honesty compels me to note that there are crimes which would be solved if someone used, say, SMS but not Signal and we should consciously accept that as the cost of not living in a surveillance state rather than pretending it’s not true.

Yeah you get the majority easy enough. They’re mostly law abiding anyway, so you conjure up as much petty crap to pin on them to justify the police state.

Meanwhile the connected and savvy minority coddle pedophiles and grifters among their lot.

These are not really new concerns or ideas. The context has shifted from “meatspace” to “cyber space”. Generally the old ideas of trust and verify, avoid unenforceable, spurious, overreach still apply.

There’s an interesting parallel to the Pareto principle here, IMO. Society is pushing for more and more policing of the 80%ish and less on the 20%ish.

Wealth inequality, and civil rights inequality, filter into our tech contexts.

Too bad we largely focus on these things in our favored context rather than see it as the general political plight of the masses, as it really should be considered, IMO.

> Regardless of our views on encryption, we need to have a conversation about that shift.

I agree that this is the real discussion, but I think that there is indeed a vibrant, worldwide discussion happening on this topic with more frequency and intensity than has ever been the case before.

The writing on the wall is unambiguous: the internet is an evolutionary force whose trajectory and destiny are to deprecate monopolistic government. This has already been shown convincingly with respect to censorship. It is increasingly obvious with respect to intellectual property and remix art. On the horizons are monetary policy and policing.

The humane and sane approach here is to get out of the way and let evolution run its course. Every time the state insists on pre-information age norms, it sounds to me like a whining adolescent, surprised that some of its childhood toys have broken.

The old model of investigative surveillance is broken - broken because of cameras which can reveal the conduct of (uniformed or undercover) state agents, broken because instantaneous worldwide communication moves much faster than bureaucracy, and yes, broken because cryptography.

Sure, you can also destroy evidence! This is already a crime we deal with. Encrypting and throwing away the key is deleting with more steps - in fact it is often an implementation detail of deletion in some software systems. It is already illegal, and already something our justice system deals with. No power grabs necessary.

> Sure, you can also destroy evidence! This is already a crime we deal with.

That is just assuming the premise. Destruction of evidence is a crime, but destruction of private lawful communications is not. The FBI has no right to a married couple's sexting.

The usual case for destruction of evidence is one of two things. Either they produce some emails where you're conspiring to destroy evidence, or that they catch you in the act, seize the evidence you were destroying, and then use it to prove that what you were destroying was evidence.

Finding someone with a bucket full of confetti or an encrypted drive but no key isn't evidence of a crime, and it's unreasonable to be able to put anybody in jail just because they shredded their old credit card statements or can't remember the password for an old device that has been in a closet for three years.

The fact that the FBI has no right to a couple sexting without a warrant is exactly why encryption is fine. When they present evidence to a judge that there is something they need in those conversations to prove a crime, and get a warrant, then it becomes evidence in a criminal investigation.

IANAL or law enforcement, but I don’t see the problem with this system.

The problem is that you may not be able to decrypt it.

It's like finding some footage that you drove into and out of a place where there was a murdered body during the same time that the body went missing. That's circumstantial evidence you might have moved it, and it might convince a judge to issue a warrant and have the police search your residence for evidence. But if they can't find anything it's not reasonable to charge you with destruction of evidence for not producing the body, because they haven't proved beyond a reasonable doubt that you could have.

People forget passwords all the time. Sometimes the police find the phone of somebody else who left it in your car and you didn't even realize it was there, and now you think they planted it and they think you won't unlock it, and the person who knows their phone is missing would rather see you in jail than claim the phone and end up there themselves. Higher level paranoia security systems can make unused space indistinguishable from encrypted data, or send cover traffic when there is no real traffic, and there is no way to decrypt it because it's not actually encrypted data to begin with.

There is no way to prove you can't decrypt something which means it's unreasonable to demand that somebody do it when they may not be able to.

Planting evidence, losing keys, forgetting where you buried something or being falsely accused of burying something are not new. Dealing with these gray areas is the job of the judge and jury, and it isn’t a technical problem, or a new one.

Refusing to decrypt is itself a crime (contempt of court?), so the government can jail you for years that way.

Edit: this doesn’t help in cases like terrorism where the owner of the device has already been killed of course.

I am not sure you can be compelled to remember something you forgot. And nobody can tell you if you remember or not, or if you will ever remember again.

So no, some judge may try to hold you in contempt of court, and it may work for a while, but at some point -- if you have a good lawyer it will turn in to a civil rights issue.

Also, you could plead the 5th as well.

NOTE: I am not a lawyer, and these are just my opinions on this matter.

Courts have ruled that providing a password to decrypt something is not in itself testimony and isn't protected by the 5th amendment.

Judges can certainly hold you in court indefinitely on a contempt of court charge as well. Considering the nature of the charges that this typically comes up in (see Francis Rawls) I wouldn't count on trying to make it a civil rights issue doing much in your favor.

Other courts have ruled the other way on that. It is not settled law.

But what if someone really does forget the key? They are just jailed forever?

It’s important to point out in this case the stated reason for the contempt charge is the “foregone conclusion” that there IS child porn on those encrypted drives, not the mere existence of encrypted drives with what could contain anything.

Police can’t compel you to provide a combination to a lock (encryption key) to go on a fishing expedition, but if they KNOW the safe contains illegal contents then you can be held in contempt for not providing it (they already know it’s in there).

Without spending forever delving into the case record I can’t comment on whether they should really have a high enough certainty that the drive does contain CP, but the argument in this specific case does match our interpretation of the 5th amendment when it comes to physical locks.

This is a good argument WHY encryption is important, however. Backdooring crypto would allow law enforcement to fish through everything they want with wanton disregard for the fifth amendment, instead of needing to build up a suitable case for illegal acts being committed with standard investigatory techniques. This right here is how the system SHOULD work.

If the prosecution has such convincing evidence that the drives contain the images they say, then why do they need to compel the defendant to do anything at all? If it's such a foregone conclusion, why not just go ahead and try him on the child porn charge?

Dunno, I’m just an armchair lawyer who watches too many Leonard French videos. Like I said in my original comment, I haven’t read the case record in detail, nor am I familiar enough with the Federal Rules or Criminal Procedure to know if there’s some evidentiary requirement they cannot meet without the contents of the drive or whatever.

I’m guessing it’s because they are operating off testimony of a witness (defendants sister) claiming she was shown the alleged images, and since they weren’t on the unencrypted internal drive they MUST be on the encrypted external drives. That combined with the knowledge that these files were in fact purportedly known to be downloaded via his internet connection is enough for something, but all they have without the drives is hearsay, hence the compulsion to decrypt then?

Personally I think in this instance with recent rulings that an individual cannot be identified by an IP address and a single witness that there isn’t enough to KNOW anything, otherwise I could wardrive around, download a bunch of CP on somebodies connection and say I saw them looking at it through a window or something.

All I know is that we do have precedent for this in the physical world, so it’s not a logical leap to require disclosure of cryptographic keys when we KNOW what they unlock.

I just don't buy that argument. If the evidence is mere hearsay, then how could it be good enough for compelling him to testify against himself?

It isn't hearsay in this case though. Hearsay is when Alice testifies that Bob told her he witnessed Charlie viewing child porn. Generally it should not be considered as fact that Bob witnessed this.

It's not hearsay if Alice testifies that she witnessed Charlie viewing child porn, which seems to be the case here. This kind of testimony is direct evidence.

Then there is the circumstantial evidence: The prosecution can show these encrypted drives exist, belong to the defendant, that he knows how to decrypt them and refused, etc.

I just don't see how it's reasonable to claim that isn't good enough for a jury to decide who to believe. They should just try him, or let him go.

This situation also has an analogy in the physical world: if the owner of the key is dead or otherwise non-coercible, that's effectively the same as a physical document being destroyed.

I wonder whether there is any work on plausibly deniable public key cryptography.

The sender uses the public key to encrypt the plaintext, and the receiver uses their private key to decipher the ciphertext, as usual. But, on being compelled, the receiver can also choose an arbitrary target plaintext, and efficiently compute a new private key that maps the ciphertext to the chosen target plaintext.

Take a look at this: https://en.wikipedia.org/wiki/Off-the-Record_Messaging

It is designed for that, and more. In fact, it does not leave the theoretical "anyone could fake the logs", they created a tool to do so, so that you do not need an expert witness to explain to court that someone could doctor the logs - a tool exists for it. On purpose. They call it Deniable authentication:

> Messages in a conversation do not have digital signatures, and after a conversation is complete, anyone is able to forge a message to appear to have come from one of the participants in the conversation, assuring that it is impossible to prove that a specific message came from a specific person. Within the conversation the recipient can be sure that a message is coming from the person they have identified.

And when a physical lock is broken it's known by anyone who can observe the lock.

When a decryption has a backdoor, who knows when it's been decrypted?

It's increasingly difficult to do this without incurring contempt from our bootlicking judiciary.

You're gambling on the temperament of your judge if you do this.

A warrant doesn't decrypt a message, a person does.

Barr is specifically addressing cases where people refuse orders to decrypt their phones or messages and just go to jail instead. That's what he means by warantless.

Did he basically just announce a false flag?

"Obviously, the Department would like to engage with the private sector in exploring solutions that will provide lawful access. While we remain open to a cooperative approach, the time to achieve that may be limited. Key countries, including important allies, have been moving toward legislative and regulatory solutions. I think it is prudent to anticipate that a major incident may well occur at any time that will galvanize public opinion on these issues. Whether we end up with legislation or not, the best course for everyone involved is to work soberly and in good faith together to craft appropriate solutions, rather than have outcomes dictated during a crisis. "

I'm no fan of Bill Barr, but I don't read this that way, no. It reads to me more like he's saying that from a planning perspective it's better to figure the worst thing that could happen and have a plan already developed that could handle that, rather than being caught by surprise and then having law and policy made in a mad, panicked rush.

(In other words, let's not do with cybersecurity policy what we did with counter-terrorism policy in the weeks after 9/11.)

> have a plan already developed that could handle that

But... we do have a plan, which is to just not do it in spite of any crisis or whatever. He is misleadingly framing it here like we don't have the ability to backdoor encryption which has never been the problem.

He clearly states that what we need to be weary about is public opinion changing, which is basically like saying that we should just get ready to compromise our standards in preparation for the day where reactionary desire is able to overcome our "sober" thinking of the present, or else fear the government coming in and doing it sloppily and by force.

Clearly that's irrational. We should resist it now and we should resist it then too, for just the same reasons we resist it now. There's no technology issue here, just an ethical/political one.

San Bernardino also already happened. And it wasn't a big deal. They eventually got the phone broken, and there was nothing of value on it. But that's besides the point, it's an example of the type of thing they are talking about.

From my perspective life would not have been any meaningfully different either way if that phone stayed locked. I also can't image some future scenario where it makes such a big deal. What type of information is going to be on some laptop or smartphone that is so important it's worth compromising our general civil rights? There is almost always a hundred human errors around the crime already that they can piece it all together without godmode on every electronic device. A smartphone is rarely an all encompassing security mechanism for any big evil plot.

There is no 'backdoor' technology solution here that makes sense and they need to get used to it.

Barr's is really just the same reasoning used for the Patriot Act. Something really bad could happen unless everyone gives up X rights.

isnt that a bad example though, because the counter-terrorism policy was already prewritten before the event. it wasnt a hasty reaction.

The hasty reaction was the panic that led lawmakers to grab for the most sweeping policy option they could find. What looks like prudence and caution in normal times looks like half-measures and cowardice in the heat of an emergency. Those policy proposals would have continued gathering dust forever, had not 9/11 provided a political moment that transformed things that everyone had considered faults into something that looked like virtues.

When a bad event happens, whatever laws you have already, no matter how strict they are, are always seen to be not enough, because people buy into the idea that if they were enough then the bad thing would not have happened. Putting into place strict laws before an event does not stop over-reaction after it.

That's how you're supposed to do things, though. You want to write the policies before the event, when you can think things through slowly and carefully. There's nothing wrong with this part of what he has to say...only the other parts.

im saying 9/11 is an example of writing the policy ahead of time, and waiting for when you need it NOT throwing a proposal together after a stimulus.

9/11 is an example of what hes proposing, not a counter example.

This is how I interpreted it. It seems like a wise, level-headed approach.

Yes. But I think it's also a trick. He wants us to accept that we have not already had the debate. He and those who think like him will continue to use this line until they get an outcome they like.

And, by the way, we have had the debate--even, arguably, in the midst of a crisis. This was all over the news for weeks after the shooting in San Bernardino when the FBI told us it was vitally important to gain access to the perpetrator's phone. They didn't get their back door. Legislation was proposed that would have required it, but it was never adopted. (Though, in fairness, FBI did supposedly get a private company to break the encryption. But this was only after a very long delay and after all the public debate had largely dies down.)

Wise? You think encryption back doors are wise?

Encryption back doors are not wise, no.

Barr's point is that it's better to have that argument now, in a level-headed moment and with opportunities for all the relevant stakeholders to provide input, than it would be to have it in the middle of some dire emergency.

If you oppose back doors, I would think you would agree with him on this -- government tends to be delegated sweeping powers in emergencies, so it would be much harder to stop gov-friendly proposals like "back door all the things" in that kind of moment than it would be to stop them now.

True. And even if we stop them now, we must be prepared to fight the battle again, because after a major incident, people will be screaming "See, we really need them! Give them to us so it won't happen again!" And we will have to explain, again, to people in panic, or people looking to exploit panic, that it still isn't a good idea, even after a major incident.

Agreed. It's better to have the conversation now because if we don't and the "event" does occur those who support encryption backdoors will use it as the basis of their argument in all possible manner.

"We must do something now, it's clear that we need backdoors because of X!" <- That's a much worse position to be in during a debate.

Sounds to me like he's trying to kick off a conversation about how to go about making it difficult for the bad guys to hide behind encryption and other security products. It's really not an easy solution as there's very few bad guys we need to expose but many good people we need to protect, for reasons currently known and, more importantly, unknown. This one is a hard dichotomy that will be interesting to solve: (1) give no hiding place to the bad guys (2) make the good guys undiscoverable. It'd be sad to see a solution by decree (9/11-style) but I fear that's where we're headed so long as private companies are unwilling to find a workable solution. Peacetime is a delusion.

They can issue as many decrees as they like but they can't solve the problem with decrees any more than they can decree that water is dry. All they can decree is that it is illegal to use effective encryption, which would be, um, unfortunate.

You have a point. By decree what I meant was that in the event of a disaster and panic the public will back any law that forces, say, Apple to give unfettered access to law enforcement. By then it's too late to engage in debates. The public became interested and very unforgivenly sided with law enforcement. They'd have prioritized their safety over being able to send cat pictures securely. Similar to 9/11.

Breaking encryption would cause breaches orders of magnitude more catastrophic than encrypted communications between bad guys.

See it this way: we have to know what the bad guys are saying in order to be able to protect the public. The way I see it the US government (and governments around the world) will make this a non-negotiable objective. There's not a lot of pressure now because, as Barr said, the event that will turn the public against encryption hasn't arrived yet. If the parties involved don't find a solution in the meantime they'd be forced to weaken encryption for everyone when a catastrophe happens. The public is fickle. Our safety is paramount.

> we have to know what the bad guys are saying in order to be able to protect the public

Why do you think that?

Speaking of dichotomies, referring to people as "good guys" or "bad guys" is a peeve of mine. Obviously, society should do what it can to prevent people from committing crime or terrorism, even using lethal force when necessary. These "bad guys" don't imagine themselves as evil actors though. They may be wrong or misguided, but most of them are doing what they think is right. To put things in perspective, Martin Luther King was considered a criminal by segregationists and WW2 resistance fighters were considered terrorists by the Nazis. You're never going to win wars on crime or terror by sifting out bad people from good people.

We can't defer to people's judgement of themselves and their intentions. The law and the courts make an independent distinction between legal and illegal.

I might be missing some subtleties here, because it does read surprisingly reasonable. A "crisis" "galvanizing public opinion" is always a recipe for disaster. See e.g. US overreaction to 9/11, from which the whole world still suffers.

Yeah. I'm no fan of Barr, but "we should think about this stuff before a big event stirs public outrage" is very reasonable.

"Don't do anything until we have a big problem" is how we got the TSA, Homeland Security, and the Patriot Act.

"Be aware that the Reichstag could burn at any time."

It’s really quite incredible how these subversions of our privacy (even including blatant disregard for the constitution) are rarely questioned when it comes to preventing terrorism, but when it comes to mass shootings that have actually killed far more people in the US, those guns are sacred objects.

But notice how the label of “terrorist” is uniquely applied to the ethnic “other”, and now consider the first real gun control legislation — the Mulford Act:


It's not incredible at all if you understand that all the power in the world is ultimately derived from people with guns.

In case it wasn’t already blatantly obvious, my point is that terrorism in the US is fundamentally associated with “brown” people, and elicits the most immediate and direct attention from governments, whereas mass shootings are almost always committed by white men, and even the worst mass shootings against children elicit no real action.

But yes, guns have been regulated in the past — when? After the Black Panther party brought open carry weapons to the state capitol during a protest.

Different motives: mentally ill vs spiritual belief. and there are plenty of initiatives to ban guns after terrorist actions done by whites. not sure what your point is.

I just stated my point with perfect clarity. And no, gun control has been legislated almost entirely in response to perceived violence from communities of color.

The public opinion is clear - security should not be compromised, just because some want to have backdoors and can't get over the fact that it's a very bad idea.

> We think our tech sector has the ingenuity to develop effective ways to provide secure encryption while also providing secure legal access.

Yeah, may be he can also claim, tech sector can achive perpetuum mobile. This just keeps coming back all over again. He should get over the fact that it's impossible, and move on to dealing with it. Next time he should consult actual security expects before producing the above nonsense.

More likely transparent opportunism, in my opinion.

Which is also bad and gross.

It's either a false flag or fear mongering, neither of which should be used to take away rights.

Both of which are bad in their own right. He really is an awful AG.

where did we see it before?

"Further, the process of transformation, even if it brings revolutionary change, is likely to be a long one, absent some catastrophic and catalyzing event – like a new Pearl Harbor."

[0] https://en.wikipedia.org/wiki/Project_for_the_New_American_C...

The most concerning part to me is that this speech now prioritizes the interests of individuals (he calls "consumers") below that of large corporations and governments.

A country is made of people. In some ways we of course act as "consumers" but that is not the beginning and end of what it means to be human. The government's needs are not endogenous; the government's justification for doing certain things is ultimately because people will be better off for it (otherwise it's simply "might makes right)". In addition, corporations, at the end of the day, get certain protections (and additional requirements as well) as they are they are machines to help people achieve various ends (e.g. providing goods, providing jobs, providing an opportunity to create wealth); they are not primary actors in themselves.

BTW my observation is not a comment on the specific politics of the past few years; past AsG and FBI heads have given similar talks and inherently will desire to achieve their job's objectives with the minimum of barriers. This scary formulation just shows how the terms of discussion have shifted.

The discussion on this from the pro-encryption team has to move towards explaining it in terms of national security, as national security is the reasoning the anti-encryption group uses.

National security is a major trump card across parties and administration, and will have to be responded to versus ignored, as that's where the argument is coming from.

It's easy enough to explain that Russia has mathematicians, ISIS has mathematicians the same way they had chemical engineers for the oil fields, China/PLA has mathematicians, etc.

The same fear mongering that is allowing an anti-encryption argument to advance can be used to fear monger right back towards encryption and be based in truth: Russia and terrorists can access my chats.

For the pro-encryption crowd, we know this is actually feasible technically and the end result of backdoors. We just have to explain it on common ground, where the argument lives.

Matt Blaze spent yesterday discussing this on Twitter:


His Twitter feed is well worth a follow if you care about these issues.

You know how every cryptographer and security person feels when this comes up? Like the poor schmuck at NASA who signed up to explore space but instead has to spend their day explaining why the moon landing wasn't a hoax. Again and again.

Even without encryption back-doors, people are under unprecedented levels of surveillance, and it's only getting more pervasive.

So I find it very hard to believe the job of law-enforcement is getting harder, not easier, just because we have some tiny scrap of privacy left.

It is quite certain that law enforcement has more capabilities than they are willing to reveal in courts. Even if encryption backdoors were available, it is dubious they would routinely submit it in evidence.

Encryption is not an impediment to an investigation into an ongoing activity, files need to be decrypted, there are side channels everywhere, etc. Metadata and physical surveillance is enough to convict or put a person in a position where they could be convicted under some other law if there is no convincing explanation for why they were where they were.

Usually the point of mass surveillance is to retroactively look up a person of interest and blackmail them.

Strong encryption absolutely impedes investigations. And, what indication is there that the primary purpose of mass surveillance is blackmail?

The cat's out of the bag. All this would lead to is law-abiding citizens having their information at risk while criminals continue to use crypto without backdoors.

I wish laypeople hearing this stuff realized how easy it would be for the "bad guys" to use one time pads.

It would be trivial to have terror cells be distributed a USB with several GB of a OTP, and that would be unbreakable even into the age of quantum computing if used properly.

Thus isn't at all about terrorism or the "really bad guys." It's 100% about accessing the average Joe Blow's communications.

Bad guys already do use such schemes and more.

If the government can’t crack strong encryption as-is, the problem is that strong encryption is deployed at scale.

Build a better exceptional access encryption system that solves human and technological shortcomings.

Removing strong encryption at scale would have far more effect than what you’ve described.

> The Department has made clear what we are seeking. We believe that when technology providers deploy encryption in their products, services, and platforms they need to maintain an appropriate mechanism for lawful access. This means a way for government entities, when they have appropriate legal authority, to access data securely, promptly, and in an intelligible format, whether it is stored on a device or in transmission. We do not seek to prescribe any particular solution. Our private-sector technology providers have immensely talented engineers who have built the very products and services that we are talking about. They are in the best position to determine what methods of lawful access work best for their technology. But there have been enough dogmatic pronouncements that lawful access simply cannot be done. It can be, and it must be.

This seems to be the key part. He doesn't believe technologists who claim both goals cannot be achieved at once, he claims they can

Is "technologists" a new term for someone who has a basic understanding of how math and computers function?

Over and over:

* https://en.wikipedia.org/wiki/Crypto_Wars

The open source folks have worked around this before:

* https://wiki.debian.org/non-US

A specific claim of the AG, and one that I've seen relatively smart people assert before, is that software update systems could be adapted to insert these backdoors into individual phones, securely and reliably, upon receipt of a valid warrant.

Software update systems have been successfully exploited to deliver malware:

> On a normal day, these servers push out routine updates—bug fixes, security patches, new features—to a piece of accounting software called M.E.Doc, which is more or less Ukraine’s equivalent of TurboTax or Quicken. It’s used by nearly anyone who files taxes or does business in the country. But for a moment in 2017, those machines served as ground zero for the most devastating cyberattack since the invention of the internet—an attack that began, at least, as an assault on one nation by another.


Presumably the software update systems for major operating systems, like for Android or iOS, are typically more heavily secured than M.E.Doc.

But they are also targets of limited value. To insert malware into iOS, you would need not only access to their software update system, you would need access to (and understanding of) their source code and build system, and access to their code signing key.

And even then, it's not clear that these software update systems are even capable of targeting patches down to the level of the phone of an individual person. There's no reason for it now. The central system really just needs to make the update available in its various OS flavors, and each client can request what it needs.

If we force these OS companies to create a targeted backdoor system, all the hard work will be done for the bad guys. They need only achieve access to the special "law enforcement access" system, they will have everything they need all ready to go.

Under these conditions, could Google or Apple keep out the bad guys with 100% success? I have great respect for these teams, but those are very long odds.

It's far safer, for them and for us, to just not build that functionality. This was the point that Apple so forcefully made when Jim Comey came after them to decrypt the San Bernadino iPhone.

EDIT to add: these companies operate in more than just the U.S. If they build a targeted backdoor system, you don't think other countries will demand access to that system as well? Look: Apple already compromised on iCloud hosting to maintain access to the Chinese market.

> And even then, it's not clear that these software update systems are even capable of targeting patches down to the level of the phone of an individual person. There's no reason for it now.

There is reason against it now, because it makes it impossible to do things like reproducible builds or other security checks like comparing the software being offered to other devices to verify that none of them is being offered compromised updates before installing any of them.

It would also require prohibiting the transparency necessary to implement any of those checks independently, or anyone could do so and then use that to detect the attack regardless of whether or not the attackers are domestic state sponsored.

If the US Government succeeds in requiring a backdoor then so will every other government.

Does anyone really think China won’t immediately demand backdoors?

It's even a problem with allies as well as adversaries; countries can potentially get around their own legal limitations by agreeing to spy on one another, and where needed to build a case, covering their tracks via parallel construction. Whether officially or unofficially, any backdoor keys would leak almost immediately.



Not just governments, but various mafias, several corporates, random masonic offshoots, and the occaisional creative individual presumably have backdoors into all sorts of stuff. This isn't about creating backdoors, this is about being allowed to use them within the public legal framework.

I thought they already did, but in the layers they control, like Chinese apps and telecoms.

So, not that different from AT&T's infamous Room 641A (https://en.wikipedia.org/wiki/Room_641A).

Right, just with more people onboard and also owning the apps in user space.

I think you have it backwards: $OPPRESSIVE_REGIME is already doing it, but Land of the "Free" somehow needs it, because terrorism.

It's worse. They will use the exact same backdoors.

China already does require backdoors of both local Chinese firms and foreign firms with HQs in China, although the requirement may not be "technically official" (it can certainly be interpreted that way from their 2017 law, though).

I think it's disgusting how supposed "democracies" have been trying to emulate China, both in terms of surveillance and censorship. UK is one of the worst offenders here -- sometimes they didn't even hide the fact they were using China as a role model.

There used to be a time when the U.S. government and other countries would condemn China for this sort of stuff.

He talks about the Fourth Amendment, but not the Second.

If encryption is a weapon then I would think the Second Amendment applies, eh?

Ultimately, enforcing agencies want privacy for themselves and transparency for everyone else. At the scale and speed of digital services, this asymmetry can go down very fast towards an authoritarian path.

There can only be 2 solutions:

- enshrine a right to privacy. Individuals should have a way to communicate in a way that is completely secure and free of evesdropping because they are believed to be innocent until proven guilty. Likewise, enforcement agencies should be granted the same to do their work.

- adopt symmetric transparency. Individuals will then be allowed to follow the intimate communications of any leaders or enforcement agencies, with the same level of ease. So if you want me to have to file a FOI to get info about an official, an equally difficult/time-consuming process should exist the other way around. OR if you want an officer to be able to monitor any individual in real time, then I should be able to monitor any officer in real time.

That second case should be automatic anytime the "nothing to hide" argument is invoked.

The cost-benefit analysis is interesting:

> If one already has an effective level of security — say, by way of illustration, one that protects against 99 percent of foreseeable threats — is it reasonable to incur massive further costs to move slightly closer to optimality and attain a 99.5 percent level of protection even where the risk addressed is extremely remote?

> if the choice is between a world where we can achieve a 99 percent assurance against cyber threats to consumers, while still providing law enforcement 80 percent of the access it might seek; or a world, where we have boosted our cybersecurity to 99.5 percent but at a cost reducing law enforcements access to zero percent — the choice for society is clear.

One issue with all proposals around this, is risk = probability X impact. While the above speaks to the risk, the impact of malicious actors having their hands on masterkeys would be insta-access to any & all gov-mandated communication channels, to the exact same access level as warrants would afford.

While the attorney is right, that so far most corp master certificates have not been compromised, none of those had this pricetag attached to it. And the impact of this would be retroactively applicable -ie for any present-day communication, we'll be taking on faith that no future masterkeys will be leaked, ever.

I would not take that bet; and so far, neither did insurance companies.

In general, I agree with the government stance that "warrant proof" communication is not in the best interests of US citizens. I believe that there is some precedent and established law that can be built upon to provide a compromise that allows for encryption to remain a strong privacy tool for society but one that does not hinder the state from lawful access.

I believe that the US should establish a court similar to the Foreign Intelligence Surveillance Court created under the FISA Act. The government must make a case to a judge establishing probable cause, and if approved a warrant can be issued to a 3rd party communications provider to disable encryption on suspected devices such that lawful interception (i.e wiretap) can be executed.

Warrants are subject to renewal every 90 days and access to encrypted communications prior to the date of warrant approval and not provided by the platform specified in the warrant are prohibited (ie, obtaining a warrant to disable and intercept WhatsApp does not mean you can disable and intercept Signal as well).

I believe this balances the interests of individuals, governments and communication providers evenly.

The FISA Court isn't a legitimate court of law. Why would you want another one? There is no adversary there, it's one branch arguing to violate the Constitution while that same branch pretends to defend the target. It the Star Chamber of Technology. https://en.wikipedia.org/wiki/Star_Chamber

How often are the people making the arguments from the same political party? This problem extends to pretty much every court, as we currently have 3 branches being gamed by 2 political parties.

This will be one of the fracture lines that break the country.

Warrant proof communication is absolutely in the best interests of the citizens for exactly the same reason it's not in the best interest of the ruling government.

If a given nation requires back-doors or compromising encryption in any way....

It seems inevitable that it would help that given nation's "enemies" more than that given nation. Their "enemies" will get a hold of them, and they can make use of them however they want free of restrictions unlike the given nation.

I don't see anyway around that problem.

There’s no discussion of how to build exceptional access encryption that solves the weakening issue, just that it “can’t be done”.

The spirit of this initiative in 2019 is likely more about stopping strong encryption at scale, which is certain to be a frustrating black hole for LEO and the IC.

Perhaps HN would do well to ask how to solve the problem from a technical perspective, given the requirements. This includes both how to build a better mousetrap (one that doesn’t have a “backdoor” or significantly weakens the encryption mechanism), and how to solve concerns about abuse of exceptional access.

> This includes both how to build a better mousetrap (one that doesn’t have a “backdoor” or significantly weakens the encryption mechanism), and how to solve concerns about abuse of exceptional access.

There is a simple way to solve concerns about abuse of "exceptional access": Not to include any "exceptional access" mechanisms. Securely implementing a cryptosystem is a daunting task almost never achieved. Intentionally creating a human-controlled mechanism to access plaintext makes the problem much, much worse.

> There’s no discussion of how to build exceptional access encryption that solves the weakening issue, just that it “can’t be done”.

Please consider that there is fundamentally no way to solve concerns about exceptional access. "Exceptional access" means that there is necessarily a human attack vector: Those humans who control whatever mechanism exists to provide LEO access to plaintext. This necessarily weakens any cryptosystem. If those people are compromised, "exceptional access" will simply be "routine access". Further, because decryption of data emits no obvious signs of physical tampering, even citizens who trust that "exceptional access" is not being abused cannot verify that.

I actually appreciate the name of your 5 hour old account. You're correct. We are experiencing mass hysteria over cryptography. However, it is not security professionals who are hysterical: it's people like you, who apparently never met an argument against liberty that they didn't like.

Let’s leave politics and assumptions about me out of it, please.

Same point: figure out a technological and procedural solution to the human attack vector. If “security professionals” all agree on ideology or theory that it’s not possible and thus refuse to help solve the problem, then exceptional access solutions generally will be worse off for it. It’s independent of whether they actually are deployed.

You've missed the point.

There is no solution. If you build in your "exceptional access" exception, then the system is broken by design and no one will use it. That's the end of the discussion, there's nothing more to discuss. You can rube goldberg "solutions" all day long, but in the end you're just figuring out ways to deploy a broken system.

The government has a different idea of what constitutes “broken” in this case. Of course adding a third party introduces additional risks. Two parties versus three parties: All can access the clear info; neither scenario is without risk. The goal is to find a solution that minimizes the risks of providing exceptional access.

Again, simply arguing that “it can’t be done”, which is of course theoretically true if the goal is to have zero additional risk by introducing a third party, isn’t going to stop such systems from being deployed, it will simply reduce the quality of such solutions due to talent refusing to work on the problem.

An idea that comes to mind: third party can’t trivially decrypt the data (maybe it requires substantial computation to decrypt) thus reducing practicality of bulk decryption. Make the exceptional access truly exceptional.

I agree that having a trivial way for governments to access encrypted comms at scale is bad; I don’t agree that governments should be completely locked out, without exception, of all comms deployed at scale by mega tech corporations.

You're describing a broken and unusable cryptosystem. What you believe requires "substantial computation" to break today requires a consumer GPU tomorrow.

There is no minimizing the risk. Your concept is broken. It does not -- and cannot -- provide security of any use. And I don't care what the government thinks about it.

It’s just a brainstorming idea. There would be a key as well - the idea would require both the key and substantial computational power for exceptional access

Brainstorming ideas are meant to be thrown out. Attacking the idea respectfully is fine. It’s to help inspire other ideas.

Of course there’s a trade off involved. But whether it’s two party encryption or three party encryption with exceptional access none is perfectly secure anyway. There is major conflation of political ideologies with hardline technical viewpoints going on

Pre quantum algos have this shortcoming built in

Word salad.

> [...] arguing that “it can’t be done”, which is of course theoretically true if the goal is to have zero additional risk by introducing a third party [...]

So then you recognize that "exceptional access" mechanisms necessarily weaken a cryptosystem, which are already notoriously difficult to implement securely. This brings us back to your OP, where you complain about people telling you the truth you already recognize, and make two entreaties for assistance from HN:

1. "Perhaps HN would do well to ask how to solve the problem from a technical perspective, given the requirements. This includes both how to build a better mousetrap (one that doesn’t have a “backdoor” or significantly weakens the encryption mechanism) [...]"

2. "[...] and how to solve concerns about abuse of exceptional access."

I understand now that you suffer from severe cognitive dissonance with respect to the first. You just acknowledged that the "weakening issue" with "exceptional access" cannot be solved, yet still argue that it can be solved, presumably with more effort from security professionals.

I already addressed the second: Concerns about abuse of "exceptional access" also cannot be solved, except by avoiding their inclusion in the first place.

Your idea is also a non-starter. Human political masters will set the work parameters, not users (otherwise: Who would choose anything but an infinite amount of work to decrypt their communications?). Users would have no way to verify the work required to decrypt as, again, they cannot verify that communications have or have not been "exceptionally accessed". The work parameters must be updated as technology improves, so there must be a way for human political masters to update work requirements (potentially reducing them). Nobody outside certain SCI or ECI compartments has any idea what kind of cryptanalytic power USG can bring to bear. Maybe, like Skipjack, the proof of work cryptography is subtly weaker than expected in a way that only they know. Maybe the USG will just start allocating $100B/year to routinely use "exceptional access". And certainly, after such a backdoor scheme is deployed, LEO and IC will howl that they cannot access enough plaintext to stop child molesting terrorist superpredators, and anyone who would just think of the children would support reducing or eliminating the burdensome computational obfuscation parameters. Once again: Any such "exceptional access" scheme necessarily reduces security by inserting a critical dependence on trust in humans that cannot be verified and whose compromise has Biblically enormous value to many groups.

> I agree that having a trivial way for governments to access encrypted comms at scale is bad; I don’t agree that governments should be completely locked out, without exception, of all comms deployed at scale by mega tech corporations.

If we agree on the first part, then we should agree on everything that I've written. "Exceptional access" schemes only make sense for unconstitutional dragnet surveillance purposes and are a severe threat to liberty. If a target is known, and is found to be using cryptanalytically impenetrable cryptography, targeted physical surveillance will defeat that cryptography every time. If some impenetrable communications happen between two non-targets, it doesn't matter that those communications cannot be read, because the government doesn't want to read those communications anyway--right? Of course, serious criminals and terrorists--the ones on whom collection is really important for security--are not going to use known-compromised cryptosystems when non-broken ones are already ubiquitous. Therefore this "exceptional access" is only useful on the average citizens; unless, that is, the government is doing dragnet surveillance and attempting to "winnow" out secure communications, something they can only do effectively if they attempt decryption of every "exceptional access-enabled" communication.

Finally, consider your request in the historical context. For the great deal of our history, communications have defaulted to being private (there were no microphones in Lincoln's log cabin) and inaccessible to government agents except through testimony (which cryptography does nothing to prevent). Now your claim is that the government must have the ability to access any communication. But why? Our government and society worked just fine without substantially all communications being recorded and accessible to the government. Such a large shift in the balance of power will, I fear, lead inevitably to tyranny.

It’s a trade off. There’s no cognitive dissonance, just refusal to work towards better compromises. Lovely argument, though. Euphemistically it’s clear you’re very passionate about this issue. Maybe my hacker news throwaway should’ve been called cryptopassion

There is no compromise. There is either security, or there is not. You want the not, because you prioritize government access to all communications over privacy.

The rest of the world disagrees with you.

No, not true. Crypto isn’t perfect as is, and involves levels of security.

Two party crypto has two parties who could leak the data. Two party with exceptional access has three. Current crypto is susceptible to brute force via shor’s and quantum

The rest of the world absolutely does not agree with you. It’s just that a lot of people here live in a bubble.

Current crypto is possibly subject to attack due to implementation defects. You're throwing out word salad that you clearly don't understand on this subject -- but hey, let's take your statement as fact for a hypothetical second.

Since as you say, "Current crypto is susceptible to brute force via shor’s and quantum", then there's no need for backdooring algorithms, since they're all already broken.

I mean, none of that is accurate, but given your argument you're asking for something you don't need because you already have it.

I’d like to petition for the usage of all prime numbers to be restricted as well. Far too dangerous and capable of harm in my opinion. We need to secure ourselves against these threats!

There is absolutely no way for the DMCA to keep up with the growth that may flow under it! How anyone expects the AG to prosecute every illegal infringer or posessor of illegal numbers is beyond me. There is a good article at https://www.natlawreview.com/article/digital-millennium-copy...

Shared secret keys may work, with one key shard in the hands of the user themselves. That way a court order may compel the user to give up the key shard, but no govt. agency or other authority can unilaterally access the device.

Then the shard _is_ the secret key. The court has no more power to compel you to give it up than any other secret key. I'm not sure what you're proposing here.

Not true. The shard alone is insufficient to unlock the secret. But to your point, the scheme could be designed in an n of m fashion. The simplest scheme is comprised of three shards: 1) You 2) Org 3) Govt. (ideally DoJ)

Any two can be used in concert to unlock the secret. You and the Org combine shards to access account. You or Org can be compelled by Govt. to reveal shard, through a warrant. The third shard is held at the DoJ, and also requires a warrant.

This is another clear demonstration that traditional government has become obsolete in the technological era. This is one reason that decentralized solutions such as distributed autonomous organizations are so interesting.


He's actively trying to pass such a law, in case this isn't obvious.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact