Hacker News new | past | comments | ask | show | jobs | submit login
White House weighs encryption crackdown (politico.com)
243 points by traderjane 6 months ago | hide | past | web | favorite | 360 comments

There's quite a bit of fallout from banning end-to-end encryption. First, it means snooping communication channels to ensure that they're either unencrypted, or encrypted with the officially approved backdoored algorithm. That's the obvious first step, remember the Clipper Chip's LEAF field, and the LEAF blower attack? Once some heinous crime is committed using encrypted-via-another-algorithm data carried in the officially encryption algorithm, you've got to start occasionally decrypting a communication channel to ensure that a 2nd level of encryption isn't present. The logic of official encryption algorithms almost inevitably leads to all comms channels being checked for decryptability.

The second major side effect will be essentially making writing software into a completely corporate thing. Either the official algorithm is a secret (Remember the Clipper Chip and/or Skipjack?), or writing implementations becomes work that only a very trusted few are allowed to do (see inevitability of snooping above). Currently it's very difficult for medium to small companies to do the paperwork for security clearances. It's not going to be any easier to get programmers certified or bonded or whatever to work on official algorithm implementations. This means slower introduction of any innovations, and any buggy implementations will be in use for a long time before getting fixed.

Yeah, your comment makes the tacit (and correct) assumption that "end to end" doesn't mean anything in the phrase "ban end-to-end encryption".

"Banning end-to-end encryption" implies banning/backdooring all serious encryption, reviving the long lines of failed attempts but in the context where serious encryption has wide, wide, wide adoption.

Earlier plan have quickly shown to be entirely impractical/counter-productive/corrosive. If you somehow enforced that, you can count on someone else besides the NSA getting the key. You would have to agree among all the world's states who gets the key or just create a "great US firewall" plus discarding all the infrastructure large companies have spent billions creating. You'd create a situation where "only outlaws have encryption", etc.

What other plans can we come up with? Maybe put software engineers in camps along with the refugees?

Is there stenographic encoding for binary sort of like Base64 that outputs full human-readable sentences?

So much bad prose on the internet, it seems feasible to make this hard to distinguish from human-written text. Plus you could have multiple codings for values, allowing non-deterministic encoding but deterministic decoding.

How do you prove that the giant integer which represents this text is a random result of my thoughts rather than encoded information?

Well, the US has handled other difficult issues with super-vague laws, like the Computer Fraud and Abuse Act (CFAA) of 1984, which appears to have been used at least as recently as 2011. If you have a weird law like the CFAA, a nation can also do the Chinese Government's "One eye open, one eye shut" vaguely vague policy, where unpredictably, the CFAA is enforced very harshly, usually on someone totally unlikeable. This makes everyone else very wary of stepping over some boundary, never mind that nobody knows where that boundary is, so everyone is very conservative in approaching issues with the law. By applying such a law infrequently, and on unlikeable defendants, the government makes it very difficult to challenge the law itself.

It is much easier to hide information masqueraded as image or video compression artifacts. Imagine how much you can hide in a single Youtube video.

Sure, take your binary, use some markov-chain expansion to yield a human-text seeming expansion and then compress that with standard efficient compression. The results shouldn't be that much larger than the original compressed then encrypted binary. Most large text gets sent compressed around the net anyway.

Something like using a book cipher[1]? But a binary equivalent maybe?

[1] https://en.wikipedia.org/wiki/Book_cipher

This is precisely one of the reasons why OpenBSD was developed in Canada (crypto). No such regulations barring import/export of crypto.

In the end, it's impossible to police the use of strong encryption. Things like Waste and many others saw this coming back in the "Clipper chip" and PGP trouble days. What would end up happening is people just start appealing to OTP. Well-designed OTP is very resistant to attack.

This all reminds me of Neal Stephenson's seminal book, Cryptonomicon, one of the best IT fiction books ever written. If you have an interest in encryption, coding, data havens, currency, etc., this is the book for you. It's a veritable tome.

* Editing to say that despite Cryptonomicon being written in 1999, I still think it holds its own even now. Kind of a timeless classic as far as IT goes.

Remember that stable democracies always have the potential become authoritarian dictatorships. History is littered with examples: France, Germany, Japan, Chile, Turkey and to a lesser extent, the authoritarian strongmen who were elected in the US, India, Brazil, Venezuela and many more.

>France, Germany, Japan, Chile, Turkey

Not a single one of these was ever a stable democracy before becoming a dictatorship. Unless maybe you're referring to Vichy France, but that's more of an occupation.

Turkey wasn't stable prior to Erdogan?

There was a coup in 1980. And the list at https://en.m.wikipedia.org/wiki/Turkish_coup_d%27%C3%A9tat is pretty convincing evidence that calling Turkey a stable democracy would be a stretch.

It has a long repetitive history of violent transitions in power, including the last couple hundred years.

Turkish democracy is barely 100 years old. If we are looking at a timeline of ~200 years, even the US won't come across as particularly stable, what with the Civil War and multiple Presidential assassinations since 1819.

None of those assassinations lead to a coup or revolution though, just peaceful transfer to the 2nd person in line. That seems like a vote for stability

Also, Rome and Greek city states.

Canada is part of Five Eyes with close cooperation with NSA. They probably won't bother anyone unless their software is blocker they encounter when gathering intelligence or maybe something with lots of press. Whatever happened with the Cryptocat dude, if true, might be an example:


Cryptonomicon -- 100% agreed. Further, it's an awesome story! Highly recommended.

My absolute favorite book of all time, Neal Stephenson is an international treasure.

Ditto on favorite. The man has the knack and the knowledge to put out some fantastic stories. I've often wondered why no one has picked up Crytonomicon for a film. I think with the right directors and actors, the movie would be fantastic. I envision a full 3-hour film (it would almost have to be).

Neal Stephenson says it's because his books are too long.

Although, allegedly, Amazon is making Snow Crash into a miniseries, and again allegedly, Tom Howard is directing a Seveneves movie. Weird, that, because I didn't think Seveneves was "finished." All that just to set the stage for Elves, Dwarves, and Man, but no followup stories of Elves, Dwarves, and Man? Weird.

I think Seveneves could work as a movie, and actually be good. I don't know how they'd handle the story "transition" though.

Please split it in two. There's enough plot to go around in the present, no need to contaminate with the future stuff. A single post-credits scene as a teaser for the second film is acceptable.

Snow Crash would be the only other book that I could see as a film, although Reamde might do well, too.

Anathem [1] is another book which could be turned into a script for a movie or series. It has a host of colourful characters, scenery varying from something resembling a medieval abbey to space ships, a fairly linear plot line and enough action to keep a modern audience from reaching for their mobile comfort devices.

[1] https://en.wikipedia.org/wiki/Anathem

nah ;)

So, there have been major strides lately in major, popular, widely-used, mass-market, easy-to-use messaging software greatly improving it's messaging security.

The US government can totally easily reverse that trend.

And it matters a lot.

There is no country on the planet that stands for individual freedom more than the US; we were founded on the idea. The idea that Canada has stronger free speech protection is laughable. There is zero chance this idea goes anywhere. Look at the sources for the article, that should give some indication of its credibility.

There is no country on the planet that pays more blind unthinking lip-service to the concept of freedom I'll give you that but if you think the average US citizen has more freedom by any objective international standard measure than the average Canadian citizen you may want to put down that KoolAid you've been drinking and reconsider because you would find few knowledgeable people who would agree with your assertion.

18pfsmt 6 months ago [flagged]

Only by some absurd idea of freedom you are able to summon. We have the freedom to be assholes yet. Sure, we lost battles against the totalitarians on several fronts, but we are still more free in the sense that one would be free as a sole occupant on a Caribbean island.

People are allowed to freely associate in corporations and pursue their economic goals. Canada is smaller than California, population wise and GDP, and most of your income comes from natural resources like copper, coal, oil, and gold.

Non-Americans can't stop visiting American websites. Hmm.

18pfsmt 6 months ago [flagged]

The negativity on this comment is amazing. These Euro ideas would die so fast. I will pay for your flights to Denver. please come to CO.

uh the us sucks please fly me to Denver

> There is no country on the planet that stands for individual freedom more than the US; we were founded on the idea

There's a lot packed into that statement. Without switching into an off-topic discussion, I'd suggest that it's a lot like GPL vs BSD licenses - there's more than one way to define "freedom", and one can focus on results vs options, on groups vs individuals, etc.

>there's more than one way to define "freedom"

Only to people with an agenda. This garbage way of redefining words to suit your predetermined policy goals is ridiculous. My comment is severely DVd but nobody supplied an example of a more free country.

All tech companies are HQd in the US for a reason. Stable private property protection and individual freedom is most prominent in the US.

The collectivists lost the argument that had been ongoing since the 1920s when the USSR fell, and some folks want to have this argument again (even when they already lost).

How many times is Europe going to fine big tech for exercising their natural rights? We need at least one country with minimal rules in the world.

While what you say is true, OpenBSD is merely based in Canada by dint of Theo de Raadt residing there and being the project lead. Being that open source is more or less borderless, even if he were told to beg off developing crypto, the code is out there. Ditto every other open source crypto project. The beauty of open source is that it can be inspected. Pandora's Box and all that...

If people want to communicate using encryption, they will do so. Maybe not with today's algorithms and implementations, but they will continue on. OTPs, new algorithms, hiding stuff in plain sight, etc. I think steganography is too obvious and too easily discovered.

Canada has no power in the world. They are our surrogate. OBSD is in Canada because that's where de Raadt lives.

18pfsmt 6 months ago [flagged]

You people are such pansies. We could take you on all day long. I'm in Denver. Bring it. I will take you on the Colorado River, and if you say anti freedom stuff, I will drown you in the river. Try it out.

Come on, please keep HN free of this.


Sorry, man. I was drinking beers. Anti-freedom people make me a bit crazy. It's difficult for me to see so much hate on HN.

"Fifty-Four Forty or Fight!"

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

That's an interesting hypothetical for a law school exam, but I don't think any of the Supreme Court would find it very compelling, as it doesn't materially abridge any of those things. People not feeling comfortable with saying things is materially different than actually restricting them from saying them.

EDIT: I wholeheartedly disagree with restricting E2E and believe in freedom of speech as strongly as one can believe in any explicitly enumerated right, I just don't agree the text of the First Amendment covers this, and it's hard for me to imagine any of SCOTUS coming up with an argument supporting it.

I would argue that freedom of speech includes the freedom to say things that look like absolute gibberish to the surveillance apparatus. Whether they contain a coded message is irrelevant.

You could argue that, but does US constitutional jurisprudence support that argument?

If US constitutional jurisprudence, in the eyes of our highest officials, supports the argument that corporate money spent politically is speech, then I hope they'd agree that a random-looking sequence of on and off states transmitted through some physical medium with the intention of one or more recipients gleaning meaning from the output is also a kind of speech. I'm definitely not a lawyer, though.

I don't understand how the two are related, can you clarify? What does corporate spending on political campaigns have to do with encryption?

I just mean if they use a broad definition of "speech", including a corporation spending money, I subjectively feel like that definition should also encompass sending text or symbols which appear to be random or encrypted. There is no relation between the two other than that I think if money is speech, pure information, in any form, should also be considered protected speech, unless there's a copyright on the information. Similarly, I think things like burning a flag, or sending someone a painting, also should be (and are?) considered speech.

Yep, my point when asking you to clarify was to see whether you were making this argument: "the Supreme Court will let basically anything crazy count as free speech, so why not this?" It seems like indeed you were.

But Citizens United doesn't rely on a broad interpretation of freedom of speech at all. It relies on a broad interpretation of freedom of association. Supporting various political views is "speech" by any possible understanding -- that's not broad at all. The main "innovation" of Citizens United is that when people form associations (like corporations) they retain certain rights, such as freedom of speech.

A better analogy would be if we already all agreed that individuals have the right to send encrypted messages, and we were discussing whether corporations retained the same rights. Then you'd have a point, bringing up Citizens United. But when discussing whether people have some underlying right at all, it doesn't seem relevant.

I love how when you apply basic logic the abuse taking place is shown for what it is.

Of course, people will point to X law or precedent and say "see that justifies it'. They don't see how that law should be invalid by the very logic that the state created for itself.

That would be an excellent argument to make in the issue spotter! That said, restricting E2E at a vendor level probably wouldn't be seen as infringing at the individual level - ie restricting E2E software doesn't restrict the ability to send gibberish containing a coded message any more than restricting an ink that (somehow) doubled as a plastic explosive.

> restricting E2E at a vendor level probably wouldn't be seen as infringing at the individual level

Would the Citizens United case apply hear? I'm thinking about corporate personhood arguments.

Citizens United did not establish corporate personhood, and is entirely unrelated here.

It might be worth while reading wikipedia on "Chilling Effect". The Supreme Court has already agreed with the gpp some 70 years ago now.


The term chilling effect has been in use in the United States since as early as 1950.[8] The United States Supreme Court first refers to the "chilling effect" in the context of the United States Constitution in Wieman v. Updegraff in 1952.[9]

The Chilling Effect was thoroughly covered in my Constitutional Law class during my year in law school. I understand that this is what any argument for First Amendment protection of E2E would ultimately rely on, but you generally have to actually need to measure or "plainly observe" said effect. Since most people couldn't tell you the difference between end to end encryption and encryption in transit (I can only sort of do it myself), it's really difficult to argue that restricting E2E encryption would actually have a chilling effect.

If I was to print out an encrypted message, and publish it in a book, would you agree that the government has a right to ban that book?

Agree. This is more of a 4th-ammendment "The right of the people to be secure...against unreasonable searches and seizures" issue.

It would be like federally outlawing clothes to make sure no one in America has a gun. Also, and aside from the whole premise being unconstitutional, shallow, & generally kind of silly, enforcement would be just about impossible.

I think it was Schneier's blog that argued or highlighted that it's simultaneously a Fourth and Fifth Amendment issue. On Fourth, they're trying to search through your digital property. At same time, getting your password requires you to speak something that might get you charged for a crime. If it's encrypted messages, those are your speech as well in digital form.

Different judges are looking at it in different ways. I say assume they can search it or hold you in contempt for the password if they intercept or find it. Then, you shift the goal post to them not being able to do that.

coded messages were critical to the revolutionaries and early founders of the US, so it's odd that a constitutionalist would simply ignore that historical precedent, especially interpreting the early amendments that were written at the time.

Maybe you'd be better off citing the Third Amendment. Judicial interpretation being that it implies a right to privacy.

So if I establish that my religion believes in end to end encryption and that in order to practice this religion, all communications shall be end to end encrypted and all services will be presented to the congregation using such means... I'm protected by the constitution?

No, that's not your sincerely held belief. But you're overthinking it. This is covered by freedom of speech.

>This is covered by freedom of speech

I think it's a bit presumptuous to go that far. You may believe that it is, but it's hardly an open and shut case, and I'm pretty sure the SC may have something to add to the discussion should it get that far.

I was making a clarification, but this is a more or less open forum where people share their opinions. Do I really need to put a disclaimer before every post? Also, the Supreme Court may very well weigh in and may very well be wrong on the law.

>but this is a more or less open forum where people share their opinions. Do I really need to put a disclaimer before every post?

Not sure I understand; of course it's fine to share your opinion. My opinion is that yours is far too simplistic and completely ignores the nuances that always appear in constitutional cases such as these.

>Also, the Supreme Court may very well weigh in and may very well be wrong on the law.

Sure... but I'd put more stock in their opinion vs your own.

How old does my religions book have to be to count as sincere? If I indoctrinate my children to believe this, does it count? Is there a certain head count where it becomes sincere?

Depends on the book and the context. Generally yes. Short answer no, long answer yes.

I urge you to search for "sincerely held religious beliefs" as there are an abundance of educational documents and case law related to this subject and this format will probably not do it justice.

How do you know it's not their sincerely held belief? Lots of people have concerns about the government listening in on their sermons and about dark forces changing holy messages to trick people into doing and believing the wrong things.

How do I know that the newly invented religious tenet that the OP doesn't practice doesn't spring from a sincerely held belief?

What basis do you have that the religious tenet is newly invented? Even if it were, what's the basis to uphold that something newly invented is not a sincerely held belief?

OP is clearly posing a hypothetical. And courts absolutely look to the amount of time and frequency with which one practices a tenet of their religion as evidence of sincerity. OP might be sincere, but a court would be unlikely to use that as evidence to justify, for example, a wrongful termination claim.

The Supreme Court just needs to decide it's not sincerely held.

It certainly is my sincerely held belief that E2E communication is necessary. If necessary, a schism on Pastafarianism can be created.

But one has freedom of speech without encryption. I'm having a hard time for freedom of speach applies in this case?

I am from the UK but my understanding is that you cant incriminate yourself which would likely be an issue here?

Freedom of speech includes, as a subset, the right to speak in encrypted speech.

The First Amendment, ostensibly AES-encrypted via https://www.gillmeister-software.com/online-tools/text/encry... using the password 'speech'

Indeed. You could take the analogy a step further in that if you add the government as an additional party in the conversation, it's compelled speech because they're forcing you to communicate differently.

The first amendment has been ruled to allow restrictions on the time, place, and manner of speech, as long as those restrictions are content neutral.

I suspect you could find reasoning in the 1st and 4th amendment that protect encryption. Freedom of Speech would apply to encrypted to or unencrypted communications because you're free to speak in whatever format you choose. It doesn't say that encrypted speech is excluded, so it can only be assumed that it is included.

But you’re assuming that encrypted speech only contains its decrypted meaning or even that its decrypted version means anything. Maybe I want to speak the encrypted text. Maybe an encrypted message is what I want to communicate. Imagine an encryption/coding scheme where the cipher-text looks like normal speech but doesn’t mean what it seems like it means... hopefully you get the idea.

I still don't see how in this encryption scheme you could make the case it isn't allowed by the 1st amendment.

Because you wouldn't be free to speak encrypted messages?

Thank God for the constitution; I expect any reasonable court to toss Trump out on his ear if he tries this in about a minute.

Just like how the courts prevented mass surveillance and illegal authorization of war.

If the federal government only did those things specifically authorized to it in the Consitution, it would be about 1% of its present size, and nobody would care about national elections.

Correct. Hopefully, we can get closer to that.

People seem to be forgetting that cryptographic algorithms were export restricted for a VERY long time.

Trump certainly can't block Americans from selling cryptographic algorithims (or anything which implements those algorithms) if the consumer is also american.

But preventing the communication of those algorithms to foreign nationals is something that was done for several decades.

I think its a terrible thing for him to try to do, but he would have some legal grounds to stand on, provided the restrictions are related to foreign nationals only.

If my memory serves correctly, it was protected right up until Phil Zimmerman circumvented them by writing a book containing the source code for PGP and winning the case based upon the argument that the book was protected by his constitutional right to freedom of speech(?)

Unfortunately, you can't put the cat back in the bag. It's no good putting export restrictions on encryption when the tightest encryption we have at present is already available all round the world.

The law had different purpose, to prevent other countries from obtaining good encryption algorithms. That law didn't forbid developing encryption software abroad and downloading it.

reasonable courts are no longer a given, or even a safe assumption

Trump has assigned his judges so you may be in for a surprise.

Trump has assigned judges? Source?

That is far different from him having assigned the judges that will hear this case. And if you're thinking that it will wind up at the Supreme Court (which is likely), well, Trump appointed 2 out of 9. So "Trump has assigned his judges" still seems like a mis-characterization of the situation.

Anyone that thinks the SCOTUS, as it is currently comprised, will allow for a ban on encryption has no idea what the legal doctrine of The Federalist Society is, nor what it means to be an American conservative. This is the most libertarian court we've had in decades.

Talking about judges as being "assigned" is both inaccurate and betrays an ignorance of the American system.


Are you calling me delusional?

They ruled states have the freedom to design their own voting districts. State sovereignty has always been a thing in the US.

I would encourage you to read the historical documents of the country where many of these constitutional arguments were had: The Federalist Papers

I think there is some confusion here regarding a right to free speech vs. the right to any speech. The courts routinely recognize the state's right to limit modes of speech, e.g. vandalism, yelling in a theater, etc. Isn't encrypted speech just another "mode"?

Encrypted speech isn't another mode, in my opinion. You can send encrypted messages in letters, spoken word, or spray paint; that is, encrypted speech is communicated through the same modes as any other speech. The content is what's different.

You need look no further than the very next right (the right of the people to keep and bear arms, shall not be infringed) to understand that these "rights" are rather squishy in practice.

I 100% agree freedom of speech protects encryption, plain as day; I'm justifiably skeptical that a court would see it that way.

If we allow encryption, how do we know that encrypted file doesn't contain AR-15 blueprints or school shooting plans? Think of the children! /s

I just got mt AR-15 blueprints and the gun I printed was 2x more powerful because they came encrypted.

> A ban on end-to-end-encryption would make it easier for law enforcement and intelligence agents to access suspects' data. But such a measure would also make it easier for hackers and spies to steal Americans' private data, by creating loopholes in encryption that are designed for the government but accessible to anyone who reverse-engineers them. Watering down encryption would also endanger people who rely on scrambled communications to hide from stalkers and abusive ex-spouses.

It does increase the chances that your communications will get out to third parties besides the government, but not for the reasons given in that paragraph.

The way you would actually implement this is not by putting a "loophole" in or "watering down" the encryption between the parties. You'd do it by adding the government as another party. E.g., two party end-to-end encrypted messaging channels become three party end-to-end encrypted chats. N party end-to-end encrypted chat channels becomes N+1 party end-to-end encrypted chat channels.

To a third party, such as a stalker or abusive ex-spouse, who is trying to eavesdrop on your messages these N+1 party systems are as secure as the previous N party version.

The increased risk comes from the risk that the government won't be able to keep its copies of your messages secure after it securely receives them. Presumably they will go into a database somewhere, from with they will be made available to law enforcement and intelligence services. That means that there will be a bureaucracy around operating and accessing the database, and given the number of different jurisdictions involved and the likely frequency of access requests my guess is that this would need to be a large bureaucracy. A large bureaucracy dealing with a large amount of sensitive data is just asking for trouble.

> The increased risk comes from the risk that the government won't be able to keep its copies of your messages secure after it securely receives them.

Or how about simply not wanting the government, a third party, to be able to scrutinize your private conversations?

Or we'll all end up making Gorky Park-esque dead drops under the cover of darkness.

>The way you would actually implement this is not by putting a "loophole" in or "watering down" the encryption between the parties. You'd do it by adding the government as another party. E.g., two party end-to-end encrypted messaging channels become three party end-to-end encrypted chats.

Adding the government as a passive 3rd party to an e2e encrypted messaging service is pretty hard to do technically

... and it wouldn't necessarily give the government access to the messages. To read the traffic the government would need to recognize that a targeted party is communicating and have the right infrastructure in the right parts of the internet to intercept the traffic.

1. What happens if the government starts recording after the session has started. Do you build the crypto so that the government can join at any moment or only during session establishment.

2. All this intercept equipment would be expensive and unreliable. ASes may use encrypted tunnels between POPs or change which ASes it routes the messages around.

Using China as an example the way this would proibably work is that the company that offers the messaging service would have all messages pass through their servers and then they would read messages in plaintext and save them in a database (think gmail). The government would be given API access to the database of all messages. Of course these datasets would then also be shared for marketing purposes, employees would read or steal the messages and hackers/foreign governments would also get access.

How are you going to enforce using such a scheme? What would stop people from downloading software with real encryption? Also, how do you oversight that the government is not abusing the system to track political activists for example?

> How are you going to enforce using such a scheme? What would stop people from downloading software with real encryption?

Does "anti-circumvention clauses" ring a bell? Or illegal numbers?

> Also, how do you oversight that the government is not abusing the system to track political activists for example?

There's something called "separation of powers" for that. That means the executive power (the government or the cops) normally have to ask permission to the judiciary power (the judges) for that kind of thing.

Or, other possible problems:

the implementation is bungled because it involves 3 parties all not making mistakes instead of 2. Maybe 3+n vendors all being perfect.

Example, The keys the government has are compromised because they or a vendor they used had poor opsec.

I recently finished "Information Doesn't want to be Free" and would highly recommend it as informed and well reasoned commentary on topics like this. Cory Doctrow puts into words the vague hand wavy feelings I have when we start talking about encryption, copyright, and freedom of speech on the internet.

The ebook is free (with suggested donation) on his website.

To elaborate more and provide some substance, the existential threat posed by regulations like this is really hard to get across to the broader public. The more good analogies we can acquire for explaining why every day people doing every day things should be able to do so privately, is important.

Maciej Ceglowski recently wrote about this: https://idlewords.com/2019/06/the_new_wilderness.htm - he calls it "ambient privacy."

> For the purposes of this essay, I’ll call it ‘ambient privacy’—the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered. What we do at home, work, church, school, or in our leisure time does not belong in a permanent record. Not every conversation needs to be a deposition.

The article is presented more in terms of JS/app trackers, but I believe the analogy still applies.

Reminds me much of the philosopher Fredric Jameson's notion that modern culture basically exemplifies the end of temporality. There is much less forgetting, including forgetting of inconsequential minutiae, nowadays.

Absolutely. In the book I like above he gives a similar analogy: that most conversation is made up of little meaningless things, but that those meaningless things are essential to the more momentous things that build upon them. It would be ridiculous to require that all conversations between individuals be conducted with a third party observer, both for the logistical ramifications, but also because it would destroy the little things that are the building blocks of society, essentially.

I also like the bathroom door analogy. Everyone knows what you are doing, but we still insist that there is a door.

Anyone who is in favor of this banning of encryption should have to hand over their unlocked mobile device to anyone that asks. The "I have nothing to hide" perspective doesn't hold when people think about it more personally. These things are very much about who will watch the watchers?

"Regulations" is a terrible word conflating and FUDing necessary protections with unfair restrictions. Deranged, all-or-nothing libertarians mindlessly cargo-cult refrain "regulations and government are evil becus government" is typical of bombastic, utopian drivel. Laws vary in harm, protection and/or benefit by degrees and by circumstances... nuance FTW. Most of the time, corporate capture of government's policy-making leads to inaction on major issues, corruption, military-industrial-complex self-dealing and capitalist-class-leaning grotesque inequality.

I couldn't find a downloadable copy on his website. Here's an external link: https://archive.org/details/InformationDoesntWantToBeFree

If communications had to be decryptable, could I go to jail for emailing the output of /dev/urandom and then not being able to prove that it wasnt an encrypted message? Is it garbage, or is it clever encryption?

They'd put you in a cell with the 8-year old who was smart enough to understand one time pads and thus also had access to unbreakable encryption.

Yes, of course.

God forbid the authorities have to do some old school actual investigation. The "we need need to fight terror and child porn" never seems to grow old

They tried the "child porn" route up here in Canada and failed. I just posted about it before I saw this response https://news.ycombinator.com/item?id=20308373

Wasn't a big part of "old school actual investigation" looking at the communications of suspects?

Yes, by doing things like putting "bugs" in their homes or workplaces to listen in on them.

There's nothing stopping the authorities from doing this now, and in fact, it's even easier now with the level of miniaturization we have now.

Bugging peoples' homes without their consent seems like an equally egregious privacy violation

That sounds like a job for Alexa!

That's why they have warrants.

No, investigation existed way before phones.

Only if they can find it.

I am not sure if this opinion is permitted here, but I would suggest that people just implement their own E2E and ignore any rules that suggest you must not do so.

Obviously a business can not survive by this logic. Rather, implement systems that permit end users send whatever text they wish, potentially including obfuscated text.

An example of business that have done this for decades would be all the amateur radio manufacturers. It is illegal to make a HAM radio that operates outside of the HAM bands. But... clip one diode, or hold down two buttons and power on the radio, presto, all frequencies unlocked, radio in "debug mode". Certainly similar logic could be implemented by clever people here.

"One has not only a legal but a moral responsibility to obey just laws. Conversely, one has a moral responsibility to disobey unjust laws." --Martin Luther King, Jr.

They will have to pull encryption from my cold, dead hands. And I'll strongly consider making a 80% lower legally-unregistered gun if things get any more insane, because stuff is going off a cliff 1930's-style.

Just do it. Even if things don't get any worse, gunsmithing still qualifies as a bona fide hardware-hacker hobby.

And everybody needs one of those, right? Yet another perfect excuse to spend a lot of money on CNC machines.

I'd print one, but my state AG made it illegal for people in my state to download the blueprints for one.

I've been tempted to get on a proxy, download the file, print out a few copies of the source in book form, and slip it in our local libraries.

3D printing a firearm isn't practical. They are dangerous and only fire a few times. You're better off with an 80%.

An AR-15 lower doesn't have that much strain

I don't understand what this would mean in practice.

"End-to-end encryption" isn't really some fancy new technology. It's just the combination of communication plus encryption. If you and I share a password, I encrypt some data, email you that data, and you decrypt it, that's end-to-end encryption.

So what would be banned? Would you not be allowed to email encrypted files to someone? That seems implausible. Would it be legal to email encrypted text files to your friends, but illegal to build software that automatically did both encryption plus email?

That seems pretty weird, making it legal to do a thing but making it illegal to build software that made that thing easier.

IMO this points more to having probable cause than actually criminalizing encryption itself. If privacy or hiding something by itself is probable cause you can essentially justify any form of search.

Are we in China? Its always about terrorism, but NSA/FBI/CIA/3-LETTER-AGENCY fails to do anything when there is terrorism or about to be. They have allowed things to happen. This should fall under the 4A for those of us in America imho.

Intel is a thankless job; if you keep people safe, nobody knows you are doing anything, but if something happens, you're the first to get blamed. Just because you're ignorant about what people working in various intelligence agencies are doing, doesn't mean they aren't doing anything. I worked in various intel fields for most of my adult life, and it gets tiring to hear that intel agencies don't do anything.

That isn't blaming the Intel agencies for doing nothing, it's blaming them for abridging our freedoms in trade for purported safety.

Isn't this analogous to IT security policies in pretty much any company?

They get blamed for abridging worker productivity in trade for purported safety.

Contrary to what you may have been told, productivity and freedom are not the same things.

Is not the freedom to be productive not one of the unwritten tenets of capitalism?

Can you think of no other freedoms?

it gets tiring hearing that you're violating the constitution, spying on girlfriends, or making insider trading tips using your military-grade equipment, but then someone calls you regarding a deranged Florida teenager with a rifle and nobody stops him.

Sounds similar to IT. If it works, nobody knows you are doing anything, but if something happens, you're the first to get blamed.

Intel seem to generate its own trouble. For every real terrorist there's ten autistic teenagers talked into it.

What concerns me is not when you and your peers are doing nothing, but when you are doing something.

How would you even prove or disprove if something a 3 letter agency did stopped a future attack?

What if it stopped the next 9/11? What if it didn’t do anything at all?

How are you so certain of anything? I’m skeptical of everything. Read, verify, read counter arguments, formulate an opinion.

The liquid airline bombers were stopped [0].

[0]: https://en.wikipedia.org/wiki/2006_transatlantic_aircraft_pl...

According to my cursory reading of your link, that plot was discovered by the UK police, not any of the intel agencies.

Maybe the intel agencies should publicize their accomplishments more if they want a better reputation. 10+ years after some terrorist plot is foiled should be enough time for it to be safe to let the public know.

There is also a 2A argument if encryption algorithms are classified as munitions.

Which, I believe, is how the previous long-term ban on exporting of crypto was justified (ITAR: https://en.wikipedia.org/wiki/International_Traffic_in_Arms_...)

Yes, quite a lot actually. While federal level treatment isn't at China level, you can receive exactly similar treatment just from any local jurisdiction.

Selected people get pretty horrific federal treatment, and there are poor checks on the selection process.

What would this mean for personal web servers running HTTPS? Being forced to use a specific (NSA-breakable) algorithm? Forced obtain TLS cert from a government-controlled source?

I would be really interested in the legal format that they put such a restriction in.

My guess, based on recent history:

It will be an executive order that requires companies with a communications product and over 500k users to implement lawful intercept protocols for communications between any two users.

Most likely companies will repurpose their GDPR tools to provide standardized exports to authorities with legal intercept requests. The regulation will be written in such a way that it precludes end to end encryption as an option, rather than forbidding it specifically. Failure to comply with lawful intercept requests will result in a fine with high interest.

What about foreign companies from countries like China or Russia? How will they make them comply?

An order on what authority?

couldn't they just enforce the existing encryption export regulations to basically break the whole industry?

Export regulations don't require breaking E2E privacy for same-country citizens; and notable E2E cryptographic software is developed outside the US, so US restrictions do not apply to it. So I don't really follow this line of reasoning.

but any software you make that uses e2e dependencies would be subject to the same export regulations, right?

I mean, not even just e2e, just the normal ubiquitous encryption we all use is a "gift" from relaxed export restrictions along side non-enforcement. Some app stores still ask you if you obtained the export license or qualify for an exemption.

The US government could just start enforcing that, fund the agency to enforce it.

It doesn't matter about what it enables to citizens, no organization operating in the US would offer it because it would break their global business.

No, I believe that is basically incorrect.

I’m glad we changed each other’s minds

The success of intelligence and code breaking during WW II apparently convinced the elites that a small elite intelligence and criminal investigation force could effectively control terrorism, criminality and excessive dissent in society. Thus, the emphasis on electronic surveillance for national security and law enforcement.

It doesn't actually work very well, since terrorists can hide, gangs can organize, drug cartels can flourish, and society is becoming more disorderly.

Weak or broken encryption privacy in order to continue a failed approach is not the answer. More physical security, more local law enforcement, and more local human intelligence to prevent terrorism and crime is a better alternative. It is better to be physically protected from bad actors than to have a technologically sophisticated process for apprehending perpetrators.

Constitutional considerations aside (since all implementations seem to run up against one or two amendments), the most likely route is probably to force compliance by tech workers/companies when presented with a warrant (see: the law rushed through late last year in Australia).

One basic implementation is that after some closed-door conversations, your specific instance of WhatsApp receives an update which either compromises your keys or performs some background E2E of your plaintext conversations (as stored on your device) with whichever government/law enforcement agency made the request.

Rather than adding the government as a third party to all communications at once, this is a nice easy first step for them to take.

So ill be honest here, im not a STEM person...but how exactly do you "crack down" on something thats essentially just really great math?

What, if anything, is the government hoping to do to stop people from using GPG/PGP? I mean i guess you could force companies in the US to not ship phones encrypted by default, but they could just sneak a menu asking if you want it encrypted before the first power on.

How does the cat go back in the bag after 2015? Snowden has basically put the word "encryption" in the mouth and ears of every American. We know the government spies on us. We know it uses this information to hold theater trials of people it just doesnt like.

How do you crack down on math?

> i guess you could force companies in the US to not ship phones encrypted by default, but they could just sneak a menu asking if you want it encrypted before the first power on.

Free software can violate the law because it can be anonymous. Corporations have addresses, and employees, and owners. They're legal entities and can only break the law when the government allows it.

Our phones are built and solve by corporations, running software installed by corporations, and we download apps from corporate-controlled app stores. Do you think Google or Apple can just say "no, we're going to violate this law" and keep operating that way freely?

You can't stop a hacker from encrypting, because we can root our devices and install our own software, but encryption is only used by more than a couple percent of the population because it's easy. If you can't encrypt your communications without rooting your device, 99% of the population won't encrypt.

> How do you crack down on math?

Funny story: back in the 90s, whenever you downloaded a web browser, you had to promise that you lived in the US if you wanted it to support the state-of-the-art encryption, because encryption--math--was legally considered a weapon so you couldn't "export" it.

You crack down on math by arresting people using the math. It's pretty straightforward. Remember the illegal prime number that was DeCSS?

> How do you crack down on math?

Same way you crack down on anything else. Round up people and throw them in prison if they break the law.

Sure you cannot stop people from using encryption but if it's a criminal offense to incorporate it into a consumer application then you could easily go after the company, it's executives, developers, or even users. The first three of those is already enough to have a chilling effect on the rest of the industry.

It won't stop foreign developers in countries that don't extradite people to US like China, Iran, North Korea or Russia. You will have to find an excuse to cut those countries from Internet or implement a border firewall.

In Russia, we have a law that requires companies that own messenger apps, to retain and provide contents of messages at request or provide information necessary to decrypt the message (encryption keys). There is no responsibility for using encryption for end users, and for companies it is either a large fine or being added to a registry of blocked URLs. So it is difficult to enforce such a law in practice.

the most basic way is to just change "innocent until proven guilty" to "guilty until proven innocent" if encryption is involved. yes this is counter to the constitution (like several other things the government does at the highest level), but it can be achieved de facto by a law that makes it illegal to not surrender encryption keys when asked. So if the cops serve a warrant and find files in your possession that you cannot decrypt, they just charge you with violating the encryption law and ensure the penalties for such are greater than or equal to whatever crime they were originally intending to prosecute.

They already cracked down on copyright, even though any picture/video/audio is just a number (a series of bits).

Law is all about interpreting and enforcing ambiguous rules.

They've cracked down on numbers.


The won't be able to stop it completely, but they will be able to stop 99% of people from using it.

You fall in one of two groups here:

1. The NSA probably has the ability, one way or another, to break E2E encryption. This is a symbolic move attempting to soothe the masses into thinking their communication is still secure.

2. This is vast government overreach trying to stifle personal privacy. Such a bill is idiotic not only because it runs counter to the US Constitution, but also because enforcing such a bill would be virtually impossible.

Even if the NSA could break E2E encryption (a very big if), that doesn't automatically imply that local, county, and state level law enforcement agencies would be able to.

I would love to see the Tim Cook VS Current Admin war that would happen though. After the FBI vs Apple war that happened with hard device encryption an iMessage encryption battle would be epic and I wonder just how far they would take it if they got hard-lined under the current conservative 'Silicon valley is trying to censor us' reparations.

If Apple were somehow forced to drop e2e encryption I would honestly expect Tim Cook to step down and run for president. He has an incredibly hardcore political side.

Just to take a moment and remind everybody that your encryption is pointless anyway because of ME[1] and its ilk.

[1] https://en.wikipedia.org/wiki/Intel_Management_Engine

The boat has sailed on this, strong encryption is everywhere, the algorithms are widely known and understood, anyone who wants encryption will have it. Back when encryption was considered a weapon in the US it did absolutely nothing to stop everyone else in the world from having it, I don't see it working any better in reverse.

As far as law enforcement goes, it reminds me of a favorite Futurama bit - Some cops can read minds... Some cops can see the past... And some cops get help from angels... But there's still one cop with no special abilities whatsoever. To solve this crime he'll have to FIND OUT.


I can’t wait for the mass hacks that come from this, with millions of people’s payment and other information getting snooped. People would stop using the internet for serious things again and our industry would die after enough of them

Not this shit again.

* Honest Government Ad | Anti Encryption Law - YouTube || https://www.youtube.com/watch?v=eW-OMR-iWOE

It is interesting that Russia already has a similar law (IM apps are required to retain and provide the content at request; they are required to ask a phone number for registration; websites having a specific amount of visitors must do the same). Luckily, some foreign companies like Telegram openly ignore it and decentralized messaging apps like Tox are unable to comply with the law.

It would be difficult to enforce such a law without implementing network filters.

The article mentions "tech companies." But isn't SSL end to end encrypted? And doesn't nearly every business that makes transactions online use SSL?

TLS (fka SSL) is not really what we mean when we talk about end-to-end encryption; it's client-server encryption. Law enforcement would usually not have a problem with that, as they can present a court order to the operator of the server to secretly siphon off the decrypted data at their end and send it to the government.

End-to-end encryption is where the service provider has no ability to decrypt any of the data that passes through their servers. Only the endpoints can do that, which (usually) can't be tapped without the endpoints' knowledge.

I assume they'd come up with a new encryption standard where you use an additional (government) public key to encrypt with, allowing the government to decrypt it later?

Or maybe they'll only enforce the rules on "chat applications" or things the general public uses?

I guess they're talking about this right now.

I’m sure you’re right. And I guess I’m saying something obvious when I say they haven’t thought this through. So they’re going to get every bank, every e-commerce website, every server and every client to upgrade their SSL packages? That’s daunting. (that’s ignoring the challenge of never leaking the government’s golden key)

I sincerely recommend reading the book Crypto if this topic interests you. This isn’t the first time we’ve been here, and it won’t be the last, but Crypto does an excellent job going through some of the history and the absurdity of the idea, both from a technical and political perspective.


A Canadian prosecutor tried to force someone this year to provide a password to decrypt the contents of their cell phone via the court system. She failed [0].

[0] https://www.canlii.org/en/on/oncj/doc/2019/2019oncj54/2019on...

Those who do not learn from history are doomed to repeat it.

Strong encryption was illegal for years. It didn't stop people having strong encryption. If they make it illegal for American companies to make software with unbreakable encryption then that will leave the door open for companies in other countries to fill the gap. The best algorithms are all public so it's not like it would be that difficult.

That's what happens when un-educated people legislate about stuff they don't have any clue of. Same in Europe with the Article 13 copyright reform.

Problem is I do not see any reasonable way to get around this as a large chonk of the voter base is old, un-educated, tech-illiterate people... and politicians tend to cater to whatever they want (which is mostly law-and-order crap such as this).

> That's what happens when un-educated people legislate about stuff they don't have any clue of

This is becoming an increasing problem as legislators get older and technology becomes more important to society.

Not going to happen. We've been through this before.

What I do think we might see is some kind of corporate incentive (secretive or not so secretive) to effectively push normal (non-computery) people into using non-E2E services. Google is already doing a fantastic job of this.

It's difficult to see how any restrictions on encryption can be deemed constitutional, especially by a conservative majority court. In my view the 1st and 4th amendments deem encryption constitutional, "corporate personhood" reaffirms this.

> the Secret Service regularly run into encryption roadblocks during their investigations

Maybe it's time we change the language and stop letting government officials label individual rights which are working as intended as "roadblocks" into government investigations.

> experts generally agree that Congress is unlikely to pass a bill requiring warrant-compatible encryption

Boo hoo. That's the system working as intended. Do your job correctly and stop making an enemy out of the People and you'll find requiring a warrant isn't a "roadblock".

Agreed. There is no other context where this type of argument works. If I have a right to privacy, that's obviously dangerous because it means I can privately do bad things without easily being seen or caught. That was always the trade off with giving citizens rights and liberty. You can't simply eliminate our rights just because they are a barrier between law enforcement and public welfare. That decision was already made long ago and the choice was to take the risk because the alternative, where those in power can violate the privacy of anyone they choose, turned out to be a pretty terrible thing. Those barriers are a feature, not a bug.

This is not something that is often said out loud, but most people would not want to live in a society where it was impossible to commit a crime and get away with it.

Lawbreaking is one of the means that people have to fight against an unjust system.

Think of all the people who support the Drug War and its effects on certain demographics, while simultaneously going above the speed limit and drinking and driving every chance they get.

I don't think most people have the innate sense of wanting it to be possible for everyone to commit crimes; I think it's much more selfish in that they just want themselves and their associates to get away with committing crimes.

> Think of all the people who support the Drug War and its effects on certain demographics

Who are those people, exactly?

I run in mostly right-wing circles (online especially), and the vast majority of people on the right seem to be opposed to the "War on Drugs" on the grounds of individual liberty. I can't imagine many on the left are in support of it - am I wrong?

I don't think it's nearly as popular as it might seem.

While the war on drugs is in its waning days, there is a (aging) segment on the right that worships law enforcement and is generally in support of more laws and heavy-handed police tactics.

I think the OP is calling some these people out as hypocrites who demand "touch on crime" policies that disproportionally affect poorer communities, while they themselves break laws.

That aside, I find the hypocrisy elsewhere, namely the people who want small government but who want everyone to pay billions for mass incarceration (and raise taxes for more police) as a result of these policies.

Talk to any middle age surbanite who is scared of their own shadow. They'll give up anything to stop the boogeymen of drugs, terrorists, pedophiles, etc

It is less popular than it used to be (especially among young Trump supporters), but support for harsh incarceration policies based on drugs is pretty much exclusively a right-wing thing. If it didn't have substantial support from the right it really wouldn't be a thing any longer. I will say, to Trump's credit, he has been pretty decent on this issue compared to his conservative predecessors.

This is an instance of where our single axis falls down.

You might run in right-wing libertarian circles, but I'd speculate that if your experience was based on the religious right you may perceive higher levels of support.

I do agree with you though, it's lost a lot of momentum over the years and now opinion seems to be that it's clung onto by law enforcement and the lobbyists of the companies who supply them because it allows for their continued militarisation and supports a significant portion of their budgets.

Not only that, but the possibility of committing a crime and getting away with it is the only way we could ever change the laws.

Sometimes things are illegal, and society changes and decides they shouldn't be illegal (interracial marriage, homosexuality, alcohol, marijuana, etc)... if people couldn't 'dabble' in the illegal thing at all, there would be no way for people to learn enough to decide that the laws are wrong and need to be changed.

I ask people that same question all of the time and then I think about China and how someday if we don't start pushing back harder against this insanity we may not have a choice about what we want.

Julian Assange is being prosecuted for allegedly aiding in hacking a foreign adversary he believed to committing war crimes—and lo, they were indeed.

What am I supposed to do if I ever come across a cache of information revealing the literal Holocaust 2.0 happening in China?

If the Holocaust had been discovered by a US citizen who broke intercepted German communication, would we have allowed Germany to persecute the civilian for hacking their infrastructure?

I doubt that. I am pretty sure that most people commit crimes every day and that they’d be annoyed if they were enforced consistently (traffic laws like jaywalking and speeding come to mind).

I think you may have misread the parent, they're saying that people LIKE the fact they're not enforced.

Oops, I missed the “not”. Thanks!

Then maybe it is better to change the rules, e.g. limit speed and make roads curved and narrow so that you cannot drive fast?

We're living in a world where practically everyone is a criminal under some interpretation of the law. So law enforcement is arbitrary and only done discriminately.

The ability to break the law, and get away with it is not just vital for the functioning of society, but for social progress to keep advancing.

Gay rights would never have been won if it were possible for law enforcement to jail all gay people for existing (sodomy laws, etc.), for example.

Selective enforcement isn't a good thing... It's a symptom of a bad thing.

"Show me the man and I’ll show you the crime" - Lavrentiy Beria

Absolutely, government tends towards ever increasing authoritarianism and unjust criminalization of the people.

So the right to privacy (and therefore the ability to break the law) is one of the most important checks against government.

As someone who lives in an area full of narrow winding roads, let me assure you that doesn’t make a difference to speeding.

>This is not something that is often said out loud, but most people would not want to live in a society where it was impossible to commit a crime and get away with it.

It's said out loud all the time (or maybe my filter bubble is less authoritarian). The problem is that politicians seem to be using the cesspools of Twitter, Reddit, etc. as their primary data feed.

We have to let the government install thought reading chips into our brains, because the government regularly runs into the roadblock of people knowing things but not saying them during their investigations

People with clearance have to undergo regular lie detectors. Wouldn't be surprised if a loyalty chip is the next step on that path.

I'm not convinced a very dark future is preventable

No they don’t. Certain types of clearances for DoD or the Intelligence Community may require polygraph along with the background investigation process, but the vast majority of cleared US government or contractor personnel do not have to undergo polygraph testing – they only go through routine reinvestigations at the 5- or 10-year marks following favorable adjudication for their initial clearance.

Ancestor post said "lie detector". A polygraph test would be a separate thing entirely. I'm not entirely certain why you thought to link them.

The lie detector in the regular process is when the federal employee investigates whether the answers on your questionnaire match up with public records and in-person interviews.

The polygraph is security theater intended to intimidate the subject and possibly reveal previously undisclosed issues by provoking a stress response. That's why they keep using it.

That’s true, although the requirements for different clearance levels are drastically different, and most people don’t refer to the standard background (re-)investigation process as a “lie detector,” even if the investigators are in fact attempting to determine your honesty in addition to evaluating other signals about your behavior and potential ability to be influenced or manipulated.

Most of the time, comments about “lie detectors” are a reference to polygraph tests, which only apply to an extremely minor percentage of the overall cleared workforce; I just wanted to point that out, that it’s not quite as bleak as implied by the parent.

I was tongue-in-cheek trying to break the implied association between polygraph and "lie detector".

The lie detector in a polygraph test is always the human running it, and they're about as fallible and unreliable as anybody else, with respect to determining honesty. They could just chuck the machines in the trash and call it a "veracity interrogation", but selling the machines and training the people to use them is a better money sink, and gives more ass-cover when someone invariably deceives the investigators. "Trained to beat the machine" sounds better on paper than "really good liar".

Security theater needs its props.

As far as I know, only those working in secure compartmentalized facilities and with high-value assets ever get polygraphs.

If you hold a key and wait by a coded terminal in a nuclear missile silo, you get one. If you reduce and analyze anti-ballistic missile test telemetry, you don't. If you write systems code for submarines, you might get one. If you write route-planning software for in-flight refueling tankers, you don't. My guess is that it ultimately depends on how much Country X would probably pay you to borrow or copy your access. If it's above $Y, they do a little more to scare you into being a good little guardian of the nation, and hope you're not another Snowden.

They just have way too much need for cleared personnel to spend enough to actually make certain, for everybody. Doing it correctly always costs more, in time and money. Why do it right when you can make it look like you did it right, and get paid the same?

Quote from a friend who works at such a job: "I just want to drink beer and get my dick sucked"

I really don't think there is much hope. He just thinks everything works out in the end when in reality it is people fighting tooth and nail and giving up their lives to fight for this shit.

FMRI data that is taken from direct brain scans is admissible in courts in India to determine if a suspect is lying in high profile cases. India also has the largest biometric database (outside of the NSA).

Wow - that's the first time I've heard that actually being operationalized.

Who's read The Truth Machine? https://en.wikipedia.org/wiki/The_Truth_Machine

Has anyone actually studied whether this works? Can you refuse to take an fMRI?

"Warrant-compatible encryption" requires a warrant, hence the name. The concern is that law enforcement can't decrypt things even if they have a warrant, so the encryption provides more privacy than is provided for by laws and constitutional rights.

See https://www.eff.org/deeplinks/2016/04/burr-feinstein-proposa... for the previous bill.

> The concern is that law enforcement can't decrypt things even if they have a warrant,

The concern is that law enforcement apparently can't understand that "Warrant-compatible encryption" is an oxymoron.

If encryption is a munition (if privacy is a weapon) then the 2nd Amendment is in play, eh?

(edit: WWII, Turing, Bombe, Enigma.)

Definitely, after all encryption is a crucial tool for citizens to protect themselves from oppressive governments, so it fits the spirit of the law even better than it fits its letter.

They should stop expecting all of their job to be easy as a Google search, get off their rears, and do real detective work. If it isn't worth real-world work, then it's probably not worth eroding citizen rights over.

> "Warrant-compatible encryption" requires a warrant

It sure does, just like my front door requires a key :)

I'm not sure what point you are making, but the entire argument in favor of these laws is to make encryption work more like front doors. In order to open the door the police require a warrant, but once they have that there are no technical issues which make it hard to get to door open, they can just call a locksmith.

What's your experience working in the IC or with LE?

Warrants don't work, that's the problem. There have been many instances now where a judge issued a legal warrant that was impossible to enforce, because of encryption. Should that be the case? It's not a simple question.

Just because the ability to hide information has reached unprecedented levels does not suddenly make the principles hard to grasp. Warrants have never been a guarantee of getting evidence. They are only permission to look.

Finding the answer to law enforcement being able to make these warrants useful while also maintaining rights is complex. Seeing a terrible solution as terrible is simple.

Is it unprecedented for a warrant to fail? People used to burn paper correspondence all the time.

Ok, so warrants fail. There are cases where people are held in contempt of court, essentially for life. Is that a good resolution to warrants being broadly, commonly enforceable? Those people aren't convicted of the crime they were accused for, in some cases they may have even forgotten the encryption key.

Indefinite contempt charges are why forcing someone to hand over a password, encryption key, submit to biometrics, etc. needs to be banned.

The current state of affairs is a clear violation of one of the US’s founding principles: “innocent until proven guilty” (and also the 5th amendment).

Remember, the Bill of Rights were tagged on after the Constitution. The Constitution establishes what powers the Government could and couldn't have, purely in its relationship with the constituent states. The Bill of Rights filter out what it doesn't. And in its vagueness, the Constitution claimed quite a lot.

The ability, power, and right to investigate crimes is certainly reserved to both the states and the Fed. Government in the Constitution.

The ability to not be forced to participation in one’s own prosecution is one of those rights the Constitution guarantees.

The fact that modern life typically entails recording and documenting almost every single thing you do during the day, and yet law enforcement complains about “going dark”? Utter nonsense.

People have never been more widely tracked, or more widely accountable. The absolute last thing we have is a privacy epidemic.

The government has the right to investigate crimes. They do not have a specific right for defendants to keep records of what they did wrong or to be provided that evidence, outside the context of business regulations which require specific paperwork be maintained.

And by the way, the Constitution specifically enumerates what powers the Federal government has. All Else is reserved to the States, or to the People.

Which is why we weren’t supposed to need a Bill of Rights in the first place. The Constitution doesn’t enumerate Speech or Assembly as something the government can control, therefore it is not. However the Framers didn’t all agree we shouldn’t go and list a bunch of things just in case, except that it might make people think if it’s not listed then maybe it’s not a Right?

It's strange to me that this isn't somehow covered under the 5th amendment. How can you be compelled to provide something (say, an encryption key) that might incriminate you?

The ability to hide things has reached unprecented levels?

Yes, it should be the case. No one should be forced to testify. That's violating free speech. Giving up information that only exists in your head (location of keys, or a passphrase) is testifying.

And the entire concept is backed by State violence. People are only compelled to testify under threat of violence in some form. This form used to be open torture, but we've found ways of outsourcing the torture to other criminals in order to wash the State's hands clean of blood.

Sometimes we need to catch a bomber. If you ask me, that problem is only getting worse and "sometimes" may turn into "often". However, we can't risk a journalist or political activist being caught in the jaws of the system while attempting to expose the State's secret crimes. And less severely, the State doesn't have the right to know what your shopping list was last week.

The entire crux of this argument revolves around the fact that at one time it was easier to do this stuff thanks to wiretapping laws and banned encryption, but now it's harder. However, the wild west of the 70-90's isn't the base line for sensible policies when it comes to digital intrusion of your life by the State without reasonable cause.

I agree, forcing people to testify to provide their keys is probably where our system will get stuck on this for good. That's a bad line to cross, and I don't think we'll cross it if we remain a lawful country.

I say this below, though, when people refuse to serve access to a lawful warrant, there are now multiple cases where people will be held in contempt of court essentially indefinitely, likely for life. The court is certain that the information necessary to convict them exists, but is being vexatiously withdrawn by the suspect.

I don't really think that's a good end. I think holding people in contempt that long is a system failure. What do we do about that? Just release people when they hide evidence that they can still have full access to once they're released?

I think it's been easier to do this stuff since the founding of the government, but our government does have certain powers, lawfully enacted by the wills of all of the states for the betterment of society. If some of those powers can't reasonably be enforced, and other, less lawful, more coercive powers arise in their absence, I think everyone loses.

We have to adapt our laws to reality; not the other way around. The reality is that even compelling me to produce a hard drive is backed by State violence.

So we must accept that a certain amount of State violence is necessary to keep the peace. Think of a schoolyard Bully going unchecked; it takes someone standing up to them and enforcing the threat of future violence in order to end the actual occurrence of violence.

But there is a clear legal framework here:

I don't have to produce any materials to the State without a warrant acquired by due process (and not parallel construction). This is my protection from unwarranted search and seizure as outlined by the US Bill of Rights.

However, I don't have to tell you where it is. I can't prevent you from taking it, but I have the right to not have to testify. Because the enforcement for my testimony comes from State violence, and we learned from the Dark Ages what happens when we allow the government the power to extract information with force.

It would be quite ironic if, in our quest for not "going dark", we enter the Digital Dark Ages.

Perhaps we need to address the definition and scope of lawful warrants in this case. Not a lawyer (clearly) but there is a big gap between "requesting data (that may be gibberish bytes when encrypted" ("here's the hard drive, officer!") and being provided that data, vs. requesting data that also requires decryption using something only known to the person being served."

This seems like a pretty clear distinction to me that could be argued in court. But, as I said... not a lawyer.

That isn't the failure of warrants, our rights, or any other aspect of the legal process. That's a failure of law enforcement. They can't break encryption, sure, but they also can't read the private thoughts of individuals - yet somehow they've figured out ways to get the information they need most of the time.

A great solution in many of these cases would be to work alongside companies like Apple to implement more products like Face ID. With Face ID, they can take a few pictures of a person, 3D model their face, open the phone. It's an easy thing with the right resources. It's a technological equivalent of interrogation. The same can be done in other contexts, they just need better hackers, better investigators, and more resources on their side.

What you posit with interrogated Face ID is a method of hacking encryption. You've provided a backdoor, that is the person's face. Previously the suspect's phone was encrypted, now the encryption is broken against the suspect's intent. A warrant was used to break the encryption.

Do you see what I'm getting at? You came up with warrant-breakable encryption, but didn't call it that. This is why it's not an easy question-- the solutions are nuanced, and some are a lot uglier than others.

Maybe? There's a huge distinction between warrant-breakable encryption as a default (because it's easier) and enforcing that nobody can encrypt data unless it's encrypted using biometric data (or other information that is available outside the head of a user)

Furthermore, there was a falsehood of security to begin with. You shouldn't legally be able to say the data is safe when it isn't safe to anyone with a couple photographs.

> With Face ID, they can take a few pictures of a person, 3D model their face, open the phone.

Note that Face ID is intended to be resistant to this kind of attack.

Sure, but the only way to build resistance to it is to expect more detail in the scan and they can't do that without making the system less convenient for people who grow a beard, wear sunglasses, etc. It can't go pore by pore on the person's face anyway, the infrared scanning process where it projects dots onto the face just isn't that advanced. It might be resistant, but it's hardly a challenge to someone with government levels of resources.

Again, as far as I am aware, Face ID is intended to be secure against concerted effort to break it. I have not heard of anyone coming up with a serious attack against it, though I'm curious to hear if you have.

The question isn’t if warrants should work but whether or not anyone is able to have secure communication. This runs directly counter to all of the privacy law stuff being enacted and discussed. It means government officials and agencies get less or insecure communication too. It means clandestine intelligence agencies don’t get to mask their communication in a sea of other encrypted communications.

People are acting as if unbreakable encryption and secure computing is a given. It isn’t. It is extraordinarily difficult and in nearly all use cases not foolproof.

The warrant excuse is fairly ludicrous anyways. We now have vast information pools of where people have been and what they were doing. We have DNA evidence. Law enforcement has never been easier in the history of human civilization.

Of course it should be the case. There are plenty of ways to make information hard to recover, and if we're going to ban encryption I think we should take a hard look at fire as well.

Fire is irreversible. Once the original good is destroyed, it can't be read or written to by the burner or anyone else. After evidence is destroyed, it only exists in (human) memory.

Encryption is reversible. The information encrypted can be retrieved, changed, shared, etc. It proves itself potentially continually useful to the person who would hide evidence.

Encryption is irreversible if I throw away the key, just as burning paper is irreversible once I light it on fire.

Ok, so a suspect tells the judge under oath that they encrypted the evidence, and the key is gone, as if it had been burned.

Just as in burning with a fire, that suspect can be charged with willful destruction of evidence or obstruction of justice, contingent on mens rea.

The judge may be unlikely to believe that the data is irretrievably encrypted, just as they may be unlikely to believe all records were burned.

A warrant is nothing more than an authorization to conduct a search and/or arrest. It is not a guarantee of finding what you’re looking for. Warrants work fine.

I think it is a simple question. There have also been cases where a judge issued a legal warrant that was impossible to enforce because the information was physically destroyed.

Should it be illegal to burn a piece of paper?

The FBI and our police forces have been complaining about the 'going dark' problem for years. But in reality, it is far easier today to spy on or collect data than it has ever been before. If the argument was that our data collection was substantially lessening, maybe we could have a discussion about that.

But data collection is substantially expanding, and the FBI is essentially saying, "yeah, but it's not expanding fast enough." The FBI is phrasing this as a complicated question because they're saying it's a choice between the status quo and "the FBI can't listen in on anything." But in practice, if you look at the direction surveillance capabilities are heading, the real choice is between the status quo and "the FBI can listen in on everything."

And that's a really easy question to answer. Of course we shouldn't give up all of our freedoms just so the police can always access everything they want, all the time. Of course people should be able to exercise personal freedom even if it occasionally means a police investigation is hampered.

The FBI wants to phrase this as a choice between encryption and anarchy, but given the direction government surveillance is headed, encryption is the middle ground position. What the FBI wants is the ability to access anything they can get a warrant for, no matter what, for any reason.

And I guarantee once they got that, the next "complicated question" would be whether or not warrants were hindering their investigations too much.

I addressed the fire metaphor below. Destruction is irreversible, encryption is reversible, in short. But to answer your question, yes, it can be illegal to destroy evidence.

I agree with you, this is a problem. What I advocate for to anyone who'll listen is to encourage more use of warrants, and sharply curtail the use of subpoenas. Subpoenas allow for bulk collection, and much much much more violation of the 4th Amendment happens under bulk subpoenas than it does under warrants.

Ignoring that encryption is also irreversible if you throw away or forget your key, is the distinction between fire and encryption that, "we could get at the encrypted data, so therefore it's different and we should be able to"?

That doesn't make sense to me. If your argument is that every warrant should be executable, no matter what, then both fire and encryption block warrants.

If anything, fire is worse, because at least encryption is reversible. If you burn something, we can never get it back -- that makes fire way more risky and dangerous to law enforcement investigations, so I would expect it to be much more highly regulated.

Yes, that's what's different.

The canonical example here is child pornography. Consumption and distribution is a crime. If a person destroys it before the state sees it, well, that may be destruction of evidence, but the state will never know. If the person can have full access and go back to consumption and distribution, then justice can't be done, and most people agree it should.

I don't think that any warrant should be executable. I'm saying this problem is complex, people are coming up to ways to address it, and they're not the utopic visions of personal privacy-- they involve unconvicted detention, coercion, etc. Miscarriages of justice.

> and they're not the utopic visions of personal privacy

This is sort of going back to what I was saying in my original post though. Encryption isn't a utopic vision of privacy, it is the middle ground. It's not harder today to catch child pornographers than it was before the Internet existed -- it's easier. The fact that we have encryption is being balanced out by the fact that we also have video surveillance, and remote exploits on Internet connected devices, and more payment tracking.

The FBI is not looking for a sensible, middle ground between privacy and bringing people to justice -- the argument they're making is, "if even one criminal escapes, that's too many."

Again, if we were in the opposite situation, if encryption did really mean that we couldn't catch criminals any more, then I'd be more inclined to say this was a complex problem. But the 'going dark' problem is largely a myth; with the advent of the Internet, it has never been easier to monitor citizens or access their data.

This is only a difficult question if it is genuinely a question between anarchy and encryption, but we're not even close to that being the choice we have to make.

You may not be of the opinion that every warrant should be executable no matter what, but when the question is put into the context of modern surveillance, it seems obvious to me that this is the FBI's stance.

I address this upthread, but when you talk about how it's easier to bring people to justice easily in the world today, frequently that ease results from over-broad subpoena or bulk-collection powers. I think that warrants give much more freedoms to us as individuals, including the freedom to be private. What I would like to see the government systematically encouraged at every step to use warrants on criminality, rather than subpoenas for potential of criminality. So in that world, I think warrants need to be more useful.

> The canonical example here is child pornography.

This is always the example used to justify anything unreasonable to suspects. For me it's an automatic argument loser, like Godwin's Law.

I'm not using to suggest unreasonable actions to subjects, I'm using that example as a clear example where the data is, itself, incriminating to possess or transact.

In this case you can replace it pretty easily with any sort of material that’s illegal to possess, such as intellectual property that you do not own the rights to.

I don't think this is helping your argument, especially on HN. "We need to end privacy to preserve Warner Brothers and Disney profit margins"?

(If I have a DRM encrypted file that I don't have the keys for, am I now committing two offences? Are law enforcement entitled to a free copy of the HDCP keys?)

That's not the point: s/DRM/any sort of "illegal" content/g. Note that I am not discussing (or providing my position on) whether it should be possible to own "illegal" content, I'm just clarifying txru's argument.

> most people agree it should

Mob mentality doesn't make a very good government, especially when manipulation of information comes so easily. No matter the crime, we can't start the basis of infringing upon an individual's rights on "well most of us on the island agree we should do this". William Golding illustrates this well in Lord of the Flies.

Is this not similar to hiding child pornography and not disclosing where you hid it, which I believe would be protected under the 5th amendment?

I don't think the fire metaphor is right exactly because encryption is reversible as you say. A better metaphor would be hiding information and only you knowing its location. Knowing a password to decrypt information is functionally similar to a situation in which you have hidden documents and only you know where.

In the case of hidden documents, warrants/subpoenas are used to compel their production. Certain actions (like holding someone in contempt) can also make cooperation more likely (though not guaranteed).

Why do similar procedures not suffice for encryption?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact