Hacker News new | past | comments | ask | show | jobs | submit login
Senate encryption bill draft mandates 'technical assistance' (thehill.com)
389 points by chlodwig on April 8, 2016 | hide | past | favorite | 209 comments

This is not just govt access. For a judge to be able to order your computer unlocked, means people at Microsoft, at Lenovo (or whoever you use), and at your ISP all have the keys to unlock your data even if you used an “encryption” feature. You have to trust every company, everywhere to never have a security incident involving the keys.

I posted my full thoughts after the first reading through this at https://rietta.com/blog/2016/04/08/feinstein-burr-encryption....

Specifically, any U.S company would be required to maintain the ability, through unspecified means, to retrieve the plain-text from any data “made unintelligible by a feature, product, or service owned, controlled, created, or provided by the [company].” And the company would then be required to turn over such data in real-time “concurrently with its transmission” or “expeditiously, if stored by the [company] or on a device.”

It's chilling.

It's not just chilling, it's insane.

A crypto-secured digital communication should be analogous to sending an untampered with letter, or having a private conversation in a secure location.

Just because the medium being communicated over changes, doesn't mean we should adopt wildly regressive policies.

By proposing that all this 'cryptographically secured'[0] communication can be accessed at a later date, they're basically saying that the contents of all snail mail letters should be photo copied, and all private conversations should be recorded.

[0] If there is a possibility of 'technical assistance,' then it is not cryptographically secure.

> A crypto-secured digital communication should be analogous to sending an untampered with letter, or having a private conversation in a secure location.

I don't think anything in the bill requires a provider to keep around copies of information they don't already keep. They need to be able to provide access contemporaneously with transmission (i.e. basically a wiretap), or if they have stored it or can access it on the device.

If you receive a sealed letter and the police get a warrant, they can search your drawer and open that letter. If you have a private conversation with someone, a prosecutor can subpoena that person and force them to testify about what you told them.

I'm amenable to the argument that the law shouldn't stand in the way of progress; it's possible to communicate much more securely now than it ever was in the past and the law shouldn't stand in the way of people taking advantage of those advances. But the bill isn't "regressive"--it's attempting to maintain powers the police already have with traditional means of communication.

> I don't think anything in the bill requires a provider to keep around copies of information they don't already keep.

Password hashes, if this were to become law then you'd have to store them in clear text or encrypt them with a controlled key. You frequently see that sort of practice turn up in massive security breaches. Also, homomorphic databases are out - you can't provide a service without being able to deliver plaintext to the state. So no decentralized security, you'd be forced to choose, during the handshake, between clear text or a CA model that offers no protection from state level attackers - which is no coincidence.

> But the bill isn't "regressive"--it's attempting to maintain...

Device manufacturers are covered. So even if it is unrelated to an ongoing service, it is still required to render assistance in compromising the product. That is regressive. And I've only mentioned the primary impact, this thing would make the manufacture of end user serviced devices economically impossible.

Unless we had a less horrible alternative... what if manufacturers where offered the alternative of installing a key escrow chip? I could swear that I've heard that idea proposed before.


I forgot to include software developers, so let your imagination run over that. That bitlocker feature offered by Microsoft? That is covered.

I'm referring to the analogy to "photocopying snail-mail letters." I don't read anything in the bill to require companies to keep a history of the communications. It contemplates situations where the government has obtained the encrypted communications by some other means.

> So even if it is unrelated to an ongoing service, it is still required to render assistance in compromising the product. That is regressive.

"Regressive" means stepping backward from the status quo. The government can already compel your bank to drill your safe deposit box; it can subpoena your accountant to spill the beans on the complex financial transactions he arranged for you; etc. Compelling assistance from tech companies might be a bad idea, but it's an extension of the status quo.

> I don't read anything in the bill to require companies to keep a history of the communications.

That isn't what I said or what I addressed. You said "...keep around copies of information they don't already keep.", that isn't true - for the reasons I previously explained. It isn't restricted to "communications", it is for any stored data - if you get it in plain text then you are required to render on demand in plain text. You are not allowed to use forward security as a feature of your product.

> "Regressive" means stepping backward from the status quo.

I know what the word means, you've misused it. The USG making it illegal to render information unusable to them for some theoretical future investigation is not the status quo. That logic would make manufacture of paper shredders illegal.

Can the government presently compel a bank not to offer super-secure highly drill resistant safe deposit boxes?

Signal by Whisper Systems is intended to work in such a way that a subpoena for someone's chat logs would be useless because there's nothing useful to subpoena. They cannot access the keys necessary to decrypt.

Assume for the sake of argument that they've achieved their goal from a technical standpoint. As a result they are unable to comply with subsection (a); they cannot provide the unencrypted information, and no amount of technical assistance on their behalf would allow them to do so.

Section 3. (c) "License Distributors" would appear to prohibit Apple from making the app available in its app store.

"Technical assistance" is very vague.

If the FBI can legally compel Apple to produce a backdoored firmware for their phones, they can also compel Open Whisper Systems to produce a backdoored version of the Signal app to intercept all plaintext on the client side. Since almost everybody has automatic updates turned on, they won't even need to find a way to push the rogue version to the target's device.

Every service that relies on a vendor-provided app is open to this kind of "technical assistance". Ditto for any webapp that relies on vendor-provided JavaScript crypto libraries. The vendor can "technically assist" themselves at any time to gain surveillance capabilities they previously did not have.

>I'm referring to the analogy to "photocopying snail-mail letters."

The analogy is that anybody who intercepts and stores encrypted communication data, has a kind of Schrodinger's Cat photo-copied letter.

If at some later date they acquire the encryption key, then they have a photo-copied letter never intended for them.

If they don't acquire the key, they don't have a letter.

(Or perhaps they still do 'have a letter' they just can't read it, that's kind of a metaphysical question)

> it's attempting to maintain powers the police already have with traditional means of communication.

I recently watched the first season of The X-Files, circa 1993. Someone kidnaps Scully. Mulder traces the kidnapper's call to Scully's cell phone, which as a cell phone was then considered impossible to locate. The protagonists repeatedly bought airline tickets with cash under assumed names to avoid government surveillance. Mulder is at one point punished by being assigned to transcribe many hours of audio surveillance.

That was the state of the art not so long ago. There was no CALEA. Surveillance was an extremely labor intensive process. Mass surveillance was unfathomable. There was no after-the-fact record of the contents of realtime communications.

You either had a warrant ahead of time and planted a bug in the suspect's house and paid for many overtime hours for agents to sit in a van listening to the suspect talk, or everything the suspect said was gone as soon as they closed their mouth.

And there is still nothing that stops them from doing exactly that. Except now the bug is smaller and less expensive and they don't need a van to be in range of the transmitter and there is software to do the audio transcription and other software that will convert an audio recording of key presses into the text of the key presses and on and on.

People could encrypt 100% of everything in the world and the police would still be a thousand light years in front of where they were back then.

If this "maintenance of traditional powers" argument is to mean anything then it has to go both ways. It can't be that technologies that make privacy invasion easier are allowed and technologies that make it harder are prohibited. And it seems like it would make a lot more sense to let both citizens and police-with-a-warrant use whatever technology each can devise rather than hampering both to the detriment of everyone in an attempt to artificially maintain some largely subjective measure of balance.

The way the bill would accomplish that goal is the problem. A warrant is a court order allowing police to work within a very narrow scope - a single tapped line or residence to be searched. It doesn't force the phone company to build a machine that can tap every phone call in the city at once. If that's what wiretapping entailed, bad guys would have stolen a lot of mass-surveillance boxes by now.

This is broad-scope. It would compel encryption providers to maintain a master key to decrypt all traffic running through them. A single security incident with one of those providers would be enough to compromise the data of every customer. Communications providers to large American businesses with foreign competition would immediately become incredibly high value targets for bad actors.

I also gather that "technical assistance" means "backdoor".

If that's the case, then it's ridiculous and it appears nobody in the U.S. Government has learned from history. When they did this with export controls on encryption the only people overseas who had access to proper encryption were those who were outside the law, and those who developed encryption in countries that don't have restrictions.

Of course, I may not be following what is meant by "technical assistance". However, if such a term doesn't mean implementing a backdoor, then if unbreakable encryption is implemented then it won't matter how much technical assistance is rendered by U.S. companies, nothing will have been achieved except wasting a lot of company dollars on a futile effort!

Laws don't matter so much because it's mostly a technical matter.

1. Surveillance tech advancement will guarantee that in the future there will be even more of it, cheaper and more accessible through search and machine learning analysis; even if you don't use the internet, your face is still recorded on surveillance cameras, your financial transactions reported, and so on. Even if your state does not do internal surveillance, various companies and foreign states will.

2. The nature of social networks makes it very very expensive to maintain communication private, and in some cases impossible. Even if you use Tor, you still can't post on your regular FB page, or connect to any of your online accounts without disclosing your identity

3. They still can't stop people from communicating in secret, especially with a combination of steganography and cryptography, but that will be increasingly more difficult

4. People will increasingly self censure, with devastating effects on the balance between political class and the rest of society

So it's a defiant march of technology, nothing can stop it at this point. Privacy is dead, it died around year 2000 with the apparition of digital cameras, mobile phones and huge data centers.

I think part of the solution will require that we create new social standards around privacy and tolerance for various indiscretions we will do inevitably. Otherwise, it will be worse than the inquisition. Everyone watching you, all your actions judged by the mob or the powerful, blackmail material for everyone.

> If you receive a sealed letter and the police get a warrant, they can search your drawer and open that letter. If you have a private conversation with someone, a prosecutor can subpoena that person and force them to testify about what you told them.

If you are a bank with a lock box, are you required to have a key to a safe that a customer installs inside the lock box to store their valuables? Of course, a court can order you to give what you have but how can they order you to give them something you don't have?

The bill only imposes obligations to the extent the service provider offers the feature that makes information unintelligible. So it would not apply to encryption the customer uses themselves that aren't provided by the service.

And yes, courts regularly order banks to drill into their safe deposit boxes.

> ...to the extent the service provider offers the feature that makes information unintelligible...

Hardware manufacturers as well. Say goodbye to bios passwords, hard drive smart locks, SSD encryption that permits instant formats and the only way to ensure that bad blocks don't leak your data to ebay buyers.

So this is pretty much a ban on encryption. So would gpg4win be required to take a copy of everyone's private key who uses their software to generate private keys? Have they thought this through?

How do you feel about encryption? Do you think it is a good and worthwhile thing for the modern world?

If multi-billion dollar companies were mandated to spend their development efforts on ineffective, insecure cryptographic systems, do you think that would be a positive or negative thing in terms of the availability and proliferation of the kind of effective and secure cryptographic systems that 'customers use themselves'?

If you receive a sealed letter and the police get a warrant, they can search your drawer and open that letter.

Also, if you have a letter, you can discard of it as you see fit, and the information therein vanishes into the aether. This agency is removed when we move into the realm of bits over digital networks, where information can be copied and retained without one's authorization.

The agency to discard of the letter as you see fit, at a time of your choosing is in effect restored through the use of strong cryptography.

If you have a private conversation with someone, a prosecutor can subpoena that person and force them to testify about what you told them.

Having an unauthorized recording of an event, and asking someone to recount an event are very different things.

attempting to maintain powers

It's not about the 'police maintaining powers they already have' because the situations are different. If the situations were the same, the powers would be the same, and this conversation wouldn't exist.

The situations may seem similar to you, because we make sense of the world through analogy, but there are important differences.

The essentially instant and infinite nature of bits and digital data allows for all those 'warrants' and 'subpoenas' to be 'served' before they have even seen a judicial system. I.e. governments can suck up all the 'locked' data into giant 'pre-crime' databases. Or, require that communication networks retain data, when it's not already happening as a matter of course.

If governments are then given the key to that locked data they essentially have a time machine to attempt to dredge up any evidence, or whatever they want from people's personal effects.

There's nothing similar about that to anything. Nothing like that has ever existed in the history of the world.

So we have a new situation, and with it a new choice to make. This choice requires we ask ourselves, Do we want to lean towards tyranny or lean towards liberty?

Terms like 'progress' are relative and subjective, but still useful when there is mutual understanding. What is being progressed towards? A freer society with less potential for government overreach and tyrannical actions.

So in that case, yes this bill is regressive.

And, that's just one aspect of what makes this bill objectionable, and frankly, retarded.

Anyone with an inkling of how the modern economy works and it's reliance on cryptography, knows that this bill is ridiculous.

The simple matter is that you can't have modern strong cryptography with 'backdoors,' or whatever doors. It doesn't exist, it never will, the two are mutually exclusive. And further, the cryptography genie is not going back in the bottle, that is just as inconceivable. So it would be best for everyone if governments educated themselves on these matters and stopped wasting everyone's time and money on this.

I think you're conflating the very different situations of the government requiring companies to keep around communications data and the government requiring companies to be able to decrypt data the government has obtained by other means. I find the bill objectionable, don't get me wrong, but I don't read anything in there that precludes a service provider that doesn't store a history of users' communications.

>government requiring companies to keep around communications data

You seem strangely narrowly focused on that. Who really cares about that? That is a point of minor interest.

The big issue at hand is this bill is attempting to coin some Orwellian doublespeak in mandating that there exist some secure cryptographic communication channels, with 'backdoors.'

That's a logical impossibility. That can't exist, it is either one or the other.

>but I don't read anything in there that precludes a service provider that doesn't store a history of users' communications.

Precludes from what?

But once again the data storage thing is a red herring, who cares, governments are not lacking in collected data. The fact that some of that data is now the form of secret codes bothers some in the government, but rendering the secret code making tools ineffective is not going to make the world a better place.

Also, I should add that I noticed how after I took the points you raised, rendered them moot, and indeed turned them into points of support for my continued coherent argument, you countered with this semi-parseable red herring deflection.

Quite clearly, you have lawerly skills, but that was a dirty trick.

It would be nice if you attempted to address the issue head on. Or, concede.

I think we should be appreciative that there are people willing to debate the other side of this issue, or at least to try to lend some clarity to that other side, rather than being petulant when those people don't agree with us.

As important as these discussions tend to seem to us at the time, they are in fact awfully boring HN threads, because for the most part they're just nerds competing to see who can agree harder.

Feinstein and Burr have forgotten (or maybe never knew it) that their role in the Intelligence Committee is to act as oversight for the NSA and CIA, not as their PR department. It's as if they think their role in the Intelligence Committee is to do the intelligence agencies' bidding - not the other way around. That's what's insane and what has already gotten too far.

There is no intelligence oversight in the U.S. The intelligence agencies seem to be running the Senate, or at least the Senate Intelligence Committee.

When I wrote to Feinstein in the past, I received a form letter telling me that her mind was already made up and that she knew was was in my best interest. She didn't even have the courtesy to lie and say that she would consider my opinion. I'm in her district, and I lost a lot of hope at that moment.

Any chance you'd be willing to post the form reply?

In fairness, there are 40 million people in her 'district', so your chances of an individualized reply and/or influencing her opinion on any controversial matter would have to be quite low.

The point of my comment was not that I received a form letter but that it's tone was condescending.

She's a senator. She has a state, not a district.

There is a long history of tampering with the mail. It wasn't merely the preserve of the Stasi et al. It's one of the reasons most countries sought to retain government monopoly until comparatively recently.


> and all private conversations should be recorded.

When they realize people can talk to each other in private, they will surely outlaw private space.

Something is broken here. Feinstein is so off base and so unaware of the implications of her proposal, she's forwarded a proposal that basically breaks the entire internet. (Over time, all data security would degrade.) Especially considering that she has a history of doing this with legislation involving technical stuff. Dunning-Kruger with literally global downside. (Knee-jerk emotionally motivated legislation that's horribly broken.)

I'm convinced Feinstein's office doesn't understand how bad this proposal is for American businesses. It would force business communications vendors to maintain a way to decrypt customers' trade secrets. Those vendors would become incredibly high-value targets overnight.

She just passed a bill to legally protect American companies' trade secrets (https://twitter.com/SenFeinstein/status/717387073804767233), so it's not like she doesn't care about intellectual property. I think if she understood the implications, she wouldn't back this.

Give her a call or write a letter, especially if you're in her constituency: http://www.feinstein.senate.gov/public/index.cfm/state-offic...

Those vendors would become incredibly high-value targets overnight.

Almost literally overnight. I suspect that someone would be trying to break into those vendors' networks in anticipation of the law's implementation.

The other bill might be related to this one.

Lawmakers have a tendency to expect that when they pass a bill to protect something, it will in fact be protected. Since they are at the top of the "security by policy" game, they don't really think about the value of "security by design".

So she passes a bill to protect trade secrets just before she proposes another bill that might put trade secrets in jeopardy. "Don't worry, they're still protected by this other bill!"

Since they are at the top of the "security by policy" game, they don't really think about the value of "security by design".

So she passes a bill to protect trade secrets just before she proposes another bill that might put trade secrets in jeopardy. "Don't worry, they're still protected by this other bill!"

Which would just mean open season for governments with state-sponsored hacking for industrial espionage. This includes China and the US.


What's even worse, is that she might be aware of what she is doing.

>Blum's wife, Senator Dianne Feinstein, has received scrutiny due to her husband's government contracts and extensive business dealings with China and her past votes on trade issues with the country. Blum has denied any wrongdoing, however.[4] Critics have argued that business contracts with the US government awarded to a company (Perini) controlled by Blum may raise a potential conflict-of-interest issue with the voting and policy activities of his wife.[5] URS Corp, which Blum had a substantial stake in, bought EG&G, a leading provider of technical services and management to the U.S. military, from The Carlyle Group in 2002; EG&G subsequently won a $600m defense contract.[1]


Why isn't there a non-voluntary recusement protocol for government employees?

I'm no fan of Feinstein, but that sort of thing is the most brainless sort of smear attempt. There's no evidence that Feinstein had anything to do with the awarding of that contract.

It wasn't really meant as a smear, I can see that interpretation though.

I agree it would be quite sad for a woman of such advanced age to sell out her country for a few dollars. Any age of course, but at that age it would be extra curious.

As I said, it wasn't a smear. Just a presentation of the facts and asking a question.

There are plenty of people in Congress and yet we have the member whose husband has an interest in a government contractor that provides technical services, drafting legislation that involves the mandating of technical services.

If members of congress don't have the dignity and integrity to recuse themselves in such situations where there are obviously avenues of inquiry even if everything is above board, but really how could it be, people's biases are subtle...

So the question is, if members of congress, and other governmental bodies can't recognize these potential biases in themselves and do the right thing and recuse themselves from conflicting matters, why aren't systems in place to make sure they recuse themselves?

Yeah, it's enough to get people to be aware of the cluelessness at face value. I'm not sure I even want to contemplate the level of cynical deviousness implied by the harm being intentional. Or, as is often said, never attribute to malice what can be explained by cluelessness.

This law is toxic to every part of California that works with technology. Can we primary her?

This should have happened years ago. Maybe this time Silicon Valley will be pissed off enough to marshal the resources necessary to kick her out of her seat. Her views on encryption are dangerous enough, but when they're held by someone with her seniority and committee power, it's much worse.

But she's incredibly popular in California. Beating her is an uphill battle. And most people aren't willing to make encryption a primary issue in how they vote.

>This law is toxic to every part of California that works with technology.

The scope is really much broader than that.

Does your non-"tech company" company use an online payroll system? That would negatively affected.

Do you buy food at a grocery store? Guess what, most of them transmit encrypted data on the internet!

etc. etc.

The worst part of all is that Feinstein supposedly represents California. She's creating a bill which directly and unequivocally attacks business interests in her home state.

If tech companies want to be serious about security, they should be putting substantial money into getting this pariah kicked out of office.

>and so unaware of the implications of her proposal

I suggest you check your premises.

It fits the pattern of her misguided anti-gun laws. I think the implications don't matter to her, so long as the legislation resonates emotionally with uninformed voters.

(EDIT: To clarify: from a rational perspective, some of her sponsored anti-gun laws primarily have the effect of alienating and galvanizing the opposition with nonsensical and ineffective laws. From a rational anti-gun perspective, they have done more harm than good.)

You can look at the legislature like a biological ecosystem subject to evolution. The natural selection force that acts on it is election ("natural election"?). Legislators who do not get elected do not survive in the ecosystem.

That means the direct evolutionary pressure is for legislators who focus on re-election above all else. Any actors who have different priorities are quickly outcompeted and kicked out of the gene pool.

You could look at insulation like it's made of whipped cream, but that doesn't make it a good pie topping.

In other words, government officials are not in fact bound by the rules of natural selection whatsoever, and they can live perfectly comfortable, happy lives without getting reelected.

What's evolving due to natural selection in the gp comment is not Diane Feinstein nor any other member of Homo sapiens. What's evolving are patterns of behavior of legislators and elected officials. Dawkins memes as received wisdom on how to be a Congressperson or Senator.


> For a judge to be able to order your computer unlocked, means people at Microsoft, at Lenovo (or whoever you use), and at your ISP all have the keys to unlock your data even if you used an “encryption” feature. You have to trust every company, everywhere to never have a security incident involving the keys.

I've only read the thing once and so may have overlooked it, but I didn't see anything that implies that companies will have to keep keys. For encrypted data, they have to provide technical assistance in decrypting the data, but I didn't see anything that would make assisting with a brute force attack not qualify as technical assistance.

There was something in there about having to respond expeditiously (or something like that), but I think one could make a good argument that brute force is responding expeditiously as long as there is not a known faster way for that particular device.

> I didn't see anything that implies that companies will have to keep keys.

That is the intent and the logical consequence, obviously they can't say "backdoor". If plain text goes into your device, software or service, and is stored, than you'd be required to provide the plain text on demand. Your product isn't a sha-2 algorithm that just happens to have a massive cloud backup service behind it, your product is a secure remote backup service. The folks who think this is a good idea are going to be the same ones nailing your ass to the wall for breaking their law, they won't entertain the excuse - they'll say that your choice in implementing a non-reversible feature demonstrates predetermination, a cyber booby-trap.

Did you miss the part where they have to be able to decrypt conversations in real-time?

The militaries and security agencies of other countries are not going to like this. I could see this being a boon to the Russian or Chinese semiconductor industries. They could market some sort of secure USB device that would contain a password input keyboard and do all the encryption for the host computer. They don't have to have the latest fab technology, just something that's not crackable by western agencies. They have fabs for 90nm right now in Russia, which is about 12 years behind US production, so it's not unthinkable.

any U.S company

Your response begs the solution.

Presumably, FOSS is not considered a "covered entity" under this bill. Arch or Gentoo linux, private mailserver, XMPP server with OTR, etc, is a simple circumvention.

A covered entity includes "any person who provides ... a method to facilitate a communication or the processing or storage of data." FOSS encryption libraries could be considered a method of processing data.

If signed into law, this could compel maintainers to design libraries with gov't decryption in mind.

Given that courts have ruled open-source crypto code to be constitutionally protected speech, it might be difficult to make that stick.


The case in question (Bernstein v. United States) is fascinating: https://epic.org/crypto/export_controls/bernstein_decision_9...

Would protected speech mean you could use some non compliant code to provide services or just that you could publish it?

I don't know, but the bill doesn't appear to cover end users. Cloud-based services would probably have to deal with it, but not people running software on their own computers.

How could this be enforced, assuming the library developers are scattered throughout the world?

Worst case scenario, all US developers drop off the project, or contribute anonymously.

Just decryption. It isn't government only just mandated by them ANYONE can use it.

Agreed. Those familiar with encryption understand that a single security incident resulting in the theft of such a backdoor key is all it would take. The attack would cascade to any company that implemented the library.

It would seem that no US person could package or distribute such open source crypto without becoming a covered entity.

Red Hat would be.

Consider the amount of funding RH and Sun Microsystems have put into the linux desktop. Imagine if they were now required to provide technical assistance to people attempting to surveil through the tools they've invented.

"A covered entity that receives a court order for information or data shall provide such information or data in an intelligible format; or provide such technical assistance as is necessary to obtain such information or data in an intelligible format or achieve the purpose of the court order."

Okay, so I get that the main section of the bill requires companies which provide encryption to decrypt data upon receiving a court order demanding access to it. If the company is capable of decrypting its customers' data, that's a scary thought.

But it's followed by two very conflicting subsections: (b) nothing in this act may be construed to require or prohibit any specific design or operating system to be adopted by any covered entity, but (c) a provider of remote computing services to the public shall ensure such products are capable of complying with a court order to decrypt.

It seems like this could be read as "you must build backdoors", but also as "we're not telling you how to architect your product." These two sections are completely at odds.

By the way, Scribd sucks. Have a downloadable PDF: https://josephhall.org/f0eabaa89b8ee38577bf7d0fd50ddf0d58ecd... (it's still just non-OCR'd images though :<)

Roughly paraphrased: "We don't care what kind of lock you build, so long as you build one that you can always unlock for us. We're not mandating any particular design, but you have to unlock it on demand."

That's not really a lock, then, is it?

To be fair, it's exactly a lock, well understood and easy to circumvent (like the keys to your house).

And just like the keys to my house, it requires an actual live physical human being to transport itself through meatspace at a non-relativistic speed in order to perform said circumvention, right? Oh, it doesn't? Hmmm.

This is why I'm hoping we can ultimately just let go of all these analogies for encryption. Analogies are training wheels, and if we leave them on for too long, we risk propagating the idea that all bikes have four wheels.

All models are wrong. Some models are useful.

Analogies are models.

Analogies are pretty much precisely how we convey novel ideas. The problem is finding one which doesn't melt when pushed too hard.

I keep having to explain to my father that deleting data from storage doesn't speed up computers. Nor does leaking it. His response when I mentioned that the Panama Papers leaks were 2.6 terabytes of data was "well, that must have made their systems a lot faster" (because they were no longer bogged down by having 2.6 TB of data on them). That's wrong at least two ways: data-on-disk doesn't slow down computers, and leaking data (making a copy) doesn't remove it from the source.

And this is a reasonably intelligent human being. Who's used computers for many decades.

What do you mean "deleting data from storage doesn't speed up computers?" Deleting old files certainly makes file searches faster.

The context was of speeding other computer operations by removing data.

With a sufficiently robust indexing system, file search time is independent of corpus size. E.g., Google isn't scanning the Internet each time you submit a search query. Time to build an index increases, but not searches against it.

You are technically correct, but the comment is irrelevant to the discussion.

It's the crypto equivalent of the TSA-approved locks.

And just like TSA locks, the bittings will leak, rendering the lock useless to all but the laziest adversaries.

What do you mean 'will leak'? You can find pictures of the master keys online today.

I think the TSA locks are a great example of why backdoors are completely idiotic.

That was exactly my point. The TSA lock bittings are available and the back door keys will be available shortly after instituted.

Think about the threat model for a lock on a suitcase. The adversary must have your bag in their possession. It's the perfect example of something that's only to keep honest people honest. Also, they make sure that your bag stays closed.

Does anyone actually use a suitcase that while locked with any padlock can't be opened with a pen?

Unless the lock is secured to something other than the zipper you can resecure it invisibly too.

It's a TSA-approved lock for encryption instead of luggage.

And just like with TSA locks, there will be leaks and compromised systems that only hurt the general public.

It worked for the TSA.

And as a result, the TSA destroyed a hot sauce boxed set I had in my luggage.

I learned my lesson; now I just buy new clothes/toiletries/etc. at my destination rather than take anything at all with me.

Another option is to start flying with something that qualifies as a 'firearm' in your luggage. Such requires actual proper locks that only you can open. There was a talk on this at DEFCON 17 https://www.youtube.com/watch?v=KfqtYfaILHw

Obviously, this only works in the US. Other countries will have different laws on the ownership and transport of firearms.

Only in the USA does taking a gun with you make the airport easier.

Pro tip: pack the gun at home.

The other complications surrounding that would make this option more trouble than it's worth for me. I'll stick with the no-luggage route!

It seems consistent with statements by Clinton and other government officials, "Maybe the back door isn't the right door, and I understand what Apple and others are saying about that. I just think there's got to be a way, and I would hope that our tech companies would work with government to figure that out." They create the requirement and tech companies have to find the least bad way to comply.

I wonder what that means for software like Google Chrome. Chrome would need to send the parameters of every SSL session to Google on the chance that the FBI requires Google to decrypt it.

Think what it does for all the creditcard companies. The TLS stream contains also those nice back-of-the-card control numbers that card processors are not allowed to store.

Subsection (c) says it must be possible to comply with subsection (a). That subsection requires companies either to decrypt or to help decrypt. Technically, you don't have to successfully decrypt so long as you can help try. That's almost certainly not what they me at to write, but that's what the draft says. It's a badly drafted bill for more than technical reasons.

> Technically, you don't have to successfully decrypt so long as you can help try.

Which raises the question of whether a vendor could build a system they cannot themselves break into, and then "help" by providing the documentation showing how they've built the system (which they should post publicly anyway).

Still a bad bill for numerous reasons, of course.

> a provider of remote computing services to the public shall ensure such products are capable of complying with a court order to decrypt.

I'm not well-versed in these things at all, but I wonder what implications this would have on VM providers like Amazon (EC2) or Google (GCE)? Does this mean they somehow have to magically ensure that you didn't install sshd on your Linux VM? Surely that can't be the intention (?), but the broad language is scary.

And here's a Tesseract+corrected transcript: https://lin.noblejury.com/burr.txt

You slightly misquoted (b) - it says that nothing in this act may be construed to authorize any government officer to require or prohibit any specific design or operating system. I think it's meant to avoid a situation where regulators would have lists of specific software that is required or prohibited, as opposed to the government suing companies on a case by case basis if their software didn't comply with (c).

It appears to be based on a clause from CALEA, which refers to "any law enforcement agency or officer" - see (b) (1) in:


I almost want this to happen, simply because it only applies to "remote computing services".

It would provide a major incentive to move anything security-relevant back to the endpoints where it should be in the first place.

This bill covers anyone who does things with data or makes something that touches data.

If you make paper shredders, you are covered under this bill.

If you handle data, you need to make sure things like your fire suppression system doesn't damage it.

The bill does not care how the data is rendered unreadable to law enforcement. It only cares that you provide them with a readable copy if it was rendered unreadable by anything even remotely in your control.

"We don't have the first notion how cryptography works," said the politician-senators in a joint statement, "but we're hoping the voters will see past the blatant trampling of civil liberties to think we're doing something to be tough on the trrrrists."

> "We don't have the first notion how cryptography works," said the politician-senators in a joint statement

This is dangerously naive. Technically-minded people often make the serious mistake of assuming politics follows any kind of logic. It doesn't; politics is often self-contradictory.

These people know how crypto works, at least at a high level. Many of these players knew 20 years ago in the first crypto war. They just don't care; rules that are hard to implement or require interpretation can be selectively enforced.

This is a power play against technology companies. If they want to survive, tech companies need to start banding together and fighting this in the court of public opinion. As half of those companies are involved in advertising, this shouldn't be hard.

The near future will reveal who is willing to fight for their own future and who is a collaborator.

> These people know how crypto works, at least at a high level.

I respectfully disagree.

I believe they think they know how crypto works, and that they may even have excellent advisors who know exactly how crypto works. But I believe that they cling to the notion that the underlying math can somehow be compromised with a back door (that only the government knows), and yet still remain secure.

I absolutely understand that this is a power-play, but the senators would frame it as technology companies being "uncompromising," implying that there is some sort of gray area where the math is concerned. This suggests either they are ignorant, or they're lying to gain position, or both.

Given that the FBI is claiming they can be trusted with secrets (escrow keys, privileged source, etc.) while apparently having their private networks play host to a variety of hackers, unnoticed for years--would you rather believe your leadership is lying or ignorant?

What they do know is that this would break tech companies. Tech companies with a lot of cash, but comparatively very little spending on political fundraising. Nudge nudge.

Many politicians don't know how to pump their own gas let alone know how encryption works.

I know most of the tech community likes to joke about this but I think there's something serious to consider:

How can we educate the wider public that breaking crypto makes everything less secure. Most of the arguments I've seen in the wider context seem to split down tech/non-tech lines with this being the key point that's not understood.

I think one easy way would be explaining the NSA's other job—ensuring COMSEC for the government, military, and "essential" industrial players. Requiring "breakable" crypto everywhere would mean 'the terrorists win' in a very literal sense—the same sense in which the break of Enigma was a key part of Germany's loss in WWI.

But, of course, the government won't have to use breakable crypto; only the citizenry will be beholden to that requirement. So instead, it will merely be those 'essential' industrial players, now constantly finding their foreign rivals to have the upper hand in trade negotiations and to seemingly predict their moves.

I think that's the key here: broken crypto, on a large scale, doesn't just mean a panopticon; it also means whoever's using it will inevitably suffer large-scale economic stagnation. Standardizing on breakable crypto is equivalent to making laws with large loopholes to allow industrial espionage into the country.

If you want to scare people into defending crypto, point out that it's the shield on the arm of the negotiators who sign things like the TPP agreement. Small business owners—even if they themselves have no trade secrets to protect—will be directly affected by whether crypto can protect the secrets of their industry from their foreign equivalents. (Your local {sports clothing, christmas ornament, office furniture} industry recently take a nosedive? Maybe someone was cornered into a bad deal because all your local supply-chain logistics were plain on the table to China, but theirs were a mystery to us.) And small business owners are a pretty good voting bloc.

Your argument is excellent and in every way correct.

It's also too subtle and abstract to ever compete with "Apple is helping Terrorists hide stuff!"

Did "Apple is helping terrorists!" really got traction in the US?

From far away, it didn't look like it did.

It got plenty of traction among key demographics. It's a good populist talking point and Trump brought it up a number of times. The kernel of truth in the messages is: whatever the right policy happens to be, it's not Apple's call to make. The elected government should decide the balance to strike between protecting communications and keeping us safe from terrorists.

Well for one reaching the wider public is hard. Well, unless, Google, Apple and others take our side. Which is plausible.

Then after reaching the public there must be a compelling relatable story. Applied did it a bit when replying to FBI's request:

-- Weak encryption means more identity theft. This is a good one. Everyone knows someone who lost a phone, a wallet and then thieves emptied bank accounts, caused grief, time wasted and pain for years to come. This works and is great.

Others I can think of:

-- Point to large scale data breaches, even Government's own OPM office was hacked by Chinese. Having back-doors in everything means more such large scale failures.

-- Take their own rhetoric and spin it around. Look Snowden stole all those documents right under the nose of the NSA. Point to their own incompetence and say imagine future such incidents with a perpetrator walking away with "master keys". Remember to repeat phrases like "it is not a matter of if but a matter of when". That was used by talking head security experts on TV post 9/11, this one works great. I like it.

-- Point to physical analogies where master key has been lost. Was it a case for NYFD or some place like that. Where it was mandated all these places have a master key and it was copied and used for criminal activity.

-- Find something with terrorists. Nobody will ever defend terrorists. Any case where Chinese terrorists hacked Google through a backdoor created to handle government's access to data.

-- Everyone loves freedom of speech and hates when worthy dissidents in countries we don't like are martyred. Point out how strong encryption has saved lives and provided the only channel for communication freedom fighters.

The list goes on... but that's the gist basically

> Point to physical analogies where master key has been lost. Was it a case for NYFD or some place like that. Where it was mandated all these places have a master key and it was copied and used for criminal activity.

I would point out that these "keys" are actually computer files that act like digital keys (obviously), and the bad guy will instantly make 100,000 plus copies of the key and distribute it to his network of bad guys (bot net) and begin breaking into hundreds of thousands of houses every hour.

Also, what happened to all the talk about cybersecurity? It seems like the TV media was interested in cybersecurity for awhile, but I don't hear about it anymore.

Conservative "think tanks" are super good at reframing concepts in a favorable way, referring to estate taxes as "death taxes" for example. Maybe we need a crypto "think tank" that can help, but it doesn't seem like there are many politicians who support cryptography.

The civil liberties crowd scored a good point in calling the UK's version the "snooper's charter."

Play to different fears, e.g.: "Anything that makes it easier for the government to track criminals also makes it easier for stalkers to track their victims."

And for criminals to target your bank account.

I've thought about this quite a bit, and think the tech community is going about this wrong. The public doesn't like being told it can't be done, but I can't explain why, because it's too complicated.

Even though we know it's impossible, we should disregard that and lay out common sense tenants that such system would require, even if it can't be feasibly built. We could then base our arguments on those tenants, and those are public fights I think could be won, because its things non technical people could understand.

For instance, one tenant could be any key escrow system must be open source. We can't base it off keeping the code secret, as then if the code is ever stolen or leaked, the whole system is compromised.

If you can win those arguments, and it just happens such a system can't be built due to the laws of mathematics, you then fall back to arguing which tenants you should break, and ideally breaking any of the tenants would be unpalatable.

I think you are grossly overestimating the general public's ability to consume and understand a technically complicated argument no matter how well-reasoned.


During a discussion about privacy, the next time someone says "they have nothing to hide" then ask them for their social security number and mother's maiden name.

This quip misses the underlying reasoning though. I have nothing to hide from entities I trust, like my bank or the government. Heck, my bank and the government know my SSN. You, in the other hand, I don't trust you.

What's to stop someone you don't trust getting into government or banking after those institutions gain extensive access to your information? (Maybe the safeguards are stronger where you live than where I live, but are they strong enough and will they remain in place?)

Those institutions already have extensive access to my information. I trust my bank with my life savings; I trust the government with my retirement and old age medical care. I can certainly trust them with my text messages. Sure there are vulnerabilities, but that's what insurance is for.

My last comment was sloppy and I agree with much of your reply. However, the governments that serve their populations well do so in virtue of their structures and office-holders, both of which change over time.

My concern is that any increase in already substantial state powers poses a twofold threat. Firstly, it makes future increases more difficult to oppose. Secondly, it risks falling into the hands of irresponsible future office holders.

I suggest that you wouldn't be able to trust your government with your text messages if you fell into one of the following three categories: freedom fighter, wrongly suspected of crime due to unchallengeable secret evidence and targeted for state-imposed restrictions without explanation, actual criminal (possibly under a homophobic, misogynistic or racist law).

I'm not sure about the banks, but maybe some of the above applies to them, too.

The New Zealand parliament has identified five "institutionalised checks on executive power". (They are a codified constitution, an elected head of state, a bicameral legislature, devolved powers and proportional representation.[0]) Only two exist in my country (the United Kingdom), neither of which is available to me, because one of our legislative chambers is unelected and (like the majority of the population) I don't live under the jurisdiction of a devolved legislature.

I don't understand your last sentence. By "insurance", do you primarily mean checks and balances? If so, which ones matter to you?

[0] See Table 2 on this page http://www.parliament.nz/en-nz/parl-support/research-papers/...

Yes, exactly. Rather, you should ask someone if they close the curtains before they have sex with their spouse. Or, more's to the point, ask them whether they should be required, when having any house built, to include a government issue sex spycam installed in their ceiling. Because if bedrooms are safe and the government can't spy on them at will, terrorists will plot in bedrooms.

Usually those doofuses are referring to the govt though. The govt knows your social security number already.

I really like this. I hadn't thought of doing that - will try next time it comes up!

I'll take a stab: it's like sending mail in transparent envelopes.

I think that might be too subtle. I'd go with "It's like mailing a check in a transparent envelope with the amount and pay-to fields erasable and written in with a dry erase marker."

Even more so--"It's like forcing everything sent by mail to be sent in a transparent envelope and then publishing in the news every route that mailmen take and the easiest points where to intercept them."

Laws dictating how to/not to encrypt things make news really quickly so it takes even takes out all the fun in security research.

More like: The government wants a master key that will unlock all the doors on your home, but they super promise to be real careful not to let anyone else use it.

But that's just it, and this is why techies often sound unreasonable: there absolutely is a well-defined system whereby the government can get into your home. And they do promise to be careful with that power. And they also promise to help protect you against other people's using that power. To most people, this whole debate sounds like we're saying everyone should have easy access to impossible to vaults of arbitrary size, which law enforcement can't access under any circumstances. In the physical world, hardly anyone (including me) would be in favor of such a thing. I's not hard to imagine abuses of these vaults that we wouldn't want to tolerate. If it's different in the digital world (and I tend to believe it is), the burden is on us to explain why.

Imagine the police response time to a black neighborhood where gun violence has just been reported. Now imagine the police don't have guns, cars, or any knowledge of the law because there is no black and white law. Now imagine you live in that black neighborhood. That black neighborhood is called the internet. And the government is trying to outlaw locking your door.

Another: it's like requiring safe manufacturers to have a master combination to unlock any safe.

(or, closer to what the article states: it's like requiring safe manufacturers to open any safe the government gives them, or at least require the manufacturer to provide technical assistance in breaking into their own safe)

Many people I've talked to think that's perfectly reasonable.

Phil Zimmermann used postcards as the analogy in his Why I Wrote PGP essay in 1991:


Teach them not to trust the state.

Doesn't EC-DRBG show us that you can create secure backdoors ? meaning open only to it's creators ?

They know how it works. They want the kinds that work well to be made illegal for civilian use.

AES was a belgian design, SHA-3 a belgian and italian design. Imagine their surprise when the next generation of algorithms has import restrictions against the US, to protect the integrity of the algorithm.

The funniest part is feinstein is California senator. Know who San Francisco congress rep is?, pelosi another champion of freedom... Voting doesn't work, it's a waste of time, our corrupt illegitimate government will continue to violate every ethic and change every law which increases their power. Anyone who defends such monsters are no better than monsters themselves. Again, the US government kills with impunity, tortures with impunity, commits war crimes like the recent hospital bombi ng, has the highest incarceration rate in the world... The list of goes on and on and yet people plug their ears and bury their heads in the sand. We live in a police state, just accept the truth.


The community can down vote but can't respond to the truth.

Problem is that she's a Democrat and a liberal on social issues. Is SF going to vote for a right wing social conservative candidate just because Feinstein is bad on crypto?

Same problem exists in "red" states, where conservative politicians can get away with all kinds of corruption and madness because who's going to vote for a baby eating liberal? Basically if your state is very Democrat or very Republican you have single party rule and your politicians are dictators.

The culture war must die if we ever want any meaningful progress on any other kind of issue.

The culture war is a non-issue because California allows two members of the same party to win the nomination and challenge each other in the general election.

There was 15% voter turnout for the 2012 primary in which Feinstein (49.5%) and Emken (12.7%) were nominated. In other words Emken got on the ballot to challenge Feinstein with only 1.9% of voter support.

To run a successful campaign to challenge incumbent Democrats with pro-liberty Democrats in the general, you possibly only need to get ~ 2% voter turnout for your candidates.

This is pretty feasible to do if California residents actually cared.


False dichotomy.

The point is the tech scene can't even get their representatives to represent them in their core issues. Belief in elections mattering is akin to belief in the tooth fairy or Santa Claus.

If Feinstein's the problem here, then it is irresponsible to throw up your hands and declare the failure of democracy. Instead, vote her out of office and replace her with someone who understands the implications of weakening encryption.

Your vote alone doesn't matter, you say? Do what you can to educate those around you on the issues. Make them think twice before re-electing someone with such a bad record.

Or maybe it's because the tech scene is full of libertarian-leaning individuals who think not-voting and not being involved in the full political process sends a message to politicians. Or think that politics is "stupid" and it would work so much better if everyone was just as rational as an engineer or programmer.

For the last 40 years, Americans have been bombarded with the idea that voting doesn't matter. That "both" sides are just the same. That all politicians are crooked and corrupt. These are all lies.

Our voting levels are abysmal, especially at the local levels where politicians such as Feinstein get their start. Low voting levels means only a small and vocal minority of people actually have to be convinced to vote for a person. Or at least not vote for another person. Running for election continues to cost more and more money, which perpetually give money an advantage.

Meanwhile, many places like Arizona continue to make voting as difficult as they legally can. Most other places continue to avoid making it easy to be an informed voter. My favorite state's approach to voting is still Oregon. Vote by mail and including a wonderfully produced voter pamphlet to help inform voters.

Elections can have huge impacts. It just requires people to ACTUALLY VOTE. Most of these politicians represent the people who do vote. Or at least strive to not work against things they value. Look at how powerful the Tea Party style Republicans are right now.

I won't say money isn't a problem in elections. It is. I won't say popular vote doesn't have its downsides. It does. Elections in America need to be reformed to allow more nuances than two party platforms and to help ensure everybody can take a part in the process.

It's a perverse fantasy to look at elections and declare them a fantasy. They do matter and have changed this country so much since the Civil Rights Act reshaped the country. Conservatives figured out how to use a minority of very motivated voters to craft a political movement that has changed everything about this country. For good or ill will depend on your politics, but it would be the height of ignorance to pretend it didn't.

SHOUTING doesn't make it true.

Elections do not matter, not for anything important. The civil rights act happened because people were willing to go to jail over what was right, the same goes for the draft.

Gerrymandering has made most elections farcical, and money does the rest.

Running as an independent is a way to get exactly nothing passed as both parties, which really are just one two-headed Hydra, will shut out outsiders.

And this only covers elected officials, most laws and regulations are made by life long bureaucrats.

It's amazing the myopia of people, this bill is but one of the thousand examples of our corrupt illegitimate government. Again the government does not care for you or your beliefs, it cares for power and money, it will pass secret trade deals, it will pass secret laws in secret courts, it will torture and imprison those that oppose it.

Does anyone have an honest response? America is a corrupt police state run by an unaccountable elite.

Amazing how so many people refuse to see the truth that is so obvious.

Even in San Francisco the tech folks are a small fraction of the population, even if they have an outsized proportion of the money. What would be "corrupt and illegitimate" would be if tech companies got their way because of their money, despite being in the minority in terms of political views.

They aren't really in the minority. There's a huge amount of overlap between tech folk and the majority in California. The main difference is encryption, and even there, that's largely a matter of education to overcome apathy and misunderstandings. Convince people that encryption backdoors aren't "reasonable," and that they're actively detrimental to their own safety and interests, and you can change that.

In any case, our entire system of government was built around the idea of protecting against a tyranny of the majority. For entirely valid reasons, a majority of the people might support a disastrous policy. Ideally, our system is able to protect against the implementation of those policies. Doing so doesn't make your efforts corrupt and illegitimate.

Money is always going to be involved in politics. There's no getting around it, and anyone who argues that you can "undo it" is deluding themselves. You can, however, push for policies that increase accountability and make sure that funding takes place in the light of day. But that's an entirely separate, complex issue.

I just wrote my senators. I encourage you to do the same.

Talk here is fine, but it's more useful to let folks on the hill know what's up.

I think the bill is over-the-top bad; they are obviously going for a compromise of some kind. By appearing to reach a middle ground they can paint the pro-crypto camps as being uncooperative.

Of course, there's no middle ground with math.

If the commercial market has to build insecure devices, these same senators, the FBI, CIA, NSA, will all eventually get to move their own work to these things or stick with 10, 20 year old devices.

The US government may be able to spend billions on specialized devices but eventually they are going to be left behind. Sounds like Assange's goal behind Wikileaks. Eventually, they can keep no secrets.

I think the CIA-NSA is smart enough to figure this out. It's not going to fly.

This! If the HN community won't take a few moments to write their representatives, then we have no excuse and can't complain. Writing mine now...

Maybe the right answer isn't fighting. Maybe the right answer is just getting a few friends together, creating a town, appoint someone judge, someone else prosecutor and someone else sheriff.

There was a report of someone speeding. Use the awesome government powers to subpoena cell phone connectivity, credit card transactions, or whatever to identify the speeder.

Police have auctions for seized stuff, presumably you could just auction off that data off to recover the costs (and then some)

Then repeat, over and over.

If you're feeling really enthusiastic, start targeting aws's DOD stuff. [1]

[1] https://aws.amazon.com/compliance/dod/

Politicians aren't idiots. Neither are they Machiavellian schemers. They are just in a situation where supporting bills like this are all upside and no downside.

If this passes and five years later computer security is swiss cheese, politicians know that most people will blame the computer companies and vendors not the government.

If this fails due to tech company opposition, they will get to avoid any criticism for not preventing the next terrorist attack. Instead they will simply find one encrypted iPhone and blame Apple.

Crypto == Math. Full stop.

All thoughts on civil liberties aside, it's sad that these officials don't even realize that what they're asking for is (or will hopefully shortly be) impossible for companies to do.

Then the real fighting will start. I'm sure Congress sees this proposal as a middle-ground between DOJ desires and freedom/privacy, but once it dawns on them that things like "no knowledge" encryption are really what it says on the tin, they will feel backed into a corner, with the only visible choices being something asinine like, criminalize encryption or ... give up?

Maybe that's something for this community to start thinking about, if we want to have a constructive say in this whole mess. Come up with a way to frame the "Give up" option in a better light, so Congress can point to it and tell the DOJ "look we did X for you", while still not outlawing encryption.

Classic door in the face technique.[1] Ask the impossible to get concessions you would otherwise be refused.


I disagree with the conclusion that the DOJ needs to be offered anything. It's like giving a rude, angry, loud customer a discount to appease them. You're just training them to be more rude, angry, and loud in the future.

They should lose as much face as possible. That's the only way to shut them up. If they look completely incompetent, they risk losing funding, because nobody likes wasteful government spending on incompetent agencies.

I 1000% percent agree with you. But my fear is they've got the $$$ to wage this war of attrition longer than all the rest of us combined.

It's worse then that, they've got your money.

It's very clear to me that civil liberties are at the core of this discussion. And while I think that crypto->math->speech is clear, others may not.

Let's keep in mind that our legislators are responding to a genuine fear in the electorate. It's our job(s) as technophiles to educate the electorate and legislature as to why this isn't the right move.

This is where SCOTUS really pays off as a check against Congress. They can be surprisingly practical (sometimes). This blog article by djb [1] really makes a good case for why encryption is speech (and the fact that it has utility is orthogonal). Recall that Bernstein v US was a driver for real progress in the government's understanding and treatment of encryption technology.

[1] https://blog.cr.yp.to/20160315-jefferson.html

> And while I think that crypto->math->speech is clear

While I agree with this, I don't think the most important part of crypto == math is the connection to speech. The most important part is that math doesn't compromise. You can't make math kinda work...it either does or it doesn't. You can't make math work for some people and not others. Math is absolute, so this debate is absolute. The result is either secure applications or fundamentally insecure applications.

This is where the government's "we've got to find a compromise" rhetoric rings hollow, naive or disingenuous, depending on how much credit you're willing to give them when it comes to technical acumen. Compromise, when it comes to encryption, means compromised. When encryption is compromised, that encryption is fundamentally broken and dangerous. Crypto == math means there can be no compromise with the government's agenda. And it's not because we don't want to be reasonable, it's because math doesn't care about court orders, the law, the government's authority or good intentions.

> This is where the government's "we've got to find a compromise" rhetoric rings hollow, naive or disingenuous, depending on how much credit you're willing to give them when it comes to technical acumen.

The government is not one uniform thing. I think there's elements of the government who feel like the compromise is appropriate and don't realize that Key Escrow is the only possible compromise and don't realize why it's already been reviewed and rejected. They might mistakenly think that the objections are somehow needlessly idealized or dogmatic. I don't think they're naive at all. They're just mistaken. On the other side, there's other elements of government who are totally aware and willing to exploit the fear of the electorate and legislators in order to achieve their goals. That group is definitely disingenuous.

To be fair, most of the loud political objections they deal with can be considered "needlessly idealized or dogmatic." That's largely the norm, and it's become even more so as the louder minorities within each party have been able to leverage new tech to increase their respective voices. This applies to almost every issue that they have to deal with, and every advocacy group argues that their issue is special and unique.

So it is any surprise that a lot of our legislators (at least those who aren't leading the charge here) are grouping arguments against backdoors into the same category based on their past experiences? They don't have the background or theoretical grounding to realize that, this time, the realities of mathematics really do make this issue unique.

It's a mess, with no clear path in sight. In addition to lobbying and supporting new candidates, any united effort to defend against these intrusions is going to have to include a massive education program meant to inform legislators on the dangers of what they're proposing. And it'll have to be tailored to how legislators actually think about policy. The arguments that work amongst the tech world aren't going to work in Congress. Not because they're idiots or anything like that, but because radically different backgrounds and experiences can affect how information is processed.

It's not impossible, it just really compromises security. For example all legal encryption could use a master key, or both hardware and software could be subverted to make truly secure processes impossible. These things are a Bad Thing but perfectly possible, they just preclude anything being reasonably secure.

The drafters of the bill clearly don't fully understand or care about those issues though, and if they did they'd probably see it as an acceptable compromise to security in order to enable government to see all communications.

If this passed, I would appreciate a determined stance to make the government's technology experience difficult. No more smartphones. Google and Bing refusing traffic to government IP blocks. No more Windows updates past whatever Microsoft has already agreed to by contract.

Obviously there's no financial incentive for any of this.

Ever talked to someone who works with technology for the government? The government's technology experience is already difficult.

The government already has e-mail inboxes with a nearly-unusable UX, messages that pass through so many unreliable contracted-out systems that they might not get delivered for several days, hideously obsolete "smart"phones... and yet this pain isn't felt by the people making the laws, just by their staff.

We're talking about a place where the Blackberry is advanced technology.

Your suggestions would involve companies dropping their government contracts out of spite, for an effect that the government would barely notice.

No way this passes....it blatantly violates the First, fourth, and fifth Amendment.

How can we stop this...Vote! Vote the people that support this out of office.

....also...If they want to make everything unsecure...hack the congress members bank accounts, private information, Ashley Madison account, every email and text they send, every phone call they make...every password they've used on any site ever. Make them wish they never made such a stupid law. Maybe then they will learn how important encryption is.

Ah yes, the good old "let's ban math" bill.

Not the first time: https://en.wikipedia.org/wiki/Indiana_Pi_Bill

Unless they backdoor in backdoors, it seems like this will just force companies to make sure there's no possible way for them to access the data.

If your company ever has any way to view the data, the government will find a way to force you to give it to them.

AIUI this bill doesn't allow for that possibility; it says that companies must always provide the data, which implies that companies can never put the data in a situation where they can't access it in the future.

It is my sincerest hope that this bill goes into the filing cabinet with Dihydrogen Monoxide [0] and legislating the value of pi [1]. I appreciate the challenges that encryption creates for law enforcement, but it has nothing to do with placing people "above" the law; rather it is a simple truth that you cannot create a system that is both designed to be secure and circumvented by authorities (and no one else). It is asinine that we must continue to have this debate.

[0] http://www.nbcnews.com/id/4534017/ns/technology_and_science-...

[1] https://en.wikipedia.org/wiki/Indiana_Pi_Bill

"We’re still in the process of soliciting input from stakeholders and hope to have final language ready soon." Who are stakeholders in this context?

Presumably just law enforcement (FBI, DoJ, etc), since they're clearly not listening to either technical experts or software/hardware providers.

Ah, demonstrating yet again how insane it is that we hand so much power to people who know so little about the domains they control (and worse, seem to do little to change their lack of understanding). Will bills like this spur people into action to change how the government operates? I suspect at the very least, this will pointlessly cost technology companies millions as they desperately try to counteract the bill.

What would we do if this passed? I suppose I would winnow and chaff the hell out of my data, sending literally 42 gigabytes of garbage bytes for every one-kilobyte file. That way, if somebody wants to decrypt it, they will at least have to sift through tons and tons of garbage to do so.

The way I read the text (which is quickly and layman-y) I don't see anything that mandates back doors or encourages weak encryption. So to me it looks mostly fine or even just a formalization of the status quo?

It seems to say that if a company can read their customers data in plain text, then that plaintext should also be provided to authorities upon request/warrant.

So the bottom line is: use strong crypto and build products with strong crypto and ensure only the end users hold the keys to the data. Then all the aid you can give authorities is "no can do" - and that's still acceptable? I'm kind of fine with that.

Do bills like this one pass? This reads like childish foot-stamping.

Anyway, if there's a product out there which doesn't provide this capability, no amount of bill passage is going to make that product work like some congressperson wants.

As far as I am concerned, this bill is unconstitutional.

That's an uphill battle, and a long one, and one that you need to have deep pockets to win.

Could someone help decode this?

"covered entities must provide responsive, intelligible information or data, or appropriate technical assistance to a goverment pursuant to a court order."

I assume that this includes secret FISA orders, but what's with the language about "a government"? Could this be construed to include non-American governments? (i.e. Five Eyes?) Or is that a term of art?

Doesn't the 13th amendment to the constitution outlaw slavery?

> Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.

If you've done anything illegal, they can't compel you to work?

Seems to make this whole thing sound unconstitutional.

YC staff: Since YC is a huge source of economy and technical innovation.. have you considered talking to the USG about security?

I can see from reading the comments that I remain in the minority in thinking that such a law will actually bring benefits in terms of security since it will promote FOSS approaches which, being based upon free speech, will be beyond the reach of this law, and which being open will in any case be more secure.

If you're in California, here is a form to email Dianne Feinstein your thoughts on this bill. https://www.feinstein.senate.gov/public/index.cfm/e-mail-me

Ignorance is everywhere, especially concerning computer security and especially among the representatives and senators in the US Congress.

We must combat ignorance by educating the ignorant, and that includes elected officials. Many here at HN have knowledge and high level of awareness of the importance of privacy and security, and I'm sure there are more than a few genuine experts tuning in. In any case most of us probably have salient information and experience to share, and it certainly needs to be shared.

If you are a US citizen by all means write to your Representative and Senators. Voters' opinions do matter, particularly in an election year.

This afternoon, I sent messages to all 3 members of my congressional delegation exclaiming how bad an idea the Burr-Feinstein bill happens to be. I urge you do do the same.

If the goal is to stop terrorism and illegal drug trade, this is the epitome of idiocy. End to end encryption isn't trivial to implement, but at the same time there are many open source projects that implement it already. If this become the norm, than professional criminals will simply adopt open source tools and 3rd party programs, as well as products offered by companies outside US to ensure their privacy. So this will weaken the security of all honest people and do basically nothing to achieve any of the goals set forth in this legislature. Of course if the goal is to further control and subjugate the citizens of this country for the benefit of it's corrupt government, then by all means, it's a brilliant piece of legislature.

Now that the Supremes have ruled that the feds can force you to buy a product (health insurance), it's not such a large step to force people to perform work on their behalf.

One of the major gripes of the American revolution was the forced quartering of soldiers in people's homes.

So this hands the entire security software market to overseas companies...?

To which country? The UK, France, Australia, NZ, Eastern Europe would love to just follow the US on this stupidity. German politicians are too stupid to understand security only remotely. The few countries which would not introduce a similar law would come onto some black list.

Maybe Iceland could capture the entire infosec industry.

How does this apply to open source? Eg, if two people used PGP, their email providers couldn't decrypt their communications. Would the PGP authors/maintainers get in trouble for being unable to help? Does this effectively make it illegal to create end-to-end encryption tools for users?

If this manages to become law, how would this impact open source crypto? Which company would hold the responsibility of compliance? The project sponsors, code repo hosts, or other?

That is the real question. My guess is that there will soon be a push to make open source crypto somehow illegal (if we're following the logic at this point) similar to the way that open source 3D printed firearms are treated.

It will be handled that way until the process makes its way through the courts.

"made unintelligible by a feature, product, or service owned, controlled, created, or provided by the [company]."

So no more Hash functions I guess, better store those passwords in clear text. Also encryption is the key to any form of distributed system where you have misbehaving actors.., you know like the internet..

It's rather disheartening that Californians haven't managed to organise serious resistance to Feinstein.

This is why none of my business, and none of my personal data (outside browing with PrivacyBadger enabled), goes to companies residing in the United States.

(written on an rk3288-based ARM laptop from China, over a VPN from Iceland)

I mean... you literally can't legislate crypto, can you?

Could someone explain what this would look like, in a practical sense? Would self-signed keys become illegal, and all PKI would have to have a "government" parent key of some kind?

> If the company cannot meet this standard, it must offer “technical assistance as is necessary to obtain such information or data,” according to the draft.

Based on this it appears not. Here's how I imagine it going.

1. There's something the government wants that you've stored on some SaaS website like Dropbox.

2. But you've encrypted all of your files with your GPG key so Dropbox gives the gov't access to the files and then tells them that's all they can do.

3. Now the government needs your key so they raid your house and take your say, Dell laptop.

4. Now you use FDE and Secure Boot with a custom key so your device is locked tight.

5. The gov't then goes to Dell and demands that they use any exploit they know to unlock your device.

6. So long as you've chosen a good strong passphrase Dell will do their best but ultimately tell the gov't that there's nothing they can do.

They used to legislate crypto, so there's no reason they couldn't now. Back in the day, creating (and "exporting") an SSL library without a license was illegal.

If you're talking TLS, whomever holds the private key would have to use it to decrypt either stored or transmitted data and turn that data over to the feds. If you're talking PGP or something, then either the makers of PGP would have to put in a backdoor, or stop making it.

To implement this all the government would need is to send an NSL to anyone who creates, owns or operates any tool with encryption technology, or anyone who operates a network or service that encryption is going over.

More to the point, how would this and the 5th amendment interact if it was the end user that created a pass-phrase protected key which no one else knew?

The plaintext version of the draft bill (cleaned up):


Does anyone know if foreign companies that operate within the United States would be subject to these laws?

If not, then perhaps a lot of big US tech companies will move offshore?


So the publisher that publishes text book on crypto have to provide a bank door to the government?

Well it was only a matter of time before we digressed back into a nation with legalized slavery.

No, absolutely not. I'm tired of this stupid conversation. The answer is always no.

We need a NRA for crypto to destroy any and all politicians and bureaucrats that would threaten crypto safety.

    So, as we set out this year to defeat the divisive forces that would take freedom away, I want to say those fighting words for everyone within the sound of my voice to hear and to heed, and especially for you, Mr. <insert politician here>: 'From my cold, dead hands!'
    — Crypto's Charlton Heston

Encryption which can be decrypted by third party is not encryption but encoding.

Reading through the bill, it's apparent that this is actually a pretty poor attempt at circumventing the rise of encryption. I have my suspicions as to why.

The bill mandates technical assistance, yes, but the obvious argument is that no amount of technical assistance can accomplish that which is impossible. If the data can't be accessed, it can't be accessed. Interpreting this as barring companies from implementing those implementing those features would require an unduly expansive reading of the text, to say nothing of issues with Sect. 3(b).

Sect. 3(c) seems to contradict that, but 18 USC § 2711 defines a "remote computing service" as "the provision to the public of computer storage or processing services by means of an electronic communications system" [0]. That would seem to include iCloud services, for instance, but not the iPhone itself. For reference, 18 USC Chapter 121 covers the storage of electronic communications and access to transactional records.

It's possible that this is part of a long-game. Breaking the encryption debate down into smaller pieces lets them control the narrative a bit more and defuse some of the strongest arguments against backdoors and weakening encryption. They can point to this bill a few months down the road and see "look, nothing bad happened. The evil hackers those San Francisco techies were complaining about never stole your identity." It would make the opposition appear more reactionary, and give them more time to muddy the water further.

To the average person, the bill seems entirely reasonable. Who could object to companies giving assistance when they have to comply with a court order? In that sense, it's a perfect starting point.

But most likely, this is a trial balloon meant to help them refine their arguments and position before they get around to a concerted push. That seems likely given that the Senate Intelligence Committee doesn't really have jurisdiction over this issue [1][2]. If they can pass it, great for them. If not, they'll come back around for another go in a bit. They've got the time, and for various reasons, they've chosen this hill to fight upon.

If the want to stop these efforts before they eventually manage to stumble into a "success," the tech industry eventually has to gear up for a lobbying and PR war with a degree of cynicism that's unheard of in Silicon Valley. Eventually, simply reacting to the government's efforts isn't going to be enough.

Groups like the RIAA and MPAA, even when they've failed to actually implement their policy proposals, have had remarkable successes in manipulating Congress to go along with their plans. At least until they backfire. Like the NRA with the second amendment, encryption will have to be the primary focus. Everything else has to be secondary. Supporting backdoors and the weakening of encryption has to be transformed into a toxic issue for politicians. Hammer home the potential consequences to every retiree and everyone else who doesn't like the idea of their identity being stolen and their assets being spread amongst the criminal element.

With this debate, there's really no room for optimism or taking the "high road." There's no high road in sight, and success is the only metric that matters here.

0. https://www.law.cornell.edu/uscode/text/18/2711

1. https://www.eff.org/deeplinks/2016/04/burr-feinstein-proposa...

2. http://www.intelligence.senate.gov/about

The bill begins:

It is the sense of Congress that--

    no person or entity is above the law;
    economic growth, prosperity, security, stability, and liberty require
    adherence to the rule of law;

Confused here, I thought that the process between Apple and the FBI was taking place in the courts and not on some kind of aquatic platform in international waters.

There's a real, reasonable fear at the heart of this legislation.

If encryption becomes widespread and providers/individuals start using it correctly, then it will greatly hinder law enforcement's ability to gather physical evidence for certain types of crimes.

At the end of the day this is just another situation where we have to weigh the positive of greater freedom against the negative of the impunity this freedom may provide to those breaking laws that we all support.

I don't know what the answer is, but acting like anyone who supports this legislation is just after more control is immature.

Encryption is already widespread, with WhatsApp alone just enabled p2p encryption to 1b users. Full disk encryption has been around for 20 years.

There's nothing anyone can do about this, you cannot hinder encryption, it is just pure math.

You can ban encryption all you want, it will never ever stop terrorism in any way. All that will do is harm the good people that have legitimate reasons to protect their secrets from everyone, not the bad people.

Criminals will always have encryption, regardless of any laws, so your dichotomy isn't really correct.

Encryption does not create impunity for anyone. Crimes are still crimes, and plenty of criminals were caught, prosecuted and convicted before mass surveillance.

I'm sorry -- "if encryption becomes widespread and providers/individuals start using it correctly" is what everyone technical agrees should happen. The spread should be as wide and as comprehensive as possible.

To suggest otherwise is to say that security isn't important at all.

Or could you concretely suggest some areas where communications shouldn't be encrypted? Offer up some sacrificial lambs.

It is really sad, that it is likely that you do not even understand how this is related to your last comment on HN about government planted child porn.

>I don't know what the answer is

What's the question?

>acting like anyone who supports this legislation is just after more control is immature.

It's mostly driven by fear based emotional responses rather than logic, the terrorists have successfully hijacked their amygdalas[1], a classic social engineering technique.

When a target cannot be directly and effectively neutralized, scare tactics are used to make the target compromise themselves.

"As for us, we behave like a herd of deer. When they flee from the huntsman's feathers in affright, which way do they turn? What haven of safety do they make for? Why, they rush upon the nets! And thus they perish by confounding what they should fear with that wherein no danger lies. . . . Not death or pain is to be feared, but the fear of death or pain. Well said the poet therefore:—Death has no terror; only a Death of shame!"



Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact