Hacker News new | past | comments | ask | show | jobs | submit login
DOJ plans to strike against encryption while the Techlash iron is hot (stanford.edu)
444 points by erwan 32 days ago | hide | past | web | favorite | 331 comments



Why does this say that the DOJ has been pushing for this since 2016?

They have been pushing for some variation on this since basically forever. I first became aware of it back under Clinton with the https://en.wikipedia.org/wiki/Clipper_chip. And the debate has been essentially the same since.

Law enforcement wants to be able to break security, and promises that their super secret, super safe system will provide everyone else protection from evildoers while letting law enforcement find the bad guys. Cryptographers have maintained that when you create a back door, it is a question of time until it is found and publicized.

And the back door doesn't even have to be found to be abused. Because it will be made available to law enforcement and the courts. Which are surprisingly easy for third parties to subvert. And which are happy to build programs for themselves that break the rules that they are supposed to follow. (Snowden anyone?)

Success has gone to both sides. But on the balance, the cryptographers have been right.


Watching old movies from 60s/70s sometimes have scenes where the characters are worried about calls being recorded or otherwise having no privacy.

The typical response is “What is this, Russia?” or something similar.

Privacy is really important. I will always error on the side of privacy even if that means not everyone bad is caught.


> I will always error on the side of privacy even if that means not everyone bad is caught.

Reminds me of something my dad always told me, "I'd rather see 10 guilty people walk free, than have 1 innocent person in jail."


Blackstone’s ratio is where that principle originally comes from, at least in modern law: https://en.wikipedia.org/wiki/Blackstone%27s_ratio


I'm not a historian, but I don't think your dad came up with that first...


They never said he did. Just that he said it.


The Conversation (1974)

Three Days of the Condor (1975)

Marathon Man (1976)

Sneakers perhaps.

Anyhow, I think you're missing the context of what happened in the 60's and 70's that stoked paranoia in the culture and in fiction.

1. CCTIFBI

Only because some morally-flexible activists broke into an FBI field office did they uncover COINTELPRO and quoting Chomksy "40% [of FBI work activities] were devoted to political surveillance and the like"

https://www.theguardian.com/world/2014/jan/07/fbi-office-bre...

https://en.wikipedia.org/wiki/Citizens%27_Commission_to_Inve...

Furthermore, in modern times, there is:

- warrantless mass surveillance (NSA)

- license plate readers (LPRs)

- cellphone tower/position tracking

- Ring and other IoT devices recording to cloud providers evidence of technical crimes. All it takes is law-enforcement getting access for something else and digging around with automated tools in data to round up bodies to prosecute. For example, a small child escaping from a bath without clothes in front of an always-recording security webcam is considered child pornography.

- No-name webcams and other IoT being hacked or manufacturing backdoored to be used for spying, malware injection, packet sniffing and who knows what else.

- As revealed by Snowden, various secret watch lists containing the names of over 2 million Americans.


Freedom isn't free. I think people have just ought to realize that. People have this idea that free countries are more efficient, which is probably true in the long run, but it's a cop-out to just call it a day there.

Once I was in Belarus. It's a heavily authoritarian country, make no mistake. When they had protests against their rigged elections, they traced the phones of everyone who went, then brought them in for questioning and asked them what they were doing there. The president later said: "We have rigged the latest election. 93.5 per cent have voted for Lukashenko. But they say it is not a European outcome. We have made it 86 per cent."

But it does work. I was at a restaurant there. Some girls are chatting, and they walk away to go get something, and just leave their bags there unattended. Why? Because it's an authoritarian country - people aren't going to steal stuff there.

This was in the capital of one of the poorest countries in Europe, but it was still one of the safest places I've ever been.

EDIT: I don't mean to praise authoritarian countries, on the contrary. We should support human rights even if it is less efficient, because the object of governance is not to strip people of their freedoms for pennies on the dollar.


>But it does work. I was at a restaurant there. Some girls are chatting, and they walk away to go get something, and just leave their bags there unattended. Why? Because it's an authoritarian country - people aren't going to steal stuff there.

Chances are high that you could do the same in most European countries. Just like in the Soviet Union, thieves exist in Belarus. People's perception might be that they're safer, but that doesn't mean they actually are. Just like during Soviet times, people felt safer, but crime still happened. It just wasn't talked about, because it makes the state look bad.


Just like during Soviet times, people felt safer, but crime still happened. It just wasn't talked about, because it makes the state look bad.

Poor example. The anarchy that attended the collapse of the Soviet system brought about a dramatic increase in crime. And while reported petty theft was arguably a reporting problem, the dramatic rise in homicides was not.

For the record my mother-in-law lived in St Petersburg at the time. And her anecdotal reports were that crime that she and friends experienced rose dramatically.


I don't recommend leaving bags unattended in public anywhere these days, they're likely to call in the bomb squad.


Just to add to your point. In old time mafia controlled villages in the old country, anecdotally you did not have to lock your car, but few would argue that an average person in that village was 'safe'.

It almost reminds me of some my parents' peers yearning for the good ole days of communism, cuz riffraffs would not dare to talk back to a cop.

edit: It is hard to argue with that kind of sentiment. It is technically right.


I actually lived in Belarus for most of my life. I would never leave a bag in a restaurant unless I have a clear view of it while ordering something. (I would not do it here in the USA as well). Also, even in the capital of Belarus, some neighborhoods are still quite dangerous to walk around at night.


Would Japan qualify as authoritarian? Yet it is safe to leave things out without watching them, except for umbrellas. There are even places in the US where it is safe enough. I doubt these places are more authoritarian but instead allow such social safety because of other reasons. Authoritarianism can, to some extent, replace those reasons if they are lost, but it replaces them with some sort of mutated social structure that I find has far more problems that the original once you look past the surface.


> Yet it is safe to leave things out without watching them, except for umbrellas.

People have no qualms about stealing umbrellas?


Umbrellas frequently get stolen via the self-justification of "I need it, I'll just borrow it" followed by laziness about returning.


In most of the US, you can safely leave most things without watching them without much risk of theft.

I mean, if you leave something that's a strong temptation (like a big pile of cash) it'll eventually disappear.

Yeah, if you go in to one of the inner cities, your stuff will get stolen, but otherwise, generally nah.


> This was in the capital of one of the poorest countries in Europe, but it was still one of the safest places I've ever been.

Obviously if you put all people in jail they will be very safe there. But then you have less people working and producing more useful stuff than rotting in jail.

The best example I give of this trade off between freedom and safety is that of women in saudi arabia. They have the least amount of sexual violence in the world.


Well of course they do; it's not rape there as long as you marry the 12-year-old first.


In Sharia law, you need 4 witnesses to prove fornication. Bringing a charge without the evidence subjects the one who brought it to 80 lashes.

Whether or not this applies to rape is a question of debate. However in Pakistan it certainly has been interpreted to so apply. Which means that it is very hard for a woman to prove that she was raped.

See https://en.wikipedia.org/wiki/Hudud#Requirements_for_convict... for verification.


It can suffice to marry the girl after rape (not sure of country and specifics).


> > This was in the capital of one of the poorest countries in Europe, but it was still one of the safest places I've ever been.

> Obviously if you put all people in jail they will be very safe there. But then you have less people working and producing more useful stuff than rotting in jail.

Friendly reminder that the US has the highest incarceration rate of all countries.

> The best example I give of this trade off between freedom and safety is that of women in saudi arabia. They have the least amount of sexual violence in the world.

Is this according to their official statistics? How do they know they're accurate and comparable? For instance, is rape within marriage considered a crime and regularly prosecuted there?


The US has the highest incarceration rate of countries from which you can trust the statistics.

That is a different thing.


This is plain false, with no legislation protecting women from marital rape and punishment for women who report rape, nothing indicates that Saudi Arabia actually has "the least amount of sexual violence in the world" this is a laughable statement at best.

https://en.m.wikipedia.org/wiki/Rape_in_Saudi_Arabia


I think that is the point that OP was making.


One time I was at a coffee shop in Chicago and I saw a woman leave her laptop unattended while she went to the bathroom with no ill effects.


You better not have left your stuff unattended in authoritarian communist Romania... I get what you're implying but authoritarian regime doesn't always imply harsh and effective enforcement of the law for petty crimes. Sometimes it does but the two aren't necessarily correlated although one probably requires the other.


For me at least this is the thing that will push me to leave and renounce citizenship. I am sure I represent an insignificant minority to them, but I can't live somewhere without my own security on my data and cryptocurrency. Over the last few years I moved basically all my assets into Bitcoin and have left the legacy banking systems behind. With a few keystrokes they can lock you totally out of all your funds and you can do nothing about it except complain. This is totally unacceptable to me. I still pay taxes and have a salary job of course. But I don't trust the banks at all.


I don't mean this to sound snippy, but where would you go? Most non-US countries I'm aware of have even worse restrictions around speech, encryption, and so forth.


My expat friends and family have mostly set up shop in Germany. It seems like Germany has pretty strong encryption support[1]. Doesn't hurt that they also bought homes there at around 1% interest and get no less than 6 weeks PTO a piece.

[1]:https://carnegieendowment.org/2019/05/30/encryption-debate-i...


New Zealand has stronger free speech protection than the US and no encryption issues. More freedom of the press and even ‘free’ heathcare for citizens or permanent residents make it a solid contender.


Are you talking about the same New Zealand that literally has a government position called Chief Censor? The same New Zealand that tried to censor sites after the attacks? The same New Zealand that then went on to push for weakening free speech protections online?[0] New Zealand is at the bottom of the list for me when I consider western countries that protect free speech.

[0] https://www.scoop.co.nz/stories/PO1905/S00277/new-zealand-pm...


https://www.computerworld.com/article/3468958/the-uk-and-nz-...

>New Zealand’s Telecommunications (Interception Capability and Security) Act was introduced in 2013.

>On receipt of a warrant from a surveillance agency, a network operator – defined as a public telecommunications network or a telecommunications service – must “decrypt a telecommunication” on their network or service, but only if it has “provided that encryption”.


All countries owned by “her majesty” are anti encryption. Look up the Queen of New Zealand - many know but I’m sure some will be surprised.

Sadly, that same world leader is the strongest anti encryption and censorship advocate.


> stronger free speech protection than the US

You do realize this is the same country which made it punishable by up to 10 years' prison to have a specific terrorist's manifesto on your computer?

https://apnews.com/162e85e9418240d3ae3650c8f59caf56


I have a US and UK passport. I would probably end up in Singapore or a Nordic country? I always figured I would evaluate my options if that time ever came.


Don't bother with the Nordic countries if trying to avoid surveilance. Surveillance shitholes and it's only getting worse.

Denmark has been illegally requiring all telcos to monitor everything for years, and nothing has changed. No one is willing to change anything or impose any consequence for it. It's even gotten people falsely imprisoned due to complete reliance on location data from mobile phones.

Sweden is not much better, and just passed legislation allowing law enforcement to hack people's phones.

I'm almost certain that Norway, Finland and Iceland are no better.


Singapore is a democracy in name only and is its own budding surveillance nightmare.


Your best bet are countries where people still remember what it's like living under an oppressive regime.


Someplace where the government is too weak to care. Most countries with a GDP/capita under $20k qualify.


If the NSA can't keep its employees from abusing their spying tools to spy on their neighbors then what chance does law enforcement have?


I'm starting to get the feeling that a lot of these jobs, both tech and government, are low key treating non public data access as a perk.

For context, I interviewed with a company that had police videos stored on their systems and were using them for entertainment, to the point of showing me one at the beginning of the interview for grins


this was verbally joked about as a perk while interviewing for Facebook in 2008, specifically: any engineer could view anyone's profile. i think that's been changed though.


This is possibly true at smaller companies, but this kind of thing does not fly at larger tech companies.


I've worked on both sides of that, I think you'd be surprised. I also suspect that it's worse at the TLAs and any organization with access to FBI, license plate and various criminal databases. Obviously CCard databases and stuff are off limits as their use is outright fraud, but stuff like traffic cameras, imaging systems, etc.


I don't know about US, but many European countries have access logging in police and health databases, which checks to see for unwarranted snooping. With actual convictions taking place when someone gets too interested in, say, the behind-the-scenes data of the latest celebrity news.


Many times that are implemented so that IT isn't monitored, and the staff who is monitored can have selective enforcement. This allows for people to be fired for abusing their access when in reality they are being fire for some other action that isn't nearly as PR friendly to state.


> This allows for people to be fired for abusing their access when in reality they are being fire for some other action that isn't nearly as PR friendly to state.

TBH sounds a bit like a conspiracy theory. And for what it's worth, at least in Finland getting caught snooping isn't cause for firing people, the convictions have been fines.


The idea that employers will fire you for justifiable reasons when the reality is that they are firing for unjustifiable reasons is far more than simple conspiracy theory. There are numerous cases of people winning discrimination lawsuits because such firings didn't cover up good enough and lawyers were able to prove the actual reasoning. I do not find it unreasonable to think that many more cases weren't caught because the individual either didn't have funds to fight back or tracks were covered well enough the jury wasn't convinced.


Yes, I find it plausible. But you specifically said that this monitoring is often leading to baseless firing. Do you have proof there's a link, are there multiple cases where this was proven? If not, you don't have grounds for claiming that it's often happening in the context of data snooping. That would be just useless speculation of what could happen.


Not true. While it has been pretty rare for anyone to get fired over snooping other people's data, there has been several cases where it has been the basis of firing the employee. Especially the health care sector has been sensitive about snooping, see for example [1], article in Finnish.

[1] https://yle.fi/uutiset/3-7460193


Good if the more serious cases have bigger consequences. I suppose the fines have been for single offences, whereas this linked case was pretty extensive snooping.


twitter employees have in the past been bragging in a bar about their ability to look at people's DMs and were recorded doing so.


I find it hilarious that RSA Security campaigned against the Clipper chip (https://en.wikipedia.org/wiki/Clipper_chip#Backlash) when they were subsequently found to also be compromised.

Seems like open-source and peer review is the only way to ensure that crypto is not being hijacked.


It's not because they have been trying to forever that they aren't going for another big push right now. Attorney General William Barr is spearheading an active campaign on this, it was headline news here a few weeks ago. Don't be complacent.


> And the back door doesn't even have to be found to be abused.

The government itself is likely to abuse such back doors. LOVEINT is a documented practice.

Meanwhile governments throughout the world are mandating the use of Signal for communication. Common folks are not special enough to justify strong encryption but it's totally valid for government use. Helps hide their corrupti-- I mean state secrets.


> Cryptographers have maintained that when you create a back door, it is a question of time until it is found and publicized.

Why would it not be possible to create a system that required several manual and offline steps in order to break the encryption?

For example (and perhaps similar to offline cold storage of bitcoin) why couldn't a system be designed whereby 3 or more people in geographically diverse areas were in a position to agree that a request for information was legitimate (by court order) and thereby produce what is needed to unlock certain information? So one person would not have the key or access.

After all right now you have a case where a single person (the owner) is able to unlock information. The feeling is a back door can be hacked. What if it's not a back door though?


Shamir secret sharing allows for that on paper.

The problem is that to be useful for law enforcement, any local police department has to be able to go to any local judge and get a warrant and then get access.

There are approximately 30,000 state judges with fairly high turnover in that list. If you can compromise one, or successfully get yourself added to that list, you can then get access to whatever you want. That's way too many people to trust.

This is not a hypothetical weakness. I personally know someone whose physical location was compromised through a court order obtained by bribing a judge. How many cases are there where similar access was gained but the victim doesn't know how it happened? And the better the access that you can get, the more incentive there is to get it. (There is no shortage of reasons why a motivated party would want such access.)


This is basically what happens when law enforcement uses a search warrant to get access to user data from a tech company.

While this process does have weaknesses, it is still the difference between a legal process overseen by the courts and one based on espionage where agents do whatever they want without oversight.

Note that strong network encryption is essential for ensuring that they have to get a warrant.

I don't think anyone has come up with a better system than judicial oversight that still allows law enforcement to do their job?


> that still allows law enforcement to do their job?

Do libraries not keeping records of who has read what books not allow law enforcement to do their job?

Do prohibitions against arbitrary searches not allow law enforcement to do their job?

Does having to make a plausible argument in front of a judge not allow law enforcement to do their job?

Do individuals not having a number tattooed on their forehead not allow law enforcement to do their job?

Law enforcement continually frames any pereceived difficulty as if it is prevents them, which is utterly disingenuous. Law enforcement's "job" is to operate within the constraints of a free society composed of innocent people, in spite of the extra work from not being able to just shoot the bad guy like in the movies. When they don't respect this, they're merely another criminal element.


While I agree with your point, if you're going to critique law enforcement this only goes halfway:

In the US, the idea of the publicly funded law enforcement officer can trace its lineage back to antebellum slave catchers.

Much like they now (and forever) frame any perceived difficulty as preventing them from doing their job, they've successfully re-framed their role in society as protecting and serving (people) instead of protecting and serving (property rights), despite that obviously not being the case.


How is that a good thing though? American police are not very popular, to put it bluntly. Even among the Americans I know who support their efforts, they still do not have any affinity towards individual police officers.

Compare this to the situation in Europe and Asia (or at least the civilized parts like Japan), where it's perfectly acceptable and normal to ask a policeman for directions. And where their role is indeed to protect and serve people (not property).

"Police have the role of protecting and serving people" really does not feel like such a controversial viewpoint to me.


You're being too binary about this. Yes, we should be skeptical of law enforcement. Judicial oversight is essential. But we should still care about police effectiveness at catching law breakers. Why have laws if they can't be enforced?


It is you and law enforcement who are being binary. Plenty of laws are currently being enforced in the presence of strong encryption. The sky hasn't fallen.

In fact, these high profile cases the FBI et al attempt to use as wedges generally involve someone who has already been caught with plenty of evidence, but the authoritarian impulse still demands more.

As far as laws that can only be enforced/observed by reading people's private thoughts and communications, then yes indeed, why have those at all?


If we're agreed that there is no deep principle is at stake and there are shades of grey, it seems that the difference between end-to-end encryption and link-level encryption (as cloud providers typically do) is a matter of practicality?

Law enforcement agencies send legal requests to Google and Facebook all the time, and the sky hasn't fallen.

Actually, for all X where X exists, the sky hasn't fallen :)


That's fallacious generalizing, though. "All sorts of things have been allowed, so allowing this other thing can't hurt." Yes, yes it can.

I think we can agree that, say, putting a chip in everyone's heads that transmits their thoughts to the government would indeed, as you put it, cause the sky to fall. I'm not trying to suggest that outlawing encryption is the exact same thing, but I do believe it would cause much more injustice than it would prevent.


There is a deep principle at stake for sure - individual privacy in thoughts and communications.


> This is basically what happens when law enforcement uses a search warrant to get access to user data from a tech company.

Which is another reason why consolidating everyone's data into a few centralized locations is also problematic.

> While this process does have weaknesses, it is still the difference between a legal process overseen by the courts and one based on espionage where agents do whatever they want without oversight.

But that's not what we're talking about here. The question isn't whether the police should need to get a warrant, it's whether the government should be able to prohibit technology that preserves privacy because some criminals might use it alongside all the law abiding citizens.

And encryption doesn't "prevent law enforcement from doing their jobs" -- that's just a trope. What it does is make their jobs more expensive. Even if they can't just get a copy of all your communications from a megacorp by filing some papers, they can still get a warrant and then plant bugs or guess your password or plant bugs that allow them to observe you entering your password etc. It's not impossible, it just takes more resources to do it -- which prevents it from happening at a massive scale.

That's a feature, not a bug. It still lets them solve murders, because murders are serious and uncommon and can justify the expense of a real investigation. It may make it inexpedient to spend those resources to catch every last hooker and pothead, but so what? Sometimes it's not worth the candle. If you think it really is, give them more money instead of giving everyone else less privacy. But sometimes it just isn't. Sometimes it costs more to solve a crime than to not solve it.

Meanwhile (this is the feature) it doesn't make it too easy for them to identify all the people in group X and give the list to Joe McCarthy or round them all up and put them in internment camps. Law enforcement should have some friction, because when it happens too fast at too large a scale, history shows this to be Bad.


> whether the government should be able to prohibit technology

There is no principle that says governments (as representatives of the people) can't regulate what technologies people can have. Sometimes we decide that yes, the government should do this. Consider how the FCC regulates electronic devices to prevent radio interference. Enforcement is lax and as a hobbyist you aren't likely to get caught, but devices aren't commonly found in retail stores that don't have FCC approval.

There are many practical issues, of course, including making sure there is a balance of power and that law enforcement powers aren't abused. I agree with that part.


Part of it is how we scope law enforcement's job. Thanks to technology there's tremendous growth in the amount of information about people. Traditionally law enforcement needs to find evidence that a person may have committed a crime, not necessarily prove a crime. Limited availability of data makes it less likely that law enforcement finds evidence on innocent people.

My take is that even with a warrant, law enforcement's reach into private data should be limited for a free society.


A computer is like an extension of your mind. for centuries, law enforcement has been unable to search your mind, by court order or otherwise. I see no reason to change that.


I've heard this argument a few times, and I don't think it passes legal muster. A computer is much more analogous to a fancy paper notebook than a part of your brain.


With how some people use smartphones, you really do start to wonder. Just try to get someone to leave their smartphone at home when they're going outside - they look at you like you're crazy.


I don't disagree with you, but I would be absolutely shocked if that matters one bit to a court.


And you trust the courts not to rubber stamp police requests?


For that matter, do you trust the courts not to misuse that access themselves? Or sell government keys for hundreds of millions plus free emigration for their entire family to wherever they want? Because I have a feeling the going rate would be on that order of magnitude.


Not entirely, but the judicial system's power to oversee law enforcement is an essential underpinning of the rule of law. Without that, laws don't have any power anymore and the executive does whatever it wants.

I guess if you're preparing for the apocalypse you don't count on that, though?


The judicial branch and law enforcement are two sides of the same coin. The executive branch (FBI, police, etc) statistically target the poor and minorities when choosing what laws to enforce and the judicial branch sentences them more harshly.

The Supreme Court just ruled that you can’t sue law enforcement for civil rights violation - including killing people unjustly.


> 3 or more people in geographically diverse areas

Sure, a system could be designed where the “master key that unlocks everything” is distributed - that makes the problem of the attacker who wants to get his hands on that key slightly harder, because now he has to compromise three systems instead of one, but that doesn’t change the fundamental risk, which is that he can do that in the first place. Remember, you’re talking about one piece (or three pieces) of information which can be used to decode every single secret in the United States - this isn’t limited by technical feasibility, this is the explicit end goal that you’re asking for. If Russian hackers got a hold of it undetected, they could decrypt everything for a very long time. Even if it were revealed that it were compromised, everything that was encrypted using the old key would have to be re-encrypted somehow, and the old copies destroyed somehow.


I would think the government itself would be exempt from the master key. They would most likely secure their own communications without having these master keys out of the control of the parties communicating. "Good for thee but not for me."


Maybe something like this?

Judge, Jury & Encryptioner: Exceptional Access with a Fixed Social Cost

https://arxiv.org/abs/1912.05620

> We present Judge, Jury and Encryptioner (JJE) an exceptional access scheme for unlocking devices that does not give unilateral power to any single authority and places final approval to unlock in the hands of peer devices. Our scheme, JJE, distributes maintenance of the protocol across a network of "custodians" such as courts, government agencies, civil rights watchdogs and academic institutions. Unlock requests, however, can only be approved by a randomly selected set of unlock delegates, consisting of other peer devices that must be physically located to gain access. This requires that law enforcement expend both human and monetary resources and pay a "fixed social cost" in order to find and request the participation of law abiding citizens in the unlock process.


Another one that the GP poster may find interesting. This is from Stefan Savage at UCSD, and published in ACM CCS in 2018.

https://cseweb.ucsd.edu/~savage/papers/lawful.pdf

Lawful Device Access without Mass Surveillance Risk:A Technical Design Discussion

> This paper proposes a systems-oriented design for supporting court-ordered data access to “locked” devices with system-encrypted storage, while explicitly resisting large-scale surveillance use. We describe a design that focuses entirely on passcode self-escrow(i.e., storing a copy of the user passcode into a write-only component on the device) and thus does not require any changes to underlying cryptographic algorithms. Further, by predicating any lawful access on extended-duration physical seizure, we foreclose mass-surveillance use cases while still supporting reasonable investigatory interests. Moreover, by couching per-device authorization protocols with the device manufacturer, this design avoids creating new trusted authorities or organizations while providing particularity (i.e., no “master keys” exist). Finally, by providing a concrete description of one such approach, we hope to encourage further technical consideration of the possibilities and limitations of trade-offs in this design space.


Each of those people would have to do some sort of authentication. Each of those mechanisms is just as subject to accidental disclosure, hacking, etc as any other.

Make a door, it won't stay secret. And you can't assure that there won't be bad actors involved in any number of parties that have to agree.

The whole backdoor system, once it exists, is open to subversion and/or misuse. The only way to not have the problem is not create the door in the first place.


> Why would it not be possible to create a system that required several manual and offline steps

No matter what "offline" and "manual" steps you propose, they can be made online and automatic after a one-time compromise of the system. And if a single system is protecting the secrets of literally everyone, the incentive for compromising that system is unlimited. There's no attack too impractical when the stakes are so high.


Apart from what others said, there is also the following problem. People won't like this and will start encrypting traffic with non-compromised algorithms. Given that properly encrypted traffic appears random, how would you enforce the requirement that everyone uses the state-sanctioned, compromised algorithm? In order to check and enforce, you'd have to turn this into an online, warrantless, dragnet-style system, the very thing you were trying to avoid. A contradiction.

It's not feasible to ban people from using their own encryption unless you plan on severely restricting their freedom.


Law enforcement likes to talk about big, scary threats like terrorism, because those make the public more likely to accept their position.

The article below argues that the real use case for breaking encryption is to catch everyday criminals, not to go after shadowy Bond villains. Would the public still go for it, if they looked at it this way? Probably not...

The Encryption Debate Isn't About Stopping Terrorists, It's About Solving Crime https://www.lawfareblog.com/encryption-debate-isnt-about-sto...

Edit: And most everyday criminals are not technologically savvy. Half of them probably have a hard time using Telegram safely.


> It's not feasible to ban people from using their own encryption unless you plan on severely restricting their freedom.

In practice even the tech types weary of using and maintaining truly secure solutions. So if they outlaw backdoor-less solutions then companies won't support them. And without commercial support the options will dwindle.

And even donation supported projects like Truecrypt will fold under pressure.

Crypto isn't a one-person job


Once we all agree that should be a key the government then focuses on just getting the "ease" of obtaining the key. You gotta fight them at every step with an irrational zeal to prevent this slippery slope.

Example: None of us case about midget porn. If we all agree that banning midget porn will not have an impact on anyone, government then quickly moves on to octopus porn. Now, on a matter of principle you will find your position weaker and weaker.


You can safeguard the process the government needs to go through to get approval to decrypt things all you want. That doesn't stop bad actors separate from the government from finding and taking advantages of the vulnerabilities introduced by enabling the government to forcibly decrypt things.

Cant fix technology problems with people processes.


You act as if the government isn’t the “bad actor”.


The point Im trying to make is even if we did trust the government and had full faith that they'd never abuse this power, whatever technical mechanism they'd use to 'rightfully' decrypt things would surely be attempted to be misused by people that are explicitly and exclusively bad actors.



Does anything prevent the "offline" bits of your system from working online? Does anything prevent the information there from being compromised? Not "we keep it super duuper safe, we promise", but actual cryptographic guarantees?

I think the history of crypto exchange hacks should be of interest here.


No, it's not possible to create such a system without fatally compromising safety.

Take Bitcoin cold storage: People use offline storage and specialized hardware wallets to store their keys more safely. Now imagine there were some privileged keys that would allow the holder to just take anyone and everyone's bitcoins. It doesn't matter how geographically distributed it is, how elaborate the key ceremonies and oversight is. The system could never be trusted if such a backdoor existed.


> Why would it not be possible to create a system that required several manual and offline steps in order to break the encryption?

When done correctly and with everyone's knowledge and consent, this is a form of key escrow. But there's some simple math at play here - the more unlock options there are, the more points of vulnerability and failure. https://haveibeenpwned.com/ lists 8 breaches of my data and I'm aware of more they don't list - the industry can't even keep my data under lock and key without adding more points of failure via key escrow as another thing they can fuck up.

> For example (and perhaps similar to offline cold storage of bitcoin) why couldn't a system be designed whereby 3 or more people in geographically diverse areas were in a position to agree that a request for information was legitimate (by court order) and thereby produce what is needed to unlock certain information? So one person would not have the key or access.

Bitcoin is a great example, in a way - there are several stories out there of exchanges fucking up and losing access to their coins or having their coins stolen, even when they were using cold storage. All 3 people will likely be duped by the same fradulent (or just overreaching) LEO email, in part because of the all too human tendency for each of them to assume that "one of the other two would have caught it if it was fishy, right?"

> After all right now you have a case where a single person (the owner) is able to unlock information. The feeling is a back door can be hacked. What if it's not a back door though?

The label you use doesn't matter - the more people who can unlock things, the more points of failure you have, and the more likely things will end up hacked. The industry struggles enough to secure things even when it takes a hard line approach and espouses end-to-end encryption. Defending against abuse by insiders/employees usually means going through a bunch of trouble to lock your own employees out of customer data - or making it unavailable to the company in the first place - not giving them more tools to access it.

Governments already have more than enough tools to completely pwn my privacy if they actually need to. Convince a judge to sign a sneak and peek warrant, have law enforcement covertly install a hardware keylogger, and I'd wager you'll pwn most people - even if they're tech savy. But that requires pesky things like indivudalized suspicion, "checks and balances", actual work, and involving multiple people (your landlord or a locksmith for the keys or entry access, your security company to silence the alarms, a judge to sign off on the warrant and a leo to execute it) so it's harder to run a long term secret warrantless suspicionless mass surveilance dragnet as a rogue employee or agency. But not impossible, as certain three letter agencies have shown.


Yeap. Lazy cops' leadership would rather trade in everyone's rights to make their jobs a little bit easier. TBH, we already exist in an inverted totalitarian regime that is risking becoming worse than Orwellian like China if the DOJ gets their way. I do not consent.


> Law enforcement wants to be able to break security, and promises that their super secret, super safe system will provide everyone else protection from evildoers while letting law enforcement find the bad guys

Wasn't this how Google, Adobe and several other tech companies had a major security breach about 5-7 years ago? They provided the DOJ backdoor access.


> They provided the DOJ backdoor access.

If you're thinking of PRISM, no, at least not in the voluntary, intentional sense of the word "provided". Many of the major tech companies had non-public backbone fiber, and links across that fiber were unencrypted. The NSA tapped this dark fiber to read unencrypted traffic. This famously hit Google, which subsequently moved to encrypt all internal traffic, even traffic that would never leave its networks.


You are mistaken. PRISM is specifically a program that "collects stored internet communications based on demands made to internet companies" [em. mine]. NSA wiretapping the non-public links of Google et ol was not PRISM (I'm not even sure that the name of that program was ever disclosed).


Not as mistaken as it appears.

The NSA placed key people into companies like Google who had security clearances that specifically forbade them from saying exactly what they were doing to higher ups. They then created systems for extracting data in an automated way based on requests from the NSA. All that executives knew was that they were doing something important for complying with law enforcement requests.

The CEOs of these companies learned about the existence of the back doors from public reporting based on Snowden's revelations. When they first heard, they issued public denials that were, as far as they knew, truthful. Their subsequent actions upon finding out that they were wrong strongly suggest that they wouldn't have approved the programs had they known what was happening.


[citation needed]

I see this claim is made in the Wikipedia article [https://en.wikipedia.org/wiki/PRISM_(surveillance_program)]: "PRISM collects stored internet communications based on demands made to internet companies such as Google LLC under Section 702 of the FISA Amendments Act of 2008 to turn over any data that match court-approved search terms.[6]" but the cited source [https://www.washingtonpost.com/world/national-security/nsa-i...] does not support this assertion; on the contrary it explains "[NSA] has secretly broken into [...] Yahoo and Google". It certainly doesn't make the stronger claim you seem to suggest here that PRISM is limited to FISA requests.


Wouldn’t tapping dark fibre be a bit useless? Or were the links tapped while dark waiting for the target to start using them?


(Context: the original comment mentioned dark fiber, but I changed it to be more precise to avoid exactly this confusion.) Dark fiber is a bit of a misnomer and term of art, IIUC. The way I've had it explained to me, dark fiber is not necessarily dark. It's just not on the public internet. When you lease dark fiber, you presumably do so to put your own signal on it, making it no longer dark. But folks still call it "leasing dark fiber" and still refer to the fiber, once lit, as "dark fiber." I'm not a networking expert so someone please correct me if I'm wrong.


You’ve hit the nail on the head. It’s a “technically wrong” term that conveys enough meaning: unlit fibre you buy and light up yourself (usually).


it's an election year, might as well try to link all this to Trump! (although, as you've noted this has been happening for quite a while across all parties)


He didn't start it, but he isn't stopping it either. In fact, he's making it worse.


> Why does this say that the DOJ has been pushing for this since 2016?

Because that's when Trump came into power and nothing bad ever happened under the Obama admin.


The current administration came into power in January 2017.


The author gives that date because, as is mentioned in the article, that is when Apple famously declined to help the FBI backdoor its own system, which happened before Trump was elected.


There seems to be some kind of intentional ignorance of just how many programs people blame Trump for have been there since Obama or even earlier.


A President (and the priorities of his administration) can make bad programs much worse.

Take marijuana enforcement as an inverse example. The federal law around marijuana hasn't changed, and it's as illegal in 2020 to use, grow, and sell as it was in 2004. But the Obama administration made clear to the DEA and FBI that pursuit and enforcement of that law in states that had passed their own laws to legalize it would be a career-limiting move, which gave the states breathing room to experiment (and allowed hard evidence to gather that allowing medicinal and recreational use wouldn't cause societal collapse or, really, almost any negative outcomes).

This aspect of the intersection of law and executive enforcement is why the Trump administration is correctly condemned for worsening the situation regarding immigration in the US. Most of the laws the administration uses have been on the books since before 2016. This administration chooses the harshest and cruelest ways to interpret and execute on those laws.


While Obama might have been nice to some states in some contexts that passed some laws he was as if not more aggressive on weed than the previous administration.

https://www.rollingstone.com/politics/politics-news/obamas-w...

https://swampland.time.com/2012/05/03/what-is-president-obam...

This was a big deal back during his term which I feel like people are already forgetting.


That didn't really change enforcement elsewhere and choosing Biden for his VP, who is basically the legal architect of the entire War on Drugs, sent a huge signal.

The Obama administration did exactly nothing for prison reform, which coincidentally is probably going to be a big part of Trump's legacy (he's done more on this than any other President), and nothing for situations where marijuana possession suddenly makes other things criminal (like owning a firearm _and_ marijuana).

For as poor as Trump's record with women is, it's pretty shameful for everyone else that he's the president that made sure tampons are widely available in women's prisons.


Yep. These are examples of the effects that a President can have on policy I was referring to.


> A President (and the priorities of his administration) can make bad programs much worse.

Ok? So that should counteract the behavior.

> This aspect of the intersection of law and executive enforcement is why the Trump administration is correctly condemned for worsening the situation regarding immigration in the US.

There is an element of "outrage" culture that means to ascribe new fault, in the established pursuit of a "federally safe" cryptography, to the administration as a proxy for other wrongs. This is not relevant to the issue of original fault.


A small anecdote.

A few years ago in an undergrad business class, we were having some discussion and the topic of encryption came up during one of my presentations. A student asked a question related to the ethics of encryption (I don't recall exactly what), and I was clearly confused by the question.

To clear up confusion, the professor asked those who thought encryption was "bad" to raise their hand, and at least 60% of the class raised their hands.

It was pretty jarring to me, and makes me pessimistic about the outcome of a DOJ campaign to demonize and regulate encryption


I think we in the tech community tend to vastly under-estimate the threat of legal restrictions on encryption. When the public gets scared, they look to governments to "do something", whether that something is really a smart thing or not.

If we're unlucky and we get caught unprepared, we run the risk of getting stuck with a backdoor or "exceptional access" mechanism that provides little or no technical safeguards against massive and nearly unlimited government overreach.

IMO this makes it our responsibility to figure out how we might design such a system that does have strong protections against misuse. Of course that is a very difficult thing to do. Doesn't mean we shouldn't try.


I think you'll run into a chicken-and-egg problem here. Without bonafide strong encryption, those strong protections against misuse probably can't exist.


> Without bonafide strong encryption, those strong protections against misuse probably can't exist.

Agreed. I actually did some work on this topic that requires having things like strong ciphers and hash functions, and protocols that provide perfect forward secrecy.

Our approach was to take strong constructions and turn them into a kind of proof-of-work. So you can recover the key, but to do so you have to expend a huge amount of electricity (ie, money).

The idea was partly just to call the bluff of the government types who claim that "exceptional access" is so critical to our safety. If they're not willing to spend the money, then we should not be willing to compromise our security.

NOTE: I'm not suggesting that we should actually deploy this kind of thing, as long as we have a choice in the matter. Just that we should be prepared in case we need it.

Anyway, the paper is here:

C.V. Wright and M. Varia. Crypto Crumple Zones: Enabling Limited Access without Mass Surveillance. In IEEE EuroS&P, 2018. https://web.cecs.pdx.edu/~cvwright/papers/crumplezones.pdf


It sounds like a majority of the students had no idea what encryption was and because the authority figure (the professor) asked them whether or not it was bad they just went with it? I'm having trouble understanding why people would say mathematical functions are bad.


> why people would say mathematical functions are bad.

People don't think 'encryption' is a mathematical function. They think it's like a radar detector in your car. "Why would you have it unless you intend on speeding?" [goes their thought process]

"Oh I use this frequently and things I do every day would be totally broken without it? Really?!" [is the realization we hope dialogue could bring]


Then again, given speeding is so widely accepted, radar detectors are legal to use in 49 out of 50 states. It would be nice if encryption was viewed the same way.


No, we absolutely don't want strong encryption viewed as "illegal but accepted", we want it viewed as "absolute necessity for day-to-day life and personal safety, unacceptable to ever weaken".


I think it comes down to popular vote versus government policy. While things like speeding and marijuana are illegal, popular votes have made these things legal (or quasi legal) in their respective jurisdictions. For example, votes to remove speed cameras or make them illegal, or votes to decriminalize marijuana.

But you are correct in that we really should make it completely legal so that the government doesn't have an excuse to target individuals for engaging in a practice that is generally widely accepted.


Isn't (relatively) strong encryption legal in every one of your 50 states and deployed to more than 10^10 devices globally?


> I'm having trouble understanding why people would say mathematical functions are bad.

Encryption is more than just mathematical functions, it's the application of mathematical functions.

And the application is what matters, because it would be easy to say a Nuclear weapon is just physics or that a bio-weapon is just chemistry.

Clearly, how you use basic principles matters.


It isn't like Nuclear weapons, though, because every single computer has everything you need to do full encryption. Nuclear weapons require materials, manufacturing, etc.

Nuclear weapons require both the knowledge and the physical world parts.... encryption just requires the knowledge, which is impossible to put back in the box.


Every house has the basic materials for a bomb, and many houses have the basic materials for a big bomb (especially if you use fertilizer). The difference is understanding how to use it.

Every computer is capable of building software that can bring society to its knees, but most people don't know how to do it. Software is just algorithms, after all.

The problem is that law enforcement pushes the bad uses of something without weighing it with the good. If fertilizer can be used to make bombs, should be ban fertilizer? No, that's throwing the baby out with the bathwater. Unfortunately, many voters don't fully understand what the baby is WRT encryption, they just see child porn, human trafficking, etc.


Encryption allows data to be locked away from the government including law enforcement and prosecutors in a way that was nearly impossible for the average citizen a few decades ago. Warrants can't break encryption like they could doors or locks. As much of life moves to the digital world and becomes encrypted, that can be a drastic change in how the justice system works. Pro-encryption people need to keep this in mind when talking to anti-encryption folks. Criminals being able to hide evidence is a serious concern for the average person.


> Criminals being able to hide evidence is a serious concern for the average person.

Suspects. Alleged criminals. Until convicted, they're not criminals. The government does not have an unlimited right to collect all information; they have a limited, judicially controlled right to try to collect information.

One key detail: strong encryption does not prevent investigations from targeted collection of information, such as through physical surveillance, bugs, etc. There are many tools available to law enforcement, those tools are just less convenient. Framing it that way helps: in order to make investigations more convenient, the DOJ wants to prevent anyone from using secure communications.


>Suspects. Alleged criminals. Until convicted, they're not criminals.

And even after they are convicted, they and their associates are still protected by encryption. Encryption protects everyone equally. That is viewed as a positive by pro-encryption people and a negative by anti-encryption people.

>The government does not have an unlimited right to collect all information; they have a limited, judicially controlled right to try to collect information.

This the point of warrants. The problem of massive warrantless collection of data by the government is a bigger and separate issue. I personally think encryption is a valuable tool to address the symptom of that problem, but it does nothing to actually fix problem of a corrupt and/or authoritarian government. One of my personal annoyances with the tech community is that we have a tendency to try to work around any governmental problems as if that solves the issue while making no attempt at fixing the root cause.


> One of my personal annoyances with the tech community is that we have a tendency to try to work around any governmental problems as if that solves the issue while making no attempt at fixing the root cause.

Agreed. We need to produce technical solutions, and effect policy solutions, and get a lot better at PR and rhetoric.


The only way your argument makes any sense is if you can somehow show that conviction rates have gone down since encryption became widespread. I think you'll have a hard time doing that because people will usually find some other way to hide information, like just destroying evidence. Are you gonna go after shredders next? They're hiding criminal's secrets too!


> Encryption allows data to be locked away from the government including law enforcement and prosecutors in a way that was nearly impossible for the average citizen a few decades ago.

I know some people do think this, but the opposite is actually true, because of the popular shift from voice to computerized text and images for almost all communications.

A few decades ago, criminals spoke on the phone to coordinate actions. If the police were aware of this in advance, they could tap the phone line and listen in to that conversation. However if they found out about it after the fact, there was no record of the conversation for law enforcement to go get. The sands of time hide information extremely well.

Today, we use mobile devices, and prefer text and image-based communications--email, SMS, WhatsApp, iMessage, Facebook, Twitter, etc. etc. Unlike voice conversations of the past, these default to creating a permanent record of the conversation, which police can go get after the fact. Encryption is an attempt to re-establish the privacy and security of these conversations that existed back when they would have been a voice call over analog telephony.

The idea of "going dark"--promoted by law enforcement to fight encryption--is wrong. What has actually happened is that INCREDIBLE BRIGHTNESS of information has developed over the past few decades as all our computerized communications now default to living forever.

Encryption is a tool for trying re-establish the level of "light" (information availability) that our culture and laws came to expect over hundreds of years.


I always have trouble imagining examples of this kind of evidence that criminals were previously utterly unable to hide but can now do it easily using encryption. It seems any kind of evidence could had just been shredded and burned in times past or hidden in an obscure, physical safe that no one knows about. If anything, the digital age had made the location of evidence more obvious and even produced entirely new kinds of evidence (traffic and connection logs, emails, etc).

It seems to me that law enforcement is simply pining for the fjords of a very short time span in human history when ample digital information co-existed with the lack of widely available cryptographic systems.


Warrants might not be able to break encryption, but refusing to provide the passcode can land you in jail indefinitely for contempt. You could serve decades potentially.

The fear from law enforcement is absurd. They've never had the ability to read every single letter that was sent, but now that we've moved online they want to read every email? They claim they are losing access but the truth was they never had access in the first place. The only difference is before maybe they could secretly read your letters before delivering them but now they need to get your password. That only worked at a small scale though, and I'm sure the NSA can crack passwords at those same small scales (e.g. some of the high-profile iphone cracking we've seen).


> in a way that was nearly impossible for the average citizen a few decades ago

Ehh, maybe if you are talking about speed and scale. The average citizen could always learn how to use a one-time pad.


Speed, scale, and by default with no initial sharing of keys are super important distinctions.


Why can't the government serve warrants to give up their private keys?

I don't really see the State's argument against encryption except in the case that they want to break encrypted information without due process.


Well, 5th amendment protections, and inability to determine whether a person even remembers the key, means that serving a warrant doesn't always work either.

But then, neither did breaking down the door and ransacking the obvious physical target pre-internet days always work. And everyone understood a suspect had no obligation to tell the feds where he hid the goods.


> I'm having trouble understanding why people would say mathematical functions are bad.

That's such a naïve way of putting it. People don't care about the mathematics behind it. They think it's "bad" because they think it enables terrorism and the spread of child sexual abuse imagery or whatever politicians have led them to believe.


Let's be clear - encryption can absolutely enable those things.

That is not all it does, but pretending it does not is stupid.


The internet enables all of those things too. Before that, telephones and fax machines did.

Hospitals enable the spread of disease by clustering all the sick together with a constant flow of healthy people (visitors, doctors, etc). They also enable the development of superbugs.

What's your point? You're confusing incidental usage with enablement. The lack of these good things doesn't prevent the occurrence of the evils listed, and to make a conversation one sided for how much it exacerbates the evil without instead focusing on the good and the reason they exist is incredibly disingenuous (which is the clearly the intent of most of the people on a soapbox about the evils)


I'm having trouble understanding why people would say mathematical functions are bad.

Because the vast majority of people don't know that encryption is mathematical formulas. They only know what they see from Hollywood, which is that it's a thing that bad guys use.


I hope those people access their bank over port 80.


They do, on their open wifi network.


Should have followed up with: Do you think having a lock on your front door is bad?


I think locks are a bad analogy. Going online without encryption is like living in a house without walls or windowshades. It's less about physical access and more about people being able to see what you're doing. Even if you're not doing anything "wrong", the lack of privacy still feels uncomfortable.


Eh. Locks can be picked and doors can be rammed. Layered encryption is really hard to break and one time pads are impossible.


So locks are only good because they don't actually work? And encryption is only good if it doesn't work too?

That logic kind of makes sense. However, locks at least delay intruders and make it harder for someone to go undetected if the break into your house. The intruder is either at your door for a little while, makes noise trying to break in, etc.

If the police bust into my house there will probably be witnesses and I will know about it. If they steal my data, how is anyone going to know? This gives them way too much power.


They already have a great deal of power. Technological progress gives us all more power.


So in an ethics discussion the professor simply asked "is X bad?" That doesn't seem like a very enlightened ethics discussion.

How certain are you he said "bad"? If he has instead asked "is encryption problematic?", the outcome can be interpreted much differently because there are problems with encryption. Especially in an ethics discussion, there are definitely pros and cons to encryption. (FWIW, the pros outweigh the cons by far, IMHO.)

I'm not trying to say your memory is wrong, I'm just asking how certain you are about what was asked exactly. Changing one synonym for another can make a big difference.


Asking if it's 'bad' is an excellent starting point for an ethics discussion. It reveals pre-existing biases and emotional responses to a topic, which must be taken into account as a pre-requisite to any rational discussion. You cannot get dispassionate without first examining the passions.


Asking about a nuanced subject in broad and blunt terms can be pedagogically useful because you hear what people really think, unfiltered ... and maybe they hear it too, priming them to realize that they've been underthinking the topic. From there you can get into nuance, which is easier when you know where you currently stand.

I have no idea what tack this prof took with the question, but I think you've drawn a hasty conclusion here.


I didn't intend to conclude anything. I was just asking if he remembers exactly what the professor asked, and saying that the exact phrasing of the question changes my interpretation of the results. I do not expect him to remember, nor do I think there is anything wrong with his comment, but I wanted to ask.


Per GP this was in a business class likely taught by a business professor, not an ethicist or someone similarly specializing in such discussions. It's not surprising that they'd pose the question to the group in such a blunt fashion. As a CS major, I had to take an engineering and a CS ethics course and it was similarly problematic. The courses were taught by an engineering or CS professor (respectively) and ethics discussions were not their specialty.


>That doesn't seem like a very enlightened ethics discussion.

That is an apt summary of my college ethics discussions. Even in cases where the professor was doing their best to encourage discussion, the majority of the class couldn't think beyond first order effects and hypotheticals would quickly get emotional and devolve into character attacks.


Yeah. I was shocked at how many -CS MAJORS- were pro-DRM.


What does being a CS major having to do with being pro-DRM? Isn't it mainly an economic/philosophical standpoint on copyright?


I believe it's something more to do with the OP's (mistaken) idea that CS majors are somehow more likely to be "hackers" or people who exchange pirated software or wares -- when really there's only a handful of people who know how to crack the DRM in the first place (which requires some degree of technical know-how) and then those people distribute the pirated goods to others (whose technical literacy may be far less, even if it's still enough to distribute the pirated material on down the line).

It's a bit like thinking all painters are in favor of graffiti, or all musicians are okay with "sampling" and remix culture.


Actually, it comes from the author's belief that CS majors should know that, A. DRM at best delays cracking, never prevents it, and B. That it ends up hurting the legitimate users, not the pirates, and C. That the more efficient it is at A, the more likely it will cause B. As even when I was in college, that was the case (I asked some of the students to give me an instance where a piece of desirable software was not available illegally within a week of release date. They were unable to provide one (in fairness, since then there have been a few cases of less desirable releases that went longer; nothing that had the attention of all the major cracking groups). I also asked if they were aware of some of the issues at the time (such as securom's huge issues, the installation of rootkits by some, the inability to use when moving computers, etc), they were not, despite having explicitly chosen the topic to do research on for the class. They went into the topic with the expectation that DRM is just a good thing, and even when researching it, only sought out sources that enforced that bias. Which shocked me, given they should be technically literate enough to know the issues, and the greater nuances involved.


An interesting follow question should have been how many people think privacy is bad. That leads to a discussion of encryption as a form of privacy boundary...


What was the class about? What sort of majors were the attendees?


Maybe the professor should have asked whether drugs are bad. (and yet this is still a poor analogy)


Similar but different anecdote. I installed i2p the other day to see what had changed/improved over the last year, and mentioned to my 12yo son that I had done this. He asked what i2p was, and I said it was the dark web. He immediately thought he knew what this is, and asked "aren't you worried about getting in trouble because it's illegal?"


> the “techlash” by Congress and the public “in the wake of myriad privacy scandals” and the 2016 election

This just makes my head explode. Because tech companies tend to be poor at privacy, let's use that logic to make it so the government can invade your privacy anytime they want?


This is a world where (at least it seems to me) the same people are against net neutrality legislation but want the government to regulate Facebook. By the time you get into the nuances of why the tech companies support encryption in some cases, you've lost the PR game.


It’s basic enemy-of my-enemy logic. It’s not always correct, but often enough to remain a powerful heuristic. It would be interesting to know if anybody has thought deeper about it, and if there are specific strategies of persuasion that work better in such situations? I’d intuitively think it might be beneficial to acknowledge the cognitive dissonance people might be experiencing when arguing this matter.


Yeah, "techlash" is a weird term but it's as good as any I've heard for the phenomena. I have plenty of these amorphous attacks on Facebook in particular here on HN, where you get some amount of too much sharing, some amount of "it's just like drugs, man", some amount of too much privacy and so-forth all cobbled together into some sort of complaint.

A lot of smart people seemed to get on board that train without thinking much.


Maybe the pitch to voters is: "Your privacy is already toast, your information is being used for ads, why not let us use it for counterterrorism?"


Questions to the public should be phrased: “Do you want Chinese style surveillance to be advanced in the United States?”


+1. This is something tech bros don't seem to get, while politicians get very well: the majority is driven by emotions and has small cognitive ability, but they vote and thus arguments to win their vote must be trivial emotionally charged ideas. A politician says "encryption is a tool of criminals!" and those who start arguing in the rational plane have already lost; instead, the answer should be "lack of encryption enables Chinese style totalitarian communism!" - no need to explain the details, just push their "scary communism" button and let the public contemplate on the "crime vs communism" topic.


Agreed. I wish I did not have to agree, but framing has proven to be very important.


People do want that, though they might not admit to it. I think this would backfire. In general Western democracy is inconvenient for most people today.


Isn’t the tech lash for the complete opposite reasons? The fact that too many people have too much of our data? Why would people (outside of effective propaganda, which would be true even without the tech lash) support something that makes their problems worse?


Exactly. But for the people that want that data, for the people who will pay for it, FUD is a great political tool, and this issue is too subtle for most people to get. I've tried to explain to my family, how the abdication of private behavior logs to some companies is creating economic/political inequality that may forever destroy norms of fairness & competitiveness that we enjoy, but they just don't get it...

For example, a friend worked at a proprietary hedge fund of a major US bank that was looking into the private accounts of their customers to drive trading decisions. Other investors can't compete against that, stock market investment is no longer fair. And you've broken a fundamental component of the economy, a fair & transparent investment system.


In fact, user data usage and share is a multi-faceted issue. The most obvious stances are (but not limited to):

  * Pro-privacy claims that big techs share their user data to too many irrelevant entities.
  * Pro-competition claims that big techs monopolize uses of their user data.
  * Pro-regulation claims that big techs don't share their user data to accountable government entities.
Each of those arguments has some valid points but also conflicts to each other in a some degree.


It's almost like people think their personal data is zero-sum - if we give it to the government then companies won't have it! Just like if you download a car, someone else loses that car.

Makes me think of the Israeli voter rolls hack[1].

[1] https://www.cnn.com/2020/02/11/tech/israel-voters-data-expos...


Huge portions of the population (Congress included) are technically illiterate. They don't know any better.


So terrorists will use one-time pads and other strong encryption and everyone else will have their information exposed on a massive scale when the backdoors inevitably are exploited.


I look forward to the day when one time pads are the norm for general encryption.

A scifi novel I read, "A Deepness in the Sky" described how the pads themselves were a valuable item of trade. I don't think it's farfetched to imagine purchasing OTP data to use with internet browsing, the way we buy yubikeys to use with passwords. It would be a far simpler encryption scheme than those we currently use, and that simplicity would make it easier to catch bugs like heartbleed in the future.

How hard would it be to extend the ssl protocol to allow the use of OTP pads, where available?


One Time Pads are perfect, except for the rather crucial problem of key exchange. You have to meet your intended recipient irl at some point to exchange pads, and I don't fancy meeting up with every sysadmin who's server I want to access some HTML files from.


That's the point of my second paragraph. The pads would be an item of trade. You buy an 'OTP yubikey' with 2GB of otp data for each of the top 50 websites, plug it in, and you're good for a month. The website spends 10k on a petabyte of otp data.


So you can only securely access the top 50 websites, and need to go in person to buy the key.

That sounds like a beyond ridiculous system. We can do better.


You'd have to have a OTP of the same length as the data you want to send/receive.

So your mobile phone for example would need to have local storage the same size as all the data you ever expected it to send/receive.


Is that farfetched? It's trivial if you have a 5GB / month plan.


OTPs are impractical to use for average people, while their use is entirely tractable (and widespread) for covert operations.


Everything technological is impractical to use for average people until people like us go out and build a practical solution.

OTP is not a very complicated scheme. All you need is a good source of entropy, a place to store a big fat array of it all, some XOR operations, and a safe way to hand the codebook to your trusted parties (e.g. phone-to-phone transfer options).


You don't even need XOR, just modular addition (XOR is basically addition modulo 2). The Wikipedia article has an example doing it by hand with letters (using addition modulo 26).


One time pads are not secure by modern cryptographic standards.

Elaboration: https://news.ycombinator.com/item?id=6008695


Huh?

First off citing yourself isn't an elaboration. Second if you are arguing that one-time-pads don't work / won't work in modern times that goes against our entire understanding of certain bits of cryptography.

You'll need to yield some real sources first.


I referenced my own comment because I didn't feel like rephrasing what I had previously written, nor just spamming a copy of it. A logical argument doesn't rely on a "source":

> A one-time pad XORed with a message doesn't provide integrity, and therefore can't reliably secure an arbitrary protocol

The point is that sure, one time pads "work" the way they're described. But their properties don't actually fulfill what we require from modern cryptosystems. Note the sibling comments talking about augmenting them with half baked authentication schemes. Which are all into the realm of computational-complexity cryptography, and no longer "mathematically unbreakable!!1!"


If integrity is an issue you can just add a poly1305/ghash tag to the message. They are not encryption algorithms so it is unlikely that they are going to be banned, and just like OTP they are provably secure. In addition they are not difficult to implement (or execute by hand).


No encryption algorithms would be "banned" by the proposed law. Rather corporate service providers would be compelled to act as bona fide MITM, regardless of what primitive(s) they use.

Once we're past the point of users doing something beyond downloading an app from a corporate app store, all secure encryption would be back on the table.

Of course we can foresee a true ban on encryption some years out, but at that point XOR has the exact same signature as AES. Steganography is the corresponding approach for that attack.


So the concern then is that the message might be tampered with on the wire?

Is there some hypothetical reason we can't just append the SHA256 of the message to the message before encrypting it? It should be impossible for an attacker to alter any message bits undetected with this scheme.


Your scheme fails for replay attacks, and allows modification of messages with low entropy. I'd say easy fixes are to add a counter and a random nonce to each message, but then there is probably something else I am missing. In general there's no "just" in cryptography, which is why cryptosystems are formally defined and then analyzed whole.


OTPs are not a replacement for our current encryption systems, but I love them as an example of how encryption isn't going to go away.

A children's book would be a great demonstration of this, and should be protected speech (let's hope). Teach kids how to use OTPs in an illustrated book. Maybe the story ends with 5 year old Jimmy getting hauled away by the feds for doing some illegal math.


I was talking to a layperson about encryption and privacy and they were very much against both interestingly. They compared encryption to wearing a mask in public and said if people don't want to be noticed (w.r.t privacy and encryption) they shouldn't be "participating" (it was unclear what they meant by this).

Just goes to show you how the average person thinks about these things. I have to admit I wouldn't like it if people wore masks in public in a way where I couldn't recall their face, but I don't think that's necessarily the same as encryption, but I guess I see the comparison.


Ask them if they're OK with being told to put TSA locks on their house and car, to prevent crime. Anyone can buy a key online but at least we're all safer. /s


Encryption isn’t like using a mask - we can still see who is communicating with who, we just can’t see what they’re saying. Banning encryption is like banning envelopes.


Ah, but the government already has a backdoor to envelopes, because they can get a warrant to open the envelopes. This analogy only strengthens their position.


Not really. Because strong encryption is like having everything in envelopes only you can open, and weak encryption is like everything passing without an envelope, because you don't know who has compromised the keys (and just like it traveling without an envelope, you have no idea who is seeing and meddling with it).

There is no "only the government can remove the envelope, if they get permission". If we had the capability for that, we wouldn't be having the argument; the problem is people think this exists and it's technical company intransigence that is preventing it from being used.


> There is no "only the government can remove the envelope, if they get permission".

Okay, but now the case you're trying to explain is "there are deep technological challenges and a wealth of historical precedent make me skeptical that we can correctly design or implement a solution that allows only the government or the intended recipient to decrypt the message, instead of only the intended recipient".

To people who think "technology is magic".

And on the other side of the PR battle, the government is saying "Nah, we got this. And if you try and stop us, we're powerless against pedophiles and terrorists."

Adding enough nuance to make your position technically accurate, also makes it abstract enough to be politically useless.


Ask them if they'd be ok with the mailman reading all their ingoing and outgoing mail. If they say "Yes", at least they're being consistent.


Dangerous analogy to offer in an argument. The easy reply: "The Government can get a warrant to read my mail today. All I'm asking is for the same capability online, so they can get warrants to read pedophile and terrorist messages"


And isn't that a valid point to make?


Yeah. But I disagree with the larger position it argues for. Like all "real world" analogies applied to digital concepts, there's a sort of impedance mismatch.

In the real world, resources are naturally constrained. It's usually impossible to read everyone's mail in real time and retroactively pull up the contents of a letter sent 3 years ago. This limitation vanishes with online communications. Encrypted messages can be stored indefinitely and later decrypted.

The super safe backdoor we build today could very easily be used by a tyrannical regime a decade from now to get dirt on everyone. We can dream up all sorts of technical solutions that allow for a backdoor, but make it really hard to abuse, but at the bottom, those solutions rely on the government obeying their own law.


Well, at first sight yes, but a) they would gather everything instead of just actual targets and b) their access could not be protected properly from third parties who are not authorized.


The easy counter: "Except encryption works on maths; a mathematical "warrant" is more like a skeleton key that applies to every single letter ever sent and every safe ever made and every house ever built – and not one of those fancy keys that's hard to forge, because it's data; once you've seen one, you can just copy it and use it without anybody ever knowing. What happens if [terrorists] see one of these "warrants"?"

And to pre-empt: "And no, we haven't discovered the maths to make it not work like that. Nothing we know about comes even close, and such mathematics might not even exist."

A slightly more abstract counter:

> The government can get a warrant to read your letters. The law can regulate that, and there's no way they can get a warrant to read everybody's letters; only suspects will have their letters read. But these are computers. A computer could just go through, unlocking and reading everybody's letters, and looking for — look, imagine a future, slightly more evil government decides that puppies are illegal. A computer could look through everyone's letters for anybody who'd ever been pro-puppy, and then fine them for puppy-supporting, and then anybody who continues to support puppies would be put in prison. I know that puppies are a stupid example, but imagine that the government really wants to start a war, or something, and you disagree, and the election's coming up, and you can't talk about maybe not starting that war.

> Okay, so you trust this government, and all future governments, not to do that? Do you trust every single person who's ever seen a warrant? Say you want to get a job, but you can't, because you maybe don't want to save the pandas while children are starving in [impoverished nation] and you said so in a message to a partner three years back and your employer does a quick background check with some company somewhere and it comes back "[person's name] does not like pandas". It's your dream job, and now they don't want to hire you because the HR person is a really big fan of pandas. I'm picking innocuous examples, but come on, you can think of more significant ones. (And those background check companies already exist – they literally Twitter-stalk people; do you really think they'd ignore private conversations if they had access to them?)

> Do you trust anybody who's ever seen a warrant with your bank details? Sure, those skeleton keys will be kept fairly secure, but it only takes one person to make a mistake and then every single message that's ever touched a computer, past, present and future, is public.

> Or they could just look at the lock and figure out what shape of skeleton key would fit it. That's how all of the other backdoors the government put in computer systems like this were spotted. And once you've spotted it, you can just decrypt everything almost instantly, because you're just getting the computers to do maths and computers are fast at maths.


And I can encrypt my postal mail. Humanity has been encrypting things for maybe 2,600 years.

https://en.wikipedia.org/wiki/History_of_cryptography


I think that you might be disappointed with the response. Postcards are a thing for example.


I am free to put the postcode in an envelope if I feel the message is private or confidential; terrible example.


What is strange it the same people (IME) believe the 2nd Amendment is required to overthrow the government. It's amazing how subservient they can be when required.


I guess it comes down to who are you more afraid of? Terrorists and pedophiles or law enforcement with unlimited power and resources to spy on you and imprison you. Of course it's a false choice anyway, if encryption is legislated away the bad guys will very easily continue to encrypt their communications anyway, after all you can communicate with someone by shooting a wall in a game of halo if you want to.


We tried to respond thoughtfully to each of the strange arguments made by the DoJ regarding the need for encryption backdoors to protect children:

https://blog.nucypher.com/todays-kids-need-end-to-end-encryp...

It's difficult at this point to think that the DoJ is arguing in good faith.


> It's difficult at this point to think that the DoJ is arguing in good faith.

I think people in the tech industry, and especially people working on security or privacy, have known this for over a decade. Unfortunately, the public doesn't pay enough attention to the issue. And why would they? They have enough things to worry about, why pay attention to one of the things that is currently working?

It's unfortunate that we're not that good at PR.


You first DOJ. These folks want back doors so they can read everyone's traffic, but once you put a back door in an encryption standard, it affects everyone.


My favorite interview over the last few weeks is from NPR when a guy in law enforcement said that police don't even use that type of encryption (that normal people use) as if he forgot that his police radio, payroll, camera footage, information storage, are all most certainly protected via encryption.


I found this exchange amusing as well, for a slightly different reason. The gist of his point was that "who needs this level of encryption, other than maybe the military?".

Substitute guns instead of encryption and see how that fits.


Interesting to read in context of the fact that public sentiment surveys suggest there is no “techlash” at all.


I agree. It kind of feels like the media is pushing a "techlash" narrative where there is none.


Overly large companies wield disproportionately large amounts of influence in our lives and probably should be regulated more. But that disproportionate influence doesn't mean "tech" is the root cause.

There shouldn't be a backlash against tech companies specifically, but really companies that are overly large and use and abuse their power and position in ways that benefit shareholders in ways that are severely detrimental the common good. Such as limiting competition, influencing and manipulating legislation, ensuring people have no other options and then raising prices exponentially, selling your data to the highest bidder, etc.


Can you provide a pointer to such surveys? In my head the popularity of articles bashing tech companies is out of proportion to people's patronage of those same companies, but numbers would be nice.


This is genuinely threatening to too many interests in the US, will not pass. Much smoke, but no heat, just like the last few times its been proposed. They will not get a master key to everything lol, nobody trusts DOJ like this. The people in power have dirt to hide too, just like you and me.


This will just push people to open source applications and peer to peer networking. Basically, devolve back to the early days of the internet with regard to person to person communications.

How is the DoJ going to force Signal or even Telegram to add a back door?


Well, during the Cold War the feds declared encryption to be "Auxiliary Military Equipment," listed on the USML [1]. It was illegal -- prison time illegal -- to ship software which could encrypt communications outside the USA up until 1992 [2].

I'm old enough to remember the tail end of this and it was absolutely absurd, yet still very real. I remember Zimmermann being investigated by the feds [3]. Zimmermann was smart to tie publishing PGP to the First Amendment. Something like Signal in those days may well have resulted in charges and a conviction.

Never underestimate the potential for folly when it comes to government regulation.

[1] https://en.wikipedia.org/wiki/United_States_Munitions_List

[2] https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...

[3] https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i...


I wonder if encryption could be tied to the Second Amendment as well. I'd be interested in the legal theory discussing that.

Many of the arguments I've seen here defending encryption are basically the same as those defending guns.


I really would not be surprised if will surface that Telegram has a backdoor for Russian special services. After Anton Rosenberg's posts [1] and Durov's (very unconvincing) responses, it is clear that Durov lies a lot how Telegram operates, and it is quite likely that all their 'blocking' in Russia is just a show to boost credibility.

It also should be noted, that while Telegram markets itself as the most safe and protected messenger, it's regular chats are, in fact, non-encrypted at all and are way less secure than WhatsApps. It's actually baffling, he bashes WhatsApp with such audacity [2], telling in one sentence, that 'if your phone dies, you can't access your messages on another device', and then continues preaching how cool it is to use encryption. BUT IF ONE USES END-TO-END ENCRYPTION, ONE CAN'T ACCESS MESSAGES ON ANOTHER DEVICE, BY DEFINITION!!!!!

It never ceases to amaze me that people bought into this. A comfortable to use app and a really good promise of security turns out to be better than real proper security. ¯\_(ツ)_/¯

[1]: https://medium.com/@anton.rozenberg/friendship-betrayal-clai...

[2]: https://youtu.be/kVZN9QbtFgs?t=106


> BUT IF ONE USES END-TO-END ENCRYPTION, ONE CAN'T ACCESS MESSAGES ON ANOTHER DEVICE, BY DEFINITION!!!!!

You can encrypt each message separately for each device, that's still end-to-end encryption.


Yeah, and load it on a completely new device by providing only login and password, too?? (Like Telegram does)


Ah, no. I missed the part about it being a completely new device. This obviously only works for devices that were already authenticated when the message was first sent.


That's the point - Durov bashes WhatsApp for 'inconvenience' and supposed lack of privacy, but in reality this inconvenience is a result of real presence of privacy. In contrast, Telegram's convenience is a result of a real lack of privacy, because to work this way it has to have a way to decrypt user messages.

Of course, they tell tales on how deeply it is all encrypted on T's servers, and that only select few trusted engineers have keys, but those are just tall tales.


> How is the DoJ going to force Signal or even Telegram to add a back door?

Hypothetically: they could easily force the Apple and Google stores to stop carrying it, or even to prevent its installation using the same mechanisms used to scan for malware. Do not underestimate the means by which use of real encryption could be made inconvenient.


A method I've seen tossed around is that they'd just add a "ghost" user as a silent participant to all conversations.

I'd like to think this is difficult with Signal's model, though.


They'll force device manufacturers to give access to keys locally locally. China did it with Apple's iCloud backups by just nationalizing the servers.


Encryption keeps the government out of things it shouldn't be snooping on, and also keeps other governments out of those things. If key escrow is mandated, it would be cracked to high heavens by parties unfriendly to the US within a year or two,tops, mark my words. Then what?


Shouldn't the US government be pushing good encryption? I wonder what sort of world we would be living in if the NSA had spent 1/2 as much time over the last 10 years trying to protect Americans from hacking as they do trying to spy on Americans.


Exactly. I can't believe I'm supposed to pay for taxes for this. The government is supposed to be for the people, not control the people. I don't remember ever having voted on or agreeing to be spied on for no reason.


Privacy is extremely important, especially as also our democratic governments cannot be trusted always, at least this is the impression I get when reading the interview with the UN Special Rapporteur on Torture concerning the Assange case: https://www.republik.ch/2020/01/31/nils-melzer-about-wikilea...


MITM-as-a-business has been nothing but a slow motion train wreck destroying individual liberty. FAANG may bicker with the DOJ/NSA (and Ma Bell) about who is in control of all the surveillance data, but none of them are fighting for we the people. They're all just jockeying over who gets to rule.

The sane response to corporate totalitarianism is most certainly not government totalitarianism. Sadly with how the two political salesteams frame a false division merely over different flavors of authoritarianism, this has a good chance of working.

As always, the true answer is trustable software running under the control of users ourselves. Unfortunately, we will have to see how bad things get before most people are driven away from all of these centralized attractive nuisances.


Very true. Nobody cares for the individual citizen. It’s just a power struggle by large players on the backs of the little guy.


Are there any concrete proposals on the table that can be looked at?

This feels to me like one of the typical debates where people are shooting at each other but nobody understands what they really are talking about.


I think you're mistaken.

Virtually any proposal that the government might offer will be a key escrow. We already know it's bad and why it's bad, because it's been debated for decades. I don't think there are any more novel solutions that should revitalize the argument.

The problem isn't that there aren't enough oversight mechanisms or checks to prevent abuse. The problem is that the design has unfixable security defects.


You say “might offer”. Have they offered anything?


"Offer" is perhaps an overly positive term. The DOJ has issued a number of speeches, letters, etc. insisting that tech companies must build a backdoor to let the government decrypt messages as required.


I know but I have never heard any details especially how they want to keep the backdoor secret. Cracking the backdoor would be such a high value target that a lot of people would spend insane amounts of money and energy on it.


Do not under any circumstances let the discussion move from "whether" to "how". Reject the premise of the question. Such a system cannot be built; the requirements are broken. Treat it as though someone asked you to build a system that solves the halting problem, or factors products of large primes in linear time: the correct direct response is patient explanation of impossibility, and the correct indirect response is good PR from real security experts who understand how to do good PR.


I don’t think it’s right to reject something outright. You should always give people the opportunity to show what they have. Then take a look at it and decide whether it’s good. A strong indicator for a scam is that they won’t show anything when asked for details. Then things should get rejected.


I'm not suggesting a refusal to listen to the question. I'm observing that it's dangerous to accept the premise of a question that may move the goalposts, without first noticing the implicit premise or differing definition, and explaining why the premise or definition is faulty. Accepting the faulty premise moves the line to "so you just need to come up with the right technology", which then moves to "what's the most secure system you can build while still having a backdoor". The right answer to that is not "here's what we could hypothetically build", it's "by definition we cannot build a secure system with a backdoor, and also here are all the additional problems that a backdoor would introduce".

By all means, fully listen to proposals, in order to understand the misconceptions and better address them. Or, in some cases, it's useful to listen to proposals to anticipate upcoming security threats and build a more robust security model. For instance, the potential threat of compromising CAs led to the invention of Certificate Transparency, which almost completely eliminates that possibility.

We also need to get a lot better at rhetoric and PR.


"so you just need to come up with the right technology"

That line is not OK. If somebody wants something and you say you are not able to do it then it should be on them to prove you wrong.

Reminds me of work BS. You say “it can’t be done in the timeframe”. VP response: “you are just not working hard enough”. It’s a nice way to manipulate the conversation.

Technologists definitely should get better at messaging and arguing with bad faith actors.


The obvious explanation for why we've heard no details is that they don't plan to work especially hard at it. Some DOJ messaging sounds very much like they don't believe it's a real problem[1] - companies can keep their centrally-managed signing keys secret, they say, so why couldn't we keep centrally-managed backdoor keys secret?

[1] https://www.justice.gov/opa/speech/attorney-general-william-...


The article specifically cites the EARN IT Act, which the same author has written about in more detail: https://cyberlaw.stanford.edu/blog/2020/01/earn-it-act-how-b...


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: