They have been pushing for some variation on this since basically forever. I first became aware of it back under Clinton with the https://en.wikipedia.org/wiki/Clipper_chip. And the debate has been essentially the same since.
Law enforcement wants to be able to break security, and promises that their super secret, super safe system will provide everyone else protection from evildoers while letting law enforcement find the bad guys. Cryptographers have maintained that when you create a back door, it is a question of time until it is found and publicized.
And the back door doesn't even have to be found to be abused. Because it will be made available to law enforcement and the courts. Which are surprisingly easy for third parties to subvert. And which are happy to build programs for themselves that break the rules that they are supposed to follow. (Snowden anyone?)
Success has gone to both sides. But on the balance, the cryptographers have been right.
The typical response is “What is this, Russia?” or something similar.
Privacy is really important. I will always error on the side of privacy even if that means not everyone bad is caught.
Reminds me of something my dad always told me, "I'd rather see 10 guilty people walk free, than have 1 innocent person in jail."
Three Days of the Condor (1975)
Marathon Man (1976)
Anyhow, I think you're missing the context of what happened in the 60's and 70's that stoked paranoia in the culture and in fiction.
Only because some morally-flexible activists broke into an FBI field office did they uncover COINTELPRO and quoting Chomksy "40% [of FBI work activities] were devoted to political surveillance and the like"
Furthermore, in modern times, there is:
- warrantless mass surveillance (NSA)
- license plate readers (LPRs)
- cellphone tower/position tracking
- Ring and other IoT devices recording to cloud providers evidence of technical crimes. All it takes is law-enforcement getting access for something else and digging around with automated tools in data to round up bodies to prosecute. For example, a small child escaping from a bath without clothes in front of an always-recording security webcam is considered child pornography.
- No-name webcams and other IoT being hacked or manufacturing backdoored to be used for spying, malware injection, packet sniffing and who knows what else.
- As revealed by Snowden, various secret watch lists containing the names of over 2 million Americans.
Once I was in Belarus. It's a heavily authoritarian country, make no mistake. When they had protests against their rigged elections, they traced the phones of everyone who went, then brought them in for questioning and asked them what they were doing there. The president later said: "We have rigged the latest election. 93.5 per cent have voted for Lukashenko. But they say it is not a European outcome. We have made it 86 per cent."
But it does work. I was at a restaurant there. Some girls are chatting, and they walk away to go get something, and just leave their bags there unattended. Why? Because it's an authoritarian country - people aren't going to steal stuff there.
This was in the capital of one of the poorest countries in Europe, but it was still one of the safest places I've ever been.
EDIT: I don't mean to praise authoritarian countries, on the contrary. We should support human rights even if it is less efficient, because the object of governance is not to strip people of their freedoms for pennies on the dollar.
Chances are high that you could do the same in most European countries. Just like in the Soviet Union, thieves exist in Belarus. People's perception might be that they're safer, but that doesn't mean they actually are. Just like during Soviet times, people felt safer, but crime still happened. It just wasn't talked about, because it makes the state look bad.
Poor example. The anarchy that attended the collapse of the Soviet system brought about a dramatic increase in crime. And while reported petty theft was arguably a reporting problem, the dramatic rise in homicides was not.
For the record my mother-in-law lived in St Petersburg at the time. And her anecdotal reports were that crime that she and friends experienced rose dramatically.
It almost reminds me of some my parents' peers yearning for the good ole days of communism, cuz riffraffs would not dare to talk back to a cop.
edit: It is hard to argue with that kind of sentiment. It is technically right.
People have no qualms about stealing umbrellas?
I mean, if you leave something that's a strong temptation (like a big pile of cash) it'll eventually disappear.
Yeah, if you go in to one of the inner cities, your stuff will get stolen, but otherwise, generally nah.
Obviously if you put all people in jail they will be very safe there. But then you have less people working and producing more useful stuff than rotting in jail.
The best example I give of this trade off between freedom and safety is that of women in saudi arabia. They have the least amount of sexual violence in the world.
Whether or not this applies to rape is a question of debate. However in Pakistan it certainly has been interpreted to so apply. Which means that it is very hard for a woman to prove that she was raped.
See https://en.wikipedia.org/wiki/Hudud#Requirements_for_convict... for verification.
> Obviously if you put all people in jail they will be very safe there. But then you have less people working and producing more useful stuff than rotting in jail.
Friendly reminder that the US has the highest incarceration rate of all countries.
> The best example I give of this trade off between freedom and safety is that of women in saudi arabia. They have the least amount of sexual violence in the world.
Is this according to their official statistics? How do they know they're accurate and comparable? For instance, is rape within marriage considered a crime and regularly prosecuted there?
That is a different thing.
>New Zealand’s Telecommunications (Interception Capability and Security) Act was introduced in 2013.
>On receipt of a warrant from a surveillance agency, a network operator – defined as a public telecommunications network or a telecommunications service – must “decrypt a telecommunication” on their network or service, but only if it has “provided that encryption”.
Sadly, that same world leader is the strongest anti encryption and censorship advocate.
You do realize this is the same country which made it punishable by up to 10 years' prison to have a specific terrorist's manifesto on your computer?
Denmark has been illegally requiring all telcos to monitor everything for years, and nothing has changed. No one is willing to change anything or impose any consequence for it. It's even gotten people falsely imprisoned due to complete reliance on location data from mobile phones.
Sweden is not much better, and just passed legislation allowing law enforcement to hack people's phones.
I'm almost certain that Norway, Finland and Iceland are no better.
For context, I interviewed with a company that had police videos stored on their systems and were using them for entertainment, to the point of showing me one at the beginning of the interview for grins
TBH sounds a bit like a conspiracy theory. And for what it's worth, at least in Finland getting caught snooping isn't cause for firing people, the convictions have been fines.
Seems like open-source and peer review is the only way to ensure that crypto is not being hijacked.
The government itself is likely to abuse such back doors. LOVEINT is a documented practice.
Meanwhile governments throughout the world are mandating the use of Signal for communication. Common folks are not special enough to justify strong encryption but it's totally valid for government use. Helps hide their corrupti-- I mean state secrets.
Why would it not be possible to create a system that required several manual and offline steps in order to break the encryption?
For example (and perhaps similar to offline cold storage of bitcoin) why couldn't a system be designed whereby 3 or more people in geographically diverse areas were in a position to agree that a request for information was legitimate (by court order) and thereby produce what is needed to unlock certain information? So one person would not have the key or access.
After all right now you have a case where a single person (the owner) is able to unlock information. The feeling is a back door can be hacked. What if it's not a back door though?
The problem is that to be useful for law enforcement, any local police department has to be able to go to any local judge and get a warrant and then get access.
There are approximately 30,000 state judges with fairly high turnover in that list. If you can compromise one, or successfully get yourself added to that list, you can then get access to whatever you want. That's way too many people to trust.
This is not a hypothetical weakness. I personally know someone whose physical location was compromised through a court order obtained by bribing a judge. How many cases are there where similar access was gained but the victim doesn't know how it happened? And the better the access that you can get, the more incentive there is to get it. (There is no shortage of reasons why a motivated party would want such access.)
While this process does have weaknesses, it is still the difference between a legal process overseen by the courts and one based on espionage where agents do whatever they want without oversight.
Note that strong network encryption is essential for ensuring that they have to get a warrant.
I don't think anyone has come up with a better system than judicial oversight that still allows law enforcement to do their job?
Do libraries not keeping records of who has read what books not allow law enforcement to do their job?
Do prohibitions against arbitrary searches not allow law enforcement to do their job?
Does having to make a plausible argument in front of a judge not allow law enforcement to do their job?
Do individuals not having a number tattooed on their forehead not allow law enforcement to do their job?
Law enforcement continually frames any pereceived difficulty as if it is prevents them, which is utterly disingenuous. Law enforcement's "job" is to operate within the constraints of a free society composed of innocent people, in spite of the extra work from not being able to just shoot the bad guy like in the movies. When they don't respect this, they're merely another criminal element.
In the US, the idea of the publicly funded law enforcement officer can trace its lineage back to antebellum slave catchers.
Much like they now (and forever) frame any perceived difficulty as preventing them from doing their job, they've successfully re-framed their role in society as protecting and serving (people) instead of protecting and serving (property rights), despite that obviously not being the case.
Compare this to the situation in Europe and Asia (or at least the civilized parts like Japan), where it's perfectly acceptable and normal to ask a policeman for directions. And where their role is indeed to protect and serve people (not property).
"Police have the role of protecting and serving people" really does not feel like such a controversial viewpoint to me.
In fact, these high profile cases the FBI et al attempt to use as wedges generally involve someone who has already been caught with plenty of evidence, but the authoritarian impulse still demands more.
As far as laws that can only be enforced/observed by reading people's private thoughts and communications, then yes indeed, why have those at all?
Law enforcement agencies send legal requests to Google and Facebook all the time, and the sky hasn't fallen.
Actually, for all X where X exists, the sky hasn't fallen :)
I think we can agree that, say, putting a chip in everyone's heads that transmits their thoughts to the government would indeed, as you put it, cause the sky to fall. I'm not trying to suggest that outlawing encryption is the exact same thing, but I do believe it would cause much more injustice than it would prevent.
Which is another reason why consolidating everyone's data into a few centralized locations is also problematic.
> While this process does have weaknesses, it is still the difference between a legal process overseen by the courts and one based on espionage where agents do whatever they want without oversight.
But that's not what we're talking about here. The question isn't whether the police should need to get a warrant, it's whether the government should be able to prohibit technology that preserves privacy because some criminals might use it alongside all the law abiding citizens.
And encryption doesn't "prevent law enforcement from doing their jobs" -- that's just a trope. What it does is make their jobs more expensive. Even if they can't just get a copy of all your communications from a megacorp by filing some papers, they can still get a warrant and then plant bugs or guess your password or plant bugs that allow them to observe you entering your password etc. It's not impossible, it just takes more resources to do it -- which prevents it from happening at a massive scale.
That's a feature, not a bug. It still lets them solve murders, because murders are serious and uncommon and can justify the expense of a real investigation. It may make it inexpedient to spend those resources to catch every last hooker and pothead, but so what? Sometimes it's not worth the candle. If you think it really is, give them more money instead of giving everyone else less privacy. But sometimes it just isn't. Sometimes it costs more to solve a crime than to not solve it.
Meanwhile (this is the feature) it doesn't make it too easy for them to identify all the people in group X and give the list to Joe McCarthy or round them all up and put them in internment camps. Law enforcement should have some friction, because when it happens too fast at too large a scale, history shows this to be Bad.
There is no principle that says governments (as representatives of the people) can't regulate what technologies people can have. Sometimes we decide that yes, the government should do this. Consider how the FCC regulates electronic devices to prevent radio interference. Enforcement is lax and as a hobbyist you aren't likely to get caught, but devices aren't commonly found in retail stores that don't have FCC approval.
There are many practical issues, of course, including making sure there is a balance of power and that law enforcement powers aren't abused. I agree with that part.
My take is that even with a warrant, law enforcement's reach into private data should be limited for a free society.
I guess if you're preparing for the apocalypse you don't count on that, though?
The Supreme Court just ruled that you can’t sue law enforcement for civil rights violation - including killing people unjustly.
Sure, a system could be designed where the “master key that unlocks everything” is distributed - that makes the problem of the attacker who wants to get his hands on that key slightly harder, because now he has to compromise three systems instead of one, but that doesn’t change the fundamental risk, which is that he can do that in the first place. Remember, you’re talking about one piece (or three pieces) of information which can be used to decode every single secret in the United States - this isn’t limited by technical feasibility, this is the explicit end goal that you’re asking for. If Russian hackers got a hold of it undetected, they could decrypt everything for a very long time. Even if it were revealed that it were compromised, everything that was encrypted using the old key would have to be re-encrypted somehow, and the old copies destroyed somehow.
Judge, Jury & Encryptioner: Exceptional Access with a Fixed Social Cost
> We present Judge, Jury and Encryptioner (JJE) an exceptional access scheme for unlocking devices that does not give unilateral power to any single authority and places final approval to unlock in the hands of peer devices. Our scheme, JJE, distributes maintenance of the protocol across a network of "custodians" such as courts, government agencies, civil rights watchdogs and academic institutions. Unlock requests, however, can only be approved by a randomly selected set of unlock delegates, consisting of other peer devices that must be physically located to gain access. This requires that law enforcement expend both human and monetary resources and pay a "fixed social cost" in order to find and request the participation of law abiding citizens in the unlock process.
Lawful Device Access without Mass Surveillance Risk:A Technical Design Discussion
> This paper proposes a systems-oriented design for supporting court-ordered data access to “locked” devices with system-encrypted storage, while explicitly resisting large-scale surveillance use. We describe a design that focuses entirely on passcode self-escrow(i.e., storing a copy of the user passcode into a write-only component on the device) and thus does not require any changes to underlying cryptographic algorithms. Further, by predicating any lawful access on extended-duration physical seizure, we foreclose mass-surveillance use cases while still supporting reasonable investigatory interests. Moreover, by couching per-device authorization protocols with the device manufacturer, this design avoids creating new trusted authorities or organizations while providing particularity (i.e., no “master keys” exist). Finally, by providing a concrete description of one such approach, we hope to encourage further technical consideration of the possibilities and limitations of trade-offs in this design space.
Make a door, it won't stay secret. And you can't assure that there won't be bad actors involved in any number of parties that have to agree.
The whole backdoor system, once it exists, is open to subversion and/or misuse. The only way to not have the problem is not create the door in the first place.
No matter what "offline" and "manual" steps you propose, they can be made online and automatic after a one-time compromise of the system. And if a single system is protecting the secrets of literally everyone, the incentive for compromising that system is unlimited. There's no attack too impractical when the stakes are so high.
It's not feasible to ban people from using their own encryption unless you plan on severely restricting their freedom.
The article below argues that the real use case for breaking encryption is to catch everyday criminals, not to go after shadowy Bond villains. Would the public still go for it, if they looked at it this way? Probably not...
The Encryption Debate Isn't About Stopping Terrorists, It's About Solving Crime
Edit: And most everyday criminals are not technologically savvy. Half of them probably have a hard time using Telegram safely.
In practice even the tech types weary of using and maintaining truly secure solutions. So if they outlaw backdoor-less solutions then companies won't support them. And without commercial support the options will dwindle.
And even donation supported projects like Truecrypt will fold under pressure.
Crypto isn't a one-person job
Example: None of us case about midget porn. If we all agree that banning midget porn will not have an impact on anyone, government then quickly moves on to octopus porn. Now, on a matter of principle you will find your position weaker and weaker.
Cant fix technology problems with people processes.
And this from 2015:
See also https://cdt.org/insights/the-nsas-split-key-encryption-propo...
I think the history of crypto exchange hacks should be of interest here.
Take Bitcoin cold storage: People use offline storage and specialized hardware wallets to store their keys more safely. Now imagine there were some privileged keys that would allow the holder to just take anyone and everyone's bitcoins. It doesn't matter how geographically distributed it is, how elaborate the key ceremonies and oversight is. The system could never be trusted if such a backdoor existed.
When done correctly and with everyone's knowledge and consent, this is a form of key escrow. But there's some simple math at play here - the more unlock options there are, the more points of vulnerability and failure. https://haveibeenpwned.com/ lists 8 breaches of my data and I'm aware of more they don't list - the industry can't even keep my data under lock and key without adding more points of failure via key escrow as another thing they can fuck up.
> For example (and perhaps similar to offline cold storage of bitcoin) why couldn't a system be designed whereby 3 or more people in geographically diverse areas were in a position to agree that a request for information was legitimate (by court order) and thereby produce what is needed to unlock certain information? So one person would not have the key or access.
Bitcoin is a great example, in a way - there are several stories out there of exchanges fucking up and losing access to their coins or having their coins stolen, even when they were using cold storage. All 3 people will likely be duped by the same fradulent (or just overreaching) LEO email, in part because of the all too human tendency for each of them to assume that "one of the other two would have caught it if it was fishy, right?"
> After all right now you have a case where a single person (the owner) is able to unlock information. The feeling is a back door can be hacked. What if it's not a back door though?
The label you use doesn't matter - the more people who can unlock things, the more points of failure you have, and the more likely things will end up hacked. The industry struggles enough to secure things even when it takes a hard line approach and espouses end-to-end encryption. Defending against abuse by insiders/employees usually means going through a bunch of trouble to lock your own employees out of customer data - or making it unavailable to the company in the first place - not giving them more tools to access it.
Governments already have more than enough tools to completely pwn my privacy if they actually need to. Convince a judge to sign a sneak and peek warrant, have law enforcement covertly install a hardware keylogger, and I'd wager you'll pwn most people - even if they're tech savy. But that requires pesky things like indivudalized suspicion, "checks and balances", actual work, and involving multiple people (your landlord or a locksmith for the keys or entry access, your security company to silence the alarms, a judge to sign off on the warrant and a leo to execute it) so it's harder to run a long term secret warrantless suspicionless mass surveilance dragnet as a rogue employee or agency. But not impossible, as certain three letter agencies have shown.
Wasn't this how Google, Adobe and several other tech companies had a major security breach about 5-7 years ago? They provided the DOJ backdoor access.
If you're thinking of PRISM, no, at least not in the voluntary, intentional sense of the word "provided". Many of the major tech companies had non-public backbone fiber, and links across that fiber were unencrypted. The NSA tapped this dark fiber to read unencrypted traffic. This famously hit Google, which subsequently moved to encrypt all internal traffic, even traffic that would never leave its networks.
The NSA placed key people into companies like Google who had security clearances that specifically forbade them from saying exactly what they were doing to higher ups. They then created systems for extracting data in an automated way based on requests from the NSA. All that executives knew was that they were doing something important for complying with law enforcement requests.
The CEOs of these companies learned about the existence of the back doors from public reporting based on Snowden's revelations. When they first heard, they issued public denials that were, as far as they knew, truthful. Their subsequent actions upon finding out that they were wrong strongly suggest that they wouldn't have approved the programs had they known what was happening.
I see this claim is made in the Wikipedia article [https://en.wikipedia.org/wiki/PRISM_(surveillance_program)]: "PRISM collects stored internet communications based on demands made to internet companies such as Google LLC under Section 702 of the FISA Amendments Act of 2008 to turn over any data that match court-approved search terms." but the cited source [https://www.washingtonpost.com/world/national-security/nsa-i...] does not support this assertion; on the contrary it explains "[NSA] has secretly broken into [...] Yahoo and Google". It certainly doesn't make the stronger claim you seem to suggest here that PRISM is limited to FISA requests.
Because that's when Trump came into power and nothing bad ever happened under the Obama admin.
Take marijuana enforcement as an inverse example. The federal law around marijuana hasn't changed, and it's as illegal in 2020 to use, grow, and sell as it was in 2004. But the Obama administration made clear to the DEA and FBI that pursuit and enforcement of that law in states that had passed their own laws to legalize it would be a career-limiting move, which gave the states breathing room to experiment (and allowed hard evidence to gather that allowing medicinal and recreational use wouldn't cause societal collapse or, really, almost any negative outcomes).
This aspect of the intersection of law and executive enforcement is why the Trump administration is correctly condemned for worsening the situation regarding immigration in the US. Most of the laws the administration uses have been on the books since before 2016. This administration chooses the harshest and cruelest ways to interpret and execute on those laws.
This was a big deal back during his term which I feel like people are already forgetting.
The Obama administration did exactly nothing for prison reform, which coincidentally is probably going to be a big part of Trump's legacy (he's done more on this than any other President), and nothing for situations where marijuana possession suddenly makes other things criminal (like owning a firearm _and_ marijuana).
For as poor as Trump's record with women is, it's pretty shameful for everyone else that he's the president that made sure tampons are widely available in women's prisons.
Ok? So that should counteract the behavior.
> This aspect of the intersection of law and executive enforcement is why the Trump administration is correctly condemned for worsening the situation regarding immigration in the US.
There is an element of "outrage" culture that means to ascribe new fault, in the established pursuit of a "federally safe" cryptography, to the administration as a proxy for other wrongs. This is not relevant to the issue of original fault.
A few years ago in an undergrad business class, we were having some discussion and the topic of encryption came up during one of my presentations. A student asked a question related to the ethics of encryption (I don't recall exactly what), and I was clearly confused by the question.
To clear up confusion, the professor asked those who thought encryption was "bad" to raise their hand, and at least 60% of the class raised their hands.
It was pretty jarring to me, and makes me pessimistic about the outcome of a DOJ campaign to demonize and regulate encryption
If we're unlucky and we get caught unprepared, we run the risk of getting stuck with a backdoor or "exceptional access" mechanism that provides little or no technical safeguards against massive and nearly unlimited government overreach.
IMO this makes it our responsibility to figure out how we might design such a system that does have strong protections against misuse. Of course that is a very difficult thing to do. Doesn't mean we shouldn't try.
Agreed. I actually did some work on this topic that requires having things like strong ciphers and hash functions, and protocols that provide perfect forward secrecy.
Our approach was to take strong constructions and turn them into a kind of proof-of-work. So you can recover the key, but to do so you have to expend a huge amount of electricity (ie, money).
The idea was partly just to call the bluff of the government types who claim that "exceptional access" is so critical to our safety. If they're not willing to spend the money, then we should not be willing to compromise our security.
NOTE: I'm not suggesting that we should actually deploy this kind of thing, as long as we have a choice in the matter. Just that we should be prepared in case we need it.
Anyway, the paper is here:
C.V. Wright and M. Varia. Crypto Crumple Zones: Enabling Limited Access without Mass Surveillance. In IEEE EuroS&P, 2018.
People don't think 'encryption' is a mathematical function. They think it's like a radar detector in your car. "Why would you have it unless you intend on speeding?" [goes their thought process]
"Oh I use this frequently and things I do every day would be totally broken without it? Really?!" [is the realization we hope dialogue could bring]
But you are correct in that we really should make it completely legal so that the government doesn't have an excuse to target individuals for engaging in a practice that is generally widely accepted.
Encryption is more than just mathematical functions, it's the application of mathematical functions.
And the application is what matters, because it would be easy to say a Nuclear weapon is just physics or that a bio-weapon is just chemistry.
Clearly, how you use basic principles matters.
Nuclear weapons require both the knowledge and the physical world parts.... encryption just requires the knowledge, which is impossible to put back in the box.
Every computer is capable of building software that can bring society to its knees, but most people don't know how to do it. Software is just algorithms, after all.
The problem is that law enforcement pushes the bad uses of something without weighing it with the good. If fertilizer can be used to make bombs, should be ban fertilizer? No, that's throwing the baby out with the bathwater. Unfortunately, many voters don't fully understand what the baby is WRT encryption, they just see child porn, human trafficking, etc.
Suspects. Alleged criminals. Until convicted, they're not criminals. The government does not have an unlimited right to collect all information; they have a limited, judicially controlled right to try to collect information.
One key detail: strong encryption does not prevent investigations from targeted collection of information, such as through physical surveillance, bugs, etc. There are many tools available to law enforcement, those tools are just less convenient. Framing it that way helps: in order to make investigations more convenient, the DOJ wants to prevent anyone from using secure communications.
And even after they are convicted, they and their associates are still protected by encryption. Encryption protects everyone equally. That is viewed as a positive by pro-encryption people and a negative by anti-encryption people.
>The government does not have an unlimited right to collect all information; they have a limited, judicially controlled right to try to collect information.
This the point of warrants. The problem of massive warrantless collection of data by the government is a bigger and separate issue. I personally think encryption is a valuable tool to address the symptom of that problem, but it does nothing to actually fix problem of a corrupt and/or authoritarian government. One of my personal annoyances with the tech community is that we have a tendency to try to work around any governmental problems as if that solves the issue while making no attempt at fixing the root cause.
Agreed. We need to produce technical solutions, and effect policy solutions, and get a lot better at PR and rhetoric.
I know some people do think this, but the opposite is actually true, because of the popular shift from voice to computerized text and images for almost all communications.
A few decades ago, criminals spoke on the phone to coordinate actions. If the police were aware of this in advance, they could tap the phone line and listen in to that conversation. However if they found out about it after the fact, there was no record of the conversation for law enforcement to go get. The sands of time hide information extremely well.
Today, we use mobile devices, and prefer text and image-based communications--email, SMS, WhatsApp, iMessage, Facebook, Twitter, etc. etc. Unlike voice conversations of the past, these default to creating a permanent record of the conversation, which police can go get after the fact. Encryption is an attempt to re-establish the privacy and security of these conversations that existed back when they would have been a voice call over analog telephony.
The idea of "going dark"--promoted by law enforcement to fight encryption--is wrong. What has actually happened is that INCREDIBLE BRIGHTNESS of information has developed over the past few decades as all our computerized communications now default to living forever.
Encryption is a tool for trying re-establish the level of "light" (information availability) that our culture and laws came to expect over hundreds of years.
It seems to me that law enforcement is simply pining for the fjords of a very short time span in human history when ample digital information co-existed with the lack of widely available cryptographic systems.
The fear from law enforcement is absurd. They've never had the ability to read every single letter that was sent, but now that we've moved online they want to read every email? They claim they are losing access but the truth was they never had access in the first place. The only difference is before maybe they could secretly read your letters before delivering them but now they need to get your password. That only worked at a small scale though, and I'm sure the NSA can crack passwords at those same small scales (e.g. some of the high-profile iphone cracking we've seen).
Ehh, maybe if you are talking about speed and scale. The average citizen could always learn how to use a one-time pad.
I don't really see the State's argument against encryption except in the case that they want to break encrypted information without due process.
But then, neither did breaking down the door and ransacking the obvious physical target pre-internet days always work. And everyone understood a suspect had no obligation to tell the feds where he hid the goods.
That's such a naïve way of putting it. People don't care about the mathematics behind it. They think it's "bad" because they think it enables terrorism and the spread of child sexual abuse imagery or whatever politicians have led them to believe.
That is not all it does, but pretending it does not is stupid.
Hospitals enable the spread of disease by clustering all the sick together with a constant flow of healthy people (visitors, doctors, etc). They also enable the development of superbugs.
What's your point? You're confusing incidental usage with enablement. The lack of these good things doesn't prevent the occurrence of the evils listed, and to make a conversation one sided for how much it exacerbates the evil without instead focusing on the good and the reason they exist is incredibly disingenuous (which is the clearly the intent of most of the people on a soapbox about the evils)
Because the vast majority of people don't know that encryption is mathematical formulas. They only know what they see from Hollywood, which is that it's a thing that bad guys use.
That logic kind of makes sense. However, locks at least delay intruders and make it harder for someone to go undetected if the break into your house. The intruder is either at your door for a little while, makes noise trying to break in, etc.
If the police bust into my house there will probably be witnesses and I will know about it. If they steal my data, how is anyone going to know? This gives them way too much power.
How certain are you he said "bad"? If he has instead asked "is encryption problematic?", the outcome can be interpreted much differently because there are problems with encryption. Especially in an ethics discussion, there are definitely pros and cons to encryption. (FWIW, the pros outweigh the cons by far, IMHO.)
I'm not trying to say your memory is wrong, I'm just asking how certain you are about what was asked exactly. Changing one synonym for another can make a big difference.
I have no idea what tack this prof took with the question, but I think you've drawn a hasty conclusion here.
That is an apt summary of my college ethics discussions. Even in cases where the professor was doing their best to encourage discussion, the majority of the class couldn't think beyond first order effects and hypotheticals would quickly get emotional and devolve into character attacks.
It's a bit like thinking all painters are in favor of graffiti, or all musicians are okay with "sampling" and remix culture.
This just makes my head explode. Because tech companies tend to be poor at privacy, let's use that logic to make it so the government can invade your privacy anytime they want?
A lot of smart people seemed to get on board that train without thinking much.
For example, a friend worked at a proprietary hedge fund of a major US bank that was looking into the private accounts of their customers to drive trading decisions. Other investors can't compete against that, stock market investment is no longer fair. And you've broken a fundamental component of the economy, a fair & transparent investment system.
* Pro-privacy claims that big techs share their user data to too many irrelevant entities.
* Pro-competition claims that big techs monopolize uses of their user data.
* Pro-regulation claims that big techs don't share their user data to accountable government entities.
Makes me think of the Israeli voter rolls hack.
A scifi novel I read, "A Deepness in the Sky" described how the pads themselves were a valuable item of trade. I don't think it's farfetched to imagine purchasing OTP data to use with internet browsing, the way we buy yubikeys to use with passwords. It would be a far simpler encryption scheme than those we currently use, and that simplicity would make it easier to catch bugs like heartbleed in the future.
How hard would it be to extend the ssl protocol to allow the use of OTP pads, where available?
That sounds like a beyond ridiculous system. We can do better.
So your mobile phone for example would need to have local storage the same size as all the data you ever expected it to send/receive.
OTP is not a very complicated scheme. All you need is a good source of entropy, a place to store a big fat array of it all, some XOR operations, and a safe way to hand the codebook to your trusted parties (e.g. phone-to-phone transfer options).
First off citing yourself isn't an elaboration. Second if you are arguing that one-time-pads don't work / won't work in modern times that goes against our entire understanding of certain bits of cryptography.
You'll need to yield some real sources first.
> A one-time pad XORed with a message doesn't provide integrity, and therefore can't reliably secure an arbitrary protocol
The point is that sure, one time pads "work" the way they're described. But their properties don't actually fulfill what we require from modern cryptosystems. Note the sibling comments talking about augmenting them with half baked authentication schemes. Which are all into the realm of computational-complexity cryptography, and no longer "mathematically unbreakable!!1!"
Once we're past the point of users doing something beyond downloading an app from a corporate app store, all secure encryption would be back on the table.
Of course we can foresee a true ban on encryption some years out, but at that point XOR has the exact same signature as AES. Steganography is the corresponding approach for that attack.
Is there some hypothetical reason we can't just append the SHA256 of the message to the message before encrypting it? It should be impossible for an attacker to alter any message bits undetected with this scheme.
A children's book would be a great demonstration of this, and should be protected speech (let's hope). Teach kids how to use OTPs in an illustrated book. Maybe the story ends with 5 year old Jimmy getting hauled away by the feds for doing some illegal math.
Just goes to show you how the average person thinks about these things. I have to admit I wouldn't like it if people wore masks in public in a way where I couldn't recall their face, but I don't think that's necessarily the same as encryption, but I guess I see the comparison.
There is no "only the government can remove the envelope, if they get permission". If we had the capability for that, we wouldn't be having the argument; the problem is people think this exists and it's technical company intransigence that is preventing it from being used.
Okay, but now the case you're trying to explain is "there are deep technological challenges and a wealth of historical precedent make me skeptical that we can correctly design or implement a solution that allows only the government or the intended recipient to decrypt the message, instead of only the intended recipient".
To people who think "technology is magic".
And on the other side of the PR battle, the government is saying "Nah, we got this. And if you try and stop us, we're powerless against pedophiles and terrorists."
Adding enough nuance to make your position technically accurate, also makes it abstract enough to be politically useless.
In the real world, resources are naturally constrained. It's usually impossible to read everyone's mail in real time and retroactively pull up the contents of a letter sent 3 years ago. This limitation vanishes with online communications. Encrypted messages can be stored indefinitely and later decrypted.
The super safe backdoor we build today could very easily be used by a tyrannical regime a decade from now to get dirt on everyone. We can dream up all sorts of technical solutions that allow for a backdoor, but make it really hard to abuse, but at the bottom, those solutions rely on the government obeying their own law.
And to pre-empt: "And no, we haven't discovered the maths to make it not work like that. Nothing we know about comes even close, and such mathematics might not even exist."
A slightly more abstract counter:
> The government can get a warrant to read your letters. The law can regulate that, and there's no way they can get a warrant to read everybody's letters; only suspects will have their letters read. But these are computers. A computer could just go through, unlocking and reading everybody's letters, and looking for — look, imagine a future, slightly more evil government decides that puppies are illegal. A computer could look through everyone's letters for anybody who'd ever been pro-puppy, and then fine them for puppy-supporting, and then anybody who continues to support puppies would be put in prison. I know that puppies are a stupid example, but imagine that the government really wants to start a war, or something, and you disagree, and the election's coming up, and you can't talk about maybe not starting that war.
> Okay, so you trust this government, and all future governments, not to do that? Do you trust every single person who's ever seen a warrant? Say you want to get a job, but you can't, because you maybe don't want to save the pandas while children are starving in [impoverished nation] and you said so in a message to a partner three years back and your employer does a quick background check with some company somewhere and it comes back "[person's name] does not like pandas". It's your dream job, and now they don't want to hire you because the HR person is a really big fan of pandas. I'm picking innocuous examples, but come on, you can think of more significant ones. (And those background check companies already exist – they literally Twitter-stalk people; do you really think they'd ignore private conversations if they had access to them?)
> Do you trust anybody who's ever seen a warrant with your bank details? Sure, those skeleton keys will be kept fairly secure, but it only takes one person to make a mistake and then every single message that's ever touched a computer, past, present and future, is public.
> Or they could just look at the lock and figure out what shape of skeleton key would fit it. That's how all of the other backdoors the government put in computer systems like this were spotted. And once you've spotted it, you can just decrypt everything almost instantly, because you're just getting the computers to do maths and computers are fast at maths.
It's difficult at this point to think that the DoJ is arguing in good faith.
I think people in the tech industry, and especially people working on security or privacy, have known this for over a decade. Unfortunately, the public doesn't pay enough attention to the issue. And why would they? They have enough things to worry about, why pay attention to one of the things that is currently working?
It's unfortunate that we're not that good at PR.
Substitute guns instead of encryption and see how that fits.
There shouldn't be a backlash against tech companies specifically, but really companies that are overly large and use and abuse their power and position in ways that benefit shareholders in ways that are severely detrimental the common good. Such as limiting competition, influencing and manipulating legislation, ensuring people have no other options and then raising prices exponentially, selling your data to the highest bidder, etc.
How is the DoJ going to force Signal or even Telegram to add a back door?
I'm old enough to remember the tail end of this and it was absolutely absurd, yet still very real. I remember Zimmermann being investigated by the feds . Zimmermann was smart to tie publishing PGP to the First Amendment. Something like Signal in those days may well have resulted in charges and a conviction.
Never underestimate the potential for folly when it comes to government regulation.
Many of the arguments I've seen here defending encryption are basically the same as those defending guns.
It also should be noted, that while Telegram markets itself as the most safe and protected messenger, it's regular chats are, in fact, non-encrypted at all and are way less secure than WhatsApps. It's actually baffling, he bashes WhatsApp with such audacity , telling in one sentence, that 'if your phone dies, you can't access your messages on another device', and then continues preaching how cool it is to use encryption. BUT IF ONE USES END-TO-END ENCRYPTION, ONE CAN'T ACCESS MESSAGES ON ANOTHER DEVICE, BY DEFINITION!!!!!
It never ceases to amaze me that people bought into this. A comfortable to use app and a really good promise of security turns out to be better than real proper security. ¯\_(ツ)_/¯
You can encrypt each message separately for each device, that's still end-to-end encryption.
Of course, they tell tales on how deeply it is all encrypted on T's servers, and that only select few trusted engineers have keys, but those are just tall tales.
Hypothetically: they could easily force the Apple and Google stores to stop carrying it, or even to prevent its installation using the same mechanisms used to scan for malware. Do not underestimate the means by which use of real encryption could be made inconvenient.
I'd like to think this is difficult with Signal's model, though.
The sane response to corporate totalitarianism is most certainly not government totalitarianism. Sadly with how the two political salesteams frame a false division merely over different flavors of authoritarianism, this has a good chance of working.
As always, the true answer is trustable software running under the control of users ourselves. Unfortunately, we will have to see how bad things get before most people are driven away from all of these centralized attractive nuisances.
This feels to me like one of the typical debates where people are shooting at each other but nobody understands what they really are talking about.
Virtually any proposal that the government might offer will be a key escrow. We already know it's bad and why it's bad, because it's been debated for decades. I don't think there are any more novel solutions that should revitalize the argument.
The problem isn't that there aren't enough oversight mechanisms or checks to prevent abuse. The problem is that the design has unfixable security defects.
By all means, fully listen to proposals, in order to understand the misconceptions and better address them. Or, in some cases, it's useful to listen to proposals to anticipate upcoming security threats and build a more robust security model. For instance, the potential threat of compromising CAs led to the invention of Certificate Transparency, which almost completely eliminates that possibility.
We also need to get a lot better at rhetoric and PR.
That line is not OK. If somebody wants something and you say you are not able to do it then it should be on them to prove you wrong.
Reminds me of work BS. You say “it can’t be done in the timeframe”. VP response: “you are just not working hard enough”. It’s a nice way to manipulate the conversation.
Technologists definitely should get better at messaging and arguing with bad faith actors.