
DOJ: Strong encryption that we don’t have access to is “unreasonable” - nimbius
https://arstechnica.com/tech-policy/2017/11/doj-strong-encryption-that-we-dont-have-access-to-is-unreasonable/
======
arca_vorago
This subject always reminds me of a talk I heard Eben Moglen give, where he
said:

"In 1995, there was a debate at Harvard Law School – four of us discussing the
future of public key encryption and its control. I was on the side, I suppose,
of freedom. It’s where I try to be. With me at that debate was a man called
Daniel Weitzner who now works in the White House making Internet policy for
the Obama administration.

On the other side was the then Deputy Attorney General of the United States
and a lawyer in private practice named Stewart Baker who had been chief
council to the National Security Agency, our listeners, and who was then in
private life helping businesses to deal with the listeners. He then became,
later on, the deputy for policy planning in the Department of Homeland
Security in the United States and has much to do with what happened in our
network after 2001.

At any rate, the four of us spent two pleasant hours debating the right to
encrypt and at the end there was a little dinner party at the Harvard faculty
club, and at the end, after all the food had been taken away and just the port
and the walnuts were left on the table, Stuart said, “All right, among us now
that we are all in private, just us girls, I’ll let our hair down.”

He didn’t have much hair even then, but he let it down.

“We are not going to prosecute your client, Mr. Zimmermann," he said. “Public
key encryption will become available. We fought a long, losing battle against
it, but it was just a delaying tactic.” And then he looked around the room and
he said, ”But nobody cares about anonymity, do they?"

And a cold chill went up my spine and I thought, all right, Stuart, and now I
know you’re going to spend the next twenty years trying to eliminate anonymity
in human society and I am going to try to stop you and we’ll see how it goes.

And it’s going badly."

[https://www.softwarefreedom.org/events/2012/Moglen-
rePublica...](https://www.softwarefreedom.org/events/2012/Moglen-rePublica-
Berlin/transcript.html)

~~~
phkahler
It's odd because peoples desire for anonymity is preventing their use of good
encryption. A solid online identity and reliable way to get data to you
without 3rd parties would be a good foundation for exchanging keys and
encrypting data between parties. I'm talking fixed IP addresses as a starting
point. We have whole infrastructure to make it easy to interact with public
web sites (DNS, etc) but it's very hard to find and send a packet to your
friend. Any efforts to change this would be seen by a lot of people as an
attack on anonymity.

I don't want make believe anonymity, I want to know where data is going and
where it's coming from. Once I have that I can encrypt for privacy and web
sites can strip identifying information if they want to provide an anonymous
forum.

~~~
justinjlynn
Aside from widely broadcast one-time pad encrypted/authenticated messages
(which provides anonymity for the receiver, not the sender -- that can be
bootstrapped), deterministic rotating asymmetric key pairs (say, a non-
compromised curve ECC scheme using Key Families) coupled with a relatively-
high non-deterministic latency mixnet delivery system using onion/garlic
routing for intermix peers goes a long way towards achieving data security,
authentication and metadata anonymity for all involved parties. Discovery and
routing in such a system is a very complicated problem -- and it's one yet to
be solved; though, generally if one is going to such lengths, one has a
rendezvous system already in place.

The primary problem is that it's expensive for users -- expensive in terms of
time, cost, and complexity; which means usage is low. Consequently, it also
makes people targets simply for using the system as the number of those using
the system is relatively small -- something one can't really get away from
easily unless makes it look like one isn't using the technology. To do that,
one then has to go through the effort of creating cover traffic -- and
creating _consistently good_ cover traffic (good enough to fool a human
analyst, because one is one of thousands, not one of millions/thousands of
millions) is immensely difficult and techniques change over time w/ local
conditions so it's _hard_ to automate. Life gets really really difficult when
survival depends not only on keeping people out but also on keeping them from
knowing anything of interest is there in the first place!

Don't forget about stylometry and inadvertent signatures; never communicate in
real time, avoid absolutely everything but plain text data if at all possible,
write _very_ plainly (say, using only first few thousand most common words in
your language) and use stylometry defeating tools (for example, Anonymouth --
though I haven't audited it; others have, but it's still just a thin layer to
apply to other work) to prevent others from creating signatures based on the
words you use and how you use them. _NEVER_ forget that binary data you send
might contain metadata fields to give you away (version numbers, encoding
settings -- all seemingly innocuous but possibly unique to you!). Sending
images? Make absolutely certain your camera doesn't have sensor glitches that
can create a signature. ( see
[https://www.schneier.com/blog/archives/2006/04/digital_camer...](https://www.schneier.com/blog/archives/2006/04/digital_cameras.html)
) Don't forget your surroundings either ( NSFW Language/Topic (4Chan helps
track down targets) but incredibly illustrative -
[https://i.imgur.com/nLCklgZ.jpg](https://i.imgur.com/nLCklgZ.jpg) ). Sending
scans of documents? Most new printers print identifying patterns using
steganographic techniques. ( see
[https://www.eff.org/issues/printers](https://www.eff.org/issues/printers) ).
Stop and meet someone and both of you brought your mobiles? The fact that your
phones traveled together, and where you went, is recorded (CO-TRAVELLER; see
[https://www.washingtonpost.com/world/national-
security/nsa-t...](https://www.washingtonpost.com/world/national-security/nsa-
tracking-cellphone-locations-worldwide-snowden-documents-
show/2013/12/04/5492873a-5cf2-11e3-bc56-c6ca94801fac_story.html) ). The world
is _INCREDIBLY hostile_ to anonymity seekers.

Relying on third parties to strip data isn't a workable anonymity solution
because you can't trust them to do so, correctly or at all. Not to mention
that; but with pervasive internet monitoring (which, thanks to Snowden, we
know is real) the mere fact that you've communicated with someone or a site
storing your data is stored in a place where it cannot be wiped by any party
authorised to participate in your conversation. Generally, if you're not
anonymous to the person you're communicating with, until you choose to
identify yourself during communications who's contents may then be repudiated
(say, an olm/axolotl ratchet) at a later time by publishing that private key,
now expired and no longer good for future authentication -- then you won't be
anonymous to any party, period.

It's a trust no-one, verify everything type of situation; people don't deal
well with that. Pervasive encryption is only the _first and easiest_ step. If
you want true (or even just reasonable) anonymity, things get very expensive,
very quickly.

------
c3534l
The government has demonstrated that they will abuse every power given to
them, and even those that weren't. I would not entrust every aspect of my
personal information to the very same organizations that indefinitely detains
people, including American citizens, without access to a lawyer while
commiting acts of torture; and the ones that said the Patriot Act could never
be used for domestic surveilance; that lied about being unable to unlock the
phone of the last guy they tried this with to get the law changed; and that
continues to engage in parallel construction, torture by proxy, and
extraordinary rendition. And now they're saying that we should trust them to
stop those bad guys once again. And the most chilling aspect of this request,
despite its inherent absurdity, unenforcibility, and threat to freedom and
privacy, is that they have a very good chance of winning that power.

~~~
ben_w
> The government has demonstrated that they will abuse every power given to
> them, and even those that weren't

I think this mixes up what is true and what people (myself included) wish was
true.

Governments don’t have power given to them. Their default state is God-Kings
ruling on personal whims.

Governments have power _taken from them_ , either by corporations, or by
religions, or by other governments — sometimes these groups even call
themselves “the people” — but the restrictions are not stable equilibriums,
they are constantly fought against on all sides.

~~~
mmjaa
Governments always gain their power from the governed, and nowhere else. This
fact is always resisted by those who know it because it a) makes the people
responsible for the actions of their government and b) makes the government
responsible for their people.

For as long as we ignore this fact, we'll get corrupt governments. Alas, its
also a key reason that governments are corrupt - a government is only as
ethical as the people it governs.

------
unabst
The argument here is extremely simple.

Encryption is the only way to secure information. This is true for criminals
and non-criminals alike. To deny encryption is to deny security to everyone.

Presuming it's the criminals who will look to exploit these vulnerabilities,
denying security is making every non-criminal susceptible to attack.

So the only question that needs to be answered is this. Do we want to protect
our citizens? The only answer is yes. The only solution is encryption.

The problem with strong encryption is that it already exists. Even if strong
encryption were made illegal, criminals will be the one's securing their data
despite the law. To deny citizens the right to protect themselves is just
putting them all at risk. It's disarmament.

Under the law, all the police should need is a warrant. It's not even an
exception to any rule.

~~~
bradknowles
But how do you design a strong encryption algorithm that can be trivially
unlocked once a warrant is provided?

Answer: you can’t.

~~~
saas_co_de
> Answer: you can’t.

No, it is incredibly simple. You have a master key and the government holds
this key on a secure audited system which can only be used to unlock a device
once a court order is granted.

The government's security for the master key will certainly be much better
than the average user's password security so this will not decrease the
average user's security in the least.

You would also make the master keys expire regularly (maybe daily) so as long
as a user updates their phone they will get updated with the new keys to
protect against a leaked key.

~~~
Sacho
> You would also make the master keys expire regularly (maybe daily) so as
> long as a user updates their phone they will get updated with the new keys
> to protect against a leaked key.

What would the logistics of this be? Would the government need to store all
master keys to be able to decrypt an old message? How would you know you're
using the right key to decrypt a message? What happens if all the old keys
leak?

What about foreign communications? You can't compel foreign actors to encrypt
with your algorithm. What if I'm storing foreign data which is encrypted with
illegal algorithms, is that going to be illegal? If so, then goodbye hosting
services in the US. If not, how are you going to differentiate between foreign
data and local data?

What about the transition period? What do you do with legacy encryption? What
about people who haven't received the newly updated government-sanctioned
encryption yet? What about old devices that can't run your encryption
algorithm, closed systems, etc?

I don't think it's as incredibly simple as you put it.

~~~
saas_co_de
We are talking about different things. I was talking about allowing access to
encrypted data on devices which is the main issue that le has been complaining
about. You seem to be talking about a backdoor for all crypto everywhere which
is very different.

------
djweitzner
Brings back memories. I'm the Daniel Weitzner mentioned by Eben Moglen. The
debate was actually billed as a being about the future of privacy in the
digital age. The moderator was Arthur Miller, a truly distinguished law
professor at Harvard who wrote about privacy in the 1970s. Despite the broad
focus on privacy in the title, the discussion ended up being all about
encryption technology and policy. After 45 minutes or so of arguing about
encryption, key escrow and the Clipper Chip, Arthur said in his trademark
stentorian voice, "This was supposed to be a debate about privacy and all I've
heard about is ENCRYPTION!" Despite this, we continued to talk about nothing
but cryptography, as if its availability or lack thereof was the only question
that mattered for privacy.

The Dept of Justice official referred to was Deputy Attorney General Jamie
Gorelick, who is now a partner at a big DC law firm WilmerHale, where she
represents Jared Kushner and Ivanka Trump.

See our paper, Keys under doormats: mandating insecurity by requiring
government access to all data and communications, for more.
[https://doi.org/10.1093/cybsec/tyv009](https://doi.org/10.1093/cybsec/tyv009)

------
gumby
It's one thing to say "reasonable minds can disagree"; I can empathize with
the FBI's position, though I think they are fundamentally wrong.

However two things are striking about this speech, and similar recent (over
the past few years) ones in the US and UK at least:

1 - Rosenstein is no dummy, so must be perfectly aware of the doublespeak in
his statement that they don't want to make things easier for criminals yet
companies must provide accomodation for alleged non-criminals. The pre-GWOT
NSA took information assurance seriously and, at least in some cases, made
encryption stronger for everyone (consider the RSA S-box) even (perhaps) at
the expense of the NSA's SIGIINT efforts. I don't know if the Information
Assurance Directorate is even staffed any more.

2 - The propaganda is at its most flagrant with the Sutherland shooter: Apple
reportedly _offered its help_ but the FBI ignored that until the 48 hours had
passed to lock the phone, and _then_ excoriated Apple. Meretricious
malpractice, as far as I am concerned.

Neither of which makes me in any way supportive of the FBIs position, no
matter what actual merit might lie in it.

------
brian-armstrong
Here's my biggest complaint with this debate - people are confusing literal
with metaphorical. They make the analogy of the unbreakable safe. Encryption
isn't that. You can still recover the physical phone and all of the storage
chips on it.

That the patterns of bits in the chips make up some unrecognizable utterance
is seemingly immaterial. I could write gibberish in my journal at home if I
wanted to, and I think we'd all agree it would be ridiculous for the FBI to
run around screaming about unbreakable ink

~~~
Spivak
The other side of this is that enshrining encryption as something that police
can't compel you to help with just creates a huge loophole for hiding
incriminating documents. You can go to jail for destroying evidence, why would
encrypting the data and refusing to provide the password or deleting the key
be any different?

~~~
JoshTriplett
> You can go to jail for destroying evidence, why would encrypting the data
> and refusing to provide the password or deleting the key be any different?

Specifically encrypting incriminating data _after_ you have evidence of a
crime in an effort to cover it up _should_ be treated as the equivalent of
shredding documents. (Assuming, of course, they can prove it, just as they
have to prove that you had the documents in question prior to destroying them
in order to prosecute you for destroying evidence.)

That's not in any way the same thing as having your data encrypted and
refusing to decrypt it.

~~~
Spivak
I don't really think the timeline is meaningful in this case. Having a rule
where people cannot be made to decrypt files is just legalizing document
shredding with an extra step.

To avoid cases where people legitimately forgot their passwords just assume
that the police have video evidence of you unlocking the files just before you
were arrested. You know the passphrase and the police could prove it beyond
reasonable doubt in court.

You just start with your files encrypted with a strong passphrase and refuse
to provide it when you get caught. This is different than routine shredding
because the moment when they become inaccessible is when you refuse, not the
moment you encrypted them.

If they were instead physical documents buried somewhere hidden where the
police could not possibly find them without your help the court still has the
ability to hold you in contempt if you don't produce them. What makes the
secret knowledge of their location any different than the secret knowledge of
the password?

~~~
AlexCoventry
Encryption of the file should be treated as a separate step from deletion of
the plain text. The latter is destruction of evidence in the case of a crime.

~~~
ohtwenty
Are you arguing you should legally have to keep a plain text copy of anything
you encrypt? I mean I get your train of thought but that seems to be the
conclusion

~~~
AlexCoventry
No, only that destruction of evidence could pertain to deleting the plaintext
of crime-related documents.

------
bo1024
It's hard to know where to even begin in arguing against this.

There's the freedom/privacy argument, but I guess this is debatable depending
on if you view computer files as an extension of your ideas/knowledge, or an
extension of your physical possessions.

Someone brought up the entire "risk of overreach and abuse" argument.

There's also the likelihood of any tools the government has being leaked and
used by bad actors (as we have seen too much recently).

Oh, and the "it's technologically impossible" argument, which should be the
only one you need -- but they refuse to hear that. (Are there some supposed
experts who are telling the DOJ this can be done?)

~~~
molszanski
Consider these two points as a start:

\- They solved crimes before iPhones. Encryption is not a roadblock

\- In many countries guns are illegal. Yet criminals do own them. If
encryption becomes illegal, criminals would still use it

~~~
draugadrotten
> \- In many countries guns are illegal. Yet criminals do own them. If
> encryption becomes illegal, criminals would still use it

Making guns illegal doesn't stop criminals from using them, but it does make
it possible to jail someone _only_ because they had a gun.

Outlawing encyrption won't stop criminals from using encryption (just look at
China) but it does make it possible to jail dissidents _only_ because they
were using encryption.

Surveillance will be easier if encryption is illegal, but surveillance would
be easier if everyone was obligated to wear a GPS tracker as well. "Making
surveillance easier" is not sufficient to argue it should be implemented.
There needs to be checks and balances. And when only the government can keep
secrets, and they will, there are no more.

~~~
molszanski
Very good point!

------
nickwanninger
Gotta love the fact that the EU seems to think the exact opposite

[https://www.theguardian.com/technology/2017/jun/19/eu-
outlaw...](https://www.theguardian.com/technology/2017/jun/19/eu-outlaw-
backdoors-new-data-privacy-proposals-uk-government-encrypted-communications-
whatsapp)

~~~
Freak_NL
It's not as simple as that. In the EU there are lots of political factors
(like the director of the Dutch intelligence service, to name just one) that
are quite vocal about abolishing strong end-to-end encryption; just as there
are political factors in the US that wish to grant citizens the freedom to use
strong encryption unencumbered.

~~~
sametmax
Which is funny when you think they invented enigma in the first place.

~~~
Freak_NL
The Enigma machine was invented by the Germans, not the Dutch.

~~~
sametmax
Sorry, deutschemark being the german money before the euro, I still confuse
them in my head.

------
gxs
When I was in college, there was a retired guy that would come in to tutor
students as a way to stay busy.

He was your stereotypical brainiac type dude - quiet, lanky, glasses, soft
spoken, and razor sharp. He must have been in his 50s, but you would think he
had just completed upper division math, chemistry, and physics "last semester"
with perfect grades to boot.

He told me a story about how once while in his Masters program, one of his
colleagues figured out how to do some cool stuff with unenriched uranium.

Almost like out of a movie, he said, the government stepped in and made sure
he did not publish his research.

I wonder if we'll see that type of stuff happen with cryptography or if it's
already happening. I wouldn't be surprised if these theatrics were just to
maintain the illusion that they don't have access to stuff.

~~~
AstralStorm
And then Chinese figure it out for a third of the price and have a monopoly on
nuclear reactors.

Remember the time when organisations like DARPA actually promoted public
research of this magnitude? Or when DoD was making certain research public?

------
olliej
I wish this could be hammered into the thick heads of congress: there is
secure, and there is insecure. There is not a gradient.

~~~
SomeStupidPoint
That's... just not true.

And that kind of misrepresentation just weakens the arguments for strong
encryption, because intelligent people will see them as pretty transparent
misrepresentations. Have you considered that's why the arguments for strong
encryption aren't going well -- that we're not actually engaging with
intelligent people trying to understand the issue, we're chanting trite,
shallow inaccuracies?

I mean -- "there is not a gradient"? ...what do you call changing key size?

Ed:

I'd like the people downvoting to explain how changing the keysize isn't a
gradient of security. (Hint: You can't, because it _is_.)

~~~
olliej
I had another comment, but in response to your Ed: comment:

key size is not a measure of security. It is a measure of how /long/ we intend
the key to be secure.

More explicitly: Key size does not exist of the gradient of protocol security.
We know how long a key takes to break given current technology and algorithms.
We choose a key size to render the time to break infeasible against our
prediction of state of the art some amount of time in the future. If there's a
gradient, the gradient isn't "how secure it is", its "how long it will remain
secure".

Hence any policy that endeavours to control the "strength" of encryption
through controls over key length is /necessarily/ requiring an insecure key
size.

It can be put this simply: How small must the key be to allow it to be "good
enough" for the DoJ? Would they accept a continuous 5 years on a 10000 gpus?
Noting of course that in 18-24 months that key size will now only require 2.5
years, then 1.25, 7 months, 3 months...

Of course I'm sure 5 years and millions of dollars will be "unreasonable", so
it would need to take less time, and cost less.

~~~
SomeStupidPoint
Sincere question: how do you define "how secure it is" except "how long it
will remain secure (under attack)"?

Edit:

You're also completely eliding that security is probabilistic -- they might
just guess our key on the first try. We can only discuss it as the expected
amount of computation to figure out our key on average. That expected amount
has a gradient along keysize.

~~~
olliej
A protocol is secure if, and only if, the fastest attack is an attack on the
key itself. All of the recent crypto breaks (that not cause by prior key size
restrictions req'd by gov agencies) have been protocol flaws, e.g. flaws in
the protocol allowed you to derive the key without having to just explore the
entire key space.

Anything other than deriving the plaintext of encrypted data alone would mean
the protocol was insecure.

That said, I am coming to agree with you in terms of trying to explain to
people who don't write crypto code that saying key size is gradient of
security is probably the most sensible thing.

I still disagree with you on the actual statement :D

------
api
This has been the refrain for 35 years. Open source strong encryption is
everywhere now. The horse has left the barn, farm, county, state, and is
currently swimming the Pacific.

Police have other ways to fight crime. Eventually we may be forced to deploy
the ultimate weapon, namely correcting the social, economic, psychological,
and neurological factors that breed it in the first place.

------
rbcgerard
Perhaps the DOJ should take the first step - implement a proof of concept and
run all their computers, servers, and communications on it, with a trusted
third party holding the keys, say the house judiciary committee.

------
nathan_long
I understand law enforcement's frustration. However, this is quite simple.

Encryption relies on keys. Either the government has everyone's keys, or they
don't.

Any stockpile of encryption keys would give access to millions of people's and
businesses' data. _It would be a hacking target of inestimable value_ ,
targeted by criminal organizations and foreign governments using every
technique imaginable.

 _It would be stolen_. Period. And every citizen and business would suffer
disastrous consequences.

We can't say this enough in this debate: _making everyone 's keys accessible
to one entity means they absolutely will be stolen._ Whether we trust a
government entity's motives isn't even relevant. They do not and cannot have
perfect security, and that should end the debate.

If anyone doubts that the keys would be stolen, please see:

\-
[https://en.wikipedia.org/wiki/Office_of_Personnel_Management...](https://en.wikipedia.org/wiki/Office_of_Personnel_Management_data_breach)
\- [https://motherboard.vice.com/en_us/article/qkjkxv/fbi-
flash-...](https://motherboard.vice.com/en_us/article/qkjkxv/fbi-flash-alert-
hacking-group-has-had-access-to-us-govt-files-for-years)

~~~
bonestamp2
> It would be a hacking target of inestimable value, targeted by criminal
> organizations and foreign governments using every technique imaginable. It
> would be stolen. Period.

And we really don't have to look any further than the Equinox hack to prove
this. If they think that was bad, and it's clear that nearly all of them do,
then imagine it was more than just identity data... what if it was all your
actual data. If politicians know that all of their email, their internet
history and all of their other secrets will get out if this happens, I wonder
if they'll change their minds.

~~~
op00to
Oh - politicians won't have _their_ data subject to the backdoors, only the
rest of us.

------
rayiner
I can’t overstate how important it is for Google, Apple, and Microsoft to keep
hammering at this. It’s workkng. The DOJ is running into this a lot now (e.g.
can’t get into a drug dealer’s iPhone to see who he’s messaging). Eventually,
they’ll see that the ship has sailed and encryption is just something they
have to deal with.

~~~
cvwright
That's an incredibly optimistic take on this problem. Google, Microsoft, EFF,
et al need to win _every single battle_ or they lose the whole war.

The DOJ just needs to find one sympathetic test case, or one sufficiently
horrible incident to get the laws changed. Like the Patriot Act in the US, or
the new French surveillance law that passed after the Bataclan attacks.

It's the classic asymmetry problem that makes security so hard in general.
Only now, the party with all the money and power and time is _also_ the one
who only needs to win once.

------
rubyfan
What did the police do before there was the internet or phones?

~~~
gooseus
I imagine they spent a lot of time solving crimes, asking questions like:

"Where did they keep all their papers and correspondence?"

"Can somebody come break into this safe we have a warrant for?"

What did the cops do before X was invented?

They didn't worry about X being used to commit or cover up criminal activities
while continuing to try to do their job of keeping communities either safe or
oppressed, depending on how well they related to them.

I support strong crypto, but I think implying detectives and the DoJ are just
too lazy or dumb or whatever to deal with this problem is a little unfair.

Apologies if that wasn't the implication

~~~
yorwba
Except in most circumstances, there wouldn't have been any papers, because
nobody was recording all their conversations. Nowadays, everything is
happening digitally and is stored indefinitely by default; and law enforcement
feels like they deserve access to all that.

The attempts at preventing ubiquitous encryption don't seem to be focused on
crimes where there would have been a paper trail if people still used paper;
they are focusing on reconstructing the last year or so of someone who is
either dead or uncooperative, all in hopes of finding _something_ they can
use.

If they were targeting organizations with a bureaucracy for some crime, I
think they could just demand access to all documents and get them? If I
remember correctly, in the Levandowski case Google's lawyers got access to a
huge trove of Uber's internal emails. If a private company can get that
access, it should be possible for law enforcement as well, no?

------
PenguinCoder
The wording of this brings up a worrisome point. What encryption methods does
the DOJ currently have access to? Why are they complaining about needing
access to this encryption now? Is it because other previous encryption methods
are know to be broken or they already have access to that data?

~~~
MichaelGG
Probably because so many communications are moving to encrypted by default,
HTTPS everywhere and WhatsApp for instance (as Brazil has found out). If the
big players decide to switch to E2E then governments would need to get them to
change their products. Better to head them off before it is too late.

~~~
0xfeba
This is exactly it. The Going Dark Problem:
[https://www.fbi.gov/services/operational-technology/going-
da...](https://www.fbi.gov/services/operational-technology/going-dark)

------
phkahler
You can't legislate reality. You can't legislate physics, or the value of pi,
or weather or not it's technically possible to create nuclear weapons. The
fact is that strong public key encryption is possible with a few lines of
code. No matter how hard they jump up and don't the universe is not going to
put that genie back in the bottle because someone writes a law or an executive
order.

~~~
marcoperaza
I tend to think that encryption should not be regulated, but I don't agree
with your analysis either; it's too dismissive of a much more nuanced reality.

You can very easily make it so all consumer devices ship with software that
only uses encryption that the government has escrowed keys for. You can
require app stores to have the same requirement. The government requires all
sorts of things of people who manufacture products. Of course, they can't
regulate the 3d printer or CNC mill in your garage, but that doesn't make
regulating mass-produced products a futile effort.

Yes, some people will get around it. They will use open-source software that
they download and install themselves on devices that allow sideloading apps.
But no one in the government expects perfection out of this. They expect that
most criminals, most terrorists even, are not so sophisticated, not so
careful, that they will avoid being ensnared.

Remember that the situation they are trying to avoid is one where EVERYONE'S
texts and phone calls are by-default hopelessly inaccessible to the
government, even with a warrant (or in the case of foreign targets, even with
the most sophisticated HUMINT and SIGINT).

------
CyberDildonics
I think people not having access to the details of their government is
unreasonable.

------
DanielBMarkham
_" law enforcement equities"_

What an odd phrase.

We have a criminal justice system in this country that is adversarial and _is
tilted in favor of the accused_. That's because our founders realized the
immense power of the state could easily overrun any person it wanted to unless
there were strict and tight guards on what they could do.

These lawyers, who presumably should know much more about all of this than I
do, continue to make cases that strike me as "There are bad people! Because
they are really bad, we need to change the game to give us more power"

But there have always been bad people. There always will be. There is no
stopping that fact. It is part of being human.

I wonder if these people realize that even if they continue to get their way,
the only thing they'll end up doing is moving the really bad people from the
private sector to the government. I get the feeling they slept through a large
part of world history.

I continue to hear arguments than sound reasonable. I continue to hear
wonderfully-intricate arguments. What I've yet to hear is any of these yahoos
recognize exactly what kinds of trade-offs they're pitching. I get the feeling
I'm watching very poor workmen, focused on the tiny job in front of them
instead of the ramifications of that job. I don't think we need to argue that
many of these people are wrong as much as we need to argue that many of these
people are incompetent. It doesn't bode well for the future.

------
nokcha
I'd argue that any backdoored encryption (which renders the plaintext
accessible to the government and to any other entity with access to the
backdoor key) is inherently irresponsible. It introduces a single point of
failure that will be routinely exposed in the course of ordinary criminal
investigations. If US technology companies rely on this encryption to protect
their trade secrets, then it's only a matter of time before China finds a way
to exfiltrate the backdoor key.

------
tim333
From a practical point of view I'm not sure there are any criminal activities
that have happened because of strong encryption being available? I mean with
things like the 911 attacks it didn't seem to make any difference. On the
other hand encryption is very useful for financial transactions and the like.
It seems the benefits considerably outway the problems.

------
alexandercrohde
If angels were to govern men, neither external nor internal controls on
government would be necessary. In framing a government which is to be
administered by men over men, the great difficulty lies in this: you must
first enable the government to control the governed; and in the next place
oblige it to control itself.

\-- James Madison

------
jarym
Governments are the next on the Silicon Valley ‘disrupt’ list.

The music industry had to be dragged kicking and screaming away from their
physical distribution model.

Governments will have to be dragged similarly until they accept encrypted
content is something they have the same right of access to as private
thoughts.

------
hinkley
The reasonable man adapts himself to the world: the unreasonable one persists
in trying to adapt the world to himself. Therefore all progress depends on the
unreasonable man.

\- George Bernard Shaw

------
hprotagonist
encryption that the DOJ has access to, isn't.

------
cprayingmantis
I'm under the impression that encryption should be covered by the second
amendment as an 'arm'. Just like guns encryption is there to protect us from
malicious parties. If only we could get the NRA or some other more ethical gun
club to endorse it as such.

------
tempodox
"Every situation we can't cheat our way out of is unfair."

------
oceanghost
These complaints have always felt like disinformation to me. I assume they
have larger capabilities than they let on, its just they can't prosecute
without revealing that.

------
Dowwie
The moment that the US government stops complaining about encryption is when
you will know that it has found a way through it.

~~~
dx034
That would be too transparent. Also, if they really crack popular encryption,
only few people will know to avoid any leaks. Maybe they'd complain even more
to give people the feeling that they cannot crack that particular method.

------
bonestamp2
Me: Strong encryption that they do have access to is "unreasonable".

Thankfully they work for us right?

------
hguhghuff
Is this crazy or not?

And if it's crazy, what's the best way to argue against it?

------
tzs
If they want to convince me to accept some kind of back door in my encryption,
they have to propose a system where it can be shown that it cannot be abused
by bad actors within my government, and where there are clearly stated public
rules about when it can be used.

It _is_ possible to design such a system, where the probability of abuse is
arbitrarily low [1], but I have a hard time imagining the current DOJ
proposing such a thing.

[1] key escrow with access controlled by a multilevel secret sharing system
that requires consensus among a diverse international group of shareholders to
release the key from escrow. The shareholder group is chosen so that it
includes a mix of public and private entities in a variety of jurisdictions,
including anonymous shareholders, so that no entity can acquire enough power
or influence to force a key to be revealed.

~~~
kobeya
> key escrow with access controlled by a multilevel secret sharing system that
> requires consensus among a diverse international group of shareholders to
> release the key from escrow. The shareholder group is chosen so that it
> includes a mix of public and private entities in a variety of jurisdictions,
> including anonymous shareholders, so that no entity can acquire enough power
> or influence to force a key to be revealed.

Intelligence agencies will just steal the keys, and then individual actors
leak them onto the black market.

~~~
tzs
The probability of that can be made arbitrarily low by proper choice of
parameters for the secret sharing system, at least against realistic threats
over realistic timeframes.

~~~
heylook
What are some examples of such parameters?

~~~
justinjlynn
That's a good question. We should probably be cooperating to find out instead
of asking it to silence ideological opponents. It might be useful to know for
transient key secure multiparty computation.

