
The battle between Washington and Silicon Valley over encryption - jonbaer
http://passcode.csmonitor.com/cryptowars
======
adwf
Not mentioned in the article is how to deal with the encroachment of backdoors
worldwide. If we allow the US to have a backdoor, we've allowed the principle
of backdoors in general. What's stopping the UK asking for its set of keys?
Still OK? They're allies after all.

Where does it get uncomfortable for people? France? India? Israel? Saudi
Arabia? China?

Once the principle is there for it to be done technologically, businesses will
be faced with the choice of allowing multiple backdoors or not doing business
in those countries at all.

And if you do accept all these backdoors, now you've got dozens of potential
security holes. Not to mention you have absolutely no guarantee that any of
these countries are genuinely going to decrypt just for "national security".
Giving their native companies access to your corporate data would be a huge
boon.

~~~
colin_jack
That is actually mentioned in the article.

------
georgebarnett
It's seems to me that the inevitable response to back doors will be similar to
what has happened following the NSA leaks - secint, sysadmins and devs will
update their threat model, excluding the known defective components from the
system. In trying to gain deeper access, I believe the government will only
lock themselves out further.

Either that or it's a noisy ruse. Make lots of noise about how you're going to
backdoor things. Have a great debate and declare it a dead idea, then do it
secretly.

~~~
marktangotango
This is a really good article, balanced and well informed. Is this typical for
the csmonitor?

>> Have a great debate and declare it a dead idea, then do it secretly.

Or, putting on my tinfoil hat, they already have means to break arbitrary
encryption and they just want back doors (public or in secret) to provide
plausible deniability of that fact. An important aspect about having an
exceptional capability is not revealing you have that capability.

~~~
surge
csmonitor puts out surprisingly good well researched articles that compared to
other online outlets, aren't rushed to press or are well researched in depth
pieces that you simply don't find on many other sites (except maybe Vice or
the Atlantic) on occasion.

------
mindslight
Rekindling this government desire was an easily forseeable result of
centralization. "Web 2.0" companies are inherently surveillance companies, and
their motives are directly aligned with government. They are putting on a good
show to avoid alienating their targetbase. But even if it were in their
interests to adopt end to end encryption, they are in no position to stand up
to the government the way a bunch of distributed software users would be.

We won the war in the 90's, just to have to fight it all over again because
lazy people adopted webcrapps.

------
joesmo
Anyone that buys the argument that we need back doors for encryption to stop
crime is an idiot. Crimes were solved before cell phones even existed, proving
that backdoors into software are not needed to solve crimes. I'm sure the
administration knows better, but they also know how weak most Americans' minds
are when it comes to understanding the consequences of national policies. The
fact that this is a debate is ridiculous, but not unexpected in the US.

Damn straight it will hurt tech companies, as it should. They should go out of
business until they move or their government stops fucking them over, ideally
the latter, and the government should be held accountable. We're not talking
about mom 'n pop shops here, we're talking about some of the richest companies
on earth here. They can lobby for the government to subsidize their losses.
They can lobby directly against any legislation proposed. If they're
unsuccessful, they pretty much have only themselves to blame, after all their
money is what moves politicians, not the other way around.

~~~
rayiner
> Anyone that buys the argument that we need back doors for encryption to stop
> crime is an idiot. Crimes were solved before cell phones even existed,
> proving that backdoors into software are not needed to solve crimes.

I don't support requiring back-doors, but it's pretty ironic that you call
people idiots then immediately follow that with a totally fallacious
assertion. Law enforcement didn't have back-doors into cell phones before
those existed, but criminals also didn't have secure ways to communicate over
distance in real time. We can't conclude anything from the fact that crimes
were still solved before cell phones existed.

The other way to look at it is to ask: out of the universe of ways for people
to communicate, what can the government get at with a warrant? Historically,
the answer was: nearly everything. For most of the last century, there was no
way to communicate over distance that the government couldn't intercept. The
only real-time way to communicate over distance was phone, and they've always
had a "back door" to the PSTN through wire taps. Think about it: why do mob
bosses in old movies take the risk of meeting at shady restaurants, instead of
just dialing into a conference number? Because their phones are all tapped!

Law enforcement's position is that back doors just maintain that long-standing
status quo. It may be a bad idea because security should outweigh the needs of
law enforcement, but it's not an irrational argument.

~~~
Zigurd
> _but it 's not an irrational argument_

Actually, irrationality is at the heart of it. As with global warming denial,
the argument _for_ back doors is a denialist argument against all evidence and
against all expert advice.

Moreover, pro-back-doors argument isn't trying to defend a law enforcement
status quo, it's trying to defend a program of illegal mass surveillance,
because that is where pervasive strong encryption would have the largest
impact. Fuck these people and the horse they rode in on.

~~~
jerf
"Actually, irrationality is at the heart of it. As with global warming denial,
the argument _for_ back doors is a denialist argument against all evidence and
against all expert advice."

It would be a lot easier if that were true, but it's not.

Examine all the reasons why backdoors in encryption are bad. Then ask
yourself, which of those bother the government that wants to do survelliance?

None of them.

When the adversary pretty much wants everybody to conduct all their business
in plain text, and if that's bad for them, too bad so sad, pointing out that
we can't very well _partially_ weaken encryption is a non-argument to them.

The only thing that slightly bothers them is that ultimately, government power
is significantly based upon American businesses being successful (they provide
the tax base, directly and indirectly), and if the encryption weakens American
businesses it therefore weakens their own power. That's why they're even
slightly listening, and why their line to date has been variants of "Can't you
just try harder?" From their point of view, the only thing that has to happen
is that customers of American businesses need to be _convinced_ that they are
secure; real security is to them a fully separable concern only loosely
related to perceptions.

I personally would observe that over time, perception of security and reality
of security would tend to synchronize, but one thing you must understand about
government or it will forever be mysterious to you is that it is full of
people who are used to _creating_ perceptions, wholesale, with varying and
often small relationships to the truth. Watch the next high-level policy
debate that happens on CNN and Fox, and yes, I'm deliberately picking mass
media because it's actually where it is clearest to see. Note how the debate
is almost _never_ about facts, real facts, provable and disprovable facts. It
is about manufacturing perceptions, and the other side trying to manufacture a
perception that fits into people's heads more easily and therefore wins out.
If you try to argue that actually weakening security will directly, via facts,
create a perception of weak security, you are almost literally saying words
that they can not even represent in their brains.

Perceptions don't come from reality for them... perceptions are _created_ , by
acts of will and clever spin. When we technologists say that people will
perceive our stuff as insecure and not buy it, the solution that leaps to
their mind is to use their usual tools to create the necessary perception,
then weaken the security under that cover, and to them, we sound like children
who believe that reality somehow creates perceptions independently and
reliably in people, which to them is snicker-worthy childish naivity.

And worst of all, they're being rational. _It works_. All the time. Even here
on HN in a lot of other of the perennial debates. It's very difficult to
rationally argue them out of this position because rather a lot of the facts
support their view.

A disturbing number of them accept the perceptions created as a whole as
reality even though they are witnesses to or authors of the creation of
perceptions themselves, so a disturbing number of these people on one level or
another essentially believe they are _creating reality_. (I find this
observation an incredibly important aspect of understanding how the very
highest level of politicians operate, as after decades in this world, most of
them fit this.) It's basically that observation about how when you read your
own field of expertise in the paper you see how crap the reporting is but
still believe the other bits of the paper, writ large. Others realize the
process consciously to one extent or another and are simply cynically
exploiting it.

Now, I believe the technologists are still correct, because for the most part,
security purchasers at the high end are fairly rationally evaluating products,
and real weaknesses will create perceptions of weakness, and probably little
to nothing the government perception-writers can do will change that. But it's
hard to explain the government how this is one of those rare places where the
engineering and the math are just too hard and too rigorous, the connection to
reality too direct and too strong, for _their_ toolset to be of much use. They
encounter that situation so rarely they don't really believe it exists.

~~~
Zigurd
> _But it 's hard to explain the government how this is one of those rare
> places where the engineering and the math are just too hard and too
> rigorous, the connection to reality too direct and too strong, for their
> toolset to be of much use. They encounter that situation so rarely they
> don't really believe it exists._

While we're at "it's even worse than that..." Many of the more-thoughtful
proponents of back doors who might have a grasp of the impossibility of doing
what they say they want are die-hard Hobbesians who think that if encryption
(or 3D printing, or cryptocurrency, or CRISPR, or whatever novelty the
government can't easily tightly control) means the end of beloved Leviathan
and a world of crowdsourced assassinations by bio-engineered insects carrying
3D printed titanium machine guns. A vision they'll make frightening to the
masses by saying "there will be more CP." They think the world will spin out
of control without pervasive surveillance.

------
rm_-rf_slash
My problem with intentional vulnerabilities is this: if we (Americans)
supposedly have the best and baddest intelligence agencies in the world, isn't
asking for a key under the doormat akin to a magician asking the crowd to
close their eyes while they get their next trick ready?

If the FBI and NSA really need backdoors to do their job, then it makes me
feel less secure...and less trusting.

------
bigmofo
Every OS gets software updates to patch various security issues every few
weeks or month or so. There is absolutely no reason why the NSA or FBI can't
have a warrant saying that the person that they want to tap gets government
spy ware on it via an update.

Also, I have to assume that every OS has security vulnerabilities that are not
known to the public. If so, those can be used.

Also, I remember reading some story about something like Intel IPMI being used
to read arbitrary memory remotely regardless of OS installed.

I think that there are far too many security holes for an agency with tons of
employees and money to not be able to exploit. How many mathematicians and
cryptographers does the NSA employ? I think that there is only one full time
person working on GNU Privacy Guard.

I don't really see any information stored on a phone as safe. I would think
that the manufacturer, the OS writer and the ISP can probably patch the phone,
which can come with government spy ware.

I have no doubt that if if they want to target somebody that they can do it.

I think that is just a bunch of publicity to make the general public think
that they are weak and impotent. Also, consider the fact that it has been
reported that the government uses parallel construction to hide where the
evidence really originated...

------
anonymousDan
Intel SGX could be this backdoor. Just have Intel hand over the key for each
device to the Government and job done.

~~~
mike_hearn
Only for programs that use SGX remote attestation. Which is at the present
time zero programs. Additionally, it's not clear why you'd need it to do local
encryption of your own data.

~~~
anonymousDan
Huh? It's not even available yet, so of course there aren't any programs using
SGX remote attestation. Regarding your second point, one of the main use cases
is likely to be the cloud computing market where you want to do remote
computation over sensitive data (requiring it to be decrypted inside an
enclave). If an attacker knows the machine's built in key then it can read any
encrypted pages. Or maybe I'm misunderstanding your comment?

------
omouse
So it's like the early 90s where they want to weaken encryption and make
industrial espionage easier?

------
erikpukinskis
We're in a weird time. I just read the writeup about moot's challenges
moderating 4chan. But we will soon have an unmoderatable "crypto 4chan" built
on blockchain technology that mechanically cannot be censored. Ethereum,
Maidsafe, et al are building the infrastructure right now.

Government officials are concerned about this direction, as they should be. We
will soon start seeing crypto-murder software that cannot be censored. Imagine
Bitcoin, except instead of mining, you get paid to cache weaponized drones in
remote locations so that they can do the blockchain's bidding.

These are not science fiction fantasies, the infrastructure for this is
planned to be completed within the next few years. We can laugh at congress
for thinking there is some obvious way for them to backdoor PGP, but they have
good reasons to be concerned.

You might argue that this is a separate issue from the simple cryptography
issue that congress is debating. But in my mind they are parts of the emerging
phenomenon of information tools which are impervious to physical violence. For
the state, which derives its power from physical violence, the entire movement
is an existential threat.

I'm actually more concerned about this issue than the whole "AI superthreat".
The kinds of tools the AIs would use to harm us will already be in the hands
of bad people, long before the AIs wake up.

I am in no way suggesting these technologies are bad. People are already dying
because they don't have access to food, shelter, and healthcare, and I belive
crypto-technologies will save far more lives that way—providing universal
logistics support to all humans—than they will facilitate deaths.

But so far IT has been sufficiently insulated from the physical world that
technology feels somewhat safe... A long weekend in Vegas with fun games and a
lot of money to be made. No one is really dying because pets.com went awry.
But that is changing. Robotics combined with blockchain technologies enable
information systems that are very much "of the world" and yet are not subject
to the same mechanics of control as humans are, due to them not needing food
or physical safety. That's not a smal change, it's a huge one.

Sometimes I wonder if this jump won't set us temporarily back towards a more
totalitarian state. The obvious solution is to ban robotics and crypto, with a
censorship regime like China has. I don't know how many "bad robots" or "bad
blockchains" would need to exist for that scenario to trigger with wide public
support.

I honestly spend most of my time trying to solve problems with these
technologies so I haven't spent much time thinking about harm mitigation
strategies. But I don't think it's silly to have that conversation. And we
should help congress to have the right conversation, not mock them for failing
to understand a problem that, frankly, none of us really understands.

------
exabrial
yawn. This is really overhyped

