
The Myth of Consumer-Grade Security - jaden
https://www.schneier.com/blog/archives/2019/08/the_myth_of_con.html
======
dredmorbius
In the 1960s, the ecological notion that "there is no 'away' to throw things"
finally became widespread regards pollution, first stated by Barry Commoner.

In the 2010s, the informational notion that "there is no 'other' informatics
ecosystem" on which security, privacy, or surveillance practices and
principles apply is slowly dawning.

~~~
derefr
> "there is no 'other' informatics ecosystem" on which security, privacy, or
> surveillance practices and principles apply

SIPRNet?

(I would argue that the fact that governments try to have their own air-gapped
packet-switched networks for secure communications, is a large part of the
reason that governments don't invest too much into making the regular Internet
secure.)

~~~
dredmorbius
See my longer self-follow-up:

[https://news.ycombinator.com/item?id=20823820](https://news.ycombinator.com/item?id=20823820)

And, corroborating, from Wikipedia:

 _SIPRNet was one of the networks accessed by Bradley Manning, convicted of
leaking the video used in WikiLeaks ' "Collateral Murder" release[6] as well
as the source of the US diplomatic cables published by WikiLeaks in November
2010.[7]_

[https://en.wikipedia.org/wiki/SIPRNet](https://en.wikipedia.org/wiki/SIPRNet)

~~~
bradknowles
In 1995, I convinced the program manager who was responsible for SIPRnet that
they should not use HOSTS.TXT files and random IP addresses that they just
pulled out of their ass.

Ultimately, the weapon that convinced them was that they wanted to ultimately
connect it to the “real” Internet, once the work was completed on the multi-
level secure gateway that the NSA was developing.

I convinced them that there would be no way for them to communicate with the
real owners of those IP addresses, and that they would need to use the DNS to
communicate with the hosts and domains on the other side of that gateway.

It was a major face-palm moment for me, but I was glad that I was the DISA.net
Technical POC at the time, that my boss trusted me and he was the DISA.net
Admin POC at the time, and I had built a good reputation by helping them get
the first CERT inside of DOD up and running inside of a week on ASSIST.mil,
back at a time when there was just the one NIC for the entire Internet and
they did root zone updates only once a week.

Thank $DEITY for that MLS gateway that the NSA was developing, because
otherwise SIPRnet would probably still be using HOST.TXT files and random IP
addresses pulled out of their ass.

This was also the event that convinced me I needed to get out of DISA quick,
because I couldn’t keep saving the entire agency from making seriously brain-
damaged decisions like the one I saved them from.

Sigh....

~~~
dredmorbius
So: the point being that despite having a secure network, the immediate
impetus was to connect it to the insecure one?

And that without IP and network isolation, HOSTS.TXT was an utterly
meaningless obfuscation / network isolation mechansim (which was apparently
not ... apparent ... to them)?

The only bureaucracy more SNAFUd than government and military is private
sector. You just don't hear so much about it except through lawsuits and leaks
-- no congressional investigations or constituent concerns. Representation has
some advantages.

~~~
GhettoMaestro
SIPRNet (the secure version) has a parallel network for unclassified traffic
(eg surfing foxnews / open-source intelligence / checking your gmail).

If I had to take a wild guess, an executive at some point said "why do I NEED
to switch between these two terminals? Can't it all be secure on just one of
them? I just want to be able to reply to outside emails on the SPIRNet."

And thus it happened, or probably something like that. Never underestimate the
"executive factor".

~~~
dredmorbius
Thanks. Much as I'd suspected.

And EF is very QED in DJT.

------
nostrademons
For a while the President was tweeting from a consumer-grade Android (!)
phone, likely made by China, in the Oval Office. It makes me wonder how many
groups pwned that phone and have access to critical national security
information.

~~~
duxup
I used to work for a company that serviced a very secure site (military).

The rule was NOTHING electronic went in that wasn't accounted for, and NOTHING
ever left. We sent people onsite with what were effectively disposable laptops
that were single use items and never left the location.

Showing up at the gate with anything extra was said to be a very bad idea. It
never happened so I don't know what would happen.

You drove to the site in a car with only what you needed in it. Your license,
keys, the equipment you were scheduled to bring. Smartphones weren't ultra
common yet, but were absolutely forbidden. The car itself was searched too
even though it was in a parking lot far from anything sensitive. You were
warned that anything suspicious would not be in the car when you came back
(like the extra stuff rule I don't know of anyone that happened to as nobody
was foolish enough to drive out with anything but a clean rental car).

I honestly think that is the only security that makes sense. That should be
the status quo for some areas the White House too IMO. At least as far as
meetings and etc.

Maybe one day we get physical switches that power off cameras and mics, but
hard to trust anything until that day....maybe not then.

~~~
nostrademons
I have friends that work in defense. Their experience is basically like yours
- physically and electronically isolated clean rooms where you do the actual
work. Nothing goes in. Nothing comes out.

Also, one of the Snowden disclosures was that apparently the NSA can power on
the cameras and mics of Android phones _even when the phone is off_. I don't
know exactly how it works, but it does, probably based off the residual
current drawn from the battery and the circuitry that handles the power
button. Smartphones are basically never safe, although I've heard iPhones are
significantly safer than Android.

~~~
kuzimoto
> Also, one of the Snowden disclosures was that apparently the NSA can power
> on the cameras and mics of Android phones even when the phone is off. I
> don't know exactly how it works, but it does, probably based off the
> residual current drawn from the battery and the circuitry that handles the
> power button. Smartphones are basically never safe, although I've heard
> iPhones are significantly safer than Android.

I mean, even if you could turn on the mic/camera, I would think you still need
to possibly save it to storage for retrieval later which pretty much requires
the OS to be running.

Otherwise, maybe, you could somehow transmit the data but that would require
the ability to communicate with the device via Bluetooth/WiFi and for the data
from the camera to be passed to the wireless interface.

Unless a device was designed with that sort of functionality, I'm not sure how
the NSA could just turn that on.

~~~
amiga-workbench
They would likely compromise the device via its baseband, and patch the
operating system to give the illusion of a complete shutdown when the user
attempts to power it off.

~~~
Analemma_
"The baseband has DMA access and so can get root at any time" is a myth that
won't seem to go away no matter how many times security professionals refute
it. On iPhones and all but maybe the cheapest Android phones, the baseband is
exposed to the CPU as a USB device, with all the usual memory protections. You
cannot patch the device over the baseband (unless there's an existing
vulnerability in the USB stack or something).

~~~
wahern
USB device drivers are for sure bullet-proof, just like the baseband
controllers are bullet-proof....

Jumping from one exploit target to another in a chain is pretty much how _all_
modern exploits work. To say that because they would need to find an exploit
in the USB driver (or some other hardware or software interface), not only
states the obvious, but more importantly misses the point: it's far more
plausible than most engineers intuitively believe.

The depth of modern exploit chains is incredible, and while the conceptual
difficulty has gone up the pace hasn't seemed to abate, which is clear
evidence that our intuition about "too difficult" and about the elasticity of
the exploit supply and demand curves is woefully inadequate to accurately
gauge risk.

Part of the problem is that the complexity of the systems grows at the same
time as improvements in security and correctness. Sometimes it grows faster,
sometimes slower, but it's dynamic.

------
jammygit
> Through the mid-1990s, there was a difference between military-grade
> encryption and consumer-grade encryption. Laws regulated encryption as a
> munition and limited what could legally be exported only to key lengths that
> were easily breakable. That changed with the rise of Internet commerce,
> because the needs of commercial applications more closely mirrored the needs
> of the military.

The case seems to be that the government and military has almost no special
product offerings, so they use consumer tech. Therefore, weakening consumer
tech weakens the government and military. This is not a robust argument imo.

The stronger argument is about how a whole economy would spring up that would
fill office building after office building with full time hackers trying to
dox, blackmail, mitm, or steal from every non-banking, non-crypto-approved
communication in the world. The interned would eventually just die off as a
communications platform as the public completely lost trust in it (though not
politicians - they would be approved to use the secure channels and would not
understand the issue)

edit: typo & wording fix

------
ttsda
I like the general argument that's being made in the article about encryption
available to consumers being the same, and of the same importance as military
encryption, but I've got to disagree with military electronics no longer being
the bleeding edge.

Especially in areas such as RF, optics and positioning, the military still has
access to stuff the general market can only dream of.

~~~
jascii
Do you have any examples of that? In my experience the military tends to be
extremely conservative and prefers well-proven designs. For example: The
RCA1802, a processor launched in 1972 is still being manufactured mostly
because of its use in military applications (guidance system of the Tomahawk
Cruise missile among others)

~~~
wcunning
Gorgon Stare springs immediately to mind. I certainly can't afford a multi-day
in the air drone carrying a 30 lbs mutli-million megapixel wide camera [0].

[0][https://en.wikipedia.org/wiki/Gorgon_Stare](https://en.wikipedia.org/wiki/Gorgon_Stare)

~~~
Retric
It’s not a single multi million mega pixel camera. “ARGUS is essentially 368
five-megapixel smartphone cameras clustered together” That’s a 1,840 megapixel
(1.85 gigapixel) camera which is not that extreme and in line with several of
these images:
[https://petapixel.com/tag/gigapixel/](https://petapixel.com/tag/gigapixel/)

A lot of confusion comes because it’s a combination of multiple different
cameras than capture wide angles, infrared, and a separate camera that can
focus on areas of interest within the field of view. So, if the entire image
was at maximum resolution you would get into insane territory, but that’s not
how it works.

------
notinpersia
See this, too. Happy to share code/implementation details.

[https://docs.google.com/presentation/d/1f2k6fsIkDmIS1WyJAT0l...](https://docs.google.com/presentation/d/1f2k6fsIkDmIS1WyJAT0lXQmDuHIPeo9GDKfP1FY2rVc)

------
m3kw9
Consumer grade security is impede them just long enough till the cops arrive

------
RcouF1uZ4gsC
Why can't we have both security as well as court ordered access. What would be
the problems if we had 5 HSM that are airgapped and located at secure
facilities. Encrypted applications (such as WhatsApp, Apple Messenger, etc)
are required to submit/transmit escrow keys that are encrypted with the public
keys of those 5 HSM's. After a valid court order, a law enforcement official
has to physically go to one of the secure locations with those encrypted
escrow keys which will then be decrypted by the HSM. This way, everyone can
have secure communication, but still allow legitimate law enforcement searches
when ordered by a judge.

~~~
akersten
No, I would not call that secure communication, because it can be accessed by
someone other than the intended recipient. The number of hoops they have to
jump through is irrelevant.

It is not possible to have secure by definition communication and facilitate a
backdoor at the same time. The two concepts are strictly mutually exclusive.

In the United States you have the amazing freedom to communicate in any
language you desire (including one unintelligible to a would-be snoop), and
the State cannot force you to translate your communications just because they
want to hear them. We should not be so eager to give up that freedom in our
digital lives.

~~~
rubinelli
I'm surprised that Americans aren't pushing to make private and secure
communication a constitutional right. Will it make it harder for law
enforcement to go after "bad guys"? Yes. Just as the second, fourth, and fifth
amendments do today.

~~~
gizmo686
The US classifies encryption as a munition. I am genuinly suprised there
hasn't been a push to get it covered under the second amendment.

~~~
bitwize
Crypto is munitions for export purposes, I doubt you can get a ruling that it
qualifies for 2A protection. The law is like that. The law can classify a
thing as one thing for one purpose, and another thing for another purpose. If
they want to ban crypto, they could classify it as a Schedule I drug and it'd
make about as much sense as classifying it as a munition.

