
DAG Rosenstein’s “Responsible Encryption” Demand Is Bad - DiabloD3
https://www.eff.org/deeplinks/2017/10/deputy-attorney-general-rosensteins-responsible-encryption-demand-bad-and-he
======
liquidise
This is a well-written piece that takes a step toward clearly discussing tech
security issues in laymen's terms. Which is an important step.

The failure to adequately describe the security landscape to those outside the
technology sector is the greatest failure of technologists in the last 5-10
years. The NSA revelations were lost on america as a whole. They were muddied
by confusing technologies and what many considered to be treasonous means.

In this regard, snowden was the whistleblower technologists needed, not the
one america needed. The person or company to bear that standard has yet to be
identified.

I would say our job as technologists is two-fold: first, we must continue to
press our organizations for privacy and the security of our users. The e2e
encryption focus is a great example of this.

Just as important is our ability to communicate security problems in a manner
accessible to the common voter. Without their understanding, and passionate
support, privacy in the tech sector will be relegated to the "anarchists" and
criminals. Tor suffers this treatment and e2e encryption is already being
painted in a similar light. It is essential that pieces like this continue to
be written.

Collectively, we need to learn to distill this complex issue into language
everyone empathizes with as much as technologists do.

~~~
sametmax
The reality is much simpler: security issues in computing don't affect the
layman's life enough so that people feel concerned about it.

A few people are conned, or have their identity stolen. But the vast majority
just had a slow windows XP and a few toolbars attached to their web browser.

They couldn't care less.

As for bigger issues, like gov spying, it's exactly the same. They can't see
any link between it and their personal life, and have no interest in
projecting what could happen in the future as they have no will to participate
in building the system they live in. Caring for society is a niche feeling.

The only thing that would make people react would be a massive and violent
catastrophy, with strong symbolic value. And remember that equifax didn't cut
it: the outrage barely trickled to outside the geek circles. So it needs to be
bigger, and simpler to understand.

------
dane-pgp
> Examples include the central management of security keys and operating
> system updates; the scanning of content, like your e-mails, for advertising
> purposes;

Now that Google, an advertising company, has given up scanning the contents of
emails for advertising purposes: [https://blog.google/products/gmail/g-suite-
gains-traction-in...](https://blog.google/products/gmail/g-suite-gains-
traction-in-the-enterprise-g-suites-gmail-and-consumer-gmail-to-more-closely-
align/) the necessity/reasonableness of this practice is highly questionable,
but I suppose "for spam-filtering purposes" would be a clever alternative
argument there. (Even with encrypted emails, though, there might be enough
unencrypted metadata for accurate spam detection).

Perhaps the real lesson for us here is the danger of centrally managed
operating system updates. The fact that Rosenstein is already aware of this
attack vector, and thinks so highly of it, suggests it could be a priority to
address this. One technology that would be helpful here is Binary
Transparency:
[https://wiki.mozilla.org/Security/Binary_Transparency](https://wiki.mozilla.org/Security/Binary_Transparency)
so a user can at least be sure that they are being given the same updates as
everyone else; and if researchers can audit the updates they can hopefully
find "if userId = 123456" payloads. You'd have to trust the OS to not support
"secret updates" which happen without your consent, but ideally this too would
be addressed by an initial audit, or a TOFU policy.

------
sneak
This worked for “responsible disclosure”, attempting and mostly succeeding to
frame the completey responsible practice of “full disclosure” as implicitly
irresponsible.

Here’s hoping it doesn’t work for the DoJ.

~~~
nl
What exactly do you mean by "full disclosure" here? If you mean disclosure of
zero days with no warning, then yes, generally responsible disclosure is more
desirable. I'm not aware of any security professional who disagrees with this
stance.

~~~
sneak
I am a security professional who thinks full and immediate disclosure is
always responsible and professional and desirable.

~~~
sneak
Example: Today, I found out that on the close order of a thousand people at
Yubikey, Infineon, Microsoft, Google, Lenovo, and Fujitsu knew that my
Yubikey’s RSA keys were weak—IN FEBRUARY.

I found out today.

Two legs good. Four legs bad. Post that shit to the mailing list. There are
far more good people than bad people. Stop covering for the fact that huge
corporations are risk-averse and slow to make changes; that’s baking bias into
the process.

~~~
nl
I don't think anyone would (or should!) argue that 9 months delay is
responsible disclosure. If they do then they are wrong.

But dumping zero days isn't any better.

This 2003 paper from SANS[1] talks about the BugTrac policy of full disclosure
after 14 days, but says most recommend 30 days. I think that is a reasonable
expectation.

[1] [https://www.sans.org/reading-
room/whitepapers/threats/define...](https://www.sans.org/reading-
room/whitepapers/threats/define-responsible-disclosure-932)

~~~
sneak
All those companies argued that for the last 9 months they sat on the Yubikey
vuln key issue.

------
yuhong
I should mention that I suspect that the more likely that FBI investigations
are successful, the more government debt they receive. Which is why the debt-
based economy is flawed.

