
“End-to-End incompatible with Chrome Update functionality” - hodgesmr
https://code.google.com/p/end-to-end/issues/detail?id=9
======
opendais
This is the most likely attack vector for any automatically updating program
on your computer, not just End-to-End.

What would stop a maliciously updated Chrome from recording all your
keystrokes in the browser [all of your password; passphrases] as well as
copying where you were at the time?

Etc.

This is one of those things where if you think the Government is going to
silence you for being a dissenting voice and/or steal your info because of
it...you grab an open source project, you compile it, you use that. You don't
grab closed source software that automagically updates.

I don't think its reasonable to expect Google to protect the user from -every-
potential attack vector.

~~~
simias
I think the reason many people (including myself) have a certain reticence to
this google made crypto-in-the-browser is that nowadays I feel the #1 reason
to use such an extension would be to protect ourselves from eavesdropping from
Google or the government. So basically you're trusting Google for giving you
the tools to escape Google's scrutiny, knowing full well that if they really
want to access your messages they can easily backdoor your crypto extension.

So what's the point? To me it's just homeopathic crypto.

~~~
raldi
You have access to all the code and can read it and compile it yourself, and
then turn off automatic updating.

And if you're not tech-savvy enough to do that, you can hire someone else to.

And if even _that 's_ not enough to please you, then tell me, _what_ could
possibly satisfy you?

~~~
simias
Use a third party tool instead of putting all your trust into a single point
of failure. There are plenty to choose from.

What you're proposing is pretty complex and time consuming, you can't expect
the average user of this extension to do that (I know I wouldn't).

Or alternatively decide that you're fine with trusting google with whatever
info is in your emails because you believe they have to much to lose if
they're caught eavesdropping on some random Joe like me. That's what I do. And
if tomorrow I feel I have a mail sensitive enough that I don't want to risk
Google or some government intercepting it I certainly won't do it from my web
browser's using Google's own extension.

~~~
raldi
How does using a third-party tool solve anything? Its code probably won't have
as many skilled eyeballs reviewing it.

~~~
simias
Maybe, maybe not, GnuPG has been here a long time already for instance.

But that's not the point anyway, the point is that Google can update your
browser and the extension without you being aware of it very easily. You can
turn it off of course, but who does that? Running an outdated browser is not
exactly the best security measure...

------
willvarfar
This is true if you have any program or OS upgrade going, whether manual or
automatic.

"Ignore" is just a linkbait title :(

~~~
dang
Yes, and it was also badly editorialized. We've changed the title to be
(approximately) that of the bug report.

Submitters: the HN guidelines ask you not to change titles except when they
are linkbait or misleading. Changing them to be _more_ linkbaity and
misleading is right out!

Edit: Oops, I forgot: the submitted title was "Google ignores the simplest and
most likely attack vector with End-to-End".

------
antmldr
This isn't a bug / complaint / observation of a vulnerability of End-to-End
per se, you could argue Microsoft could be NSL'd to do the same to a user's
operating system.

To counter this you'd need a secure, distributed way to release updates in
Chrome. I don't think that's quite in scope of what this project is trying to
accomplish.

~~~
AnthonyMouse
> To counter this you'd need a secure, distributed way to release updates in
> Chrome. I don't think that's quite in scope of what this project is trying
> to accomplish.

It's not like that would require a large amount of novel code. Use BitTorrent
+ sign the update, problem solved.

------
DanBlake
Maybe the solution is to have the automatic downloads of chrome be anonymous
and build the system in such a way that changing it would not be possible.

Basically, change the chrome automatic updater to not send any identifying
information when it requests a update. That way, you can be sure that Google
couldn't just target 'you' with a update.

Then, you just need to rely on the fact that people would be watching the
chromium code for any changes which would negate the above anonymity.

The real challenge would be for Google to develop a way where they could not
still identify people from their other data (IP, cookies, etc..) when they
were requesting a update.

Maybe have a third party host/store chrome update binaries? Something like
amazon S3 or something which would not data share with Google.

~~~
AnthonyMouse
> Maybe have a third party host/store chrome update binaries? Something like
> amazon S3 or something which would not data share with Google.

Obvious problem there is that whoever is putting pressure on Google could put
the same pressure on Amazon. Solution is to use P2P. If your device has
uploaded the new code to two other devices (and so on) _before_ installing it,
targeting specific devices becomes much more difficult.

~~~
danielweber
If a hostile party can get code onto your computer, it doesn't matter much
whether the hostile code is living inside chrome.exe or some other program.

If I am quietly targeting you to get malware onto your machine to capture your
key, trying to sabotage Google Updater even in the current format is the wrong
way.

~~~
AnthonyMouse
If you want to be secure against an attack you have to close all the attack
vectors. If whatever the "right way" is supposed to be is patched but this
method remains exploitable then you're not finished.

~~~
danielweber
Trying to secure a self-updating program pulling updates from an untrusted
source that it's explicitly designed to trust via its threat model is not
"closing an attack vector." It's "impossible."

If you think Google's update mechanism is compromised, you need to turn off
automatic updates, and then only update via Chrome's offline installer after
verifying checksums. Of course, the question is "compare against what?" If
Google Chrome is actively hostile, it isn't going to be turned up by Joe
Random decompiling every single version and inspecting all the routines. You
are slightly better off if you build it yourself, but now you have opened a
turtles-all-the-way rat's nest of Trusting Trust and your entire toolchain.
And who are you paying to review google's source code against accidental
memory leakage each time a new version comes out?

If Google is hostile against you, you shouldn't even be using their browser in
any way.

~~~
AnthonyMouse
> If Google is hostile against you, you shouldn't even be using their browser
> in any way.

The problem with this statement is that it generalizes. Unless your solution
is to not use any browser, you have to trust somebody. The traditional way to
solve this problem is checks and balances. You don't trust _just_ Google, you
make them show their work. Because if the source code is sitting there on the
web for everybody to see it, and anyone can verify that the binary being
distributed to everyone matches the one they get when they compile it
themselves, inserting a backdoor is a lot more dangerous because of the risk
that you get caught and ruin your reputation.

But for any of that to be possible, the code running on your computer has to
be the same as the code running on everybody else's computer. The fact that
somebody could inspect the code that isn't running on your computer does you
no good.

------
Zigurd
It is a hard problem, and it's a positive sign that Google acknowledges the
problem. Providing end-to-end security also goes against the trend of
expanding the non-open parts of their Android app suite.

In addition to the reported bug, this plugin is handing cleartext back to
Google-controlled code. Web apps and good security are still miles apart.

But this is still a significant change from a year ago when we heard internet
portal CEO kvetching about the NSA and not even mentioning and-to-end
encryption.

There is still a VERY long way to go before this counts as democratizing end-
to-end security. Any portal that has real time communication tools and a
social graph could also provide tools for automating Web-of-trust and key
exchange.

All journeys etc.

------
rlx0x
I think its a moot point, if the government is the attacker, who tries to spy
on you specifically, there is absolutely NOTHING you can do to prevent that
from happening.

The US government has (literally) secret laws, that grants themselves the
right to go to another country and kill someone without due process or trial
or any kind of repercussion if they 'accidentally' kill innocent bystanders.

Its sort of laughable to talk about end to end encryption and possible NSL
when you really think about it.

~~~
fulafel
Here you make the mistake of assuming security is binary. Making attacks more
expensive and likely to be exposed are effective deterrents.

There are many things that people in us government would like to do but cannot
afford it.

------
stcredzero
You know, if we had DRM infrastructure we could actually trust, this wouldn't
be a problem. Granted, having DRM we can trust may well be itself an
insurmountable problem.

~~~
AnthonyMouse
Your comment doesn't parse due to an ambiguity. Who is _we_?

If "we" are supposed to be all of us as owners of computing devices, I'm not
sure how DRM is supposed to help. Maybe the ambiguity is in what you think DRM
is.

If "we" is "they" as the overlords who manufacture our stuff, having a DRM
infrastructure _they_ can trust would be a primary source of the problem,
because it would consequently be one that _we_ can't.

~~~
stcredzero
_If "we" are supposed to be all of us as owners of computing devices, I'm not
sure how DRM is supposed to help. Maybe the ambiguity is in what you think DRM
is._

This is exactly the knee-jerk populist fluff we need people to actually _think
through_. If _we_ was everyone, then we could have agents that use our private
key to sign and encrypt things on our behalf, but be a bit more confident that
it wouldn't fall into the wrong hands. (Actually, we'd best spawn agents with
a time-limited signed ephemeral private key.)

"We as in the overlords" is so predominant as a model, people have forgotten
to think about the implications of such tools for themselves. Most DRM
"discussion" gets reduced to "two legs baaad, four legs goood."

~~~
AnthonyMouse
> If we was everyone, then we could have agents that use our private key to
> sign and encrypt things on our behalf, but be a bit more confident that it
> wouldn't fall into the wrong hands. (Actually, we'd best spawn agents with a
> time-limited signed ephemeral private key.)

What are we supposed to gain from "agents" here? You have your private key,
nobody else has it, you sign things with it. If somebody else has it then it
isn't _your_ private key.

For DRM to even pretend to work for any purpose that anyone thinks of when
people say "DRM" (i.e. preventing the copying of bits given to untrusted
parties), it has to be a centralized trust model. Some party that "everyone"
can trust has to certify that the device or software you're giving the secret
bits to is not going to divulge them. The problem is that there is nobody that
everybody can trust. Trust is not a global hierarchy, it's a relativistic
relationship between peers.

If (as it seems) you're trying to achieve something separate that may work
using a distributed trust model, why are you calling it DRM?

~~~
stcredzero
_If somebody else has it then it isn 't your..._

...data. This is how things work _right now._

 _For DRM to even pretend to work for any purpose that anyone thinks of when
people say "DRM" (i.e. preventing the copying of bits given to untrusted
parties), it has to be a centralized trust model._

That is what we have _now._

 _If (as it seems) you 're trying to achieve something separate that may work
using a distributed trust model, why are you calling it DRM?_

How is the acronym "Digital Rights Management" inappropriate for this, without
bringing up fluffy sentiment? Actually, I would prefer "Trusted Execution."

~~~
AnthonyMouse
I don't see how "this is what we have now" is a refutation of anything. The
status quo is not optimal.

> How is the acronym "Digital Rights Management" inappropriate for this,
> without bringing up fluffy sentiment?

That's like asking why the acronym "Structured Query Language" is
inappropriate for a new language for making queries to search engines. The
problem is that it already means something else.

~~~
stcredzero
_I don 't see how "this is what we have now" is a refutation of anything. The
status quo is not optimal._

My point exactly, and it doesn't have to be that way. Also, why does it have
to be a refutation? I've met a few great minds in my time. They are the ones
who create a small cloud or tree of meanings from what they hear or read, and
try to make them all work out of curiosity.

 _The problem is that it already means something else._

"Means" in the sense that people feel a certain way about it, or means in the
sense of the definitions of the words? That's exactly what I mean in terms of
fluffy sentiment. Back to great minds. Great minds try to refute things by
trying as hard as they can to make them work, even challenging their own
assumptions to do so.

Also, aren't you a member of TX/RX?

~~~
AnthonyMouse
> My point exactly, and it doesn't have to be that way.

OK, that still doesn't mean it's a good idea to say that some useful new
combination of PGP, TLS and AES-NI is "DRM."

> That's exactly what I mean in terms of fluffy sentiment.

It's not "fluffy sentiment" to believe that words mean what other people
understand them to mean. If you want to start a new nation-wide left-leaning
political party, calling it the National Socialist Party is a Bad Idea,
regardless of whether it satisfies each of the characteristics of being
national, socialist, and a political party.

> Also, aren't you a member of TX/RX?

Why, is there someone there impersonating me?

~~~
stcredzero
_OK, that still doesn 't mean it's a good idea to say that some useful new
combination of PGP, TLS and AES-NI is "DRM."_

You do realize that DRM started out basically as trusted execution, and that
the other layers of meaning you're piling on are just politics?

 _It 's not "fluffy sentiment" to believe that words mean what other people
understand them to mean._

Okay then. Global Warming is a debate, vaccines are harmful, and people freeze
instantly when they step into vacuum, then.

 _If you want to start a new nation-wide left-leaning political party, calling
it the National Socialist Party is a Bad Idea, regardless of whether it
satisfies each of the characteristics of being national, socialist, and a
political party._

Yes, but if you have the intellectual wherewithal to discuss nationalism and
socialism, I expect you to follow suit in the discussion, not to knee-jerk
about Nazism.

 _> Also, aren't you a member of TX/RX?_

 _Why, is there someone there impersonating me?_

Oh, so you're not _that_ Anthony.

~~~
AnthonyMouse
> Okay then. Global Warming is a debate, vaccines are harmful, and people
> freeze instantly when they step into vacuum, then.

Those aren't issues of word meaning. The issue there is not a disagreement
about what Global Warming is or what a vaccine is.

Here, I'll demonstrate by solving the problem at hand and distinguishing "DRM"
from the solution. The problem is that you have a piece of software that needs
to update itself periodically, but the software has access to sensitive data
(private keys) and the updates are sufficiently frequent that verifying each
of them is unmanageable. Let's sketch out a solution. The thing you need is to
separate the sensitive bits from the other bits, so you can update the other
bits without worrying about your private key.

One possible solution is for computers to be sold with a TPM that comes from
the factory with a private key in it, and have the TPM as the thing that does
the sensitive bits so that unless the TPM is broken the browser plugin can't
get a copy of your private key. This solution is _terrible_. If the private
key is there from the factory then you don't know that the manufacturer
doesn't have a copy of it, or that it wasn't generated with Dual EC DBRG, or
that the TPM is doing what it's supposed to be doing and doesn't have an
intentional backdoor or accidental security vulnerability. If you can't get
the key out even as root then you can't copy it for when your hardware dies or
becomes obsolete. And if there is a security flaw in the TPM then you can't
fix it, you have to throw away the device (with the key) and buy a new one.
The only reason to put any keys there from the factory and incur all of these
disadvantages is to try to secure the device _against the user_. There is no
benefit to the user of doing it that way.

Let's try a different solution. You run the browser and its plugins in a
virtualization sandbox. The sandbox has an API for signing messages. The PGP
plugin can request signing and provide a copy of the message to be signed. The
OS or hypervisor then removes the sandboxed code from the CPU and runs a
small, privileged application the purpose of which is solely to display the
message to be signed, get your approval, and produce a signed message for the
sandbox if you approve. It does nothing else, so it very rarely needs to be
updated, but it can be, and when it is the code is short and simple and the
changes are small, making updates much easier to validate. The privileged code
is secure against browsers and plugins but not against the owner of the
device, so you can backup the keys or transfer them to another device at your
pleasure, and you generated the keys to begin with so you know where they came
from.

These are two very different solutions and it is important to distinguish one
from the other. Whatever five researchers thought DRM meant a decade ago is no
longer important. Today it has a certain connotation which is decidedly
negative. If you have something _beneficial_ , calling it DRM only serves to
sully its reputation by association with the user-hostile things that have
historically been called DRM.

------
raldi
It doesn't seem like they're ignoring it. They just don't have a quick fix for
this hard problem.

If you've got an easy answer, please post it on the bug!

------
danielweber
What's the precedent for a company ever being required to ship backdoored
products to their customers by government legal order, NSL'd or otherwise?

"The FBI once lobbied for the government to give them that power!" or "look at
these service companies that had evidence in their possession they were
required to turn over!" are non-answers.

~~~
hodgesmr
[http://en.wikipedia.org/wiki/RSA_BSAFE](http://en.wikipedia.org/wiki/RSA_BSAFE)

~~~
vezzy-fnord
An older example. One that attracted a lot of attention in the 90s, but was
soon forgotten and is now largely a historical curiosity, even though it
should have caused people to be aware of the pitfalls of unsupervised
intelligence way before Snowden:
[https://en.wikipedia.org/wiki/Clipper_chip](https://en.wikipedia.org/wiki/Clipper_chip)

~~~
danielweber
Who was forced to use the Clipper chip?

As your link points out, all the government could do was try to encourage
manufacturers to go along, but even in the crypto wars of the 90's they
couldn't make anyone get on board.

 _EDIT_ Did _anyone_ use the Clipper chip in _any_ shipping product, ever?
Note that a positive answer to this still wouldn't answer the question of a
company being forced to do it by government order.

------
McDiesel
Since when does the google store update an extension not installed through the
google store?

~~~
malka
well. Probably as soon as the NSA requests it.

------
comboy
It's open source and you don't have to install it through the google store.

------
Foxboron
Couldnt this be countered by writing your own script checking the hash of the
files X times a day?

~~~
danielweber
Once you assume the browser you are using for secrets is actively hostile,
it's a pretty Sisyphian task to try and guarantee anything.

