
Side-channel attacking browsers through CSS3 features - drchiu
https://www.evonide.com/side-channel-attacking-browsers-through-css3-features/
======
rememberlenny
Equally worth surfacing:

    
    
      Habalov reported the bug to Google and Mozilla engineers,
      who fixed the issue in Chrome 63 and Firefox 60.
    
      "The bug was addressed by vectorizing the blend mode 
      computations," Habalov said. Safari's implementation 
      of CSS3 mix-blend-mode was not affected as the blend 
      mode operations were already vectorized.

~~~
astura
Block quotes are unreadable on mobile, so here's the unformatted text:

Habalov reported the bug to Google and Mozilla engineers, who fixed the issue
in Chrome 63 and Firefox 60.

"The bug was addressed by vectorizing the blend mode computations," Habalov
said. Safari's implementation of CSS3 mix-blend-mode was not affected as the
blend mode operations were already vectorized.

~~~
delaaxe
I agree and it's annoying. Is there an easy way to fix that?

~~~
aepiepaey
When posting a quote, don't indent the text. Just prepend '> ' instead, like
this:

> I agree and it's annoying. Is there an easy way to fix that?

Readable on mobile, and makes clear what is/isn't a quote.

------
jameal
There is a similar vulnerability with `mix-blend-mode` that enables a script
to check whether a user has visited any given URL. It has existed since as
early as 2016 and is still a problem today in Chrome 67. See:
[https://lcamtuf.blogspot.com/2016/08/css-mix-blend-mode-
is-b...](https://lcamtuf.blogspot.com/2016/08/css-mix-blend-mode-is-bad-for-
keeping.html)

And here is a demo:
[http://lcamtuf.coredump.cx/whack/](http://lcamtuf.coredump.cx/whack/)

It has limitations which make it slightly impractical to check against a large
number of sites but it's still surprising to see this hasn't been fixed.

~~~
jacquesm
That's very broken. It misses lots of websites I did visit and shows me
websites I did not visit as though I did.

~~~
Bjartr
Make sure to clock the visible mole, not elsewhere in the page

~~~
jradd
Does not matter. This attack has not work since 2016.

~~~
KenanSulayman
Works in Firefox Nightly 62.0a1 very reliably

------
michaelbuckbee
This goes beyond just identification to actual data leakage, but ties into a
larger point about GDPR and other regulations that we've been discussing ad
nauseum here, which is questions like: "Why don't the GDPR regs just tell me
what I need to do?"

The GDPR being deliberately vague is a response to issues like these, it
serves neither the interests of the users or the regulation to say "Get
consent before getting cookies and only cookies" only to have sites and
services turn around and use methods like this, or:

BrowserFingerprinting -
[https://panopticlick.eff.org/](https://panopticlick.eff.org/)

Super/Ever Cookies -
[https://github.com/samyk/evercookie](https://github.com/samyk/evercookie)

HSTS identification -
[https://www.owasp.org/index.php/HTTP_Strict_Transport_Securi...](https://www.owasp.org/index.php/HTTP_Strict_Transport_Security_Cheat_Sheet)

or whatever the new sneaky identification via browser features hack happens to
be.

~~~
cryptonector
The whole "tell users we're using cookies" thing is idiotic -- every site uses
cookies, and that's fine. The trouble lies not in the site's use of cookies,
but in the use of cookies by third-party content providers used by the site.

It's the FB, Twitter, and other "like" and "share" buttons that are the
problem.

~~~
michaelbuckbee
This is kind of the subtlety that I'm getting at here: what the GDPR says is
you need to say how you (as a controller operating a website) are tracking
people.

Because the minute you throw down a technical rule it's trivial to route
around it. Case in point: say you allowed only 1st party cookies (as defined
by cross domain browser behavior and cookie setting). I could setup a CNAME
for analytics.my-domain.com that points to FB's ad retargeting servers and
they could include in the actual cookie data my id, ip or whatever else they'd
need to lookup a user's information across domains.

Same with IPs: every GDPR discussion devolves into someone saying: "Well every
site uses IPs! We have them in our logs, etc.! It's stupid to say that they're
personal data" Which is how I thought of them until getting a glimpse into
some of the ad networks/bidding where they are treated as more or less a
currency for targeting, demographics, etc.

With GDPR it's not the data it's what done with the data.

~~~
cryptonector
I mean, there's nothing wrong with statutes and regulations being worded
generally as long as they are not vague. Getting too prescriptive in this
particular context would be counterproductive as the technology issues change
over time.

What GDPR should say: things like "do not allow third-parties to track your
users" or "warn users about third-party tracking". That would be exceedingly
clear and actionable.

Now, "do not allow third-parties to track your users" might require some
additional regarding implementation via contractual clauses. For example, if
site A uses resources from site B and has a contract with B stipulating non-
tracking of A's users, is that good enough? What if B is outside the EU? And
so on. But aside from such side issues, "do not allow third-parties to track
your users" is trivial to implement: a) only embed resources from third-
parties that agree not to track your users, b) do not embed resources from any
other third-parties.

"Warn users about third-party tracking" is even easier to implement: if you
embed resources from third parties, you must warn.

> With GDPR it's not the data it's what done with the data.

Good! You can use contracts to manage this with all the third-parties you deal
with.

Now, what about links to [non-embedded] external resources? We must not kill
the web.

~~~
michaelbuckbee
Ok, so you're describing (IMO) exactly what's in the text of the GDPR.

Here is a "plain english" translation of the GDPR that makes much of this more
evident:

[https://blog.varonis.com/gdpr-requirements-list-in-plain-
eng...](https://blog.varonis.com/gdpr-requirements-list-in-plain-english/)

If you dig into it, I think you'll find it lines up nicely with what you are
suggesting.

------
recursive
The gist of it is that there's a timing attack on fancy stacks of blending
modes. Calculating a final pixel color takes different amounts of time for
different underlying pixels. So javascript can "scan" and OCR a page, or an
iframe in that page.

~~~
pishpash
All doors are open for side channel leaks. Having programs run on your machine
is like having someone watch your computer. The issue is how to control the
exfiltration of that data from you computer. That's where the sandbox should
be focused on.

~~~
_yosefk
Not sure why this is being downvoted. I admit that I find it hard to see how
to control the exfiltration of data once it's been figured out by a badly
sandboxed program. But the idea that systematically addressing side channel
attacks in a sandbox is really hard seems very valid. I mean, the exploit
described in TFA is quite the argument in favor of this point.

------
jameslk
It's surprising to me that anything should be allowed to overlap on top of
iframes. This seems to be a continuous vector for other security issues, such
as fake buttons over the Facebook "Like" button, etc.

~~~
mbell
That would break a lot of things. For example you couldn't use a modal on a
site that had a facebook like button on it.

Click jacking is generally avoided by the `X-FRAME-OPTIONS` header or the CSP
frame-ancestors option.

~~~
makecheck
Are you sure? Maybe the modal couldn’t cover the Like button and it might be
weird if the user could still click Like while your modal is up but it
wouldn’t _break_ your page if this were the case.

~~~
mbell
I don't understand what you are proposing. If the modal can't cover the like
button, how would you implement a modal? How would you implement the standard
background fade when a modal is open (which is a div covering the entire
page)? How would you propose implementing this change such that it doesn't
require millions of websites to change their code?

~~~
makecheck
A full background fade is an aesthetic thing at best, and it would be
disingenuous to fully fade a page if some parts are not in fact blocked (i.e.
remain clickable). So I think it would require no changes, it would just look
different.

Having said that, I see no reason to support Like buttons OR modals and I
would be fine with millions of sites being forced never to use them. We know
Facebook Like buttons are over-engineered trackers whose domains are best
blocked at your router. And when modals are not being abused for things users
don’t want, the remaining “good” uses for modals would still be better
implemented as less-intrusive and more-asynchronous things (display a modeless
background message with buttons, for example).

~~~
J5892
It seems to me that you've never worked at a company with a marketing team.

------
zethraeus
Alternative interpretation: Iframes are so overpowered and such an edge case
for the browser's security models that they cause constant issues with the
rest of the reasonable browser spec.

~~~
unilynx
Iframes will never go away.

But we need a way to tell a browser "if you embed this page into an iframe, it
needs to be the topmost content, no transform/translate/visibility or
anything". Social/login iframe would set that flag, and it would prevent the
clickjacks and attacks like this.

~~~
user5994461
All websites are supposed to set the header X-Frame-Options: DENY to block
iframes. It's a solved problem.

~~~
empyrical
That doesn't solve the issue of clickjacking attempts on pages meant to be in
iframes (FB like buttons are in iframes)

------
Groxx
Worth noting that this general category of render-timing-based pixel-reading
has been around for a while. e.g.
[https://www.contextis.com/media/downloads/Pixel_Perfect_Timi...](https://www.contextis.com/media/downloads/Pixel_Perfect_Timing_Attacks_with_HTML5_Whitepaper.pdf)
is from 2013, uses svg instead, and I could swear I've seen a CSS one from a
few years ago.

I really suspect there's only one fix, and it seems reasonable to me: don't
allow sites to place content over iframes, period. Allowing it opens all kinds
of exploits like this and various flavors of clickjacking, all basically un-
preventable since there's no limit to the techniques possible.

------
lambda
I don't think this can really be blamed on CSS. I mean, yes, a new CSS feature
allowed a timing attack that could extract information, but these kinds of
issues keep on coming up time and time again, meaning that it's not this
particular CSS feature that should be blamed.

The issue is the ability to interact in any way with cross site resources
which contain any kind of potentially sensitive information. This leads to
things like clickjacking attacks, CSRF, information leaks like this, and so
on.

The things that could be done to fix it:

1\. Don't allow any kind of cross-site embedding (yeah, this isn't going to
happen) 2\. Treat any kind of cross-site embedding like private browsing mode;
don't ever send any credentials along with it 3\. Don't allow the embedding
site to interact in any way with embedded content. Treat it like an entirely
separate, opaque layer above everything else, not subject to layering anything
over it

Of course, I don't think any of these are actually going to happen, because
they'd break too much. But otherwise, it's going to be a game of whack-a-mole
with information leaks, new kinds of clickjacking and CSRF, and so on.

~~~
blattimwind
> 3\. Don't allow the embedding site to interact in any way with embedded
> content. Treat it like an entirely separate, opaque layer above everything
> else, not subject to layering anything over it

I'm actually surprised to hear that this is _not_ the fix here. Instead, they
optimized the rendering code, which might not preclude more sophisticated
attacks on that side channel.

~~~
mattnewton
As noted elsewhere, this would cause sites with models or menus to appear
broken, when the menus or models are stuck behind the iframe. Yes I know,
don’t make sited with iframes and litghtboxes or whatever then, but people do,
and the option they chose won’t break those people.

------
mcav
Building a powerful sandbox that maintains anonymity is a difficult problem.
The statement that CSS is overpowered is clickbait for a _different_ argument.

------
A_No_Name_Mouse
"Besides Habalov, another researcher named Max May independently discovered
and reported this issue to Google in March 2017."

I wonder why they are fixing it now, when they didn't do anything for over a
year

~~~
ThePadawan
The disclosure is being _published_ now. From the timeline at the bottom of
the article, it becomes clear that they only ignored the issue for 2 months.

That is still more than zero delay, but a bit more reasonable.

~~~
A_No_Name_Mouse
How is that? >"2017-12-06 Fixed with Chrome version 63.0

2018-05-15 Fixed with Firefox Quantum version 60.0" That's 9 months for Chrome
and more than a year for FF Quantum?

~~~
andrewaylett
But also "2017-11-26 Reported the vulnerability to Mozilla’s VRP" because "due
to some misunderstandings on our side, reporting the vulnerability to Mozilla
was delayed".

So I think "more than a year" is a little unfair to Mozilla.

------
dang
Url changed from [https://www.bleepingcomputer.com/news/security/css-is-so-
ove...](https://www.bleepingcomputer.com/news/security/css-is-so-overpowered-
it-can-deanonymize-facebook-users/), which points to this.

------
barbegal
This is really a timing side channel attack. Such side channel attacks have
been known about for a long time but this one allowed you to leak information
about the color of a single pixel in your target page. I wouldn't say this is
due to CSS being overpowered, it is much more due to the way that iframes can
be configured to show a single pixel and the way that rendering times for your
page are leaked into JavaScript. If your rendering is not done in constant
time then the vulnerability occurs and CSS in this case allowed you to trigger
non constant time rendering. The fix was to make the rendering constant time
again.

------
tripzilch
This line particularly caught my eye:

> It was quite surprising for us to find out that the blend mode layers were
> able to interact with cross-origin iframes in the first place so we
> investigated this further.

Because a few years ago I was wondering the same thing; Someone figured out a
side-channel timing attack using SVG filters, exploiting differing execution
times of code paths in the Erode filter (probably also in the Dilate filter
but they only needed one). IIRC, they could even apply it to an iframe with a
source:// URL of another domain (probably FB again), of which they could
control the scrollbars and scroll into view the session key that showed up in
the HTML source somewhere. This raised a couple of questions with me ...

Why can we put a source:// URL in an iframe at all? Why can we control the
scrolled position of content in a cross-domain iframe? And why are SVG filters
capable of processing cross-domain content instead of just seeing a black
square or something?

AFAIK they fixed it by changing the filters, making sure all code-paths were
of equal length. Which is part of the right solution because side-channel
attacks can come from anywhere (it's kind of their thing). But making the
whole system more robust by just not allowing _operations_ on cross-domain
content would do a whole lot for plugging many of these bugs.

I'm kinda surprised there haven't been a lot more of these. Could be I've
missed them though.

------
lucideer
This is a really interesting vulnerability, and really interesting research,
but the clickbait title-with-an-agenda is really terrible. Can we change it?

------
2bitencryption
> Habalov says that depending on the time needed to render the entire stack of
> DIVs, an attacker can determine the color of that pixel shown on the user's
> screen.

I wonder how this is the case?

Anyway, lots of interesting vulnerabilities of late that utilize the
measurable time of computations, and use it to reconstruct data. I think that
type of "lossy attack" is so cool and creative.

~~~
recursive
A fancy darkening algorithm on #000 can be short-circuited. I don't know if
that's precisely what's going on here, but it's a general example.

------
maxk42
This isn't the first time a color-exfiltration vulnerability has been found in
CSS. Disappointing that we're revisiting this same issue.

~~~
bsimpson
I believe this is also why there's not a CSS equivalent of Adobe's Pixel
Bender shader system.

They're worried that someone could write a shader that would detect sensitive
info (like a credit card number) and jank the render thread in an observable
way to leak the number to a script running in the parent window.

~~~
Bjartr
Actually, CSS shaders are a thing, but can't access DOM content pixels for
exactly that reason.

[http://alteredqualia.com/css-shaders/article/](http://alteredqualia.com/css-
shaders/article/)

------
mirimir
Expecting browsers to prevent cross-site exfiltration and tracking is pretty
iffy. When it really matters, even preventing that at machine level is pretty
iffy.

Compartmentalization among multiple machines is better. Maybe sandboxing aka
light virtualization can be adequate. But I'd rather go with at least full
virtualization. And when it really matters, I use multiple hosts, with network
isolation.

------
test6554
I am fine with locking down cross-origin iframes, but one major annoyance is
the inability to detect whether the iframe has a scrollbar and if so, adjust
the size until it doesn't have one. I really wish that there was a "height:
auto" property for iframes that automatically adjusted the height so that it
didn't need a scrollbar.

~~~
unilynx
Once upon a time the specs planned a <iframe seamless>, but unfortunately it
was never widely implemented. Probably because it also tried to do things like
letting the outer CSS leak into the seamless frame.

------
throwaway5752
Short of something drastic like Lynx, is there a good dumbed-down browser from
a reputable source? It feels like there's no good check on site complexity.
The security risk, but also the absolutely embarrassing resource consumption
is almost too much to bear.

~~~
astrobe_
You don't realize it, but by saying this you are rejecting the modern Web
(insert horror screams here). So maybe try the old passive web: Gopher.

~~~
throwaway5752
Hah, maybe a middle ground vs gopher. The GDPR mitigation from NPR
([https://text.npr.org/](https://text.npr.org/)) was refreshing. The amount of
effort for display capabilities above and beyond moving around plain old text
+ some markup for layout has been totally out of proportion to the end-user
value. I kind of wonder things like, "I wonder how much the memory/cpu it
takes to run Slack (just on my laptop) vs total at NASA to design and managed
the Apollo 11 mission". Anyway, that's OT enough. It's just that this is such
a ridiculous vector that it's irritating.

~~~
roywiggins
text.npr.org is older than the GDPR

[https://news.ycombinator.com/item?id=15342758](https://news.ycombinator.com/item?id=15342758)

~~~
throwaway5752
Thank you! I didn't know that.

------
fareesh
Now that is crafty.

------
anchpop
Is there a mirror? Site's down for me

~~~
recursive
[https://webcache.googleusercontent.com/search?q=cache:FoYSf9...](https://webcache.googleusercontent.com/search?q=cache:FoYSf9fZz9MJ:https://www.bleepingcomputer.com/news/security/css-
is-so-overpowered-it-can-deanonymize-facebook-
users/+&cd=1&hl=en&ct=clnk&gl=us)

