
Fantastic Timers: High-Resolution Microarchitectural Attacks in JS (2017) [pdf] - jhatax
https://gruss.cc/files/fantastictimers.pdf
======
userbinator
It's apparently a pretty controversial position to have now, but I'll say it
again:

Leave JS off by default. Whitelist sites that absolutely need it. Running
completely untrusted foreign code on your hardware has been, is, and likely
will continue to be a source of security problems.

As a bonus, you get to enjoy much faster-loading, nearly ad-free and cruft-
free pages on almost every site. Now couldn't be a better time to try it out.

~~~
madez
I'm astonished at how fast websites are without JavaScript. I just turned it
on today to make an order on EBay and I was immediately annoyed by how slow
and sluggish it was. Why aren't there programs to buy stuff on the internet?

There are some other minor pain points though. Why can't I collapse comments
here on HN without JavaScript?

There are more examples of minor issues that seem to be completely solveable
by a statically defined markup language. Mozilla, where are you when we need
you?

Allowing JavaScript only on certain sites is not a good solution. First, it
leads to unfair conditions and thus centralization. Second, the appification
of the web including it's consequence of ever increasing dependency on certain
software that is outside of the control of the users continues.

~~~
pdkl95
> Why can't I collapse comments

Features like this could easily be a declarative feature in HTML. e.g.
something like:

    
    
        <div class="comment_wrapper">
            <action type="show,hide" for="foo">
              <img ...> <!-- or whatever -->
            </action>
            <div id="foo" class="comment"> ... </div>
        </div>
    

Unfortunately, development of new HTML features like this mostly stopped when
a major browser developer decided to try to fight with Microsoft for the
desktop/mobile markets by changing webpages into software.

~~~
odammit
You don’t even need that. You can work some magic with CSS to hide/show and
even rotate/flip elements using inputs.

> input:checked + div { display: none; }

Some people don’t like that either though. :/

------
benjaminjackman
Woah!

Summarizing:

...

Goethem et al. exploited more accurate in-browser timing to obtain information
even from within other websites, such as contact lists or previous inputs.

...

Oren et al. recently demonstrated that cache side-channel attacks can also be
performed in browsers. Their attack uses the performance.now method to obtain
a timestamp whose resolution is in the range of nanoseconds. It allows spying
on user activities but also building a covert channel with a process running
on the system. Gruss et al. and Bosman et al. demonstrated Rowhammer attacks
in JavaScript, leveraging the same timing interface. In response, the WC and
browser vendors have changed the performance.now method to a resolution of 5
µs. The timestamps in the Tor browser are even more coarse-grained, at 100 ms
.

In both cases, this successfully stops side-channel attacks by withholding
necessary information from an adversary. In this paper, we demonstrate that
reducing the resolution of timing information or even removing these
interfaces is completely insucient as an attack mitigation.

...

Our key contributions are:

– We performed a comprehensive evaluation of known and new mechanisms to
obtain timestamps. We compared methods on the major browsers on Windows, Linux
and Mac OS X, as well as on Tor browser.

– Our new timing methods increase the resolution of ocial methods by 3 to 4
orders of magnitude on all browsers, and by 8 (!!) orders of magnitude on Tor
browser. _Our evaluation therefore shows that reducing the resolution of timer
interfaces does not mitigate any attack._

– We demonstrate the first DRAM-based side channel in JavaScript to exfiltrate
data from a highly restricted execution environment inside a VM with no
network interfaces.

– Our results underline that quick-fix mitigations are dangerous, as they can
establish a false sense of security.

~~~
pdkl95
> reducing the resolution of timer interfaces does not mitigate any attack.

> quick-fix mitigations are dangerous, as they can establish a false sense of
> security.

This demonstrates - again - the danger of treating security as default-permit.
This blacklist-style thinking is very common, but it is _guaranteeing_
eventual failure because _you cannot enumerating badness_ [1].

Limiting the granularity of performance.now _assumes_ that providing any
timing information at all is safe. It's the same basic misunderstanding of
what it means to design for security I hear way too often whenever a new
security issue is being discussed: someone always asks "Is this an actual
problem in the wild, or whining about some hypothetical that isn't a 'real
threat'?" So what; future threats are nt limited to only the attacks we know
about today.

As I sain in a recent comment[2], the thing that nobody really wants to talk
about is that it isn't possible to know the behavior of programs a Turing
complete language without running the program. Declarative documents in
HTML+CSS were safe, but trying to run potentially malicious Turing complete
programs safely is _provably_ futile endeavor.

[1]
[http://www.ranum.com/security/computer_security/editorials/d...](http://www.ranum.com/security/computer_security/editorials/dumb/)

[2]
[https://news.ycombinator.com/item?id=15708099](https://news.ycombinator.com/item?id=15708099)

------
xigency
Why high precision timing was ever introduced into browsers is beyond me. I
don't see why timers should be more accurate that millisecond precision or why
the timer should return different values when polled more than once in the
same execution context (as an event based language).

Edit:

> Unlike other timing data available to JavaScript (for example Date.now), the
> timestamps returned by Performance.now() are not limited to one-millisecond
> resolution. Instead, they represent times as floating-point numbers with up
> to microsecond precision.

What a terrible idea. If this kind of profiling is needed, it should be better
off as a feature of the browser developer tools, not a built-in function.

None of these examples are compelling to me either: [https://w3c.github.io/hr-
time/#introduction](https://w3c.github.io/hr-time/#introduction)

HTML 5 audio has its own standards as does drawing. In fact, a high-precision
timer is the worst solution to any of those problems.

~~~
niftich
It was introduced because " _web developers are building more sophisticated
applications where application performance is increasingly important.
Developers need the ability to assess and understand the performance
characteristics of their applications using well-defined interoperable
measures._ " [1].

Browser-engine makers, like WebKit, were itching [2] to implement this, so
they participated early on.

The proceedings of this W3C working group were conducted in the open; their
mailing list archive is public [3].

For a cute and oblivious early message, see this post [4] from July 2010.
Luckily it was eventually followed up by the first wave of enumerated privacy
concerns [5] in October 2010.

[1] [https://lists.w3.org/Archives/Public/public-web-
perf/2010Jun...](https://lists.w3.org/Archives/Public/public-web-
perf/2010Jun/att-0002/webperf.html) [2]
[https://lists.w3.org/Archives/Public/public-web-
perf/2010Jul...](https://lists.w3.org/Archives/Public/public-web-
perf/2010Jul/0012.html) [3] [https://lists.w3.org/Archives/Public/public-web-
perf/](https://lists.w3.org/Archives/Public/public-web-perf/) [4]
[https://lists.w3.org/Archives/Public/public-web-
perf/2010Jul...](https://lists.w3.org/Archives/Public/public-web-
perf/2010Jul/0009.html) [5] [https://lists.w3.org/Archives/Public/public-web-
perf/2010Oct...](https://lists.w3.org/Archives/Public/public-web-
perf/2010Oct/0027.html)

------
feelin_googley
Someone once suggested the risk in opening email attachments was analagous to
the risk in running executable code (e.g., Javascript) from websites. Each can
expose the user to a multitude of vulnerabilities. The argument against this
comparison at the time was that Javascript from websites, unlike email
attachments, could be run in a "sandbox". Even if the code could not be
trusted, it was "safe" because it was "isolated".

Today, many users do not run Javascript because all too often it exposes them
to excessive advertising and resource usage. Now, after researchers defeated
KASLR with Javascript, and with Meltdown and Spectre being implemented in
Javascript, there are additional benefits of not running third party
Javascript.

------
nostoc
That was extremely quick. Did they have this paper up their sleeve, expecting
browsers to implement these kind of mitigation?

~~~
jhatax
OP here. I didn’t write the paper, but had goose bumps (in a scared, WTF kinda
way) after reading through it. Thanks for the comments so far...

I got the link to this paper from Luke Wagner’s post [1] on Mozilla’s Security
blog regarding Spectre, and migrations being implemented in Firefox. I had a
strong sense of foreboding after reading both the post and the linked PDF for
a few reasons:

1\. The overall tone of the post is pretty dour because a veritable Pandora’s
box has been opened by these findings.

2\. The team doesn’t really know how to fix the issues surfaced without
performance penalties.

3\. Not every attack vector is known to the research and browser development
community!

4\. Firefox (and other browser vendors) strongly believes that the web as a
platform holds a great deal of promise. For this to become a reality, browsers
need to perform close to native speeds. High fidelity timers are deemed
pivotal to the performance equation even though they risk the overall security
of the system. This is why Luke and team are going to invest heavily in fixing
these issues as they are reported.

This last point is key, and it surfaces the inherent Catch-22 faced by browser
vendors. They need to balance performance and security considerations for the
web platform to succeed. And right now, security is being sacrificed at the
altar of performance.

This brings me to the question that I have been noodling over all week long:
Is the notion of the “web as a platform” an anachronism? Can we ditch this
idea because native apps have won?

More insightful and influential folks in the HN community should chime in here
with their perspectives.

1\. [https://blog.mozilla.org/security/2018/01/03/mitigations-
lan...](https://blog.mozilla.org/security/2018/01/03/mitigations-landing-new-
class-timing-attack/)

Edit: light wordsmithing

~~~
pjmlp
Personally I am betting my horse on the native side.

My web development projects were a pleasure back when it was all about
HTML/CSS instead of trying to duplicate an OS.

------
ezoe
I was wondering how can I use SharedArrayBuffer to implement the high-
resolution timer. It's just consist of a dedicated thread keep incrementing
the counter memory on SharedArrayBuffer. Read the counter value in
SharedArrayBuffer from other threads and it's effectively a high-resolution
timer.

It can achieve 2 ns resolution.

------
cscheid
So this paper appears to have been published on 2017-12-23 (according to
[https://link.springer.com/chapter/10.1007/978-3-319-70972-7_...](https://link.springer.com/chapter/10.1007/978-3-319-70972-7_13)),
which was ahead of the Specter and Meltdown disclosures. Truly unfortunate
timing for the browsers implementing mitigation, huh.

It seems that one of the covert channels described here has been fixed
(SharedArrayBuffers) but many of the other ones have not. The passive reading
of data from timing-based side effects (like CSS animation as described in the
paper) seems particularly hard to avoid in general. HTML5 video will have the
same vulnerability, I bet.

------
Analemma_
I had a depressing feeling this was coming.

Are cache timing attacks just inherently impossible to stop? It’s sure
starting to look that way with the last couple years of security research.
Seems like every mitigation that gets thrown up is knocked down immediately.

~~~
benjaminjackman
It may be that the only true solution is to replace / re-engineer CPUs with
these vulnerabilities expressly in mind, and guarded against.

Until then (and that will probably take a while obviously) it's probably going
to be a cat and mouse game. I would expect a lot of patching. Future
discoveries probably won't have the luxury of a 6 month coordinate / fixing
window.

On top of that this is drawing a lot of attention to a particular area which
will likely have a snowball effect as more and more exploits are found more
and more will search for them.

Finally, If the mitigation strategy is 'don't let a javascript application be
able to accurately know how much time something takes' and for 25 years that
has not been a serious fundamental design consideration (quite the opposite to
the point that a super accurate one was given freely in the form of
`performance.now`). It seems like this is one big messy ball of yarn to
untangle.

~~~
DannyBee
"It may be that the only true solution is to replace / re-engineer CPUs with
these vulnerabilities expressly in mind, and guarded against. "

No such way that anyone knows about exists without huge performance loss.
Without changing existing software/programming/etc paradigms, it is, IMHO,
quite unlikely such a way exists.

------
fpoling
It is interesting that all security measures that Tor developers implemented
gave them exactly nothing. The researches were able to get the same 2ns
resolution as with stock Firefox.

I wonder if the measures in FF to counter the Spectre will fire any better.

------
dboreham
I was wondering when we'd see a paper like this since I saw the post on
Firefox reducing timer resolution. My immediate thought was "there have to be
other ways to get good enough timers to mount these attacks". Sure enough..

------
jnwatson
Nice timing. This follows up well from my previous HN comment, that timers are
everywhere. Pretty much any operation that has a small predictable time can be
used as the basis for timing something else.

------
jhallenworld
I'm pretty sure a timer on a webserver is going to be good enough to determine
if an item is cached or not in the client. The time difference between cached
and not-cached is huge.

