Hacker News new | comments | show | ask | jobs | submit login
Show HN: detecting cache latency inside a Web browser (maratyszcza.github.io)
107 points by Marat_Dukhan 11 months ago | hide | past | web | favorite | 47 comments



Author here. I made this demo and a related matrix-matrix multiplication demo [1] back in 2015 for Robert van de Geijn's Linear Algebra: Foundations to Frontiers MOOC class [2]. In the light of Spectre attack and recent browsers' changes to reduce precision of timers, I remembered of this project, and decided to check if it still works now, 3 years later. Surprisingly, it still works well!

The source code is available on GitHub [3].

[1] https://maratyszcza.github.io/laff-demos/dgemm.html

[2] https://www.edx.org/course/linear-algebra-foundations-fronti...

[3] https://github.com/Maratyszcza/laff-demos


Could you make the website viewable without JavaScript?

Edit: I think the downvotes are unjustified. For clarification, if that wasn't clear by context, I don't expect to get the JavaScript test results from my computer while viewing the website without JavaScript. Demanding that would obviously be nonsense. Rather than that I assume there is information on that website that is interesting to read even without using JavaScript personally. Or is using JavaScript now a requirement to learn about JavaScript?


There's only a graph of the results, so you are not missing any other content.


Thanks for the info.


I guess you'd have more luck if you had asked "can you make some examples accessible for those of us who don't run JS?".


I didn't know what to expect to see. I vaguely assumed to see some text and data. That's why I carelessly formulated my question like I did. Still, I think the difference is small and my question wasn't extraordinary.


I use uMatrix, this site just shows up as a white page until I allow it to load some JS from a third party. Once I allow the JS, I see a graph that gets built without any explanation of what that graph means.


Is there a particular reason why you can't enable JS yourself? Metered connection? Low-end machine?


I consider it an unacceptable form of code deployment. It's unsafe in the computing sense, and leads to a ecosystem where users are less and less in control of the software they use.


Metered wouldn't really matter would it? Scripts are still downloaded, right? (never had JS off, because the web)


Metered could matter - it's very easy to disable downloading of any non-inlined JS (like with uBlock)


On my pretty outdated Core 2 Duo Q9300: https://i.imgur.com/HQXm5FU.png

The CPU is advertised as 6 MB L2 cache[0] but it has "L1 = 4 x 32 KB 8-way set associative data caches and L2 = 2 x 3 MB 12-way set associative caches (each L2 cache is shared between 2 cores)"[1].

You can clearly see the constant time access to L1 cache up to 32KB. Then grows to (linear?) access time up to 2MB (where it still fits in one of the L2 caches) and then again a different (log-like?) function from there on.

This is really neat.

[0] https://ark.intel.com/products/33922/Intel-Core2-Quad-Proces...

[1] http://www.cpu-world.com/CPUs/Core_2/Intel-Core%202%20Quad%2...


Somewhat related: using the cache latency to estimate the CPU cache size. https://fromwhenceitca.me/cache_size/cache_size.html


So, who thought it was a great idea again to use a mechanism intended for document transfer to run unsigned, unaudited code on just about every client in the world?


That's because companies want to keep making money by using data from users they would never give otherwise and by keeping control over the software people depend upon after it was clear that free software won on the users computers.

The most unreflected thing is that people now even use these technologies for their private projects. HTML must be further developed, or some alternative be designed.

We should just say no to the "modern web".


To be fair the problem is not related to web browsers. It's more about running code in general.

I start to realize that running someone else's code means trusting the developer. No matter the amount of sandboxing and the layers of abstraction, eventually his code will be able to run as root using exploits that are yet to be discovered.


I would argue it is very much related to web browsers. Sure, your torrent-engine or audio-player can be vurnerable as well, but how likely is that? If this would be the case, I'd say that the developers of this audio-player have seriously fucked up, essentially have allowing the third party to run an arbitrary code on your PC — something is doubtfully necessary for an audio-player. The most important question would be — what do you have installed? And, sure, it could potentially be all kinds of malware, but then chances are you are fucked anyway — because installing a software, ironically, requires much more trust than opening a web-page.

But in reality you (and billions of people around the world) have this thing called web-browser with JS enabled, where opening a simple text-document is basically indistinguishable from running a BTC-miner for the end-user. All these people are fucked right now and have been left to hang. All they can do is to ignore the news and hope it will just go away.

It did seem to me a viable solution some 10-15 years ago, in the time of Web2.0. Now, after mobile apps have spread around the world — I reconsidered. Having an app is just better even if it's just the HN-reader. Even better, if it would be a single app for some common "news-aggregator-protocol", to fetch content from HN, reddit, etc. But it is not only not the case — I heard people say that this attitude is "killing the web". Well, if so, fuck you, web should be killed. It should have never existed at all in the form we currently have.


Sure, but until the advent of JS you at least knew whose code you were going to run. Now you are more or less required to run untrusted code just to get through daily life.


And web platform people remain convinced that HTTPS transport encryption is sufficient to protect everyone, even though desktop app, OS, and bootloader people have been doing code-signing for something like two decades.


People wanted animating buttons.


A way to achieve that in declarative HTML or equivalent sane technology can be worked out and be implemented, if it isn't already.

JavaScript is used for many small things where you don't need the expressive power that JavaScript has.

@Mozilla, are you reading? You were supposed to protect us, a star of hope in a stormy night.


Yeah, they did exactly that. But it was too late. And we've really lost the battle when browser vendors (one of whom has vested interest in advertisements and tracking) overtook the HTML standard from W3C.


There is no too late for sane and open standards.

We shouldn't give in to the market when the market is wrong.


The market is definitely wrong, but peeing against the wind sets you up for a really sad outcome if you're not pure in the Stallman-sense.


I’m confused too. Being “pure” somehow keeps your leg from getting wet?


If you're pure, you don't care about getting soaked. You do it, because you believe it's the right thing to do, even if people joke about you and you don't make big money.


> sad outcome if you're not pure in the Stallman-sense.

What do you mean?


Sad outcome: realizing that 99% of the people in our field don't care about freedom or privacy, and are willing to actively work against these ideas as long as it is beneficial for their yearly bonus or their next valuation round.


The original sin is JITing JavaScript.


Brendan Eich?


This is similar to how I detected total CPU cores in 2013 to implement navigator.hardwareConcurrency: https://eligrey.com/blog/cpu-core-estimation-with-javascript...


i5-4690K (L1 4x32KB; L2 4x256KB; L3 6MB): https://i.imgur.com/FUs1isW.png

LG v30 (L2 2MB): https://i.imgur.com/q5R3bLY.png


And with the latest iOS 11.2.2, this (admittedly cool) piece of code no longer works.

Still works on the latest MacOS (10.13.2) though.


Hmm...I just tried on iPhone 7/iOS 11.2.2, and it still works, albeit takes very long to start.


On my OnePlus X Android phone: https://imgur.com/a/C1hb7


On my i7 8700k https://imgur.com/a/D71v1


Thinkpad T42p with Pentium M 1.8GHz, 2MB L2 cache

https://imgur.com/agxy8e7


I'd like to see some text explaining what I'm looking at.

Based on the title, it's something to do with cache latency, but that doesn't really help me. What exactly is it measuring? How does it perform this measurement? What are the limitations of this technique? What are the wider implications of this? What uses does it have, both potentially nefarious ones and potentially beneficial ones?


Is there plan to collect this data voluntarily? Would be interesting to see the different percentiles


No, it is a static web page, and all code runs only locally in your browser.


Is that Ram only or also L1 etc. caches?


Its characterising memory accesses, so it will show up the result of using all the caches in the system.

You would typically see larger latencies when the caches are not being effective.

Multiple humps will indicate the presence of multi-level caches.


Okay, so how can we deactivate ASM.js and WebAssembly? (in the light of Meltdown and Spectre)

The config in Chrome is broken, WebAssembly cannot be deactivated anymore with chrome://flags/#enable-webassembly , setting it to "deactivated" and it's still active.


Asm.js is not necessary, simple JavaScript interpreter is enough. This demo used to work before most browsers implemented optimizers for Asm.js


Mozilla Foundation Security Advisory 2018-01 Speculative execution side-channel attack ("Spectre")

https://www.mozilla.org/en-US/security/advisories/mfsa2018-0...


You need to disable `SharedArrayBuffer` and `performance.now`, which Firefox already did.


chrome://flags/#enable-site-per-process is a mitigation that at least raises the bar a little. :/




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: