The source code is available on GitHub .
The CPU is advertised as 6 MB L2 cache but it has "L1 = 4 x 32 KB 8-way set associative data caches and L2 = 2 x 3 MB 12-way set associative caches (each L2 cache is shared between 2 cores)".
You can clearly see the constant time access to L1 cache up to 32KB. Then grows to (linear?) access time up to 2MB (where it still fits in one of the L2 caches) and then again a different (log-like?) function from there on.
This is really neat.
The most unreflected thing is that people now even use these technologies for their private projects. HTML must be further developed, or some alternative be designed.
We should just say no to the "modern web".
I start to realize that running someone else's code means trusting the developer. No matter the amount of sandboxing and the layers of abstraction, eventually his code will be able to run as root using exploits that are yet to be discovered.
But in reality you (and billions of people around the world) have this thing called web-browser with JS enabled, where opening a simple text-document is basically indistinguishable from running a BTC-miner for the end-user. All these people are fucked right now and have been left to hang. All they can do is to ignore the news and hope it will just go away.
It did seem to me a viable solution some 10-15 years ago, in the time of Web2.0. Now, after mobile apps have spread around the world — I reconsidered. Having an app is just better even if it's just the HN-reader. Even better, if it would be a single app for some common "news-aggregator-protocol", to fetch content from HN, reddit, etc. But it is not only not the case — I heard people say that this attitude is "killing the web". Well, if so, fuck you, web should be killed. It should have never existed at all in the form we currently have.
@Mozilla, are you reading? You were supposed to protect us, a star of hope in a stormy night.
We shouldn't give in to the market when the market is wrong.
What do you mean?
LG v30 (L2 2MB): https://i.imgur.com/q5R3bLY.png
Still works on the latest MacOS (10.13.2) though.
Based on the title, it's something to do with cache latency, but that doesn't really help me. What exactly is it measuring? How does it perform this measurement? What are the limitations of this technique? What are the wider implications of this? What uses does it have, both potentially nefarious ones and potentially beneficial ones?
You would typically see larger latencies when the caches are not being effective.
Multiple humps will indicate the presence of multi-level caches.
The config in Chrome is broken, WebAssembly cannot be deactivated anymore with chrome://flags/#enable-webassembly , setting it to "deactivated" and it's still active.