This will take several minutes, so if you see weird incongruities or disappearances, hold your fire.
Edit: Ok, I've done as much of this as I'm going to do. If you notice anything wrong, can you let us know at email@example.com so we can fix it?
* https://zombieloadattack.com/ , on Hacker News as https://news.ycombinator.com/item?id=19911341 (The technical paper is hidden inside a collapsed part of the page and is at https://www.cyberus-technology.de/posts/2019-05-14-zombieloa... .)
* https://mdsattacks.com/ , on Hacker News at https://news.ycombinator.com/item?id=19911277
* Google's announcement about ChromeOS at https://sites.google.com/a/chromium.org/dev/chromium-os/mds-... , on Hacker News at https://news.ycombinator.com/item?id=19911406
( Several Hacker News discussions have since been merged here. And were then re-split. )
( This comment was merged from a duplicate discussion. )
Computer makers? Wouldn't that be OS makers? They are patching their OS to prevent leaking...
Almost like saying that the "software maker" John Deere will fix their latest-model Haverster.
How many days are you willing to wait for your computer to boot?
GPUs aren't "better" that CPUs, they're different. Between the two, CPUs probably make better GPUs than GPUs make CPUs, but it's a tough call; neither of them are very good at the other!
Huge, huge swathes of your normal boot process would be trying to run on a single GPU execution unit, since there would be no parallellization available, and your GPU is terrible at out-of-order dispatch (i.e., basically can't do it last I knew), so all the optimization we've spent the last 50 years putting into our CPUs won't be firing. You'd basically be trying to run your computer on something that would be in the range of 100MHz down to for all I know single-digit MHz-equivalent of your current Intel or AMD CPU (after all the penalties around not using any prediction, not having the proper caches, thrashing like hell in your GPU's memory caches, and all the other effects... I'm not even sure I'm willing to promise you'll never hit code with KHz-equivalent performance; you might just get that NES performance(!)).
Your modern GPU-based computer trying to boot Windows or Linux is gonna craaaaaaawwwwwwllllll. Can it do? With the right work, yeah, probably, but you're not going to enjoy it, or be willing to use it. CPUs are terrible GPUs, but GPUs are terrible CPUs.
How well a computer could run if it was optimized for the GPU is an open question, but I guarantee you that if in some bizarre parallel universe everything was running on GPU-like hardware, but in 2010 suddenly people figured out CPUs and next year the Core Duos were available, people would be flipping out over how awesomely they perform and would be rushing to rewrite huge swathes of code in these new-fangled "in-order execution units".
The examples you give are gaming machines, so the programs on the CPU, especially on older machines, are based around video interrupts to facilitate smooth graphics. It’s a program choice on the CPU.
Meanwhile, GPUs are designed to run thousands of threads in parallel, and orders of magnitude slower than a CPU when running only a handful of threads.