Hacker News new | comments | show | ask | jobs | submit login
Ask HN: Delay buying new devices due to Meltdown/Spectre CPU bug?
114 points by m3nu 9 months ago | hide | past | web | favorite | 59 comments
Is it advisable to put off any device purchases until this issue is fixed in a new generation of CPUs? So in 1-2 years?

Unless you are building a server, it does not matter. Your GPU handles a lot of desktop heavy workloads such as gaming. If you are building servers or a machine for CPU intensive tasks such as compiling chromium every hour, then you might want to look at AMD. The major performance hits are caused by meltdown patches that do not affect AMD. And the amount of cores they give certainly help in these workloads.

Fixing these bugs at CPU level will require changes in architectures. Such sweeping changes won't come for years. Intel has not made such a overhaul in years and AMD just made one.

> Unless you are building a server, it does not matter

The average HN user is not the typical desktop/laptop user. I bet we use virtual machines, compilers, technologies like nvme, databases (postgres/mysql)... more often than the typical desktop/laptop user.

Yes, we do. But I doubt anyone is using them for production workloads. Development workloads don't continuously push cpu to 100% or even 70% for large periods of time. Development workloads don't make 1000 req/sec on postgres. No one has 10 vm's running at a time on a desktop.

These perf issues will only affect the server and cloud providers or some edge cases on the desktop. Everyone else can just apply the patches and go to work.

I think parallel compilation is going to be hit the most. On my VS2017 with the latest MSVC and C++17, and a moderate C++ project (very few units, only about 100 files and so many constexpr and TMP) with multi core compilation enabled, plus a E5-2660, I have a record of 10% penalty(originally a minute) in build time after clean-and-rebuild. Haven’t tested with IncrediBuild tho, but my paralleled webpack production build seems to offer the same outcome. 30 secs to 36 secs. Both of which is not that really mattered I’d say. I’m using Windows 10 with KB4056892.

You are probably right although C++ compilation can use a fair bit of processor, especially with parallel compilation (e.g. Qt jom). Not sure how frequent system calls are though.

C/C++ compilations typically read and write a zillion different tiny files and even if they happen totally in memory buffer they'll still do a hell a lot of system calls.

> The average HN user is not the typical desktop/laptop user.

Maybe so, maybe not. Personnaly I have the hunch that most HN users use the same setup as the typical desktop/laptop users.

They might have the same macbook pro, but it might not be a dedicated email/youtube machine.

Maybe, maybe not on their personal computer. Their families' machines are a different story.

Well, iam not.

Also i do not host critical production machines, but some web and database server. I wonder how the "i host some stuff on the internets" guy is affected by this stuff.

Quite. If Intel, AMD, and ARM just slam in fixes now and release as soon as they can, assuming that's even possible, there's a significant chance they'll introduce other security problems. To gain a high degree of confidence that hasn't happened will most likely require extensive design iteration, verification, and testing so, yeah, years unfortunately.

> Your GPU handles a lot of desktop heavy workloads such as gaming.

Depends on a game, though. While true in most cases, one of my favourites - Kerbal Space Program - is heavily CPU-bound due to being mostly a real-time physics simulation.

There are certainly exceptions. But most games aren't like Dwarf Fortress.

Even some games you might not expect could be in there. Guild Wars 2, for example, consistently uses the vast majority of my cpu.

Also, Valve's source engine is known for being relatively cpu-bound (compared to other engines).

Looks like Intel is already out with a patch for Meltdown: https://www.theverge.com/2018/1/4/16850776/intel-meltdown-sp...

A microcode update, which simply enables some previously disabled features, that allow you to workaround the problem (ie. by clearing the branch prediction buffer). The hardware is still most definitely vulnerable to both Meltdown and Spectre, and will require software workarounds to hide the problems, which in turn will cause lower performance.

There's no point in waiting. Neither Intel nor AMD will significantly adjust their roadmaps due to these issues simply because they don't realistically have the flexibility to do so.

Evaluate your threat model in the context of Meltdown/Spectre and opt in or out of the mitigations accordingly. There are relatively few cases where the workload is both significantly affected by the mitigations and vulnerable to these attacks (Xen paravirtualization would be the prototypical example). Personally, I opt out of page table isolation, KASLR and any retpoline-style mitigations on my desktop systems and compute servers for performance reasons. Make sure you understand the implications of these choices if you go that way, though.

Do you have actual facts to back up your first paragraph? If it's just an informed opinion, mine is: if you can, you should definitely wait before buying new server parts.

You underestimate how long it takes to "patch" hardware. Having an opinion does not make it an informed one.

Read the following thread: https://twitter.com/securelyfitz/status/949370010652196864

"Come 2019 and 2020, other products in the pipeline will have more involved fixes that again improve performance over the software and quick fixes. The solution everyone wants is a full fix with no performance impact. I can't imagine that coming any sooner than 2021"

I don't have any facts that aren't already public - the official as well as the leaked roadmaps of the CPU manufacturers and general knowledge of how the industry works. What I'm wondering is - what, specifically, do you think is it that's so worth waiting for at the moment?

The Meltdown and Spectre attacks require code execution on your local machine. You can avoid both the Meltdown and Spectre attacks by not downloading and running untrusted software.

The Javascript attack vector for Spectre will be patched by browser vendors.

If you operate safe computing practices it is unlikely you will be hit by either the Meltdown or Spectre attacks.

> The Javascript attack vector for Spectre will be patched by browser vendors

Not just limited to browsers though. Content may be injected via other sources. Its difficult to get an exhaustive list of such application. But here is an example of iMessage : https://threatpost.com/inside-the-latest-apple-imessage-bug/...

iMessage uses the WebKit rendering engine to run Javascript. The WebKit engine will be patched by its vendor Apple.

Ugh.. when Apple introduced apps and stickers into iMessage, I knew it could be a vector for attacks.

Too bad Apple doesn't give users the option to have only "text-only" messages.

It does it’s called using SMS.

Unless I'm missing something, Spectre isn't really a JS-based vulnerability. As far as I know, any code that runs directly on a computer with one of the affected CPUs could be used to perform the Spectre attack. Feel free to correct me if I'm wrong.

There are two components: the branch prediction needs to be trained (to get it to speculatively execute the right instructions on context switch to the victim process) and precise timing needs to be available (to exfiltrate data using differences in what got cached). Without executing code, it's difficult to see how the two components could be triggered.

Loading complex file formats is structurally similar to interpreting code; the output of the interpretation is some data structure, rather than some side effect, but otherwise it's very similar. There may be some loaders that are controllable and configurable enough to be programmed by data - perhaps something in the video decoding space that isn't offloaded to dedicated hardware or GPU - but I think it would be tough to find. JS or a downloadable game (which already lets people in the front door) of some kind are the best vectors.

Sure, JS was the most obvious attack vector, but my point was that it's not limited to JS. Any code that runs natively on an affected computer can exploit it. The only advantage JS has is that it's easier to get targets to run malicious code through a web browser. You seem to be in agreement with me on this.

Maybe OS-wide blocker for malicious JS?

No thanks, just thinking about running 'Norton Anti-JS' (et al) on a server is enough to induce nausea.

In other words, blocking 'malicious' code is reactive, not pro-active. Reactive solutions by definition end up solving yesterday's problems while being oblivious to today's - viz. liquids restriction in hand luggage.

Isn’t the performance loss due to KPTI still a big factor for consideration? The security risk might not be huge but you’re still losing at least some performance due to added overhead on nearly every OS.

I suppose you could disable it on Linux, but at that point it seems like you’re going against the tide.

Yep, until your "trusted code" (e.g. Apache, PHP,...) has a remote code execution vulnerability.

This is such a dangerous advice... you should really only disable these things if:

a) your computer is not connected to the internet in any way

b) it is in a trusted environment

I wonder how long it will take to get the fixes to all electron/cef based applications.

I still bought. Even if there are changes to the systems in the future to address them at a lower level, it'll be at least one generation away probably two. The timelines on designing new CPUs are multi-year beasts because of the complexity, and with the next generation happening "soon" I doubt a redesign could be feasible for them. So your looking at what I'd guess is a year or so for any movement.

Delay or upgrade by purchasing used hardware. I often buy my gear 3-4 years old from Ebay, it lets me upgrade without the sticker shock. This assumes you don't need the latest and greatest. I do this with laptops and server gear.

This is similar to purchasing used cars, you can steadily upgrade with much less capital and therefore each purchase holds much less risk. Also it's much better for the environment.

Buying old or used hardware from 3-4 years ago does not protect you from Spectre/Meltdown.

I never said it does. Buying new hardware today does not protect you from Spectre/Meltdown either, the new hardware shipping for the next year at least (and the stuff currently on the shelf) does not protect you.

I suggested delaying a new purchase until the CPUs are fixed (who knows how long that will take) or upgrade by purchasing used to get a bump in performance regardless of the hardware flaw (which will still require software mitigation and thus have performance degradation).

Most hardware worth using right now has the hardware flaw so you might as well upgrade (for performance) by purchasing used.

At scale or for personal use?

If at scale, you probably don’t have an option because you need it now.

If for personal use, we don’t know what the fix would look like, or whether there would be a perf hit. If you can hold off for a few years sure do it, but it’s not a huge part of the equation now.

How long will this device last you? If less than 2 years buy now.

Honestly very few reasons to wait. Everyone has this problem. The only thing I might wait for is clarification of the attack surface area of spectre since that can’t be patched but it seems like it’s difficult to pull off in most cases.

I'll refrain from buying an iMac Pro (which was on my roadmap) until the situation is fully clarified.

I mean how frustrated would one feel if it turned out that one spent that amount of money in a computer that should be replaced asap?

I would wait a month or so to get all information about this topic and evaluate it. As of now, it seems that AMD has an immediate fix that does not impact performance while Intel's immediate fix has a large performance impact. I do not understand why most people want to stick with Intel...

I just got an 8700k last week. I was thinking about returning it and go with Amd instead. The truth is, I do not do many tasks on my home computer where the CPU is going to be at >50% at all times.

The other truth is, that Intel CPUs are just faster than Amd Cpus (now with a few exceptions, but for general use that still applies).

I'd say if you are talking about a large quantity purchase, you should maybe wait or look for alternatives. Other than that it should not make much difference for us "normal folks."

Another truth is that there is / was so much misinformation out there.

Finally, I wonder if there are going to be any optimizations to these fixes in the future?

The cynic in me is amazed how people complain about a loss of 0-30% in performance on the desktop (depending on the workload), but are completely happy to replace native applications by Electron applications and/or web applications. Of course, these may be disjoint sets of people.

I think when it comes to most desktop workloads, the security issues are far more interesting and serious. And while Meltdown only affects Intel CPUs, Spectre affects all (?) out-of-order CPUs and there may be more/more effictive attacks in the future. So every fast, modern CPU is pretty much in the same boat and it will take some time for the dust to settle.

So, I would just stick with the Intel CPU. You can vote with your wallet once one vendor decides to axe ME/PSP and/or properly mitigates Spectre-type attacks ;).

The problem, I think, is a psychological one.

No one likes to get stuff taken away. People were also annoyed how "bad" their VW drives after the diesel fixes. Even if you probably never hit the limits to feel the difference, just the thought that from now on your CPU could be slower than it was yesterday is annoying enough.

But I agree with you. It is the wrong crowd that complains the loudest!

> The cynic in me is amazed how people complain about a loss of 0-30% in performance on the desktop (depending on the workload), but are completely happy to replace native applications by Electron applications and/or web applications. Of course, these may be disjoint sets of people

I think this largely due to faulty bottleneck analysis. I'm not hugely slowed down overall if the switch to a chat application is 50ms slower, even if I may notice it. If the tests I need to run at various stages of the development process are slowed down by 30% however, that might actually be noticeable for development pace. More likely to context switch to something else / get out of flow.

Thank you for this comment, it is very insightful.

It's also interesting how perception of these bottlenecks differs. I wouldn't care much if tests are running 30% slower, I run them in the background from my editor anyway. I usually tidy up stuff in the meanwhile or git stage some lines (which needs to be done anyway). However, if an editor has a higher latency than I am used to, it drives me crazy.

> It's also interesting how perception of these bottlenecks differs. I wouldn't care much if tests are running 30% slower, I run them in the background from my editor anyway. I usually tidy up stuff in the meanwhile or git stage some lines (which needs to be done anyway).

Heh. I do that too, but I'm anxious enough that I rerun tests after finalizing the commit. I'll still use the time for the tests to finish to do another read through the commit, but especially when pushing to many supported branches at the same time, tests still take longer (work on a pg, which has 5 years of supported back branches...).

> However, if an editor has a higher latency than I am used to, it drives me crazy.

Oh yea, but I wouldn't use a web based editor, ever ;). The point I was trying to make was less about something as central as an editor - where I spent a lot of my time, fighting for first place besides terminals - but more around secondary non-critical path stuff like IM.

If your CPU takes a performance hit, it will affect everything, including the non-native applications.

As a developer, I'm not so sure about that: I don't use any web and non-native program by choice. In fact, I'm even stricter than that, I can tell you at a distance if a program was written using QML instead of QtWidgets (due to lag and poor behavior) and avoid the first just as I avoid non-native apps.

In fact, unless I'm restricted to a single program I don't have any alternatives to, I'll generally even select programs written in a language I understand and/or prefer.

I have several colleagues that use slightly different metrics, but still pay very close attention to performance and details of the tools they use. Especially if that's something they use for hours over the course of a day.

But why did you get a new CPU if you don't need horsepower? Not saying this is you, but I don't understand people that have to have the fastest processor but then rationalize away these losses in power with "I don't use my CPU much anyway."

That is not what I meant. My new CPU is much faster than the old one. I was hitting the limits at certain times. And now that I finally can afford upgrading, I did. But this was all before we knew about Meltdown / Spectre...

Of course I am annoyed by the fact that I will lose some power. But I do not have much of a choice now, do I?

Either way, I am not utilizing my CPU the way servers with many users do. These servers will feel this impact the most.

I personally don't care if I lose 2 fps or if my compilation will take 20 seconds longer after all these fixes. But if you budget let's say 5000 USD per month for servers and everything was running fine and now it turns out you need to add an additional 3000 USD a month, then that's where the impact is felt the most and not with normal users like us where the compilation time might take a bit longer.

Also, for some people it is just a hobby to always have the latest and best :)

I was putting serious consideration into an iMac Pro but this has at the least made me stop to consider the options.

I write and run a ton of multithreaded data processing scripts and really feel this is going to be a world of hurt for me.

My last fully speced iMac has lasted me eight years, so I might just wait it out and perhaps pick up something far cheaper in the meantime.

I mean if you’re building a pc from scratch you could just factor in some overclocking to make up some of the performance loss I’d imagine. In the future we probably just won’t see the advertised speeds of chips increasing much IF the performance impact is as big as people are saying.

I would put off hardware purchases, but for a different reason - crypto miners. As I was researching parts to buy for a new rig, I realized that just a few weeks ago, some GPU models were about half the price they currently are. Apparently, something recently happened with Bitcoin and/or other cryptocurrencies that caused a spike in mining interest.

Buy the machine you need when you need it. If you don't need a new machine right now then yes delay that purchase.

If you want Intel and can wait, I would wait. They should fix Meltdown bug and it'll increate performance significantly in some edge cases. I think they'll release such processor in 2018. Spectre won't be fixed for a long time, so it should not affect purchase decisions.

2018 sounds a bit early. I'd say we'll be lucky to see engineering samples in 2019, generic availability in 2020.

If you upgrade your web browser, firmware (if not an apple or microsoft device), and OS you should be more than fine. There's not much else you can do now anyway, so just upgrade devices based on what features you want in a new device.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact