
Best CPUs for Workstations: 2017 - bem94
https://www.anandtech.com/show/11891/best-cpus-for-workstations-2017
======
jonbaer
"Some workstation users will need ECC memory, and up to 512GB of it. When
memory has an error rate of 1 error per GB per year, using 512GB ensures
almost two bit errors per day: something that a 60-day simulation would find
catastrophic."

~~~
keldaris
Just to offer a partial counterpoint to the popular "ECC is a must for real
work" sentiment, I've ran many 60-day simulations where a few bit errors would
be completely irrelevant. Many physics simulations include internal checks
(that must be there regardless of ECC, for algorithmic reasons) that would
catch virtually all such issues. While a bit error could certainly crash the
simulation, long running simulations can be resumed from checkpoints with
minimal loss of time and the probability of a crash is still very low for most
real workloads.

In truth, there are plenty of good reasons, both for data integrity and
security, to have ECC. However, there are real costs to ECC too - 10-30%
immediate performance overhead (depending on your application) and much more
than that for memory-bound workloads (due to lower frequencies and subtiming
issues), not to mention hardware cost efficiency. Instead of oscillating
between "ECC for everything" and "ECC is useless" people should rationally
compare the costs and benefits for their specific workload and choose
accordingly.

~~~
dna_polymerase
Thank you! That sentiment on HN about how "Everything should have ECC" really
grinds my gears. There is no need for ECC in your iPhone or on your MacBook.
Your average emoji-laden WhatsApp Conversation with your friends does not
require Bit Error Correction. People who really need ECC do exactly what you
just mentioned: Evaluate, and often enough they will find little to gain from
ECC.

~~~
jmh530
But when you need ECC, you really need it. My FreeNAS server has like 32GB or
ECC memory because it runs ZFS, which basically doesn't work unless you have
ECC memory. However, my desktop computer or my laptop don't need it. When you
need ECC, use it, but if you don't need it, then no biggie.

~~~
rightos
Why does ZFS "really need" ECC? I have a few friends running freenas on just
old machines without ECC, no issues so far a few years in.

~~~
milcron
Traditional filesystems will let files just sit on the hard drive, untouched.

ZFS is a lot more active... checksumming and caching mean that important
information is spending time in RAM. It's not good if that information gets
corrupted.

~~~
rightos
Doesn't ZFS perform checksumming and error correction on that data to make
sure that memory or disk issues _don 't_ get affected by corruption? I thought
that was a main reason to use it.

~~~
milcron
Well right. It protects against silent bitrot on the hard drives. But it does
this by caching and checksumming _in RAM_ , so now you need to defend against
bitflips in RAM.

It's not that ZFS without ECC is completely unsafe... it's just not as safe as
it could be. ECC RAM becomes the next biggest concern once you've addressed
on-disk bitrot.

------
SeanDav
OT:

I am in the market for an upgrade to my home PC. I use it for Compiling and
Gaming.

I am finding the sheer variety of CPU's and their weird naming conventions
utterly confusing. Then combine that with the almost-as-confusing choice of
motherboards and the bit-better-but-not-much choice of RAM and I am completely
lost.

Any tips out there for finding a path through this maze, how do I upgrade my
PC without requiring an advanced degree in Intel/AMD marketing speak?

 _EDIT: Added use of PC._

~~~
nfriedly
[http://www.logicalincrements.com/](http://www.logicalincrements.com/) is a
handy site that recommends PC builds that give you a good bang-for-buck at
various price points. On a given horizontal line, the CPU, motherboard, and
RAM will all be compatible.

The site is tilted for gaming, but GPUs are mostly interchangeable - so if you
want more compute power, you can drop to a lower GPU and spend the money on
the CPU instead.

~~~
nfriedly
Oh, and the #1 upgrade you can make for everyday use is a SSD. I was recently
impressed at just how fast an old Core 2 Duo system was once I installed an
SSD. It won't do much for gaming (aside from improving load times), but it
will make the rest of the things you do on the computer more pleasant.

------
todd8
Remember that one-bit errors aren't necessarily small errors. A single one bit
change when working with 64 bit floating point can result in a very large
difference in value.

0b0111111111101111111111111111111111111111111111111111111111111111 ==
1.7976931348623157e+308

0b0011111111101111111111111111111111111111111111111111111111111111 ==
0.9999999999999999

~~~
euroclydon
How often doe a program load an FP with a value at the absolute extreme of the
range though? I mean, if my value is there, I probably already have a problem.

~~~
todd8
Yes, but the point is that flipping a single bit can change a value
dramatically. While using 64 bit ints for example programs are likely using a
limited range of values and flipping a bit in say the middle of the int will
change the value by plus or minus 4294967296. Not many programs will handle
this gracefully.

------
linsomniac
Why does the image on this article show rackmount equipment while talking
about workstations? I find their selections to be, at least for my work, much
more server-oriented than workstation.

Do a lot of people really run >512GB RAM in their workstations rather than
running a "thin" workstation and running simulations on servers or EC2?

I just built a new workstation a couple months ago: i7 7700 (not the K), 64GB
RAM, one of those closed loop CPU water coolers... As a workstation, I wanted
it quiet, and performance has been great. Long running jobs I run on our
dev/stg cluster (4 machines, 512GB total RAM, 48 total cores.

~~~
microcolonel
I second that, in most cases, 64GiB is enough for workstation clients.

I can think of some reasons to have twice, four times, or eight times as much
though. You can almost never have enough block cache, after all. Local testing
databases come to mind. Also, tracing/profiling Elixir programs takes up
enormous amounts of memory. My colleagues at my last company did not have
enough RAM to generate meaningful profiles of our system, 64GiB was just
barely enough to get a signal out of it. Rather than spending a month working
on the profiler, it was nice to be able to get some data out of it as-is.

~~~
glenneroo
Depends on what you're doing. With video or 3D work, 64GB can often be a
bottleneck. With After Effects and Maya rendering we've gone over our 96GB RAM
limit using dual Xeon 6-core (i.e. 24 logical) on multiple occasions,
especially rendering 4K+ scenes, forcing us to set limits on RAM/thread usage.
I want to upgrade them to 128GB but getting bigger ECC RAM sticks in Austria
(and EU in general) was (and still is) difficult and stupidly expensive.

------
j_s
A discussion when Ryzen was released had quite a bit of info on PC builds:

Ryzen is for Programmers |
[https://news.ycombinator.com/item?id=14243350](https://news.ycombinator.com/item?id=14243350)
(May 2017, 272 comments)

~~~
dabockster
Why did I have to scroll down so far for a Ryzen shoutout? Building with Ryzen
has been one of my best investments in years.

------
qaq
Well it's nice to finally have serious competition in x86 CPU market.

~~~
rrdharan
... again.

[https://en.m.wikipedia.org/wiki/Opteron](https://en.m.wikipedia.org/wiki/Opteron)

[https://en.m.wikipedia.org/wiki/Athlon](https://en.m.wikipedia.org/wiki/Athlon)

[https://en.m.wikipedia.org/wiki/AMD_K6](https://en.m.wikipedia.org/wiki/AMD_K6)

~~~
qaq
good point but it has being a while

------
gsnedders
I wonder how the EPYC 7401P would compare with the 1950X in terms of
performance/price? It has 24 cores v. 16, but with a lower clock speed, and is
$51 more. If you can actually use all the cores, I suspect better?

~~~
philjohn
How much more are the Mobo's though?

~~~
infogulch
And do the motherboards support features that you would expect out of a
consumer version? They often only have a couple usb and video out, with very
few features.

------
hsivonen
The new AMD CPUs look otherwise good, but it seems they still don't run rr.
Giving up rr is not a reasonable option if the purpose of the workstation is
developing software in the kind of language that gdb can make sense of (C,
C++, Rust).

------
sandworm101
A workstation will be used or at least turned on 24/7\. The one I just sat
down at sure is. So when calculating costs one should include daily power
consumption (power used + cooling needs). Some of these new intel chips run
very hot.

------
parski
Doesn't Coffee Lake release today? I'm upgrading for sure.

~~~
ghostbrainalpha
It did.

Discussion here:
[https://news.ycombinator.com/item?id=15408850](https://news.ycombinator.com/item?id=15408850)

------
theyregreat
If money were no object, Xeon Platinum 8180m ($13k retail).

~~~
KekDemaga
28 cores sure, but they run a base frequency of 2.50GHz. For $1469 you can
have the i9-7940X 14 cores at 3.1GHz probably more suitable for a workstation.

~~~
SSLy
No ECC support

~~~
KekDemaga
I'd go for the Threadripper 1950X if that is a requirement then 16 cores, 3.4
GHz, ECC support all for under $1000. Intel doesn't really have anything that
meets those requirements at a reasonable price.

~~~
really_operator
I thought he said if money was no object...

~~~
mtgx
Then his choice is $13,000 vs $1,000.

Money may not be an object, but choosing the first option seems a little
stupid. Even rich people don't throw away money like that, because if they
did, they probably wouldn't have gotten rich in the first place.

------
xelxebar
Somewhat off topic, but what about open hardware platforms that are really
friendly to hacking around on the hardware?

Is this just a thing of dreams?

~~~
nbathum
Can you explain more what you mean by open hardware, or expand on what kind of
hacking around are you looking to do?

~~~
xelxebar
Thanks for asking!

I don't have any precise requests, but I would like to be able to pretty much
hook up an oscilliscope or logic analyzer anywhere and in principle be able to
understand what's going on.

I'm looking to get into hardware by way of tinkering and my mental model is of
open software. Currently, any time I have a burning question about one of my
tools, I can just open a man page as well as start digging into the source. I
think it would be cool to do the spiritual equivalent with hardware.

For example, I am currently digging into the bootup process--everything after
CPU POST until a fully ready userspace. However, the details that between
pressing the power button on the power supply and CPU POST are completely
opaque to me, and some are completely opaque even in principle on my current
system. I'd like to have the ability to fully grok the electrical
underpinnings of what's going on or at least as close as possible.

------
ptero
Somewhat off-topic, but I need to replace my ancient workstation and would
love a recommendation. I prefer to max cores and do not care much for clock
speeds (I work mostly on algorithm development and can parallelize experiments
relatively easily but often run multiple unrelated tests). 10-15k budget.

I was thinking of dual-CPU Z840; are there better options?

~~~
dsr_
You can get 64 cores, 128 threads on a 2P EPYC7551 with 256GB of ECC RAM in a
rackmount server with redundant power supplies for that much money. e.g.
SiliconMechanics.com

~~~
ptero
Thank you for the suggestion, I will look at SiliconMechanics. I need a tower,
not rack mounted, but I see capable systems with 24-28 cores there. The site
seems to be in a disarray, at least for high performance workstations (two
rightmost options of the four, which I would assume to be the most capable are
showing up as discontinued when I click on them, etc.).

------
dragontamer
Just in time for Intel to release the 6c/12t Coffee Lake i7, and make all of
these comparisons (slightly) out of date.

------
siliconunit
I was one step from getting a threadripper... But then I opted for a 1800x, I
will invest more on the gpu side when there will be the hw to properly drive a
4/8k vr headset eh:) I wonder though if there's a way to restore disabled
cores on a Epyc chip... As according to and Amd they all born with 32 cores...

------
Dowwie
Hasn't cloud computing reached the point where local workstations like this
aren't required?

~~~
dagw
First of all no one has really successfully solved the running Autocad etc. in
the cloud problem. Secondly the per core performance is generally pretty poor
on cloud computers when compared to high end workstations and for many
workloads 16 really fast cores beat out 64 slower cores.

Finally, cloud computing is still really expensive compared to dedicated
hardware. A halfway decent EC2 'workstation' like the p2.xlarge is over $5000
a year (plus storage) if you pay for the whole year up front and that 'only'
gets you 4 cores and 60 GB of RAM (and a pretty good GPU). $5000 will buy you
a really nice workstation with much better specs, and you get to keep it at
the end of the year

~~~
rjsw
Onshape provides CAD in the cloud, does AutoCAD need a lot of local CPU power
?

~~~
xemdetia
Yes, most engineering applications end up doing all sorts of simulation that
can be summed up as 'run as much math as fast as possible.' There's just
routing traces on a PCB, there can be heat or stress simulations on a part
built with constructive geometry, and a lot of other things that lets an
engineer simulate/try something interactively and not simply 'draw a part.'
This is where good workstations can definitely make a difference. An example
product that is definitely going to need the local beef:
[https://www.autodesk.com/products/cfd/overview](https://www.autodesk.com/products/cfd/overview)

------
latch
$999 for the best choice seems really high to me. I understand that they're
trying to generalize a broad range of applications, but still.

They mention the Ryzen 5 is better bang for the buck, but dismiss it for
having "low overall performance." I guess I'm wrong in thinking that, for 98%
of people, a $250 CPU would still be wasteful.

~~~
mrich
But this is about workstations where CPU bound workloads are running, like
rendering or compilation.

Most people will either buy a laptop or desktop (if their smartphone is not
sufficient) and not spend nearly as much money, they only need to run a
browser and a couple of other small programs.

~~~
dredmorbius
A browser is not a small program. Video can still be demanding (watching, or
chat).

The workloads are fairly _standardised_ and non-diverse. Not necessarily
_small_.

------
intrasight
Since this article is about "workstations", it should have only considered
ECC-enabled systems. Since it didn't (hardly mentioned ECC in fact) it's a
useless article IMHO.

------
dis-sys
anandtech.com used to be an okay source, but it is getting worse pretty quick,
non-sense article like this is helping anantech to get that position fast.
threadripper 1950x is not the overall best workstation processor, it is more
like a toy for kids into gaming/overclocking.

1950x is slow for multithreaded applications, it has a low cinebench score of
~3,000, you pay some serious $ for a fancy motherboard full of LED lights,
then you don't have officially verified ECC support.

As a comparison, you get the same mutithreaded speed, much cheaper system cost
if you just buy second hand dual 2696v2 processors with real ECC support. if
you need real processing power in a single box and have a tight budget, there
is a flood of E5-2696v4 processors on the market at $950-1,100 each. for a
much nicer budget, you can always go dual/quad Intel 8180.

~~~
Kayou
Nice one, you compare second hand prices to brand new prices. Anandtech's goal
is to guide you in the chip market buying it brand new, not hunting it on
ebay. Also, what they say with the 1950x being the best compromise between
number of PCIe lanes, processing power, maximum memory etc. If you didn't
notive, performance is not their only criteria.

~~~
dis-sys
I listed dual/quad 8180 as a good option. I don't think you can get second
hand 8180 on ebay.

For 1950x, details explanations were provided in my post -

1\. too sloow, cinebench score is 3k, you can get it from 4-5 years old
ancient processors already declared as EOL. 2\. no official ECC support

I can list more actually - single socket system with 8 DIMMs only. As
explained, it is a good platform for kids to game/overclock.

~~~
Kayou
Sure, but as you say yourself, the 8180 isn't the same budget at all, so I
excluded it from your price comparison. Also, it's not a competitor to the
1950x. I guess in the end it all depends on what you need to do. Your need
seems to be maximum performance, why go for the best "compromise" CPU and then
criticize their choice? Clearly you should go for either of the 3 recommended
CPU for performance. (which are also better for PCIe lanes and memory, but
have a poor performance price ratio).

~~~
dis-sys
if it is about budget, or performance per $, surely you'd be buying Xeon from
ebay. you don't lose anything as the likelihood of having a dead processor 18
months from now is pretty much 0, no warranty required.

if it is about PCIE lanes, you should at least be buying dual socket systems,
that gives you more PCIE lanes than 1950x.

if it is about memory, well, 1950x is limited to 8 DIMMs.

