
Ryzen Threadripper 1950X vs Core i9-7960X - rbanffy
https://www.pcworld.com/article/3228244/components-processors/amd-vs-intel-we-test-ryzen-threadripper-1950x-against-core-i9-7960x.html
======
mizzack
That pricing comment is almost laughable and makes me wonder how much Intel
paid for this review.

Framing the pricing in the context of a $9,000 PC is a blatant attempt to
dilute the price discrepancy. In the context of a common ~$2,500-$3,000 rig,
$700 matters a lot. I'd be curious to see how many people buy these "dream"
$9,000 PCs. I'd wager it's not many.

~~~
rovek
I don't see this as Intel biased, in the "The bang for buck stops here" the
author very clearly points out that the AMD CPU is vastly more cost effective
for likely scenarios.

Though saying it's a Mobo and a decent SSD somewhat understates what those
builders would likely spend $700 on.

~~~
geezerjay
> I don't see this as Intel biased,

The article was written to sell the idea that Intel's offering beats AMD's in
all relevant features considering all aspects deemed relevant by the writer,
and then he proceeds to bend himself backwards to try to come up with any
excuse to ignore the 700$ price difference between Intel's i9 rig and AMD's
Ryzen Threadripper and how that price difference buys only marginal
performance gains.

I would also add that it's also questionable how the author decides for no
reason at all to setup Intel's rig with two SSDs in RAID0 while leaving AMD's
rig with two separate drives.

All in all, it looks like a far-fetched and somewhat desperate attempt at
finding a particular offering by Intel that can show some performance gains
over the AMD Ryzen Threadripper, and in the process commit so many
methodological errors to get to those marginal gains that a good portion of
the text is dedicated to come up with excuses to try to sell the idea that
none of that matters.

------
losvedir
Sort of offtopic, but what is with those charts? Is there some rhyme or reason
to the ordering that I was missing? Sometimes the AMD was above the Intel and
sometimes vice versa. Sometimes the better result was above the worse and
sometimes vice versa. I had to squint at the "longer/shorter bars indicate
higher performance" captions on every chart to make heads or tails of it...

~~~
bluehazed
Understandably different benchmarks have different measurements of success
(scores vs. time spent in seconds, etc) but I agree this could be displayed
better some way.

Perhaps simply outlining/indicating by colour or a visual marker for the
better performer on the chart.

~~~
jo909
You can always normalize it to a "the longer bar is better" measurement. For
example instead of "Kernel Compile Time" you graph "Kernel Compiles / hour".

------
TazeTSchnitzel
Something interesting a lot of reviews don't consider is that Threadripper,
since it's a multi-chip module with each chip having its own memory
controller, has a non-unified memory architecture (NUMA), where half the
memory will have higher latency because it has to go to the other chip. Some
software is written with this in mind, some isn't. Consequently, Threadripper
has two modes: NUMA mode (OS sees two NUMA zones and different latencies) and
UMA mode (single NUMA zone, single average latency). It may be worth trying
both when benchmarking to see what difference it makes. I think most reviewers
aside from AnandTech have overlooked this since NUMA hasn't been a concern on
the desktop before.

~~~
AlexB138
> has a non-unified memory architecture (NUMA)

Small, pedantic, point of fact, NUMA stands for Non-Uniform Memory Access. You
could say Non-Uniform Memory Architecture in this context, obviously, as well.

~~~
TazeTSchnitzel
…oh, huh, okay. Point taken.

------
nspattak
I hate to be the one to say it but this looks very one-sided to me:

is 1-20% of a performance benefit really a "It gives you great performance at
light-duty applications and generally can't be touched by the Threadripper
1950X in heavy-duty applications, either." ?

------
mtgx
tl;dr As you'd expect, AMD's chip is a no-brainer on bang/buck.

I wish more reviewers would compare chips at various price levels, because I
think that's how most people think about their purchase.

If you want to build a $1500 gaming rig, you're going to look at the whole
system cost, and you'll designate a certain price for the CPU. You'll want to
buy "the best CPU for that money".

Similarly, if a medium development company has $10,000 to upgrade _some_ of
its systems, does it buy 6 Intel CPUs or 10 Ryzen CPUs with that money?

Comparing product _brands_ like "Intel's top of the line" vs "AMD's top of the
line" is not very useful, especially if there's a 2x delta price difference
between them, while the performance delta is nowhere that big.

This should be even more the case for data center purchases, where total cost
of ownership is king.

~~~
fpoling
For data centers the cost is dominated by electricity, not CPU price, so they
look for performance per Watt. This was not addressed in the article.

~~~
jhasse
> so they look for performance per Watt. This was not addressed in the
> article.

The did test power consumption though:
[https://images.idgesg.net/images/article/2017/09/fnw_showdow...](https://images.idgesg.net/images/article/2017/09/fnw_showdown_power_scaling-100736973-orig.jpg)

~~~
fpoling
That plot shows the power consumption during one rather synthetic benchmark.
What is necessary for meaningful conclusion about price/watt is to have power
consumption during a particular benchmark so the results can be scaled. This
is not addressed.

