
AMD Ryzen price and release date revealed - kungfudoi
http://www.pcworld.com/article/3171161/components-processors/amds-ryzen-launches-march-2-outperforming-intels-core-i7-at-a-fraction-of-the-price.html
======
mrb
First (amateur) third-party benchmarks confirm a ~$350 Ryzen beats $1000 Intel
processors in CPUMARK, 3DMark Fire Strike Physics, Cinebench:
[https://www.chiphell.com/thread-1706915-1-1.html](https://www.chiphell.com/thread-1706915-1-1.html)

AMD had said for years their Zen goal was a 40% IPC gain over their previous
microarchitecture, but they ended up with a 52% gain:
[http://www.anandtech.com/show/11143/amd-launch-
ryzen-52-more...](http://www.anandtech.com/show/11143/amd-launch-
ryzen-52-more-ipc-eight-cores-for-under-330-preorder-today-on-sale-march-2nd)

Today's launch event by AMD's CEO:
[https://www.youtube.com/watch?v=1v44wWAOHn8](https://www.youtube.com/watch?v=1v44wWAOHn8)

~~~
Asooka
I'm a bit dubious. If they have such an amazing part why is it being sold for
so cheap? Why not 80% of the price of the comparable Intel part? Selling it at
50% or even less raises some heavy alarm bells for me.

~~~
masklinn
> If they have such an amazing part why is it being sold for so cheap?

Because they get exploded in single-threaded performances, so they have to pit
their 8-core parts against Intel's 4 cores.

The 1700X will be competing against the $350 7700K more than the $1100 6900K
(which incidentally is a Broadwell-E part not a Skylake part despite de 6
prefix)

~~~
dstaley
The linked benchmarks are putting the $350 1700X about on par with the $1100
6900K when it comes to both single-threaded and multi-threaded performance.
Both of those processors are 8-core/16-thread chips. Also the 1700K is a 95W
TDP compared to the 6900K's 140W TDP.

~~~
masklinn
Again, the 6900K is a two years old µarch, and not one which was considered a
great release in the first place. The 1700X will not be competing against the
6900K because nobody gives a rat's ass about the 6900K, the 1700X will be
competing against the 7700K and its ilk.

~~~
mrb
The IPC difference between Broadwell (6900K) and Kaby Lake (7700K) isn't that
big. 5-10%.

The _only_ redeeming quality of the 7700K is that Intel was able to raise the
turbo freq to 4.5 GHz, while the 1700X only boosts to 3.8 GHz. So yeah single-
threaded perf will be higher with Intel.

~~~
edvinbesic
Isn't the IPC difference basically 0 and the 7 series only has slightly higher
frequencies due to the optimization cycle? So clock for clock 6 and 7 series
CPU should yield the same performance.

I should add that the above is the main reason the kaby lake release is so
lackluster. There is very little reason to upgrade your desktop to a 7 series
(laptops get power optimization and 4k playback support).

It's also the main reason why AMD might look appealing, lack of worthwile
upgrade from Intel while AMD delivers twice the cores/threads for not a gread
deal more then a 7600K.

~~~
masklinn
> Isn't the IPC difference basically 0 and the 7 series only has slightly
> higher frequencies due to the optimization cycle? So clock for clock 6 and 7
> series CPU should yield the same performance.

You're talking about Skylake 6-series, the 6900K is Broadwell-E not Skylake
despite being 6-branded.

------
tracker1
I just have to say, I really hope that this is real. AMD has burned a lot of
trust a few times in overzealously stating performance marks in their prior
generations of CPUs. I ran AMD for several years until the Core-2 came out
from Intel (currently very happy with an i7-4790K).

If these comparisons are real, my next build may be AMD again.

~~~
Cyph0n
Seems like you skipped over the FX series (Bulldozer, I believe). I got a
FX-8320 a few years back, and my brother got a FX-6300. Both are capable
processors and are MUCH cheaper than their counterparts (i5 and i3,
respectively).

Heck, if I built a PC in the past year, I would have gone with a FX-8350 just
because it's so damned cheap yet very performant. Once Zen is out, it's really
a no-brainer for me.

~~~
RussianCow
The biggest problem with the FX series, IMO, was the deceptive marketing. The
FX-8000 CPUs were marketed as being 8-core, when in reality every pair of
cores shared an FPU and cache, so real-world performance was much closer to
that of a quad-core for many workloads. Add on top of that the abysmal IPC
(compared to Intel's offerings) and high power consumption, and there was very
little reason to get an FX-8350 or similar over an i5.

(Yes, today the FX CPUs are much cheaper, but when they came out they were
barely so. I know because I got an FX-8350 as soon as it came out and
regretted it.)

~~~
mrb
I have to disagree with that. Most workloads (H.264 decoding/encoding, gaming,
typical desktop software, etc) are ALU-bound not FPU-bound. So the FX series
is generally beating Intel on the perf/$ metric, just because it is so darn
inexpensive. In my experience a $100 FX CPU is as good as a $200 Intel CPU
when doing H.264 encoding:

[https://news.ycombinator.com/item?id=13172972](https://news.ycombinator.com/item?id=13172972)

(However the FX series perf/watt is worse than Intel, I give you that.)

~~~
RussianCow
Video encoding/decoding is definitely a niche for which the FX series had
great value when it came out, so I agree with you on that. However, for gaming
and most "typical desktop software", having higher single-core performance is
much more important than having more cores, and the FX series was a huge
letdown there. Then you factor in the extra energy you're using, and all of a
sudden there are very few cases in which it makes sense to buy an AMD CPU over
an Intel one.

(They're much cheaper now, so it might make a lot more sense to buy one today,
but I'm sticking with my Intel CPU until AMD can prove themselves.)

------
youdontknowtho
Really looking forward to this, and I'm glad that AMD's management no longer
seems to be on the pipe.

Even if the real world perf is close to the $1K Intel chips it will be a win.
It's going to force price cuts from Intel and hopefully spark some competition
again.

~~~
slantyyz
It would be interesting if Apple used these chips in upcoming Mac desktops.

I'm guessing that adopting these lower priced chips without lowering prices
would have a negligible impact on sales for Apple if the performance is as
good as AMD claims.

~~~
edko
It would be nice. However, would they be able to still keep Thunderbolt?

~~~
sp332
The standard chipsets do have support for USB 3.1 Gen 2. But mixing in support
for DisplayPort would require either a custom chipset (which Apple may be able
to do but might not be as cheap), or wait until a Zen-based APU comes out.

~~~
samcat116
USB 3.1 Gen 2 plus display is not the same as Thunderbolt. TB is basically
straight PCIE. Can't just combine data and video and call it Thunderbolt. Plus
3.1 Gen 2 is only half the speed of TB3.

------
adamnemecek
I can't think of a more spectacular comeback than AMD in the last year or so.
They are positioned pretty well for the inevitable convergence of CPU and GPU.
That CUDA crosscompiler was such a brilliant move.

~~~
mtanski
It's already in the works.
[https://www.overclock3d.net/news/cpu_mainboard/amd_reveals_a...](https://www.overclock3d.net/news/cpu_mainboard/amd_reveals_a_exascale_mega_apu_in_a_new_academic_paper/1)

CPU, GPU, HBM all on one package. That would be a game changer for building
analytics databases. There's a bunch of GPU based databases out now but a lot
of the benchmark numbers they give you rely on having the data cases in GPU
RAM. Real world benchmarks that require moving data over PCIe are less
generous.

------
nimos
I think the low TDPs have to be the scariest thing for Intel. Really
interested to see what they can do in the 15-25W range for notebooks and their
server stuff.

~~~
rasz_pl
I have to wonder if the AMDs console win for the current generation gave them
this badly needed R&D to reach decent TDP.

~~~
redtuesday
How so? The R&D budget is even lower than before AMD got the console deals.

[https://ycharts.com/companies/AMD/r_and_d_expense](https://ycharts.com/companies/AMD/r_and_d_expense)

~~~
rasz_pl
Its still better than nothing. AMD management is dysfunctional, they treat
engineering as a cost and were cutting it badly during K9-K10 era.

------
Symmetry
I'm optimistically looking forward until the independent benchmarks on March
2nd.

~~~
kogepathic
The NDA on benchmarks is supposedly set to expire on February 28. [0]

Not sure why this PC World article is coming up now. There have already been
many stories on Ryzen pricing and the release date of March 2 has been known
for several days now... [1]

Product lauch video [2]

[0] [http://wccftech.com/amd-ryzen-reviews-live-february-28th-
tes...](http://wccftech.com/amd-ryzen-reviews-live-february-28th-testers-
receiving-samples/)

[1]
[https://www.reddit.com/r/hardware/comments/5urylu/when_does_...](https://www.reddit.com/r/hardware/comments/5urylu/when_does_amd_ryzen_benchmark_nda_lift/)

[2]
[https://www.youtube.com/watch?v=1v44wWAOHn8](https://www.youtube.com/watch?v=1v44wWAOHn8)

~~~
bryanlarsen
It was known, but not official until now. Also, preorders are now live.

[0] [http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-
newsArticle&I...](http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-
newsArticle&ID=2248241)

~~~
hermiti
Thanks

~~~
hermiti
Just pre-ordered mine...:D (newegg)

~~~
loeg
Huh, I don't see them on Newegg yet.

~~~
hermiti
Do a search for ryzen, they are there.

~~~
loeg
Thanks, I was trying to find them browsing by category.

------
Asdfbla
If AMD makes a comeback, how did they do it? Not enough competition and Intel
got complacent or what? Or did they just reach the limits and it's not
possible anymore to keep the lead and prevent AMD from catching up?

~~~
abandonliberty
They actually have a history of this. They did it with Athlon too.

~~~
rasz_pl
AMD has a history of having dysfunctional R&D:

386DX - copied Intel design verbatim (1:1 microcode). Got sued, won only
because Intel had multiple source agreements with IBM.

K6 - purchased NexGen. Designed 100% by outsiders.

K7 - once again hired somebody else's engineers. This time almost whole DEC
Alpha design team.

K8 - DEC people had some more steam left in them.

K9-Zen period - no more companies to take over and claim credit for their
designs, add terrible management bleeding engineering talent while burning
money on stupid ideas like Seamicro = in house garbage with 50% Intel IPC.

Zen - finally managed to rehire competent ex DEC engineer Jim Keller after his
successful career at Apple.

------
toxican
I'm far too broke to shell out cash for a new Mobo and CPU right now, but I'm
excited to see what this does to Intel's prices for older CPUs. I'd love to
upgrade my i5-2400 soon-ish.

~~~
icefo
Unless you are regularly doing CPU intensive tasks you can probably stick with
it a few more years. I noticed since I bought a SSD that most of time the hard
drive is limiting the CPU.

Until my DVD reader burned my desktop motherboard I was using a core2quad and
it was still fast enough. No lag whatsoever. Now I'm using a laptop with the
first gen I5 and I feel no need to upgrade it (I'm using it as a desktop
battery life does sucks)

I understand why PC sales drop every year. You just don't have to buy a new
one unless yours die.

~~~
toxican
In terms of general use, yeah definitely no problem with the CPU at all. It's
zippy enough. But I also do a lot of CPU-intensive gaming (Cities: Skylines w/
tons of mods, RimWorld, Banished, etc.).

------
mtgx
Jim Keller deserves a frigging statue in front of the AMD HQ, if he doesn't
have one there already. I'm not even kidding. His efforts shouldn't be easily
forgotten, and it could serve to inspire new generations of AMD engineers.

Beyond the product quality itself, I think AMD has had a pretty smart launch
strategy by releasing the CPU chips first to show that it can beat "Intel's
best".

But they really need to start focusing on notebooks ASAP. That's where they
can steal most of the market from Intel, especially now that Intel is showing
signs of (slowly) abandoning the notebook market, by prioritizing Xeons over
notebook checks for its new node generations.

AMD should prioritize notebook chips either next year or the one after that,
at the latest. They should be making the notebook chips first, before the PC
ones. They need that market share and awareness in the consumer market.

In terms of how they should compete against Intel in the notebook market, I
would do it like this (at the very least - AMD could do it even better, if it
can):

vs Celeron: 2 Ryzen cores with SMT disabled

vs Pentium: 2 Ryzen cores with SMT enabled

vs Core i3: 4 Ryzen cores with SMT disabled. Or keep SMT and lower clock
speeds, as Intel did it. This may help further push consumers as well as
developers towards using "more cores/threads".

vs Core i5 (dual core): 4 cores with SMT enabled

vs Core i5 (quad core/no HT): 4 cores with SMT enabled + higher clocks and
better pricing. Maybe even 6-core with HT, if AMD goes the 6-core route. I
honestly don't even know why Intel decided to make "Core i5" a quad-core chip
as well, and its Core i7 a dual-core chip as well. It's so damn confusing -
but maybe that was the goal. For differentiation's sake, it may be better for
AMD to have a 6-core at this level or maybe even an 8 core with SMT disabled -
so same thing as Intel, but with twice the physical cores. I don't know why
but for some reason 6-cores don't attract me much. It feels like an
"incomplete" chip.

vs Core i7 (quad core/HT): 8 cores with SMT enabled

The guiding principle for this strategy should be "twice the cores or threads
with competitive/better single-thread performance, and competitive/better
pricing).

In a way it would be the inverse of the PC strategy where they maintain the
number of cores but cut the price in half. This would mainly focus on doubling
the number of cores (because notebooks come with so few in the first place),
while maintaining similar or better pricing.

The only ones that don't really fit well in this strategy are the Celeron and
Pentium competitors and that's because a dual-core Ryzen, even at low clock
speeds should destroy Intel's Atom-based Celeron and Pentium. We could be
looking at a least +50% performance difference, and that's what AMD should
strive for there as well. AMD should show Intel what a mistake it made when it
tried to sell overpriced smartphone chips for laptop chip prices.

~~~
cat199
From what I recall of their recent marketing material, it seemed like like
they are looking to corner the market on 64bit ARM servers and also have some
hybrid amd64/ARM chip plans to help people wanting to take advantage of ARM in
datacenters migrate.. but I'm no expert and this is based on a recollection of
marketing materials from a few months back - 64bit arm could be a huge thing
in large scale light-duty computing centers (e.g. web/cloud datacenters)
because of the cooling/horizontal scale out and also mobile/IOT/industrial so
this isn't quite as crazy as it sounds.

~~~
bryanlarsen
AMD K12[1] is their full-custom ARM64 core. Originally scheduled for 2016, it
was delayed to focus on Zen, probably the right move.

It's still scheduled for 2017, but there hasn't been much noise about it for a
long time so I suspect it's been delayed, as they're still focusing on Zen &
Vega.

Zen & K12 probably share a lot of elements, so if Zen is awesome, K12 probably
will be too.

1:
[https://en.wikipedia.org/wiki/AMD_K12](https://en.wikipedia.org/wiki/AMD_K12)

------
arca_vorago
This is really exciting, as I have been waiting to build a new system with the
new AMD gear. Lots of people also don't hear about it, but I am super-excited
to see the new server class CPU's. I built a quad opteron 6380 system and have
been in love with them ever since, but they weren't perfect and had some
issues I hope are fixed with this new line.

------
crudbug
Interesting to see multi-core wars starting here.

Intel did not do anything as a market leader, 8 years back I could still buy 4
core machine. Waiting to see, how AMD does on Server parts, 32 cores / 64
cores ? Power9 does 24 cores/ 96 threads.

------
walrus01
I was highly skeptical when the first info about Ryzen came out. This is
looking really promising versus the $300 to $400 price range Kaby Lake i7
(7700, 7700K) CPUs.

And when comparing the 1700/1700X to the $200-250 price range i5-7600 Kaby
Lake.

------
laura2013
every time i look at AMD's stock price from last year to today i really wished
i'd been wiser with my money. they've performed excellent and it's showing,
this will likely be huge.

------
gnipgnip
I wish people benchmark GEMM performance for all of us math folk.

~~~
gcp
Ah, but what should they benchmark GEMM with?

MKL? I don't think so, it takes the worst codepath if the chip isn't
GenuineIntel. OpenBLAS? Doesn't have Ryzen support. Doesn't even recognize
some AMD cores. ACML? Wasn't updated since 2013 or so.

This is a serious question. What are you supposed to do if you need a GEMM
kernel for Ryzen? I sure hope AMD puts out updated ACML libraries _real soon_.

~~~
philipkglass
I'd like to see benchmarks with OpenBLAS. Unlike ACML it's under active
development, unlike MKL it won't deliberately screw AMD performance, and it
offers "pretty good" performance across every environment I've tried it in.
Good enough that it's not worth paying for MKL, not worth going through
ATLAS's self-tuning routine, not worth changing my build scripts to use vecLib
under OS X. If OpenBLAS currently runs poorly on Ryzen I hope Ryzen will get
some development love, because I kind of hate using ATLAS and at this point it
would take a major advantage to tempt me back away from open source
components.

~~~
gcp
OpenBLAS performance is atrocious in 32 bit mode because it doesn't properly
support AVX with the halved register file. Not the most common configuration,
but MKL handles it fine (on Intel chips, obviously).

That said I agree it makes more sense for AMD to contribute to OpenBLAS than
anything else.

~~~
philipkglass
Interesting -- what are the use cases for single precision BLAS on CPU? All
the scientific software I use requires double precision and for tasks that do
well with single precision, I would have thought that GPGPU would now be the
go-to solution.

~~~
gcp
Not if you're shipping software to consumers. (Also, I actually meant 32-bit
as in the OS, not the floating point precision)

------
walshemj
Good now Intel has some competition

