
AMD Announces Ryzen “Zen 3” and Radeon “RDNA2” Presentations for October - rbanffy
https://www.anandtech.com/show/16077/amd-announces-ryzen-zen-3-and-radeon-rdna2-presentations-for-october-a-new-journey-begins
======
dang
This is an announcement of an announcement. Those are off topic here. There's
no harm in waiting.

[https://hn.algolia.com/?query=%22announcement%20of%20an%20an...](https://hn.algolia.com/?query=%22announcement%20of%20an%20announcement%22&dateRange=all&page=0&prefix=true&sort=byDate&type=comment)

[https://hn.algolia.com/?query=%22no%20harm%20in%20waiting%22...](https://hn.algolia.com/?query=%22no%20harm%20in%20waiting%22%20by:dang&dateRange=all&page=0&prefix=false&sort=byDate&type=comment)

~~~
username90
This announcement of an announcement is very important for people looking to
buy a GPU now.

~~~
dang
Fair point, but we have to go for the global optimum, which is having the most
interesting front page. That means sacrificing local optima, so you may be
right that there isn't _no_ harm in waiting. Most of the people you mention
(at least the ones who read HN) are probably googling for GPU information,
though, so I doubt there will be too many GPU purchase disasters.

The trouble with "announcement of an announcement" posts isn't just that
they're unsubstantive and lead to generic discussions
([https://hn.algolia.com/?query=generic%20discussion%20by:dang...](https://hn.algolia.com/?query=generic%20discussion%20by:dang&dateRange=all&page=0&prefix=true&sort=byDate&type=comment)).
It's also that when the real announcement is made, we end up with a duplicate
thread.

I suppose this follows from the competing interests of the publishers and this
forum. Their interest is to copy the same information to as many places as
possible; our interest is to deduplicate it.

------
the_duke
With Intel stuck in process hell for a while longer, I primarily hope AMD can
gain ground on Nvidia. They are pushing a lot of innovation and gaining market
dominance, both in the ML/GPGPU space and gaming (with ray tracing, upscaling,
etc).

We need some competition to keep prices low and innovation high medium to long
term.

~~~
reitzensteinm
Chips are in the pipeline for many years, so products being released today are
probably on the tail end of being impacted by AMD cutting GPU budgets to the
bone to focus on Zen. I wouldn't expect miracles.

Given AMD's market cap and financial success, I bet they're working on
competitive designs now. There's no money in being an also ran in the space.

~~~
DisjointedHunt
The information available around Big Navi (The GPUs expected to launch next
month) put them at a +50% performance jump from the last generation and likely
able to compete with the Nvidia flagship 3080:
[https://www.tomshardware.com/news/amd-big_navi-rdna2-all-
we-...](https://www.tomshardware.com/news/amd-big_navi-rdna2-all-we-know)

I'd really urge anyone wanting to opine on these issues to watch the official
analyst day videos and read through the releases AMD has put out so far in
terms of architecture and other tech decisions they've made:
[https://www.youtube.com/watch?v=LMNwLVJQzGs&t=10064s](https://www.youtube.com/watch?v=LMNwLVJQzGs&t=10064s)

~~~
mbell
> The information available around Big Navi (The GPUs expected to launch next
> month) put them at a +50% performance jump from the last generation and
> likely able to compete with the Nvidia flagship 3080

I guess it depends on what benchmark you care about but for AMD to compete
with a 3080, which is 70-100% faster than a 2080, which is 10-20% faster than
AMD's current flagship, they need to do a lot better than +50%.

~~~
pbalau
Isn't the Nvidia flagship the RTX 3090? The overclockers uk website has the
most expensive 3090 (Asus one) at ~1600gbp. On paper this should destroy
everything else available atm.

~~~
DisjointedHunt
In the kitchen keynote, their CEO positioned the 3080 as the flagship card and
the 3090 as the Titan replacement(higher vram for things like large model
runs)

------
PedroBatista
Rumor has it AMD has a competitive GPU but will not pursue the high-end market
mostly because of their limited TSMC slots, they can make much more money from
a CPU wafer than a GPU one ( due to area size and yield )

But I really hope I'm wrong and they kick Nvidia ass since they become such
intolerable arrogant anti-competitive pricks in the last +10 years.

~~~
technofiend
The last time I owned an AMD video card was more than a decade ago. At the
time their video card drivers needed work and could blue screen your box.
There was a lot of back and forth on the internet about which driver version
to load to get the most stable experience.

Fast forward to 2020 and the perception about AMD's video driver quality seems
to be in a similar place: [https://www.pcgamesn.com/amd/radeon-big-navi-
driver-issues-f...](https://www.pcgamesn.com/amd/radeon-big-navi-driver-
issues-fix). This may just be confirmation bias since there are also articles
refuting that AMD still has driver issues.

On the gripping hand AMD themselves apparently acknowledge they have issues
and are trying to address them . [https://www.tomshardware.com/news/amd-
drivers-update-downloa...](https://www.tomshardware.com/news/amd-drivers-
update-download-adrenalin-2020)

~~~
derefr
> Fast forward to 2020 and the perception about AMD's video driver quality
> seems to be in a similar place

Maybe for Windows, but the same cannot be said of Linux or macOS, where AMD's
choice to open-source their drivers has led to these OSes having great
support; while Nvidia, having to support their own closed-source drivers,
seemingly hasn't had the bandwidth to get their drivers for these OSes up to
par.

~~~
packetlost
This. I've already decided my next GPU will be AMD regardless of if it's
competitive at the top end in terms of performance. I am so sick of nVidia's
Linux driver bullshit that I cannot wait to drop them. I don't game that much
anymore anyway.

~~~
shmerl
AMD is great for gaming on Linux, especially with projects like radv and ACO.

~~~
packetlost
I'm not super interested in Linux gaming, I just want my desktop to be usable.

~~~
jdmichal
Honest question: If you're not interested in gaming, why even get a discrete
GPU and not an integrated one? Are you interested in using it for other
purposes?

~~~
packetlost
I dual boot Windows for gaming. I work and code on Linux, except my monitor
setup + nvidia graphics card results in a virtually unusable desktop.
Integrated graphics can't run multiple 4k screens very well, if at all.

~~~
shmerl
I do all my gaming on Linux and AMD works very well for it.

------
dtx1
As a linux-only user (and gamer), nvidia is a plain no-go for me so i just
hope that rdna2 is not entirely disappointing like RDNA1 was. For us Linux
gamers a working VFIO on a consumer card would be a dream of course since
nvidia is locking that to their workstations cards, maybe AMD just throws it
in as a goody.

Zen3 is propably gonna kick-butt, especially in any non-gaming task perhaps
AMD will inch closer to Intel in gaming loads aswell, but i don't care too
much.

~~~
lreeves
I have had perfect working VFIO setups with both the 1070 and 1080ti from
nVidia, there are many comprehensive guides and working around the driver
error code 43 is quite trivial.

~~~
dtx1
That's nice but an error thats designed to fuck me as a customer is a good
reason not to buy from that company, and the general state of nvidia drivers
on linux makes me not want to chose them for a new build. Nonetheless, i'm
glad to hear you got it working, it sure is a fun thing to do.

~~~
calcifer
> the general state of nvidia drivers on linux

... is great? They have been well supported for decades, has working drivers
for every GPU on release day and even 10+ year old cards get updated drivers
for new kernels / X servers whereas AMD takes months/years to have usable
drivers in a stable kernel.

~~~
dtx1
Is that so? I always here: stay away from nvidia on Linux, especially but not
limited to using Wayland. Is that truly not the case?

~~~
calcifer
Well, my experience [1] has been nearly perfect, certainly much better than
what I've had with AMD. Nvidia has been pretty much plug&play for me for a
long time, with very few issues.

For Wayland yes, Nvidia is using a different approach than AMD/Intel so most
desktop environments don't support the Nvidia way, resulting in a poor
experience. That said, Wayland desktops as a whole are still quite some time
away from reaching feature parity with X, so I'm not too worried just yet.

[1] For reference, I've been using Linux (and only Linux) on the desktop for
~18 years.

------
allenrb
If the rumors are true about increased frequency _and_ IPC for Zen3, we should
be in for quite a ride. Zen4 and a shrink to 5nm next year ought to produce
some further benefits.

I’m too old to “hate” Intel but the improvements we are seeing from both sides
these last couple of years make clear how bad the stagnation had gotten. Gotta
have competition!

------
ziml77
I wonder if Zen 3 will finally let AMD pass Intel in single threaded
performance. I thought they had already, but just a few days ago I checked
benchmarks and saw that my 8086K still beats a 3800XT.

Though maybe I'm stuck in the past a bit thinking that single threaded
performance is even still important for games.

~~~
StillBored
Its not just games, firefox with ublock+ and a dark mode plugin are noticeably
faster on fast single threaded machines.

~~~
ziml77
I was going to add "and typical PC tasks" but ended up taking it out because
everyone's version of typical on a PC is different.

------
jonpurdy
It was already a dire situation running a Mac with eGPU or a Hackintosh even
before the RTX 30-series announcement, and much more dire after that.

The likeliness of Apple allowing NVIDIA to produce Mac drivers again is close
to zero. It’d be nice for AMD to be able to compete with GPUs again.

~~~
andreasley
Apple Silicon will contain an Apple-designed GPU, so it might soon be
irrelevant for Mac users what AMD has to offer.

~~~
littlecranky67
I highly suspect Apple will still ship (and support) AMD GPUs - maybe for the
iMac Pro and Mac Pro. I doubt that Apple Silicon GPUs will be close to the
performance to AMD/NVidias flagships. And I assume the market share for
iMac/Mac Pro is just not that big for Apple to invest in competing with
AMD/Nvidia on the Highend Segment.

~~~
pier25
OTOH Apple also needs dedicated GPUs for some of their MBPs and the bigger
iMacs.

I think they will end up ditching AMD to have 100% E2E control on all their
products.

~~~
srtjstjsj
Apple's MBP drivers have been broken since 2016 and getting worse, high-
frequency locked and thereby thermally throttling the CPU at idle. I don't
know if that means Apple Silicon will be better or if they are just
incompetent at it.

------
kushalpandya
I feel RDNA 2, at the very best would prove to be something in between of what
Zen+ and Zen 2 was against Intel, so while it may not outperform Nvidia on
highest end, it ends up providing an overall better value, leading to people
buying over Nvidia.

It is worth noting that unlike, Intel, Nvidia isn't sitting ducks when it
comes to GPU innovation so that's a harder hill to climb for AMD. And
fortunately AMD now has their cash cow running with Ryzen, EPYC and Consoles
so they can allocate more R&D budget for GPU.

------
pier25
Very excited about this. I wasn't really a fan of AMD, but I built a machine
with a Ryzen 3700X last year and I've been super impressed.

Personally I don't really care if RDNA2 is not as powerful as the Ampere GPUs
as long as the price is competitive. I want to update my 1070 just for playing
Cyberpunk 2077 when it comes out, but since I don't play that much anymore it
doesn't make sense to go all-in on the GPU.

------
swalsh
I recently bought a Ryzen 3900X. I hope Linux eventually supports these
processors better. There has been some serious stabilization issues I
eventually tracked down to being related to my processor. These issues
required me to spend a significant amount of time researching. Things seem to
just work with Intel processors. I'm hoping as popularity increases support
for these processors will improve.

~~~
garblegarble
>serious stabilization issues

Care to go into more detail? I've just installed a server with a 3700X and the
only issues I've experienced have been that I need a custom kernel module in
order to get CPU temperature and power data (zenpower)

~~~
shrimp_emoji
>I need a custom kernel module in order to get CPU temperature and power data
(zenpower)

Is this on an old Ubuntu version or something? This was only true in the early
days of these CPUs, which was last year. Temps have been working for a long
time in the Arch world, but maybe it's different in point release hell.

~~~
garblegarble
>Is this on an old Ubuntu version or something?

I don't know, although I'm running the latest Ubuntu (20.04.1), with kernel
5.4.0-45-generic

------
jonplackett
At the beginning of the year I thought AMD were going to have an amazing year.
I think they’ll probably continue beating Intel but the new nVidia cards look
really good, and very good value.

I hope AMD still have an ace up their sleeve.

~~~
akerro
> I think they’ll probably continue beating Intel but the new nVidia cards
> look really good, and very good value.

AMD have much lower power consumption, that was my main motivator for Ryzen
and AMD Radeon as dev station.

~~~
zerocrates
Really? The general sense about AMD has almost always been the reverse, both
as compared to Intel and Nvidia in those respective spaces.

~~~
mastax
5 years of process stagnation (Intel) will do that. Samsung's 8nm process used
in Nvidia's cards is not as good as TSMC 7nm and that shows in their power
consumption. Historically Nvidia's architectural efficiency has been much
higher so it's not a given that RDNA2 will be more efficient even with the
process advantage.

------
SloopJon
Would an AMD event like this usually be an announcement about future products,
a paper launch, or could one expect to buy one of these processors soon like?
I'm looking to build a new system around an RTX 30-something. I would even
consider a pre-built system if I can't get my hands on retail parts.

~~~
brundolf
Same here; I decided about a month ago that I wanted to upgrade, and that I
wanted to go AMD this time, but then I found out Zen 3 was right around the
corner so I've been anxiously awaiting its release so I can move forward

------
dzonga
I'm in the market for an apu ? should I wait for these new processors or just
bite the bullet on a 3400G. not a serious gamer, or hardly play games at all.
but an alternative to the xbox one s, I have would be nice. games I would like
to play e.g strategy | tactics are not available on console.

~~~
neogodless
The AMD naming conventions are a little confusing, so it's not obvious, but
the 3400G is based on Zen+. They have announced Zen 2 APUs [0], but they are
not available directly - just in pre-built systems. That may change at some
point. These are a huge improvement over Zen+, but I wouldn't wait around for
retail availability of Zen 3 APUs!

[0] [https://www.amd.com/en/products/apu/amd-
ryzen-7-4700g](https://www.amd.com/en/products/apu/amd-ryzen-7-4700g)

------
jagger27
I'm pretty excited to see what they'll offer in the 10- to 16-core range for
Zen 3 and if they'll hit the magical 5GHz mark.

Anyone else remember the promised 10GHz of the NetBurst? It's nice to see the
industry slowly creeping up to that number.

------
Severian
Besides processors, AMD knows how to design a good hype train.

~~~
srtjstjsj
They are masters of the hype cycle. The hype bicycle, nowadays.

~~~
Miraste
Including the most important part of the AMD hype bicycle, backpedaling.

------
nsxwolf
I’m worried that the choice of not going first with the RDNA2 presentation is
because they know it’s going to be disappointing and not competitive with RTX
3000.

~~~
blymphony
It's not out of the question to save the best part of a presentation for last.

~~~
tw04
Sure - but waiting to announce (and hopefully release) for over a month after
your competitor is SHIPPING is a sure way to lose sales. At the very least
they need to leak potential pricing and performance in the mean time.

~~~
ses1984
Is October 8th really a month after Nvidia is shipping?

I'm not sure how many units Nvidia is really going to move before then...

~~~
tw04
Per the article you're commenting on... RDNA2 is being announced October 28th.
NVidia is SHIPPING RTX 3k on the 17th.

[https://www.techradar.com/news/rtx-3080](https://www.techradar.com/news/rtx-3080)

So yes, October 28th is more than a month after September 17th. Maybe read the
article next time before you downvote.

~~~
ses1984
I didn't down vote.

Sorry I misread the article.

Anyway I am still highly, highly skeptical Nvidia is going to move a lot of
units so close to launch. The first batch is going to sell out in moments and
from then on they will only be available way above retail price. Happens every
time...

------
d0mdo0ss
I commented earlier, but figured to add some more:

I'm rooting for AMD and also looking to replace a 2012 MBP with an AMD laptop.
The defacto cards for ML work seem to be NVDIA (eg their RTX line,
[https://timdettmers.com/2020/09/07/which-gpu-for-deep-
learni...](https://timdettmers.com/2020/09/07/which-gpu-for-deep-
learning/#TLDR_advice))

Does anyone know if pytorch/sci-kit might offer more support to AMD cards?

~~~
t-vi
PyTorch is increasing support. It had CI for a long time, has been buildable
for all this year without too many caveats, and I would expect nightlies to be
advertised sometime soon, maybe the next stable release will even also have
it.

------
srtjstjsj
The elephant in the room is (non-free) drivers/compilers

You can lose 90% of theoretical performance due to code quality and
(especially in a laptop) thermals.

~~~
derision
AFAIK AMD is more open than the alternatives

------
MrStonedOne
Unless they are going to announce that they will stop letting motherboards
lock your cpus, I don't see why I should be excited.

~~~
newsclues
Why wouldn’t you buy a Ryzen because some epyc chips get locked?

~~~
MrStonedOne
I can't support AMD as a company as long as they are doing that on any of
their products.

~~~
bildung
So what kind of CPUs do you buy, with both Intel and AMD out of the game?

~~~
MrStonedOne
I'm mad at intel for what they did in the past, im mad at amd for what they
are currently doing, which do you think will win out?

~~~
bildung
As Intel's Boot Guard is essentially the same feature, and not a thing of the
past, my guess was neither of the two. I didn't try to be snarky, I'm really
interested in alternatives.

~~~
MrStonedOne
I can't find anything on google about Intel boot guard meaning that plugging a
cpu into a motherboard can bind that cpu to that motherboard such that putting
it into another motherboard no longer works.

