
AMD’s third shoe drops at CES 2020: 7nm Zen 2 mobile CPUs - vo2maxer
https://arstechnica.com/gadgets/2020/01/amds-third-shoe-finally-drops-at-ces-2020-7nm-zen-2-mobile-cpus/
======
nickcw
Your standard disclaimer about nm when referring to modern chips:

> Most recently, due to various marketing and discrepancies among foundries,
> the number itself has lost the exact meaning it once held. Recent technology
> nodes such as 22 nm, 16 nm, 14 nm, and 10 nm refer purely to a specific
> generation of chips made in a particular technology. It does not correspond
> to any gate length or half pitch. Nevertheless, the name convention has
> stuck and it's what the leading foundries call their nodes. Since around
> 2017 node names have been entirely overtaken by marketing with some leading-
> edge foundries using node names ambiguously to represent slightly modified
> processes. Additionally, the size, density, and performance of the
> transistors among foundries no longer matches between foundries. For
> example, Intel's 10 nm is comparable to foundries 7 nm while Intel's 7 nm is
> comparable to foundries 5 nm.

[https://en.wikichip.org/wiki/technology_node](https://en.wikichip.org/wiki/technology_node)

So 7nm does not mean the chip has 7nm features, it means that the marketing
team called it "7nm".

~~~
derefr
This closely matches what happened to processor naming. It used to be that any
two (CISC) CPUs with the same frequency were about on par with one-another, so
we just called processors by their frequency. As soon as processors started
adding things like SSE, though, that went out the window, since now “one
cycle” could do arbitrary amounts of work, and also consume arbitrary amounts
of electricity. So now we instead group processors by their manufacturer and
model, and compare processors by “generation”, which ends up being a loose
count of each time each vendor has done a large redesign that enabled
efficiency increases beyond “just” a process-node shrink.

So: is there any analogous new convention for naming process nodes, now that
you can’t just refer to them by their size? If there’s a dropdown select box
on some form internal to e.g. Apple, where an engineer specifies what process-
node _of_ what fab they’d like to use to print a new chip—and said list has
had some thought put into making it intuitively unambiguous—then what would
the items in that list look like?

~~~
neodymiumphish
Yep, but now we refer to chips by their cost, instead.

AMD Threadripper 3990X : $3990

~~~
warrenm
That's only true when it's first released

------
bitL
Now please get some proper high-end laptops (HiDPI, 32GB, TB3) and a NUC clone
that can be used as a SteamBox.

~~~
a012
It may take forever for a vendor to make a Mini PC like NUC which is
comparable with AMD CPUs. I assume the chicken and egg problem here because
the market is small and Intel NUC is already well established.

~~~
aeyes
ZOTAC has been selling AMD based Mini PCs for a while, I don't see why they
shouldn't make one based on this new CPU:

[https://www.zotac.com/us/product/mini_pcs/all?processor=AMD](https://www.zotac.com/us/product/mini_pcs/all?processor=AMD)

~~~
bitL
Zotac has a better Ryzen-based computer in their Magnus line with a proper GPU
though...

------
Roritharr
I really hope they find a way to include a compatible Thunderbolt 3
experience. Using a Thunderbolt 3 Dock has become a must for me at this point,
i'd hate to go back to multiple cables/less than 2*4k/60

~~~
jakamau
I share that sentiment and I can't wait for USB 4 to show up in consumer
products.

~~~
Roritharr
I wonder if USB4 Devices will be actually really compatible with current
Thunderbolt 3 Devices, or if it's a "technically yes but not really"
situation.

------
jacek
I have been waiting for mobile 7nm Zen 2 to get a new laptop. However, it
seems most of the newly introduced laptops with Zen 2 are gimped compared to
Intel counterparts. Some AMD laptops from Lenovo don't have a high-res display
options (Intel versions do), 16GB memory options (Intel versions do). Acer
Swift 3 with Intel comes with high res 3:2 display. AMD version comes with
16:9 FHD. I hope the next generation Thinkpad line will be more interesting.

~~~
opencl
They announced these chips 2 days ago, there's still plenty of time for nicer
laptops to get announced. Hopefully the Matebook D gets updated with these at
least.

~~~
jacek
You are right, we need a bit of patience. Reviews of last generation Thinkpads
with AMD have been very positive. I hope the next generation will be even
better. I would love to see one with 16:10 screen, but that's rather unlikely.

~~~
bjoli
Let's make it 3:2! I make extensive use of either 2 windows side by side, or
multiple windows in Emacs. I never have horizontal space issues.

~~~
SmellyGeekBoy
I play a lot of retro games, so why not 4:3!?

~~~
y4mi
I don't remember 4:3 being frequent, wasnt almost everything 5:4?

~~~
fhars
No, you remember wrong, 5:4 was extremely uncommon, 1280x1024 is the only 5:4
resolution I have ever seen in the wild, compared to 320x240, 640x480, 800x600
and 1024x768.

(Does anyone else fondly remember the short moment in time when all laptops
and all projectors had 1024x768 as their native resolution and you could
expect presentations to just work?)

------
bloody-crow
Meantime, Apple is stuck with Intel for CPUs and AMD for GPUs — the worst of
both worlds.

~~~
dijit
I would not say “stuck”. They have good reason for telling Nvidia to fuck off.
Maybe you remember some time ago where Nvidia graphics cards didn’t meet the
spec that they gave to Apple, and thus overheated and de-soldered themselves
from MacBook Pro’s. This was not helped by the fact that Nvidia insisted on
integrating itself into the north bridge of the motherboard. Meaning a card
failure was fatal in the weirdest of ways.

Or the other time where Nvidia didn’t ship their graphics chip to them with
support for the old school Cinema Display. So they sat in warehouses for
18months while Nvidia produced the part and only then, Apple could start
integrating.

Then Nvidia tried to sue Samsung+Qualcomm when the iPhone came out (because
apple used GPUs from these companies) claiming patent infringement.

Then in 2016 Apple said “no” to Nvidia on putting their GPUs into their
laptops due to power consumption to performance concerns. And apparently that
was not a healthy conversation.

Nvidia is a bully, and Apple is a big player. I will not shed a tear over 10%
performance in predominantly game workloads.

Intel on the other hand has an edge with thunderbolt right now, but that’s an
open standard and Apple could produce an AMD machine with thunderbolt now. I
would suspect that they’ll live with the lesser performance until ARM is
possible.

~~~
ksec
>Then Nvidia tried to sue Samsung+Qualcomm when the iPhone came out (because
apple used GPUs from these companies) claiming patent infringement.

Apple has always been using PowerVR for iPhone since day 1. Even the recent so
called self designed GPU are pretty much 80% PowerVR with all PowerVR
proprietary and patented features.

------
blackhaz
I'll probably be downvoted, and sorry for the off-topic, but I miss those days
when you could approximate computing power by looking at the processor's model
number. 386SX-33, 386DX-40, 486DX2-66, and so on... Right now it's all
becoming gibberish to me.. 1075, 4800u, 3950, 6600U. I don't even remember
what CPU do I have. All the romance is gone!

~~~
blihp
I suspect that's been deliberate at least on Intel's part since their year
over year changes haven't been compelling for the better part of a decade. So
their model numbers indicate 'new' without really quantifying 'improved' since
that hasn't been a very good story for a while now.

~~~
ajross
That's just fundamental. In the golden age of VLSI scaling, from 1985 to
2003[1] or so, we really were seeing that doubling of transistor density with
every 1.5-2 year generation, and the shrinking transistors were getting faster
more or less linearly with size, leading to a quadratic improvement over time.
Those days are almost two decades in the past, and they aren't coming back,
ever.

We happen to be seeing a bump right now due to AMD migrating across two
process nodes at once (and Intel having fallen a little behind), but that's
just an instantaneous thing. The days of the 486 are still gone.

[1] My own dates. I'm bounding this roughly by the point where EDA tools
caught up with process improvements (allowing straightforward die shrinks and
scaling of logic) and the end of "free" frequency scaling due to thermal and
power limits. Someone else probably has a different definition. Fight me.

------
AtlasBarfed
also handily nosed past Intel's most recent full-on gaming CPU, the i7-9700K,
on both content creation and physics engine benchmarks, despite being a mobile
form factor with under half the TDP.

...wow

------
awill
It's super annoying that AMD's 3000 desktop line is 7nm zen2, while their 3000
laptop line is 12nm zen+.

It's confusing and dishonest. Multiple times I've jumped at almost getting a
Ryzen 3000 laptop only to remember that it's not really Ryzen 3000.

~~~
jlei523
It's only confusing to a small minority of buyers.

Most people who buy these laptops know nothing about Zen2, Zen1, Zen+.

------
cabaalis
> Intel focused on AI acceleration—but AMD went unapologetically hard on
> gaming.

I don't think you can go wrong focusing on gaming. This is my opinion of how
Microsoft succeeded so well. The network effects push all of computing
forward.

~~~
Havoc
Stadia etc could still ruin it for AMD. You no longer need local power to game

~~~
kllrnohj
Stadia doesn't run on fairy dust. It still uses CPUs & GPUs, and the more
people use Stadia the more it will need. And since Stadia uses AMD GPUs I
don't think AMD would be that sad about selling more super high margin
enterprise GPUs to Google.

And of course you still need that local client to play it, which might as well
be an AMD-powered ultrabook. At which point Stadia just resulted in AMD
selling up to 3 products instead of just 1 (GPU & CPU in the server + APU in
the laptop)

------
heyflyguy
The article mentions it, but I stay consistently frustrated by my marriage to
NVIDIA at this point. A heavy user of image processing, image mosaicing,
tensorflow, et al - I should invest in NVIDIA stock.

~~~
gowld
Warren Buffet recommended investing in the companies who make products you
can't not use.

~~~
wolfgke
> Warren Buffet recommended investing in the companies who make products you
> can't not use.

Let's put it this way: My taste for products that I can't not use is rather
different from the majority of society.

~~~
WillPostForFood
Warren Buffet's recommendation may particularly relevant to himself because of
his aw shucks main street tastes.

------
agumonkey
Side effect of these: SFF, NUC, fanless desktops too.

------
ChuckNorris89
I put off upgrading my 3 year old laptop last year waiting for the 7nm Zen and
7nm Nvidia parts. Soo far it looks like it was worth the wait. Now it's
Nvidias turn.

~~~
XCSme
But next year even new/better CPUs and GPUs will appear, right? Why not wait
until next year?

~~~
pdimitar
Valid question. My answer would be: because even though AMD made things quite
interesting during 2019, I don't foresee them being able to go much further
than what they did now.

Example: PCIe 4.0 has _insane_ bandwidth allowances. I don't see any SSDs
_ever_ going beyond PCIe 4.0 bandwidth. Other iterative improvements like a
few dozen MHz more in CPUs, or a few dozen more cores on the GPU, or a few
more hundred MHz in RAM... they are usable without their potential being
wasted in the new chipsets -- although I am not an expert, it does seem that
way after several reads of the their parameters.

And even if next-gen stuff with bigger improvements is coming, I don't feel we
can go much higher before these things hit big diminishing returns in
noticeable performance in all but very specialised programs. Currently I am
using a workstation with 10-core 4.0GHz CPU / 64GB 2666MHz DDR4 ECC / NVMe SSD
at ~2.8 GB/s. I seriously cannot make _any_ programming software even utilise
that SSD to full capacity -- only Postgres manages to saturate it to 50% of
its potential when doing `pg_restore`.

Granted I haven't run any deep learning stuff and I don't intend to. I am
focusing on everyday and professional-but-not-strongly-specialised software.
And there, I feel, stopping at motherboards with PCIe 4.0 and their
appropriate AMD CPUs and uber-fast SSDs and the fastest RAM you can find
today... will be more than enough for like 5 years.

All IMO of course.

~~~
ksec
There is another data point.

Zen 4 will be PCI-E 5.0 and DDR5. Unless Zen 4 is a gigantic leap forward in
performance and not an iteration. The cost of Jumping in those tech will be
quite expensive.

So Zen 3, PCI-E 4.0, DDR4 will be like the sweet spot for desktop for a while.

( I do wish I am wrong and they drive down the cost of PCI-E 5.0 and DDR5
faster. But history has shown those tends to take 2 - 3 years to pick up the
pace )

~~~
pdimitar
I have the same feeling. A lot of people will stay on Zen 3 / PCIe 4.0 / DDR4
for years and PCIe 5.0 will be for min-maxers.

Hell, even the current high-end PCIe 3.0 setups like mine seem to be quite
future-proof.

~~~
ksec
Well of coz that is from a PC perspective, and I am stuck on Mac......

~~~
pdimitar
I'm "stuck" on an iMac Pro. :D

It's an amazing machine in every way.

------
mikece
Sorry if this is off-topic but how would this chip compare to an ARM design
that could handle 16 threads? Would thermal dissipation, power consumption,
and performance be roughly the same or would they be significantly different?
I want to be excited about this chip but thought that this was supposed to be
the year we all get ARM-powered laptops/Chromebooks.

~~~
toast0
There's several popular ARM cores around these days. You would need to pick a
specific processor family, at least.

That said, I have seen plenty of ARM powered Chromebooks and I don't think any
of them had more than four cpus or were known for high performance. It's not
impossible to make a high performance ARM chromebook, but it's not the market
people are building for. High performance x86 Chromebooks happen because once
you've built for a dual core mainstream x86 laptop processor, you can also
easily solder in a higher performance processor with the same footprint.

------
dghughes
AMD should market it as "7en".

~~~
kijin
That looks disturbingly reminiscent of a certain David Fincher movie.

~~~
blinkingled
Could signify an end to INTC's most recent 7 MDS sins.

