
AMD Ryzen 4000 Mobile APUs - neogodless
https://www.anandtech.com/show/15324/amd-ryzen-4000-mobile-apus-7nm-8core-on-both-15w-and-45w-coming-q1
======
m0zg
And laptop manufacturers will now proceed to stuffing these into bargain bin
laptops with shitty 1080 displays. To this day desktop/workstation Ryzen CPUs
are not treated as a premium product in spite of their superior performance.

I wish Apple would adopt this at least for some products. It'd give AMD more
credibility, and Apple would be able to negotiate pricing more easily with
Intel.

~~~
me551ah
I'm curious about how many people actually have laptops with resolutions
higher than 1080p and use them without display scaling.

People buy a high resolution laptop and turn the scaling on 200%, effectively
nullifying the benefits of a high resolution display.

~~~
toyg
I'm sorry but you don't know what you're talking about.

Using a hi-dpi display at native res on a 15'' screen is a recipe to kill your
eyes, whereas scaling to 200% gives you butter-smooth text that relaxes them.
_That_ is the benefit.

When I was young, I was all for keeping text small and stuff ing as much info
on the screen as I could. Nowadays, I just struggle to work on any native-res
screen: unscaled low-dpi looks like crap, unscaled hi-dpi is tiring. My world
changed with the 2012 retina MBP and I simply cannot go back.

~~~
teekert
There is scaling like just making every pixel a 2x2 pixel area on the screen,
but there is also scaling that increase the resolution. I think me551ah was
talking about the former kind, which also seems useless to me, just buy a
1080p display. the second kind though makes everything nicer.

~~~
deno
Fractional scaling is stupid. 150% × 4K = 1440p so just buy 1440p screen but
150% on 4K will mean misaligned pixel grid. If you need more real estate _and_
want HiDPI then you need 5K display, not 4K. 5K is 1440p@2.

Fractional scaling should be under seven “Are you sure? What you’re doing is
stupid.” pop-ups. Instead at least Windows 10 actually makes 150% the default.
Boggles my mind.

Here’s a good article series on this topic by Elementary OS dev:

[https://medium.com/elementaryos/what-is-hidpi-and-why-
does-i...](https://medium.com/elementaryos/what-is-hidpi-and-why-does-it-
matter-b024eabea20d)

[https://medium.com/elementaryos/top-3-misconceptions-
about-h...](https://medium.com/elementaryos/top-3-misconceptions-about-
hidpi-f5ef493d7bf8)

------
e12e
Well, I hope this works out for AMD - currently they apparently can't compete
for power efficiency.. At least judging by one of the more interesting reviews
from 2019:

"The Microsoft Surface Laptop 3 Showdown: AMD's Ryzen Picasso vs. Intel's Ice
Lake"

[https://www.anandtech.com/show/15213/the-microsoft-
surface-l...](https://www.anandtech.com/show/15213/the-microsoft-surface-
laptop-3-showdown-amd-picasso-vs-intel-ice-lake)

Ed: looks like we'll see more intel/amd head-to-head designs, eg:
[https://www.anandtech.com/show/15305/acer-swift-3-either-
wit...](https://www.anandtech.com/show/15305/acer-swift-3-either-with-
core-i71065g7-or-ryzen-7-4700u-the-laptop-market-just-blew-wide-open)

~~~
basilgohar
Not that I want to apologize for poor performance, but I remember feeling let
down by that review because some obvious differences between the platforms
were not highlighted that should feed into the conclusions, not the least of
which was the drastically different memory used between the two platforms.

The article was billed as "let's see the difference between AMD and Intel" but
there were significant platform differences that made it not quite apples-to-
apples.

------
MikusR
Any idea if these support full HW acceleration of VP9 (Youtube)?

~~~
wmf
The previous gen has it, so yes.

~~~
The_rationalist
And about AV1?

~~~
tambre
Very unlikely. They'd be the first to ship customer PC parts with AV1 decode
support. But the next generation almost certainly will. Same goes for other
vendors, for Nvidia post-Ampere, for Intel post-Icelake/Tigerlake/whatever the
next is nowadays, etc.

Otherwise, there are AV1 decode IPs available, including a SoC or two. Plus
some very recently announced set-top boxes and TVs. So you'll definitely be
seeing some hardware with AV1 support shipping this year.

~~~
pkulak
Can't wait. I'm really hoping AV1 is the be-all end-all and we can all stop
moving to new codecs.

~~~
The_rationalist
AV1 is inferior to h265 and the successor to h265 should come soon and rekt
AV1 and no longer have big royalties issues.

BTW the hardware cost by not using h265 far outweight the royalty cost. It has
always been a pure economic nonsense for google (YouTube) to exclude h265.

~~~
pkulak
Could you elaborate on the weaknesses of AV1? I hadn't heard that. Is it
because of the wavelet-based i-frames that h265 uses? Or a bunch of little
things?

~~~
ksec
I think he _may_ be referring to AV1's economics weakness. Not the actual
quality itself. As in the cost to bring quality to this level while
disregarding encoder and decoder complexity. Kostya has a rant about it here
[1]. He was the guy that made Real Video work across all platform .

Personally I am giving the industry a benefits of doubt and one last chance on
VVC / H.266.

[1] [https://codecs.multimedia.cx/2018/12/why-i-am-sceptical-
abou...](https://codecs.multimedia.cx/2018/12/why-i-am-sceptical-about-av1/)

~~~
tambre
> As in the cost to bring quality to this level while disregarding encoder and
> decoder complexity.

AV1 is amazing for YouTube, Netflix and torrent-scale videos currently.

I can't find the reference for this, but in the beginning the codec developers
and big companies had meetings. The answer from big companies for how much
encode time increase would be acceptable was 100x to 1000x the current
standard. Thus the design.

Of course the encode time problem will be solved for regular users too once
AV1 encode ASICs for consumer hardware enter the market in 3–5 years. There
are a few solutions already offering cloud FPGA encoding alongside beefy
servers. If streaming bandwith costs are a significant issue for you, then you
can easily afford that.

------
therealmarv
We need NUCs from AMD !

~~~
qilo
DeskMini A300 by AsRock. Slightly larger than NUC, but can house full power
65W desktop CPU.

~~~
therealmarv
This one is using desktop CPUs (nothing against them) and unfortunately not
the newest Ryzen versions.

~~~
magicalhippo
It supports the 3400G, which is the most recent with integrated graphics.

[https://www.asrock.com/nettop/AMD/DeskMini%20A300%20Series/#...](https://www.asrock.com/nettop/AMD/DeskMini%20A300%20Series/#CPU)

~~~
therealmarv
ah, thanks for that information. I'm not super familiar with AMDs productline
and somehow thought that all Zen 2 have an integrated graphics inside. So we
are gonna to expect some more powerful G line APUs in the future too I
guess...

~~~
magicalhippo
Yeah hoping to see something like that soon, will be pretty awesome I think.

------
darksaints
Anybody know of any companies planning to ship these in a NUC-style form
factor?

~~~
alimbada
ASRock did the DeskMini A300 last year which was quite popular. I'm hoping
they will refresh it with an updated motherboard for the new APUs coming this
year.

~~~
llampx
It was the only one I believe. I'm still on the lookout for a good MiniPC
solution that doesn't have compromises like a soldered CPU or requiring
SODIMMs etc.

------
est
I'm looking forward to build a mini HTPC with this. Hope it can handle 8K HDR
encode/decode well.

------
rurban
I got the previous 3000 Ryzen edition on my new cheap Lenovo, and it kills all
my big Intel machines in all benchmarks. It cannot use it for benchmarking, as
it drops frequencies from 4.3 to 1.5 as it likes (or does temp. freezes) but
for testing and dev the AMD works wonders.

------
jwildeboer
Since when are CPUs called APU?

~~~
bob1029
APU = CPU + GPU in a single package or die.

------
Out_of_Characte
These APU's wont fix todays problems in Laptops. Idle power consumption is
largely due to all components together. bigger batteries are only more
expensive. I'm still waiting for more efficient SSD's. There's no point in an
efficient SOC if you still need a large hard drive and a bright display.

And of note; an 1800 base freq is on the low end of the performance/watt curve
we've seen from their other products. Maybe AMD expects most workloads to not
use all 8 cores properly and let the boost algo max the cores out?

also, where's PCIE 4? My guess is they are waiting a cycle on purpose due to
power constraints.

~~~
kllrnohj
> And of note; an 1800 base freq is on the low end of the performance/watt
> curve we've seen from their other products. Maybe AMD expects most workloads
> to not use all 8 cores properly and let the boost algo max the cores out?

The 1.8ghz base freq is just to hit the desired 15W TDP. The approach is pick
a TDP, say 15W, then adjust base freq for the core count to hit that. That's
why the 8C ends up at 1.8ghz base while the lower-end 4C has 2.9ghz base freq.
Then let turbo be the thing that everyone actually uses on a daily basis,
because base frequency is irrelevant. It's not actually an input into anything
the CPU does for either AMD or Intel. The CPU is monitoring its power draw to
stay in a power budget and a temperature budget. What speed it ends up running
at is then not just dependent on how many cores are used but also what type of
instructions they are running. It's a fully-dynamic system these days, making
single-number specs useless.

~~~
jotm
Man, I hope the TDP is adjustable, as well as the voltage, like many Intel
chips. Would be great to see what these chips can do.

~~~
kllrnohj
Given this is a laptop it doesn't really make sense to adjust the TDP even if
you can. You're going to be limited by the laptop's cooling solution, which is
going to be at best sufficient for factory settings but nothing more. More
commonly it's actually not _quite_ sufficient for factory settings, leading to
thermal throttling over sustained loads.

~~~
jotm
Oh I am well aware. Laptop coolers are barely good enough, without exception.
It's fascinating really, it's like they want their systems to run at close to
throttling temperatures, so they last a tad bit longer than their warranty.

Doesn't bother me, I've got 2 hands and 2 eyes, anything can be modified with
those. Except encrypted BIOSes, for now :D

