Hacker News new | past | comments | ask | show | jobs | submit login
AMD’s Entire Ryzen 9 3950X 16 Core CPU Inventory Sold Out in Japan (wccftech.com)
525 points by ekoutanov 5 days ago | hide | past | web | favorite | 294 comments





I think there was a link last friday of Ryzen being top seller for like 6 of the top spots on Amazon.

https://www.amazon.com/Best-Sellers-Computers-Accessories-Co...

It looks like today the top 10 are all Ryzen. This is crazy.


Rumour is a corporate client wasn't given access to as many units as they wanted, so decided to buy retail.

Still considerably cheaper than Intel...

The fact that Intel can sell their CPU higher price than AMD means that supply is the bottleneck.

Intel has been having supply issue all year, to the point that it is and will have an effect on their bottom line.

https://www.techspot.com/amp/news/82415-over-year-later-inte...

https://www.cnbc.com/2019/11/27/intel-chip-shortage-pulls-do...


That is the problem, AMD is not taking all the advantage of Intel's misstep. In that regards I think the Sales and Marketing department in AMD is no where near Intel's.

I think it has to do with AMD being way too conservatives with their estimate and not booking enough 7nm Capacity.

Now we know AWS is using those 7nm Capacity as well for AWS, and given Amazon has shown they intend to have all their SaaS running on their ARM CPU, I think their order are going to be big.

Edit: One thing I forgot was It could also be GF's problem where they cant keep up with the I/O Die.


As I understand Intel significantly cut their prices due to third generation Ryzen.

They cut prices, but that was not what I was saying.

They don't have to undercut AMD to sell CPU's if they can deliver more volume. They may cut prices for total $3 billion worth, but they start from so high profit levels ($21 billion) that they can outmuscle AMD financially and stay profitable.


You can only play financial trickery with obsoleted the h for so long and Intel's entire product stack is dead in the water except for low cost consumer class CPUs with the lowest realtime latencies in the world (and those are mostly used for gaming).

At no price point are they competitive with AMDs offerings and they kind of cursed themselves with their legacy of inflated unchanging prices - they tear apart a decade of profit margin and consumer confidence to shortchange their product stack into staying relevant. C levels at Intel must still be dissatisfied with this outcome.


And my phone autocorrected "obsoleted hardware" to "obsoleted the h". Thanks Google.

> They don't have to undercut AMD to sell CPU's if they can deliver more volume.

Intel is having supply issues, so they also have a volume issue as well. https://www.techspot.com/news/82415-over-year-later-intel-cp...


But who isn't going to wait for their Ryzen when Intel still costs more and is riddled with performance issues.

Everyone who wants to sell product with processors. Data center upgrades etc.

Retail market is insignificantly small. AMD can win it but it does not matter in Intel vs AMD fight.


>Everyone who wants to sell product with processors. Data center upgrades etc.

Not when Speculative Execution mitigations cut performance literally in half in some cases.


Retail market is where the developers are. Once you win developer mindshare, it turns into a trickle-up exercise and before you know it a single developer workstation turns into a datacenter.

Citation needed.

There are only two companies capable of fabricating the most advanced processor technologies: TSMC and Intel. The disaster of 10nm at Intel has meant that all of its product lineup has had to internally compete for 14nm fab space, greatly constraining supply there. Meanwhile, Global Foundries' decision to abandon chasing smaller nodes means that fabrication for pretty literally everybody else has had to compete for space in TSMC, so smartphones, GPUs, CPUs, TPUs, etc. are all vying for the same fabrication capacity. Furthermore, it is extremely expensive--$10 billion or so--to make a new fab, so there are few companies that could even afford to throw money at TSMC to expand production.

Put more succinctly: there is literally not enough semiconductor fabrication supply today to meet demand, and the supply is very inelastic.


You forget Samsung.

Yes, but Samsung’s latest nodes have faced delays, so they are slightly behind TSMC. I believe NVIDIA was hurt by this.

And yet they still are ahead of Intel.

I believe that AMD relies on another company to produce its CPUs which may limit their supply. If they can't produce enough, and there is still a demand then there will be no choice but to go with Intel. This is some guessing on my part, since I don't have any economic background.

I don't think it is purely a TSMC supply issue. To put it bluntly; AMD isn't used to selling that many CPUs. Zen 1/+ brought them parity with intel performance-wise (something AMD has been laking for over a decade). It's only with Zen 2/ the 3000 series that AMD has had both core-count and single-threaded. Before you had to choose between lots of slower cores or fewer/ faster cores.

I do wonder if the 3900x/ 3950x shortage is at least partially manufactured (they're so hard to get they must be good right?). Intel halved their prices pre-lunch which is something that has never happened before, but they still can't match AMD for price/ performance.

AMD has both good CPUs & GPUs (something that is very rare!). Being an owner of an RX 5700, I'm surprised how low the power usage is. If they put a cut-down RX 5000 on a low-power CPU, they could make some decent laptops that don't need dedicated GPUs (memory speed is a bottle-neck especially on a laptop with a single memory channel). That could pose a problem for Intel. I'm actually a little surprised it hasn't been more of a priority for AMD, all Intel can do to compete right now is to literally give their CPU/ APUs away (better to make a loss than lose a customer to AMD). Next year will be a good time to buy a laptop. It looks as though Zen 3 will first launch on laptops, that may pose a problem for Intel...


Honestly, I feel that Intel could have the better laptop chips if they get 10nm in a better state. And I say this after building a 3600X +RX 5700 XT desktop this year.

Intel's R&D budget is several times larger then AMD's revenue so they'll always be able to squeeze a little more out of their current chips.

Could just be a timing issue though. Intel might abandon their current 10nm leaving AMD as the default winner by mid/late 2020.

Disclaimer: I'm long on AMD and wishing I bought more 5 years ago.


> Intel's R&D budget is several times larger then AMD's revenue so they'll always be able to squeeze a little more out of their current chips.

But Intel has to compete with TSMC. Since TSMC is making Apple, AMD, NVidia, and Qualcomm chips, TSMC's R&D Budget seems to have won over in this generation.


Agree.

TSMC amortizes its RD/CapEx/Op costs over Apple, Qualcomm, NVida, Broadcom, Xilinx, etc.

Apple, Qualcomm paid for most of the heavy 7nm RD/Capex long before to AMD's 7nm production commitments. AMD basically got the 7nm production advantage without the RISK on Fab investments like Intel. It was good that the AMD 's board had the foresight to divest fab to Global Foundry a few years back.

Intel has to amortizes its fab cost over Altera and x86 CPU. Altera part is probably trivial. Intel's huge 10nm investment in RD, Capex depending entirely on one x86 product line to recover. When execute well like 15 years before 2017, it was big advantage. Any execution misses, the FAB RD/Cap cost became a liability.

"Only the Paranoid Survive" - Last few years, Intel was definitively not Paranoid enough.


The Radeon Pro 5000 cards on the new 16" MBPs seem to have some good reviews

Is it still TMC?

I do remember hosting on consumer CPUs was a thing a decade ago, but then declined rapidly after n-core server chips took over because they provided better per core price.

Now I believe, things may go the other way around as consumer chips are now gaining a lot in core count.


More so considering Ryzen supports ECC. https://www.hetzner.com/dedicated-rootserver/ax51-nvme

that's interesting :)

DIY CPU retail market for processors is extremely tiny market. Very high margin, but insignificant.

AMD's challenge is getting enough processors out in the enterprise market. They need to grow the market share now when they have the technical advantage. Intel can sell their inferior CPU's for higher price because they can deliver large volumes and satisfy demand.

Revenue comparison 2018:

    Intel  $70.8 billion
      AMD   $6.5 billion
AMD is fabless, so they are in competition with other GF and TSMC customers for manufacturing capacity. Apple,Nvidia,Qualcomm and many others.

Indeed. I'm trying to find a suitable vendor here in Germany (as a small software making business), but all I can find (when I look for Ryzen powered PCs) are small local dealers - with usually very mixed reviews and questionable support. I always got Dell, and while I would not want to rely on their support for any software issues, at least whenever we had trouble with hardware I could get that replaced easily and quickly. Those smaller dealers don't even have any decent bundled support options, and no guarantees similar to what I can get from the "big guys". The reviews accordingly show a lot of variation and randomness (basically, all those who have no issues with their purchase give five stars, but of those needing support, even if it is for issues clearly caused by the vendor, have a good chance of telling a nightmare support story).

So while I still want a Ryzen PC, I also want five years of 24 hour hardware support, and I feel more confident to get that from Dell rather than from some small local dealer for whom any support issue is a big deal for their own bottom line and where employees are under more pressure than those working for a big vendor. The big ones still are "Intel only" - apart from >$2,000 gaming PCs (I usually buy ca. $1,000 PCs with minimal graphics card, videos only (hardware accelerated decoding is nice to have; but too much of a card costs too much energy for no reason, even base energy when not in gaming mode is much higher - GTX 1050Ti using just motherboard energy is perfect, as an example), no gaming ever, but good components and large RAM and SSD, for developers).

PS: Any Germans here who found a solution for needs such as mine?


Depending on how many you need, you should consider building them yourself. I just ordered a new Zen2-based build from alternate.de in parts and I think that's the way to go.

That's far too much detail for me. When I look at the "PC configurator" I feel like I'm drowning in details. I would need a week to research all those components. CPU is easy (Ryzen 5 3600 seems appropriate), but next comes the motherboard - and I'm already lost. Never mind the 100 different cases and the list goes on and on. I'm also not so sure that whatever components I select really work flawlessly together.

I run a pc recommender that is basically made for you. Parts that fit together with some optimizing algorithm selecting an optimal configuration for your price point. I changed this configuration a bit manually given your statements: https://www.pc-kombo.com/share/mX6w2675

This will fit together and is stronger than with a GTX 1050 Ti. If you really want a gpu powered only by the motherboard step down to a GTX 1050, or look at the (rather overpriced) GTX 1650 (though it depends on the model there). Mail is in the profile if you have questions.


Damn... I looked for a PC configurator for several hours two weeks ago and eventually settled for a 3800X. Your site managed to get me a 3900X in my 1200 eur budget (with 64 GB of RAM). Oh well, I will buy another PC in the next six months. Definitely bookmarked!

  Internal Server Error

Maybe you hit it during the upgrade there. Works now, as far as I can see.

I have put together many PCs myself, and honestly many parts aren't that important.

Power supply, motherboard, SSD and Ram will "just work", given you buy a recent product from a known brand. The +/- 5% of whatever will not make a difference to most people.

For mobo, think of features you must have ( built-in wireless?) And that will direct you.


Oh please don't do built-in wifi. It's ok to have the adapter inside, but you should put the antennas outside. Include them in the design, if you wish.

https://www.heise.de/ct/artikel/Der-optimale-PC-2020-4564302...

to clarify, these are vetted parts lists by a leading german computer magazine, complete with optional upgrades, videos to aid with the build process and even detailed BIOS settings.


"AMD is fabless"

Which also means they can jump to whatever the current hot stuff is in the fab world, letting the big fabs (which are few in number, but still >1) compete on technical merits.

Where Intel has a corporate need to use whatever fabs they've invested in, and to be anchored by whatever the limitation of their fabs are. They aren't really limited by that -- they could just get TSMC to make some stuff for them or something --- but they're Intel so they'll just make 10+++++++


Don't you mean 14++++++? They still can barely get 10nm working in laptops, much less desktop or servers. By the time they do, AMD will be far ahead of them process-wise.

They don't jump. Usually make long term deals for each process and design their microarchitecture for that process. AMD moved all 7nm CPU, GPU production to TSMC and they are tied to them for the duration of 7nm.

The question is if they can get all the capacity they need for price they can afford. It seems like high margin customers like Apple and Nvidia are always served first because they can pay more.


Intel, Samsung and TSMC are the only three companies that are committed to EUV. Intel is out, which leaves Samsung and TSMC. So I guess the >1 = 2.

The I/O die is rumored to be GlobalFoundries.

AMD's Ryzen 9 3900x and 3950x is a chiplet design: there are 2x TSMC 7nm dies tied to a 1x GloFo 14nm I/O die. See the picture: https://images.anandtech.com/doci/14605/Ryzen9_3800X_Hand_57...

In effect, AMD spends very little money and die-area on expensive 7nm process, while leveraging their GloFo 14nm contracts to cheaply make I/O and memory controllers.

----------

The 7nm chiplet is a single design: mass produced for EPYC, Threadripper, and Ryzen. The 14nm I/O die is what differentiates between EPYC, Threadripper, and Ryzen.

The I/O die can have 2x memory controllers (Ryzen), 4x memory controllers (Threadripper), or 8x memory controllers (EPYC)... supporting 2x dies, 4x dies, or 8x dies as appropriate.


> The I/O die is rumored to be GlobalFoundries

It's not really "rumored" to be so much as it is actually. The I/O dies are built at GlobalFoundries. The Epyc's larger I/O die is on GF's 14nm, and the consumer Ryzen I/O dies are GF's 12nm. https://www.anandtech.com/show/14525/amd-zen-2-microarchitec...


> AMD is fabless, so they are in competition with other GF and TSMC customers for manufacturing capacity.

That also means that they don't have all the fab expenses that Intel does.


The fact that Intel sells inferior product at higher price means that AMD has supply bottleneck.

I'm almost certain that in AMD takes every sale from Intel when they can deliver. They have superior product but not enough capacity to reap all the benefits from their success.


> The fact that Intel sells inferior product at higher price means that AMD has supply bottleneck.

Only if Intel is also _perceived_ as the inferior product. Remember that the value to the customer goes beyond raw benchmarking specs; it also includes brand reputation and marketing.

At the level of truly huge purposes, "value" also encompasses a real bilateral relationship between the company and vendor. A cloud provider might be tempted to stay with Intel now over AMD if by doing so they preserve guaranteed access/preferential pricing over the next set of server chips. To crack that hypothetical relationship, AMD would have to both dominate now and be perceived to dominate for the medium-term future.

It's only the retail market that can be so fickle as to consider a CPU a one-off purchase.


Enterprise customers don't care about brands. Retail customers are insignificant.

> Enterprise customers don't care about brands.

Sure they do. That's where the saying "nobody ever got fired for buying IBM" comes from. If anything enterprise is more brand sensitive and less price sensitive than retail customers.

Not to mention there is the entire surrounding ecosystem from the CPU parts themselves. The chipsets, boards, laptop availabiilty. Not to mention things like vendor lock in to the particular out of band management technology. Getting better benchmark scores and saving a few hundred per CPU is only part of a much larger matrix for making a purchasing decision.


Enterprise customers may care about brands at the level of Dell or Apple.

Data center customers care about specs and prices only. They study the products they buy.


Due to the needs to reduce complexity and have less vendors to deal with most companies use a limited amount of suppliers for datacenter equipment as well like Dell or HPE or whatever.

Data center customers care about support, vendor relationships, keeping complexity down, politics, reliability (= aversion to new products), lots of things. If anything the custom PC market is more focused around specs and prices solely.


Yes they absolutely do and AMD vs Intel is more than a matter of branding. It's fundamentally switching vendors. If something goes wrong it's on the head of the person who pushed for the change. There's also some things you can't do seamlessly between two servers that don't have a processor from the same brand.

Enterprise if anything has more inertia than say the custom PC market which will turn on a dime to chase the best deals.


> The fact that Intel sells inferior product at higher price means that AMD has supply bottleneck

Except they don't? There's a reason Intel's flagship HEDT part, the i9-10980XE, is half the price of the part its replacing despite almost nothing changing about the chip itself. And at the same time AMD is raising the price of its HEDT parts.

And it's not just the HEDT parts, but Intel also cut the prices on a bunch of Xeon chips a month or two ago as well, as well as their consumer stack.


> The fact that Intel sells inferior product at higher price means that AMD has supply bottleneck.

...or that Intel controls supply to retail vendors by other means. If a vendor is bound to keep putting Intel products on their stores to comply with supply agreements made in the past, for example.


I hope AMD doesn't slow down on the performance side with their 3rd generation of Zen.

I want to get a new laptop next year and I hope it comes with a good mobile Ryzen processor with Thunderbolt support.


AMDs mobile and APU line is generally about a year behind desktop, so expect to see Zen 2 cores in mobile mid 2020, numbered as Ryzen 4xxx.

Zen 2 combined with much better than intel integrated GPU is likely going to make a very nice dent in marketshare, unless Intel pulls a rabbit out of their hat.


There are no rabbits in the hat until late 2021 when 7nm comes to market. The 10nm process is completely broken. Watch how the majority of the roadmap is still on 14nm into 2021, there never will be a desktop 10nm for example. The Rocket Lake desktop chips in 2021 still on 14nm will drop down to eight cores to cram the new GPU architecture into desktop chips at a truly ridiculous 125 W. The Ryzen 9 3900 today does 65W w/ 12 cores and and you can easily squeeze in a 1650 into the remaining 60W.

> There are no rabbits in the hat until late 2021 when 7nm comes to market. The 10nm process is completely broken.

10nm products have "shipped" and put up a solid fight: https://www.anandtech.com/show/15092/the-dell-xps-13-7390-2i... It shows good performance and strong efficiency.

10nm desktop & server parts seem entirely MIA and dead. But it seems that Intel is able to at least struggle out small laptop parts on the process.


Now TDP isn't really reflecting output and can't be compared well across brands. But a 2x difference in TDP is a lot more than the different ways of calculating them between amd and intel

Especially since intel normally has a bigger spread between declared TDP and actual TDP.

> There are no rabbits in the hat until late 2021 when 7nm comes to market. The 10nm process is completely broken. Watch how the majority of the roadmap is still on 14nm into 2021, there never will be a desktop 10nm for example.

From what I heard from a man close to Intel process crowd is that Intel will simply slap a 7nm marketing designation on their current 10nm process when they finally get it going.

They will then change design rules to deliver enough density change, comparable to a node shrink without any change in the process.


Design rules don't change because marketing says so. It is reasonable to begin early processor development in a new process with conservative design rules (in order to be sure that everything works and yields are acceptable) and rework some components, pushing the envelope a little more, if tests allow it; but such improvements are going to be small "without any change in the process". Maybe the same specifications on a slightly smaller die to reduce costs or a slightly higher clock or lower power SKU. Or nothing at all because the tooling costs for marginally improved revised processors aren't justified.

Well from that roadmap, I see 6 core Comet Lake U (high end mobile) set for Q2 2020; we don't know how many cores AMD is planning for Zen2 mobile. If AMD sticks to 4, and Intel has 6, that's going to swing some buyers. The (leaked) Intel roadmaps tend to have more specifics than the AMD ones though.

It's still 14nm, even the quad core KLR chips were a minor miracle and only brought 25-30% from the dual core KBL.

Just about every 6 core mobile CPU laptop struggles with thermals. 7nm could conceivably upend that.

For as long as the ultrabook design remains popular (and it shows no signs of waning), the high-end CPU options will be riding the edge of their thermal limits during sustained use. OEMs aren't going to suddenly start over-building their cooling solutions just to help out marginally with a niche use case. As long as we don't get back to the problems from a decade ago with dying mobile GPUs, there's not really anything wrong with having CPUs that boost up to the thermal limits of the system form factor.

If someone comes out with a laptop CPU that can't boost up to those thermal limits, it means the chip's undersized and that vendor will probably need a different microarchitecture for the desktop or server markets.


> OEMs aren't going to suddenly start over-building their cooling solutions just to help out marginally with a niche use case.

A friend from a laptop engineering company has worked on this exact problem recently. Chinese OEMs are all trying to squeeze 35-45w chips into small chassis now.

To my big surprise, doing so in even thin bezel 13 inch models is not that big of a deal actually. Big OEMs simply were never bothered enough to try that before.


I think my point is that a lot of laptops have been held back (throttled) by the thermal solutions not being able to cope with the heat.

More efficient design will give better perf.


You have to take into consideration what kind of workloads lead to throttling. Laptops are usually not used for the kind of tasks that keep a CPU fully loaded for several minutes or hours at a time. People who do use laptops in that manner are a tiny fraction of the market, and when they experience throttling that does not have any bearing on whether the cooling system of an ultrabook is adequate for the kinds of more typical workloads it is actually designed for.

There have been some ultrabook-style designs that offered inadequate cooling even for fairly normal use cases, but that's a separate issue. Mainstream laptops will be designed around mainstream workloads, and heavier workloads will push them to their limits. Better cooling doesn't come free, and if it doesn't benefit mainstream workloads it's unreasonable to expect mainstream laptops to put more emphasis on cooling capabilities.


> Laptops are usually not used for the kind of tasks that keep a CPU fully loaded for several minutes or hours at a time.

Web developers work hard to change this. Browsing without an adblocker and with Javascript enabled is often enough.


Right. I get the sense that most developers browse with adblock on, but the average user experience is for their computer to be effectively running Prime 95 during regular web browsing.

Sorry, but if I buy a six core laptop, I'm not going to be in the casual notepad user category.

In many laptops, thanks to bad thermals I'd be better off with a 4 core where the thermals can keep up. That's where the 7nm stuff could really bring advantages.

I've been able to load up my desktop six core plenty using e.g Docker and a bunch of microservices. It has a fairly decent 360mm AIO water cooler so stays pinned at max perf. Had a bad cooler before, though, and it really impacted perf and stability.


The point still stands that most people buying these machines are generally not running them at 100% CPU usage for extended periods of time; their usage is much more bursty, with short periods at full power separated by longer periods of idling or low power. This gives the CPU plenty of time to cool down in between bursts.

If OEMs optimize for that use case, I suspect that a more efficient CPU will simply mean that they cut even more corners on the cooling, not that the thermals will actually be significantly better.


It's sad that this is the norm. Aluminum is cheap, a bigger heatsink in a regular laptop (not Ultrabook) should cost what, a dollar more? Yet laptops are never designed for full load for hours. Just "mainstream" use. Even business "workstations" have the same problem. I've had to do hardware mods or undervolting on all laptops. WTF.

If only the industry stopped for one second to pursue angstrom thick laptops designs in favor of more thermally efficient ones. Laptops could have their CPU and chipsets facing downward in contact with the bottom cover entirely made of aluminium, then use a second aluminium made upper shell with small thick fins facing outward that when closed works as a sturdy cover to protect the lid carrying the screen but when opened it could be removed then attached to the lower one to increase thermal exchange with the environment. People obsessed with the thinnest hardware wouldn't touch it with a 20 meter pole but those in need of serious performance and mobility would probably find it interesting.

So, have the bottom cover be the heatsink? That actually sounds brilliant.

HP thinned the ZBook series by turning everything upside down and having the bottom be just a dumb panel instead of the main frame.

Sadly they, once again, used a standard, barely capable heatsink. It will run for days loaded, but it will go over 90 degrees and even throttle, which is unacceptable imo.


"So, have the bottom cover be the heatsink?"

Yes, the upper cover with the fins up should be designed to match perfectly with the lower one when mounted with the fins facing down. Cuts should be arranged so that the bottom cover rubber feet wouldn't prevent perfect surface contact. It would become a fairly large heatsink in which the size and combined thickness would likely be enough to counteract the small fins size and absence of fans. Battery/disk/memory covers on the bottom side would be accessible by removing the additional cover.


"You have to take into consideration what kind of workloads lead to throttling."

I'm driving four displays at work with Windows 10 (two 21.5' 1080p monitors, my laptop flipped open, and an iPad Pro 12.9 connected via USB C running Duet Display) and my idle desktop CPU utilization hovers around 15-20%. Having Outlook and Chrome open gets it into the mid 30s. This is a four core i7 Dell Latitude 7490 with 16GB memory and an NVMe drive that I was given in May 2019.

Yes, all the OS/applications I'm using are resource hogs but I'm not even doing software development - this is all business analyst work. Seeing that the general trend of applications/OS will continue to be resource hogs, let's hope that six core thermal chassis design for 14' ultrabooks is figured out in the next two or three years.


Throttling is what happens when your CPU stays at 100% for a long time. Perhaps you could describe the part of your workload that exhibits that behavior, rather than describe a workload that is obviously not causing thermal throttling?

Throttling is what happens if your CPU overheats which does not imply it's running at 100% for a long time

We're talking about the current state of the real market here, not abstract hypotheticals. Laptops overheating at 30% CPU usage is not a widespread issue in the real world; to a first approximation, the only way to get an ultrabook's CPU to thermally throttle is to keep at least one of its cores completely busy so that the processor stays in its boost state long enough to pump out serious thermal energy. Bursty workloads give the CPU too many opportunities to cool off.

"Mainstream laptops will be designed around mainstream workloads"

Games are a mainstream workload that will cause most laptops to throttle.

"Better cooling doesn't come free"

You could make the heatsink in a Macbook out of Pure Silver, and it would barely move the price.

Meanwhile aluminium costs $1.7 per kilo.


Comet 6 core already exists and I bought one to review. 10710U

I think AMD is still (slightly) behind in power consumption, especially if the CPU is idling. Some tests suggest that this is heavily dependent on the motherboards chipset though. But power consumption is probably very important for the industry and for notebook devices.

That said, AMD becoming a worthy competitor on the market again is awesome.


I recently bought a new laptop and actually wanted AMD, but none of the laptops I was inteested in were available with one. It appears that high-end laptops are still ruled by Intel. Do AMDs run too hot for mobile or something?

It's not about heat, more probably vendors have a contract with Intel that makes Intel parts more expensive if they are not bought exclusively (it will be worded differently because Intel already got sued for this practice).

I recently bought a Thinkpad E495 with a Ryzen 3700u and thermals are just fine. The surface barely heats up under sustained load. The E series is more budget, though. But the T495 (14") or X395 are of higher build quality (and price) and also have Ryzen 3x00u CPUs.


Typing this on my x395 (with 3700u) and can confirm these are really well made laptops.

Linux support is excellent. Openbsd is good but doesn't support this generation of wireless cards yet, which is quite inconvenient; next version perhaps.


How is graphics performance / functionality?

I've only ever really dealt wit nVidia and Intel graphics, and highly prefer intel due to tearing fixes.

I'm not a gamer, but do need at least basic hardware acceleration for X rendering etc.


Integrated Vega full support landed in June 2018 (kernel 4.17, Mesa 18 for 3D), as long as you have these or newer and a firmware for your card in the system — you'll get full acceleration.

I've been running Debian Testing on 2400G since 04/2018, where I had to put firmware and compile the kernel with the config changes to enable the support, but it was already in kernel tree nonetheless. It's out-of-the-box since 07/2018, way before the release happened.


At least on Arch, X doesn't even start 4.19 (linux-lts package), but works fine on 5.2+ (linux package); probably also in some kernels in between, but I only bought the laptop recently.

Likely specific support for 3700u was introduced at some point between these two kernels.

AIUI 5.4 is meant to become the next LTS, so it won't be an issue going forward.


AMD added support for Raven 2nd gen APUs in 4.20.

That might be it! Thanks.

Sweet, does that mean that there's no binary drivers (eg nvidia-drivers vs nouveau)?

Have you had any issues you suspect may be driver / gfx related?


There's the amdpro drivers, which nobody uses; they're partially open source, as I understand it.

The point is that AMD releases detailed documentation for every GPU they sell, and thus it's very easy for mesa to support their hardware; AMD themselves also contribute code to Mesa, of course. This is in contrast to NVIDIA, which even uses encryption and signatures to twart nouveau efforts. I avoid NVIDIA entirely, for this reason.

I haven't had issues so far. Life has been good, same as with the vega 64 on my workstation.


AFAIK There is no AMDPRO drivers for Ryzen APUs.

GPU Accelerated graphics (2d, 3d and video codecs) on both Linux (5.2+, probably earlier) and Openbsd (6.6+), with open source drivers (kernel DRM, userspace mesa3d and xf86-video-amdgpu).

If you get tearing, try:

    xrandr --output eDP --set TearFree on
To enable the anti-tearing workarounds. This shouldn't be necessary on a modern composited desktop, but I do need it with a simpler i3 setup, to not tear videos on youtube. Mpv seems to not need it either.

Thanks, that might be helpful given I run OpenBox / qTile :)!

Interesting. I haven't looked much at the E495 and T495, but the X1 Extreme, P1 and P73 were not available at all with Ryzen CPUs.

Probably exclusivity agreements by intel that last for many years.

My guess is that laptops probably have a much tighter integration with the whole system, and so it takes more effort to change the CPU. It might take a few years of consistent performance from AMD to convince the laptop manufacturers to invest in the engineering to do that integration with a new CPU.

> I recently bought a new laptop and actually wanted AMD, but none of the laptops I was inteested in were available with one.

The new Thinkpad E595 looks decentish and ships with Ryzen, as opposed to the previous model (E590) which shipped with Intel.


But is this only the E line? That's Thinkpad's budget line, isn't it? I've been looking mostly at the T, X and P lines.

T495, T495S, X395 are AMD laptops. I would guess the P line don't have an AMD option because AMD doesn't have a high end laptop CPU line like Intel does.

Isn't Thunderbolt getting replaced with USB4, after Intel opened it up?

I'm not super familiar with the usb spec but it was my understanding that thunderbolt was an optional feature that usb devices could support.

Not current generation ones. USB4 is supposed to combine the two.

USB4 is essentially Thunderbolt 3. Since TB3 also defines how a USB 1-3 connection can be pushed over the wire, that's what most devices can fall back to.

So you are telling me a usb microphone needs to implement hdmi and displayport if it wants to be compliant? Or are these devices expected to stay on lower usb versions? What if the device needs some of the features from usb4 but it makes no sense to have thunderbolt features?

> So you are telling me a usb microphone needs to implement hdmi and displayport if it wants to be compliant? Or are these devices expected to stay on lower usb versions? What if the device needs some of the features from usb4 but it makes no sense to have thunderbolt features?

All USB Type-C connectors all have a pair of USB 2.0 wires in them. You can still be a USB 2.0 device and talk on those USB 2.0 pairs on USB3 (and presumably USB4) just fine with just a little bit of care (get your resistors correct on the CC lines, for example).

If you want some features of USB3 or USB4, then things get a little more complicated. For example, if you want to be able to draw slightly higher power (900mA or 1.5A), then you need to have some active circuitry on the CC lines (One of these is about 70 cents: https://www.ti.com/product/TUSB320HAI) and you have to respond properly when the system tells you to draw less power even though you don't need to use the full-blown USB3 communication pairs.

If, however, you want USB3 or USB4 speed or very high power, then you need a full-blown controller chip and you incur all the grief that demands. Of course, if you actually need a couple Gbps, you're in the realm of doing serious signal integrity analysis anyway, and you're probably not going to balk at the $3-$5 required for a true controller chip to handle it all.


The only new feature in USB4 is Thunderbolt, so yes, if a device doesn't need Thunderbolt features it should stay on an older version. Of course, older USB versions will end up rebranded as something like "USB 4.0 1x1 High Speed".

USB4 will implement DisplayPort and PCIe if I understand correctly (in addition to actual USB), like Thunderbolt does.

https://en.wikipedia.org/wiki/USB#USB4

Obviously, not every device need to use every feature. I'm talking about the port on the computer.


AMD now reminds me of their Athlon/K7 glory days

I drove 2 hours to get my 3900X. I was worried it would be too much horsepower but it spends a lot of its time >80% utilization.

These cores are great for debugging threading issues. I would trade it for a 3950X if I could find one.


I bought a 3900X also. I, too, would like a 3950X, but the 3900X is so fantastic it's hard to be too upset. I also have 4 Threadripper systems and I just could not get work done without them. My build times for compiling software would be atrocious without them.

I’m still on an i5 4670k and 16 gigs of ram. I built this system in 2013. I secured a day one 3950x and it arrived this weekend - the rest of the components are getting here throughout next week - and the speed up of compiles and ability to run multiple VMs at once for testing distributed apps is what I’m most excited about! It’s basically a workstation and a gaming rig in one. Gaming is fun but man I can’t wait for some faster feedback loops when I’m developing.

I was able to get rid of multiple computers and convert from physical to virtual and save on power bills. Once your build is complete you will be incredibly happy. AMD is killing it with these processors. I've never been happier with a tech product line. It just blows me away how much I'm able to accomplish on a shoestring budget now.

I replaced a i5 2500k + 8gb with a 2700X and 32gb last year so I could run multiple VMs + have the odd game here and there. You're in for a treat :D

What's the latest on VM tech? Is it still using Virtualbox + spinning up multiple VMs?

Now I wanna change from 3700x to 3950x... :(

i went from old second generation i7 , to 3700x, its awesome :)

i also put in 64 gb of memory, so i can keep all the tabs open (insert "EVERYONE" meme gif from leon), while i develop

At the moment I don't feel the need to upgrade. My bigest problem is that my cpu fan tends to throtle up and down a lot, can't find correct fan curve.


There will be a BIOS option entitled "Fan Smoothing" or similar. On my ASUS board, it gives a range of times over which to average CPU temp rather than use the instantaneous value for fan control. I have mine set to 7 seconds I think, and it completely stopped the fan revving up/down issue.

That's what gets me...the desktop ones are all AM4...when my 3600 seems a little long in the tooth, a Processor + RAM upgrade is actually reasonable.

There's also the 4xxx series on the same socket due sometime next year. The talk for zen3 is 15% ipc uplift and 10% higher clocks. Sure the 3900x will carry you until then.

Is the 3950x compatible with the same x570 mobos/ram as 3900x, ie drop in replacement?

Yes, it's the same CPU with better chiplets in it.

> I also have 4 Threadripper systems

Can you tell what you're using them for ?


Worker pool for compile jobs and simulations. The cost to run this locally and with local storage (over 10g) is cheaper than to do it in the cloud.

I'm curious as to why these processors make it easy to debug threading issues? Is it just due to the sheer number of cores or something? not sure why that would help either.

More cores _actually_ running simultaneously means more likely chance of encountering sporadic race conditions. Also makes it easier to measure (and thus work to improve) high thread-count scaling.

I wish my home server performed more tasks, so I can justify an upgrade from my i7. Unfortunately even as it stands today it’s mostly idle despite me doing quite a bit with it.

Besides, it’s really annoying I need to upgrade from DDR3.


I’ve been sticking with Ivy Bridge-EN chips for my lab at home because used DDR3 RDIMM’s are dirt cheap, not to mention the chips and systems themselves. In a few years once used Naples gear hits the market I may consider upgrading, but I’m also quite fond of the low power draw my 2450L v2’s have.

Have you checked power consumption? Might be an excuse to switch!

I bit the bullet and upgraded from 4670k + DDR3 --> X570 Mobo + R5 3700X. Costs ~$500 assuming you can keep your GPU, PSU, Case, fans etc. and you don't want to go all our RGB on everything.

Among the PC phenomena I’ve witnessed in my life, LEDs on computers and every other digital thing is one of the most annoying.

Haha, I think it can be tastefully done. Of course, the PC shouldn't look like a Unicorn puking rainbows, but I'm a fan of having a glass panel that shows off components with some subtle single / dual color lighting.

So yeah, I made some questionable technical decisions purely for aesthetics but yeah form is kind of important too.


It's a nice bonus if something happens to have it. My AM4 motherboard has an RGB LED header, and the Ryzen 1700's included cooler has an LED ring on it, so it gently illuminates my PC build with my favourite colour.

> the PC shouldn't look like a Unicorn puking rainbows

i have 0 RGB on or in my case.

your comment just made me change my view on RGBs. I.Want.Them.

My next built is going to puke rainbows! its gonna have a builtin display for a unicorn which moves according to the load - walking on <10, running on >60. its gonna be amazing!


Before LEDs remember CCFLs? they were horrid.

It’s truly bizarre. Think a lot of it comes from people being influenced by YouTube , and rgb lights being one of those things that really only exists to appear on video.

To be fair, some of us were sticking case windows and cold cathode lights into our gaming PCs 20 years ago so we could look uber-1337 at LAN parties.

(I am now firmly in the "RGB is dumb" camp like the other grumpy old people. Oh well.)


I think the most annoying part of this fad is that if you want to build a gaming PC it's very hard to find good components (especially cases and motherboards) that _don't_ have LEDs. It wouldn't bother me much if the same products were available with and without all the bling.

The RGB LEDs exist to be customised, so you can turn them off if they annoy you.

> Besides, it’s really annoying I need to upgrade from DDR3.

Do you? You can use an older AM4 motherboard that uses DDR3 with Ryzen 3000, no?


AM4 was never DDR3. DDR4 predates Ryzen by about 2 years. There was no overlap with Ryzen & DDR3.

Ah, sorry, I was getting confused with PCIe 4!

I’m still on haswell i5 for my home server. Still pretty powerful.

I'm running a Ryzen 5 2600X in my new pc. 12 threads running at 4 GHz for 150 tax included is just too good to pass up.

I have a Ryzen 3700x on my new PC. First PC I ever built by hand. Mostly used for music production. 10 instances of Serum in Ableton with 50+ tracks and the thing barely crosses 15% CPU utilization

MicroCenter had the Ryzen 7 2700x for $129 and the Asus ROG Strix B450 Gaming Board for $79 today. I picked that up and some G.Skill 3600 DDR4 fir $59.

They were killing it today.


It feels like it’s cheaper to buy a Ryzen and a new motherboard than to get an equivalent Intel chip at the $200 price level.

I’ve been wanting to upgrade my Intel based CPU but these prices have me thinking about just going with AMD itself.


Damn those are fantastic prices. Wish we had a micro center around here...

I miss Atlanta.

do you have the link to the b450 gaming board and the ram?

It was a Black Friday special.

I'm not surprised. The Ryzen line is the best line of consumer CPUs I've seen since Sandy Bridge.

why isn't it priced higher then?

AMD is still gaining back mindshare. Yes, they’re not optimizing for profit right now, because they’re also gaining something valuable - entrenched market position. Intel’s dominance is incredibly sticky, and their name brand alone is worth 10s of billions. By blowing away Intel at every price point by a wide margin (and with Threadripper halo SKUs with no Intel competition), AMD is flexing and showing they are winners, and Intel are losers. People want to feel like winners when they buy products.

The last Intel security bugs made them lose a lot of the reputation they had, at least amongst system adminstrators though.

Now is a crucial time for AMD to strike back.


This and their Minix backdoor are the reasons why I wanted an AMD for my laptop, but I was sad I couldn't find a high-end Thinkpad with AMD processor. I hope that will change next year.

Why call it minix backdoor? It puts some blame on minix, yet minix is blameless here.

Minix is an excellent FOSS microkernel multiserver OS with a focus on reliability and fault tolerance.

The way Intel is using it just isn't nice.


I'm not blaming it on Minix, but it does make use of Minix. I'm mentioning it to identify what I'm talking about. I believe the official name is Intel Management Engine. Or maybe that's just part of it.

Intel ME is the name I usually see, and also the name that the tools to remove/disable when possible do reference.

If you're talking about the Management Engine, doesn't AMD have basically the same thing just called PSP?

Possibly. The Intel version is better publicised. I have no idea what PSP can do, but Intel's IME makes it possible to remotely completely override anything about a PC, which can be convenient for sysadmins for large organisations, but hasn't been disabled for consumer products.

I have no idea whether the same is true for PSP.


AMD supports KVM redirection, too, via a standard called DASH. You can see examples at https://community.amd.com/community/devgurus/dmtf-dash/blog. From the standard body's description: "DASH provides support for the redirection of KVM (Keyboard, Video and Mouse) and text consoles, as well as USB and media, and supports the management of software updates, BIOS (Basic Input Output System), batteries, NIC (Network Interface Card), MAC and IP addresses, as well as DNS and DHCP configuration. DASH specifications also address operating system status, opaque data management, and more." https://www.dmtf.org/standards/dash

Intel's AMT is also an implementation of DASH.


The extensive research on the ME I actually conside a pro for Intel, since I know more about what it does and how to disable it. The PSP is still more of a black box.

The remote management features in Intel ME require a vPro capable chipset.

To be useful to you, usually yes.

To be a convenient security hole, AFAIK no.

Any quotes on ME being safe on non-vPro?


It's still a security risk – code is running in the ME that can be exploited locally.

Without vPro or with remote management and the network stack turned off there's a much smaller (probably close to zero) remote attack surface. With a vPro-capable chipset that has remote management enabled, the ME has its own IP address, plenty of potentially unsafe services, an insecure-by-default provisioning mechanism and much more.


T495 Zen+ A485 Zen X395 Zen+

The 5 at the end denotes an AMD product.

I'd hold out for after CES though. That's when Zen2 is coming to laptops. With thinkpad lag I'd say August of next year.


AMD has the same "Secure Enclave".

AMD PSP; "AMD Platform Security Processor".

:(


Not with the Directors of those system administrators though. Nobody got fired for buying Intel is a thing. I had to fight to get our recent server purchases to be AMD.

What were the arguments you were getting against buying AMD?

heh after your words i'm thinking that AMD is doing to Intel what Nvidia did in GPU segment

Depends on whether it was sold out due to manufacturing capacity or just a local logistical issue.

If manufacturing capacity, then it should've been priced higher.

If local logistics, then this data point has no bearing on whether it's priced correctly or not.


What if it's actually priced below cost and supply is deliberately restricted? Making the top bins of a part "marketing only" products has been done in the past, and fits the evidence here...

The binning on 3950x chips is nothing extraordinary, thats why the chiplet model is so powerful, because you don't need to have 16 perfect cores on one die to sell the CPU.

Then why are there so few of them? You can't buy these things anywhere, despite being priced at a volume level. Something doesn't add up.

Well, the 3950X just launched, so there's a demand spike that's going to follow from that. But that aside these are sharing parts with the Epyc series. If AMD is getting volume orders for Epyc that is definitely going to take priority.

The allocation order is going to go enterprise first, system builders second, retail boxes a distant 3rd.


The Epyc devices are lower binnings though! That's exactly my point. They don't seem to have the 3.5+ GHz parts available in a reasonable quantity. When Intel was doing this with "1 GHz" Pentium 3's way back when, they got crucified for a paper launch.

Are you sure they're lower binnings, or just equally good chips being run more slowly for reduced power consumption? The power budget per core on Epyc is quite a bit lower.

The 3700X and 3800X, both of which have higher base frequencies than the 3950X and also use 8-core chiplets, have been in stock near continuously since launch.

That's not how binning works. If you don't have enough of your top bin to supply the demand at your price point, you're not selling it at a loss, merely leaving money on the table.

Supply isn't deliberately restricted, it's just you can't ramp up production of your top bins without also making more of everything else.

If it were a different die entirely this might make more sense. But it's not.


I'd guess the top line products will be more available later, as they tune their process to increase yields.

The 3950X is the most expensive consumer platform CPU on the market, and AMD's HEDT lineup has a starting price higher than Intel's flagship HEDT part. So how much higher priced should it be?

generally to gain back market share

Your ability to gain market share ends when supply ends.

If the demand is exceeding AMDs manufacturing capacity, then they should have priced higher or need to increase manufacturing capacity in order to actualize the potential market share to be gained.


> If the demand is exceeding AMDs manufacturing capacity, then they should have priced higher or need to increase manufacturing capacity in order to actualize the potential market share to be gained.

That's the standard micro economics answer. You're not wrong.

However, I don't think you're right either. The pricing of consumer products is a black art because people don't necessarily behave like economics textbooks say they should.

Some possibilities:

1. This is a short term supply hiccup. The good will of keeping the price the same of the medium term is worth more than the money AMD would make by raising and then lowering the price of their chips.

2. They are constrained in their ability to raise their prices by Intel, which is broadly thought of as a superior product (leaving out the technical analysis of the current generation).

3. They want to create the perception of scarcity.


They've been binning these chips for months maybe even a year at this point. The binned CCDs used in the 3950X are also used in Epyc. I'm sure server has a higher priority over DIY.

From what I've read/heard - AMD are selling every chip they can make.

So whatever the highest test the chip passes, it's sold as that.

This is apparently why nobody is having much luck getting any kind of overclocking beyond the on-box specs - AMDs just very good at ensuring chips go where they're required.


> If the demand is exceeding AMDs manufacturing capacity, then they should have priced higher

If they did that the news would have been AMD is expensive which is not much of a news-story. Whereas now the news is that AMD is sold out because it is so good.


Except if you want to be known for being as the cheaper option and not just thinking about short-term profits; even if it takes a little delay to actually get the supply needed to satisfy the demand.

Then you have consumers feeling ripped off when the next batch arrives, as has said above they could be trying to build brand loyalty.

What? Increase price to lower demand so they have a lower market share than they do now so they can have a higher market share later?

If they're supply-limited, they can increase price all the way until demand falls to meet supply. Selling widgets priced below the equilibrium point achieves nothing for capturing market share, it just loses potential revenue.

I.e. if they can only manufacture 1000 widgets, but 10,000 people want to buy one at the given price point, the gained market share is still only 1000 individuals. The number of people who want one but can't get one due to lack of supply is not market share.

Temporary under-pricing might be a valid strategy if AMD is planning to ramp up manufacturing capacity, but comments further down in this thread suggest they lack the fabs to do that.


You're completely forgetting about the future loss of sales when someone only looks at AMD CPUs once. The buyer will switch to Intel immediately and then when prices are finally down again won't be shopping around for prices. The impression that AMD is overpriced remains.

They could have priced higher and still dominated intel price/perf ratio wise. The supply/demand equilibrium point is somewhere higher than the current price and lower than intel break-even price.

The only way they'd set the impression of overpricing is if they actually overpriced. They did the opposite.

Whether it was a mistake in pricing or some calculated second-order strategy, we'll never know. What we do know is they're leaving cash on the table whether intentionally or not.

Temporarily (keyword) underpricing to hurt competitors is not unheard of but I don't see how it will hurt intel unless it significantly shifts mindshare for the next ~2 years and allows AMD to get a real foothold in the server market.


They have basically one opportunity to set the expected price point and as it turns out, they prefer to get all the positivity and hype on their side, than perfectly optimize to marginal demand.

I am not sure it is a bad choice.


Selling out of inventory isn't a bad move if you seeking to raise capital or debt to increase production capability. E.g. were profitable and selling out at this price point and arguably would continue to do so as we ramp up production. Especially if looking down the road long term. In the short term it generates buzz, thus this very post here on hacker news, which isn't necessarily negative unless it is a prolonged inventory shortage.

It could be that this is the long-term efficient price after demand from early adopters goes down. Or maybe they underestimated demand altogether and it's too late to raise prices now.

I think this is a fair question to ask, I don’t know why you are being downvoted. It bothers me that hackernews penalizes curiosity.

The new threadrippers are on par or more expensive than Intel's solutions if judged by performance/cost per core. Especially if ridiculously priced motherboards are considered.

The way your sentence reads is that Sandy Bridge is better. I suspect you mean that the advancement with this generation is as significant as any since sandy bridge. Can you clarify?

IMO it was clear to me that they mean in terms of value.

I'm pretty sure it meant better in terms of a step change. :) Not outright better than Sandy Bridge.

Not surprising given how little supply there seems to have been. Even relatively major outlets like PC Perspective didn't get a review sample.

That and the Japanese PC-building scene has become quite tiny nowadays.

As in it's shrunken? Why is that? As far as I know the US and European markets have never been bigger than right now.

An increasing majority of the average person's everyday computing needs have consolidated into smartphones, perhaps even moreso than other countries. This ate into the markets of both home consoles and gaming PCs. (Though, since Japan already as a console-centric country, the latter weren't really big to begin with.)

Question: I can drop Ubuntu, Debian, Mint, Arch, etc onto a Core series whatever and get class-leading KMS/DRM graphics acceleration the moment the kernel is running, long before X11/Wayland start. So bootup is reasonably seamless, no flickering etc. Basic 3D works. WebGL is getting there.

And thanks to DRM, I can play around with alternative environments that talk directly to the kernel without X being involved.

So... if I get an AMD APU... how will the hacker/tinkerer experience compare?


AMD GPUs should work better than Intel ones with open drivers lately. In particular, Intel has serious tearing issues with Xorg in all kinds of nontrivial configurations, and I've also seen weird glitching issues with compositors after many days of uptime that require compositing resets. I haven't had a solidly good Intel Xorg experience in 5 years (on Ivy Bridge). Functional yes, but not good. Bugs go years without getting debugged, and it's a shame the tearing issues have gone on for so long. The xf86-video-intel driver is deprecated/unmaintained, but the modesetting driver isn't quite up to par because it can't include GPU-specific knowledge.

AMD supports all the same DRM/KMS interfaces, has better performance than Intel with open drivers, and also offers proprietary userspace drivers (GL/Vulkan side only, they work with the open source kernel and modesetting side, no weird binary compat issues to worry about) in case you prefer those, but they are absolutely not required.

I'm seriously looking forward to replacing my IVB laptop with a Ryzen once AMD releases 6+ core APUs. Maybe next year.


My Ryzen 3400G APU worked perfect on a fresh Manjaro installation using the mainline AMD driver.

I moved to a GTX 1660 yesterday and immediately hit a bug where KDE isn’t able to detect display refresh rate when using the closed source Nvidia driver.


The amdgpu mainline driver works well in my experience.

For perspective: Tsukumo "honten" had 20, Tsukumo eX had 35, Tsukumo Nagoya had 15. 3960x and 3970x had scarcer availability.

I also just saw on Twitter that some of the shops have new 3950x available.


Haven’t heard of Tsukumo before - do you recommend them over buying on Amazon?

They have actual shops in Japan, contrary to Amazon. For some reason the 3970X is (slightly) more expensive on Amazon Japan than the Japanese MSRP. I haven't checked the other models, but I wouldn't be surprised if that were the case. I'm not too familiar with the Japanese market (only from researching for the 3970X in the past few days), so I wouldn't recommend anything in particular, but it did show up during my research. PC Koubou, too.

Okay, that’s helpful - thanks for your take.

I picked up a Ryzen 7 2700X for $159 on Black Friday. Unbelievable cost:power ratio

Sold out on Newegg and Amazon too, as far as I can tell. Looks like AMD did not quite expect this kind of uptake on what is now their "top of the line but reasonably priced" CPU.

The 3950X sold out in seconds on Amazon and Newegg, I was watching all morning and I've read the reports from the people who were able to purchase one.

Intel stock is all time high, short? Or the market doesn’t think amd will cut into their profits....

- EPYCs market share is around 3%

- Intel makes in a few weeks as much as AMD in the whole year

- AMD doesn't own fabs and can't saturate market (as seen with 39XXx shortages)


> AMD doesn't own fabs and can't saturate market (as seen with 39XXx shortages

Intel owns fabs and has been having shortages for most of the year. AMD spinning out fabs seems to have been a good choice for them, as GlobalFoundry, their spun out fab, ran into problems with process shrinking and AMD was able to move Zen to another fab that got their shrink to work.


They spun it off to avoid bankruptcy. Let's not pretend it was done for agile reasons. Intel's issues are unrelated, they simply ignored negative internal feedback which is a management problem.

Sure, they spun off fabs to keep themselves afloat. It seems to have worked, and now they're able to take advantage of the agility.

I'm not saying Intel should spin out their fabs, I'm just saying not owning your own fabs has some advantages sometimes. Owning your own fabs also has some advantages sometimes; it's certainly been a historic advantage for Intel.


Everything is a management problem one way or another.

Yep. AMD is a threat, but not a disaster for Intel.

Also, AMD is going to have to keep up the momentum for a while before they can make in-ways in the enterprise market.


TSMC has 6 month lead times on 7nm, and you best believe Apple gets the bulk of their fab capacity. AMD could have the best CPUs on earth. They’re not worth anything if you can’t actually buy them.

No significant improvement in sight for Intel until 2021, while both TSMC and AMD yields and capacity on 7nm are going to ramp (not to mention Zen 3 and 7nm+ in summer 2020).

Not to mention, it's hard to get the top SKUs from both AMD and Intel, but something that costs 85% of Intel consumer parts and has 110% of the performance is readily available.


Long on TSMC then?

They’re up 50% over the past year. Might be priced in though, it’s a low margin capital intensive business.

What do you mean low margin? They're making 45% gross / 30% net, that's pretty respectable I would think.

Blackberry's stock price went up 10x in the year after the first iPhone was released.

Seems like everything is at an all time high

Maybe people expect that growth in TAM will outweigh loss from decrease in market share? Intel also has areas other than CPUs which might supplement CPU profits.

AMD has always been the stock markets whipping boy with shorts. Yet it stock price still continues to grow.

I'm betting Intel will just dust off tech they have already developed but didn't need to put into products due to lack of competition, and will be back on top soon.

I doubt they have it. They're on the third or forth iteration of their 14nm tech now. If they really did have something up their sleeves, now would be a great time to use it.

I got an email from Amazon saying the 3950X was back in stock, 25 mins later, it was all gone when I actually checked.

AMD will need to provide some retail guidance for the holiday season, maybe widespread availability won't happen until Jan/Feb..


As a segue, is this a good time to buy AMD stocks?

Leaving aside the GPU market, if you believe AMD is going to take and sustain a sizable share of the CPU market in the next 2 years (in particular server, mobile and high-end desktop), then the answer is yes.

If Intel manages to keep AMD market share below ~18% in the next couple of years, then probably not.

The value of the stock revolves around how much money AMD is able to generate and the biggest variables here are volume of sales and margins on those sales. AMD's variable costs are probably higher than Intel's since they outsource their manufacturing to TSMC, but their fixed costs dwarf Intel's.

      Intel     AMD 
Market Cap $252B $44B

Revenue $19.2B 2B

Earnings Q3 $6B $0.12B

Keeping aside the fact that AMD's numbers include GPU sales, notice how AMD's earnings to revenue ratio is very small compared to Intel's. Also notice that Intel's market cap is almost 7 times AMD's (again, AMD has a GPU business too).

So the key is really, do you believe that AMD will be able to break the ~18% market share (in particular in high margin segments)? My back of the envelope calculations indicate that at around 20% market share AMD should be able to generate earnings close to the billion dollars per quarter, making AMD a buy. If they cannot break and sustain a market share close to that, my answer would be not.

Disclaimer: I have AMD shares and I don't plan to buy or sale given current pricing.


It seems odd to mention that AMD has a GPU business without mentioning intel has a substantial amount of non-CPU businesses (indeed, their PC business, which includes stuff like NICs and similar, makes up less than half their revenue).

Intel's Client Computing Group – 55% of 2016 revenues

Intel's Data Center Group – 29% of 2016 revenues

Source: Wikipedia

That is 84% of their business.


A lot of people are already betting big on AMD since before the release of zen2 so it may be overpriced. It's anyone's guess, not every investor will agree. Some holding it may just say yes because they want a bubble to sell out of. Look at price to earnings and all that and consider that sentiment is high right now.

Their stocks has increased 2000% since 2016, not sure if there is any value left to extract.


no. the buy point has passed. but there are 8 stocks in the fabless semiconductor space that have better fundamentals than AMD. for example: IPHI, AVGO, MPWR.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: