It looks like today the top 10 are all Ryzen. This is crazy.
I think it has to do with AMD being way too conservatives with their estimate and not booking enough 7nm Capacity.
Now we know AWS is using those 7nm Capacity as well for AWS, and given Amazon has shown they intend to have all their SaaS running on their ARM CPU, I think their order are going to be big.
Edit: One thing I forgot was It could also be GF's problem where they cant keep up with the I/O Die.
They don't have to undercut AMD to sell CPU's if they can deliver more volume. They may cut prices for total $3 billion worth, but they start from so high profit levels ($21 billion) that they can outmuscle AMD financially and stay profitable.
At no price point are they competitive with AMDs offerings and they kind of cursed themselves with their legacy of inflated unchanging prices - they tear apart a decade of profit margin and consumer confidence to shortchange their product stack into staying relevant. C levels at Intel must still be dissatisfied with this outcome.
Intel is having supply issues, so they also have a volume issue as well.
Retail market is insignificantly small. AMD can win it but it does not matter in Intel vs AMD fight.
Not when Speculative Execution mitigations cut performance literally in half in some cases.
Put more succinctly: there is literally not enough semiconductor fabrication supply today to meet demand, and the supply is very inelastic.
I do wonder if the 3900x/ 3950x shortage is at least partially manufactured (they're so hard to get they must be good right?). Intel halved their prices pre-lunch which is something that has never happened before, but they still can't match AMD for price/ performance.
AMD has both good CPUs & GPUs (something that is very rare!). Being an owner of an RX 5700, I'm surprised how low the power usage is. If they put a cut-down RX 5000 on a low-power CPU, they could make some decent laptops that don't need dedicated GPUs (memory speed is a bottle-neck especially on a laptop with a single memory channel). That could pose a problem for Intel. I'm actually a little surprised it hasn't been more of a priority for AMD, all Intel can do to compete right now is to literally give their CPU/ APUs away (better to make a loss than lose a customer to AMD). Next year will be a good time to buy a laptop. It looks as though Zen 3 will first launch on laptops, that may pose a problem for Intel...
Intel's R&D budget is several times larger then AMD's revenue so they'll always be able to squeeze a little more out of their current chips.
Could just be a timing issue though. Intel might abandon their current 10nm leaving AMD as the default winner by mid/late 2020.
Disclaimer: I'm long on AMD and wishing I bought more 5 years ago.
But Intel has to compete with TSMC. Since TSMC is making Apple, AMD, NVidia, and Qualcomm chips, TSMC's R&D Budget seems to have won over in this generation.
TSMC amortizes its RD/CapEx/Op costs over Apple, Qualcomm, NVida, Broadcom, Xilinx, etc.
Apple, Qualcomm paid for most of the heavy 7nm RD/Capex long before to AMD's 7nm production commitments. AMD basically got the 7nm production advantage without the RISK on Fab investments like Intel. It was good that the AMD 's board had the foresight to divest fab to Global Foundry a few years back.
Intel has to amortizes its fab cost over Altera and x86 CPU. Altera part is probably trivial. Intel's huge 10nm investment in RD, Capex depending entirely on one x86 product line to recover. When execute well like 15 years before 2017, it was big advantage. Any execution misses, the FAB RD/Cap cost became a liability.
"Only the Paranoid Survive" - Last few years, Intel was definitively not Paranoid enough.
Now I believe, things may go the other way around as consumer chips are now gaining a lot in core count.
AMD's challenge is getting enough processors out in the enterprise market. They need to grow the market share now when they have the technical advantage. Intel can sell their inferior CPU's for higher price because they can deliver large volumes and satisfy demand.
Revenue comparison 2018:
Intel $70.8 billion
AMD $6.5 billion
So while I still want a Ryzen PC, I also want five years of 24 hour hardware support, and I feel more confident to get that from Dell rather than from some small local dealer for whom any support issue is a big deal for their own bottom line and where employees are under more pressure than those working for a big vendor. The big ones still are "Intel only" - apart from >$2,000 gaming PCs (I usually buy ca. $1,000 PCs with minimal graphics card, videos only (hardware accelerated decoding is nice to have; but too much of a card costs too much energy for no reason, even base energy when not in gaming mode is much higher - GTX 1050Ti using just motherboard energy is perfect, as an example), no gaming ever, but good components and large RAM and SSD, for developers).
PS: Any Germans here who found a solution for needs such as mine?
This will fit together and is stronger than with a GTX 1050 Ti. If you really want a gpu powered only by the motherboard step down to a GTX 1050, or look at the (rather overpriced) GTX 1650 (though it depends on the model there). Mail is in the profile if you have questions.
Internal Server Error
Power supply, motherboard, SSD and Ram will "just work", given you buy a recent product from a known brand. The +/- 5% of whatever will not make a difference to most people.
For mobo, think of features you must have ( built-in wireless?) And that will direct you.
to clarify, these are vetted parts lists by a leading german computer magazine, complete with optional upgrades, videos to aid with the build process and even detailed BIOS settings.
Which also means they can jump to whatever the current hot stuff is in the fab world, letting the big fabs (which are few in number, but still >1) compete on technical merits.
Where Intel has a corporate need to use whatever fabs they've invested in, and to be anchored by whatever the limitation of their fabs are. They aren't really limited by that -- they could just get TSMC to make some stuff for them or something --- but they're Intel so they'll just make 10+++++++
The question is if they can get all the capacity they need for price they can afford. It seems like high margin customers like Apple and Nvidia are always served first because they can pay more.
AMD's Ryzen 9 3900x and 3950x is a chiplet design: there are 2x TSMC 7nm dies tied to a 1x GloFo 14nm I/O die. See the picture: https://images.anandtech.com/doci/14605/Ryzen9_3800X_Hand_57...
In effect, AMD spends very little money and die-area on expensive 7nm process, while leveraging their GloFo 14nm contracts to cheaply make I/O and memory controllers.
The 7nm chiplet is a single design: mass produced for EPYC, Threadripper, and Ryzen. The 14nm I/O die is what differentiates between EPYC, Threadripper, and Ryzen.
The I/O die can have 2x memory controllers (Ryzen), 4x memory controllers (Threadripper), or 8x memory controllers (EPYC)... supporting 2x dies, 4x dies, or 8x dies as appropriate.
It's not really "rumored" to be so much as it is actually. The I/O dies are built at GlobalFoundries. The Epyc's larger I/O die is on GF's 14nm, and the consumer Ryzen I/O dies are GF's 12nm. https://www.anandtech.com/show/14525/amd-zen-2-microarchitec...
That also means that they don't have all the fab expenses that Intel does.
I'm almost certain that in AMD takes every sale from Intel when they can deliver. They have superior product but not enough capacity to reap all the benefits from their success.
Only if Intel is also _perceived_ as the inferior product. Remember that the value to the customer goes beyond raw benchmarking specs; it also includes brand reputation and marketing.
At the level of truly huge purposes, "value" also encompasses a real bilateral relationship between the company and vendor. A cloud provider might be tempted to stay with Intel now over AMD if by doing so they preserve guaranteed access/preferential pricing over the next set of server chips. To crack that hypothetical relationship, AMD would have to both dominate now and be perceived to dominate for the medium-term future.
It's only the retail market that can be so fickle as to consider a CPU a one-off purchase.
Sure they do. That's where the saying "nobody ever got fired for buying IBM" comes from. If anything enterprise is more brand sensitive and less price sensitive than retail customers.
Not to mention there is the entire surrounding ecosystem from the CPU parts themselves. The chipsets, boards, laptop availabiilty. Not to mention things like vendor lock in to the particular out of band management technology. Getting better benchmark scores and saving a few hundred per CPU is only part of a much larger matrix for making a purchasing decision.
Data center customers care about specs and prices only. They study the products they buy.
Data center customers care about support, vendor relationships, keeping complexity down, politics, reliability (= aversion to new products), lots of things. If anything the custom PC market is more focused around specs and prices solely.
Enterprise if anything has more inertia than say the custom PC market which will turn on a dime to chase the best deals.
Except they don't? There's a reason Intel's flagship HEDT part, the i9-10980XE, is half the price of the part its replacing despite almost nothing changing about the chip itself. And at the same time AMD is raising the price of its HEDT parts.
And it's not just the HEDT parts, but Intel also cut the prices on a bunch of Xeon chips a month or two ago as well, as well as their consumer stack.
...or that Intel controls supply to retail vendors by other means. If a vendor is bound to keep putting Intel products on their stores to comply with supply agreements made in the past, for example.
I want to get a new laptop next year and I hope it comes with a good mobile Ryzen processor with Thunderbolt support.
Zen 2 combined with much better than intel integrated GPU is likely going to make a very nice dent in marketshare, unless Intel pulls a rabbit out of their hat.
10nm products have "shipped" and put up a solid fight: https://www.anandtech.com/show/15092/the-dell-xps-13-7390-2i... It shows good performance and strong efficiency.
10nm desktop & server parts seem entirely MIA and dead. But it seems that Intel is able to at least struggle out small laptop parts on the process.
From what I heard from a man close to Intel process crowd is that Intel will simply slap a 7nm marketing designation on their current 10nm process when they finally get it going.
They will then change design rules to deliver enough density change, comparable to a node shrink without any change in the process.
If someone comes out with a laptop CPU that can't boost up to those thermal limits, it means the chip's undersized and that vendor will probably need a different microarchitecture for the desktop or server markets.
A friend from a laptop engineering company has worked on this exact problem recently. Chinese OEMs are all trying to squeeze 35-45w chips into small chassis now.
To my big surprise, doing so in even thin bezel 13 inch models is not that big of a deal actually. Big OEMs simply were never bothered enough to try that before.
More efficient design will give better perf.
There have been some ultrabook-style designs that offered inadequate cooling even for fairly normal use cases, but that's a separate issue. Mainstream laptops will be designed around mainstream workloads, and heavier workloads will push them to their limits. Better cooling doesn't come free, and if it doesn't benefit mainstream workloads it's unreasonable to expect mainstream laptops to put more emphasis on cooling capabilities.
In many laptops, thanks to bad thermals I'd be better off with a 4 core where the thermals can keep up. That's where the 7nm stuff could really bring advantages.
I've been able to load up my desktop six core plenty using e.g Docker and a bunch of microservices. It has a fairly decent 360mm AIO water cooler so stays pinned at max perf. Had a bad cooler before, though, and it really impacted perf and stability.
If OEMs optimize for that use case, I suspect that a more efficient CPU will simply mean that they cut even more corners on the cooling, not that the thermals will actually be significantly better.
HP thinned the ZBook series by turning everything upside down and having the bottom be just a dumb panel instead of the main frame.
Sadly they, once again, used a standard, barely capable heatsink. It will run for days loaded, but it will go over 90 degrees and even throttle, which is unacceptable imo.
Yes, the upper cover with the fins up should be designed to match perfectly with the lower one when mounted with the fins facing down. Cuts should be arranged so that the bottom cover rubber feet wouldn't prevent perfect surface contact. It would become a fairly large heatsink in which the size and combined thickness would likely be enough to counteract the small fins size and absence of fans.
Battery/disk/memory covers on the bottom side would be accessible by removing the additional cover.
I'm driving four displays at work with Windows 10 (two 21.5' 1080p monitors, my laptop flipped open, and an iPad Pro 12.9 connected via USB C running Duet Display) and my idle desktop CPU utilization hovers around 15-20%. Having Outlook and Chrome open gets it into the mid 30s. This is a four core i7 Dell Latitude 7490 with 16GB memory and an NVMe drive that I was given in May 2019.
Yes, all the OS/applications I'm using are resource hogs but I'm not even doing software development - this is all business analyst work. Seeing that the general trend of applications/OS will continue to be resource hogs, let's hope that six core thermal chassis design for 14' ultrabooks is figured out in the next two or three years.
Games are a mainstream workload that will cause most laptops to throttle.
"Better cooling doesn't come free"
You could make the heatsink in a Macbook out of Pure Silver, and it would barely move the price.
Meanwhile aluminium costs $1.7 per kilo.
That said, AMD becoming a worthy competitor on the market again is awesome.
I recently bought a Thinkpad E495 with a Ryzen 3700u and thermals are just fine. The surface barely heats up under sustained load. The E series is more budget, though. But the T495 (14") or X395 are of higher build quality (and price) and also have Ryzen 3x00u CPUs.
Linux support is excellent. Openbsd is good but doesn't support this generation of wireless cards yet, which is quite inconvenient; next version perhaps.
I've only ever really dealt wit nVidia and Intel graphics, and highly prefer intel due to tearing fixes.
I'm not a gamer, but do need at least basic hardware acceleration for X rendering etc.
I've been running Debian Testing on 2400G since 04/2018, where I had to put firmware and compile the kernel with the config changes to enable the support, but it was already in kernel tree nonetheless. It's out-of-the-box since 07/2018, way before the release happened.
Likely specific support for 3700u was introduced at some point between these two kernels.
AIUI 5.4 is meant to become the next LTS, so it won't be an issue going forward.
Have you had any issues you suspect may be driver / gfx related?
The point is that AMD releases detailed documentation for every GPU they sell, and thus it's very easy for mesa to support their hardware; AMD themselves also contribute code to Mesa, of course. This is in contrast to NVIDIA, which even uses encryption and signatures to twart nouveau efforts. I avoid NVIDIA entirely, for this reason.
I haven't had issues so far. Life has been good, same as with the vega 64 on my workstation.
If you get tearing, try:
xrandr --output eDP --set TearFree on
The new Thinkpad E595 looks decentish and ships with Ryzen, as opposed to the previous model (E590) which shipped with Intel.
All USB Type-C connectors all have a pair of USB 2.0 wires in them. You can still be a USB 2.0 device and talk on those USB 2.0 pairs on USB3 (and presumably USB4) just fine with just a little bit of care (get your resistors correct on the CC lines, for example).
If you want some features of USB3 or USB4, then things get a little more complicated. For example, if you want to be able to draw slightly higher power (900mA or 1.5A), then you need to have some active circuitry on the CC lines (One of these is about 70 cents: https://www.ti.com/product/TUSB320HAI) and you have to respond properly when the system tells you to draw less power even though you don't need to use the full-blown USB3 communication pairs.
If, however, you want USB3 or USB4 speed or very high power, then you need a full-blown controller chip and you incur all the grief that demands. Of course, if you actually need a couple Gbps, you're in the realm of doing serious signal integrity analysis anyway, and you're probably not going to balk at the $3-$5 required for a true controller chip to handle it all.
Obviously, not every device need to use every feature. I'm talking about the port on the computer.
These cores are great for debugging threading issues. I would trade it for a 3950X if I could find one.
i also put in 64 gb of memory, so i can keep all the tabs open (insert "EVERYONE" meme gif from leon), while i develop
At the moment I don't feel the need to upgrade. My bigest problem is that my cpu fan tends to throtle up and down a lot, can't find correct fan curve.
Can you tell what you're using them for ?
Besides, it’s really annoying I need to upgrade from DDR3.
So yeah, I made some questionable technical decisions purely for aesthetics but yeah form is kind of important too.
i have 0 RGB on or in my case.
your comment just made me change my view on RGBs. I.Want.Them.
My next built is going to puke rainbows! its gonna have a builtin display for a unicorn which moves according to the load - walking on <10, running on >60. its gonna be amazing!
(I am now firmly in the "RGB is dumb" camp like the other grumpy old people. Oh well.)
Do you? You can use an older AM4 motherboard that uses DDR3 with Ryzen 3000, no?
They were killing it today.
I’ve been wanting to upgrade my Intel based CPU but these prices have me thinking about just going with AMD itself.
Now is a crucial time for AMD to strike back.
Minix is an excellent FOSS microkernel multiserver OS with a focus on reliability and fault tolerance.
The way Intel is using it just isn't nice.
I have no idea whether the same is true for PSP.
Intel's AMT is also an implementation of DASH.
To be a convenient security hole, AFAIK no.
Any quotes on ME being safe on non-vPro?
Without vPro or with remote management and the network stack turned off there's a much smaller (probably close to zero) remote attack surface. With a vPro-capable chipset that has remote management enabled, the ME has its own IP address, plenty of potentially unsafe services, an insecure-by-default provisioning mechanism and much more.
The 5 at the end denotes an AMD product.
I'd hold out for after CES though. That's when Zen2 is coming to laptops. With thinkpad lag I'd say August of next year.
AMD PSP; "AMD Platform Security Processor".
If manufacturing capacity, then it should've been priced higher.
If local logistics, then this data point has no bearing on whether it's priced correctly or not.
The allocation order is going to go enterprise first, system builders second, retail boxes a distant 3rd.
Supply isn't deliberately restricted, it's just you can't ramp up production of your top bins without also making more of everything else.
If it were a different die entirely this might make more sense. But it's not.
If the demand is exceeding AMDs manufacturing capacity, then they should have priced higher or need to increase manufacturing capacity in order to actualize the potential market share to be gained.
That's the standard micro economics answer. You're not wrong.
However, I don't think you're right either. The pricing of consumer products is a black art because people don't necessarily behave like economics textbooks say they should.
1. This is a short term supply hiccup. The good will of keeping the price the same of the medium term is worth more than the money AMD would make by raising and then lowering the price of their chips.
2. They are constrained in their ability to raise their prices by Intel, which is broadly thought of as a superior product (leaving out the technical analysis of the current generation).
3. They want to create the perception of scarcity.
So whatever the highest test the chip passes, it's sold as that.
This is apparently why nobody is having much luck getting any kind of overclocking beyond the on-box specs - AMDs just very good at ensuring chips go where they're required.
If they did that the news would have been AMD is expensive which is not much of a news-story. Whereas now the news is that AMD is sold out because it is so good.
I.e. if they can only manufacture 1000 widgets, but 10,000 people want to buy one at the given price point, the gained market share is still only 1000 individuals. The number of people who want one but can't get one due to lack of supply is not market share.
Temporary under-pricing might be a valid strategy if AMD is planning to ramp up manufacturing capacity, but comments further down in this thread suggest they lack the fabs to do that.
The only way they'd set the impression of overpricing is if they actually overpriced. They did the opposite.
Whether it was a mistake in pricing or some calculated second-order strategy, we'll never know. What we do know is they're leaving cash on the table whether intentionally or not.
Temporarily (keyword) underpricing to hurt competitors is not unheard of but I don't see how it will hurt intel unless it significantly shifts mindshare for the next ~2 years and allows AMD to get a real foothold in the server market.
I am not sure it is a bad choice.
And thanks to DRM, I can play around with alternative environments that talk directly to the kernel without X being involved.
So... if I get an AMD APU... how will the hacker/tinkerer experience compare?
AMD supports all the same DRM/KMS interfaces, has better performance than Intel with open drivers, and also offers proprietary userspace drivers (GL/Vulkan side only, they work with the open source kernel and modesetting side, no weird binary compat issues to worry about) in case you prefer those, but they are absolutely not required.
I'm seriously looking forward to replacing my IVB laptop with a Ryzen once AMD releases 6+ core APUs. Maybe next year.
I moved to a GTX 1660 yesterday and immediately hit a bug where KDE isn’t able to detect display refresh rate when using the closed source Nvidia driver.
I also just saw on Twitter that some of the shops have new 3950x available.
- Intel makes in a few weeks as much as AMD in the whole year
- AMD doesn't own fabs and can't saturate market (as seen with 39XXx shortages)
Intel owns fabs and has been having shortages for most of the year. AMD spinning out fabs seems to have been a good choice for them, as GlobalFoundry, their spun out fab, ran into problems with process shrinking and AMD was able to move Zen to another fab that got their shrink to work.
I'm not saying Intel should spin out their fabs, I'm just saying not owning your own fabs has some advantages sometimes. Owning your own fabs also has some advantages sometimes; it's certainly been a historic advantage for Intel.
Also, AMD is going to have to keep up the momentum for a while before they can make in-ways in the enterprise market.
Not to mention, it's hard to get the top SKUs from both AMD and Intel, but something that costs 85% of Intel consumer parts and has 110% of the performance is readily available.
AMD will need to provide some retail guidance for the holiday season, maybe widespread availability won't happen until Jan/Feb..
If Intel manages to keep AMD market share below ~18% in the next couple of years, then probably not.
The value of the stock revolves around how much money AMD is able to generate and the biggest variables here are volume of sales and margins on those sales. AMD's variable costs are probably higher than Intel's since they outsource their manufacturing to TSMC, but their fixed costs dwarf Intel's.
Revenue $19.2B 2B
Earnings Q3 $6B $0.12B
Keeping aside the fact that AMD's numbers include GPU sales, notice how AMD's earnings to revenue ratio is very small compared to Intel's. Also notice that Intel's market cap is almost 7 times AMD's (again, AMD has a GPU business too).
So the key is really, do you believe that AMD will be able to break the ~18% market share (in particular in high margin segments)? My back of the envelope calculations indicate that at around 20% market share AMD should be able to generate earnings close to the billion dollars per quarter, making AMD a buy. If they cannot break and sustain a market share close to that, my answer would be not.
Disclaimer: I have AMD shares and I don't plan to buy or sale given current pricing.
Intel's Data Center Group – 29% of 2016 revenues
That is 84% of their business.