Hacker News new | past | comments | ask | show | jobs | submit login
AMD Reports Q1 2021 Earnings (anandtech.com)
158 points by neogodless on April 27, 2021 | hide | past | favorite | 118 comments



Consumer CPU, APU, Server CPU, Semi-Custom ( Consoles ), are all doing great as expected. What is missing is the GPU big picture. I dont see evidence of AMD discrete GPU winning any market shares and seems to be dominated by Nvidia.

AMD only have two more years to enjoy their advantage from Intel's multiple missteps. After that it is going to be tough. And that is excluding the threat from ARM.


I think you’re underestimating the trouble Intel is in. They are being absolutely stomped on by both AMD and Apple in terms of performance/efficiency, and if it wasn’t for such high demand it would reflect heavily on sales. Also, AMD has begun R&D for ARM processors whereas that is nowhere to be found in Intel’s roadmap.


That would have been true if Pat Gelsinger didn't became CEO. Worth pointing out Intel's R&D didn't stopped, all their CPU uArch design were done, only to be let down by their 10nm fiasco. That single misstep caused Intel years of setback. ( My calculation is somewhere around four years, they had 2 years lead at the time, so now they are about 2 years behind )

As some have pointed out, TSMC is the real competitor, not really Apple or AMD. So I was surprised how quickly Pat managed to open up Foundry with industry standard tools and ecosystem. I have been saying this since 2012 even before the original Intel custom foundry were announced. ( And constantly being attacked and abused by Intel Fans online) But to actually see Intel announcing it is a completely different thing. Especially knowing their strong inertia. In hindsight, it probably took that beating from 10nm before they could really go all in with Custom Foundry. ( "All in" as in with industry tools, they are still an IDM, not Pure Play )

And how the tide has turned. All of a sudden Leading Edge Foundry becomes a national security concern. And Intel should rip all the benefits from it.

It was Andy Grove first started moving away from DRAM in the late 1980s, and may be his disciple will be the one who start to move away from x86 as core business.


Say Intel does only take two years to turn around x86. They are a big company with lots of resources and talent, so even though in my mind it seems difficult they surely have a chance.

Meanwhile, AMD gets a headstart on ARM and leaves Intel in the dust on that front. I don't see Intel winning one way or another. I think it will take many years for them to potentially come back as no. 1 in the industry.


I didn't say they will lead in two years. But they have a much stronger position going forward and be able to compete. ( Compare to dying in a slow death before Pat came back ) Right now they aren't really "competing", more like holding AMD back. Look at AMD's entrance into Cloud / DC market, they are barely 10% by even the most optimistic standard. If you take out the consoles sales number, the DC market for AMD isn't as rosy as many want people to presume. ( I would have wanted them to be close to 20% by now. Although Mark papermaster have shown their ambition for 30%+. Which is the only good sign I would take.) Also reminder AMD isn't in the forefront of leading edge node, they tend to wait for mainstream. Apple is the one paying premium. ( Although TSMC sudden 5 / 3nm acceleration seems to suggest a change in strategy, we will see )


Jim Keller was in part hired to work on K12 which was an ARM cpu which they were working on at the same time as Zen.


Clearly you aren't really plugged-in to what's really going on. Intel has had multiple missteps both 10nm and 7nm which are both years late and kept getting pushed back. To say TSMC is the real competitor is really silly, AMD's Zen arch is giving intel all kinds of problems, AMD has a solid roadmap...I really don't wanna rehash


>Clearly you aren't really plugged-in to what's really going on.

Not sure if this is a new comer or new account. You may want to explain yourself. As you have nothing in your previous comment to "rehash".

If the past 10 years are records that I am way more plugged into what's going on than 99.99% of the internet.

And if you dont think TSMC is the competitor, it tells others more about you than about me.


Intel's Tiles sound like an engineering nightmare. There's no way they're going to compete with AMD with a "roadmap" like the one they've shown so far.


Intel isn't being stomped on by AMD and Apple. The stomper is TSMC (and Samsung to a lesser extent). Without TSMC, M1 and Zen 3 don't exist. (Quite literally in the case of Zen 3, as you still can't buy one!) It's all about the fabs.


I don't know anything about other regions but Zen3 Desktop CPUs are available in multiple stores in Germany and Europe.


How expensive is it in Germany? Russian retailers also had a decent supply, but the 5950X cost ~$1050 in the winter, ~$990 now. Which is a little bit more than the usual regional premium but honestly not too bad. GPU prices are much much worse. Over $1200 for a 6700XT – yes the one with a seven. 2.5 times the US MSRP :/



In Sweden the 5950X costs the equivalent to US$ 1060 right now.


You have been able to but Zen 3 laptops for a good while now. Not sure about desktop CPUs thought.

But that's mostly because of apple reserving TSMC entire 5nm line, so AMD is still on 7nm (although they look extremely promising even on an older node)


Apple did reportedly reserve all the initial 5nm capacity but I don't think we can say that's the reason AMD are still on 7nm. The early state of any node better lends itself to lower power, small SoCs. AMD may have been offered some capacity and turned it down because they prefer a more mature process for their larger chips with high-speed IO. Given the shortages now maybe with hindsight they regret that - who knows.


Zen 3 laptops haven't even been properly announced and are definitely not shipping. Even the Zen 2 models are in very short supply. Desktop Zen 3 is what is shipping and quantities are very limited as demand is very high.


You can order Asus Zen3 laptops such as ROG Zephyrus on Amazon today.


I stand corrected. Seems like the 35W chips have indeed shipped. I've been waiting for the 15W parts to reach the Lenovo range.The part they'll use was only even announced a month ago. Even within those some of the 5000 generation are Zen 2 parts. Desktop has been out for much longer across the whole range but availability seems quite poor.


I don't know about where you are, but supply has caught up with demand for Zen 3 in the UK.


Good point, Intel’s primary weakness at the moment is definitely that they are failing at progressing their own fabrication process


Intel runway got extended a lot due to current silicon crisis. There isn't enough o/p for people to completely stop buying the 14nm chips, which Intel can produce with high profitability.


But is it the runway for the purposes of taking off or landing in less crashy manner? ;-)


AMD has made ARM chips before (2016) for servers. https://www.amd.com/en/amd-opteron-a1100

Never seen one in the wild, though.


Also remember that AMD had Jim Keller back in the day, for an integrated ARM / x86 CPU. They have done a lot of R&D already to date, not something recent!


Unfortunately, the software tooling gap between Nvidia and AMD has widened significantly for GPGPUs. AMD has also left the enthusiast video card market in the dust with their lack of ROCM support. Nvidia has some seriously cool tech that they showcased at GTC 2021 this year, like speeding up algorithms for bioinformatics, and new CPU/GPU combinations for autonomous driving. I would guess that the immense amount of R&D spent by Nvidia is causing this gap.


It's really disappointing that the GPGPU space isn't adopting Vulkan all that fast. These "compute specific" things such as OpenCL/CUDA/ROCm should not be necessary and should die. Just use Vulkan for fuck's sake.


Libraries like cublas and cudnn help the GPU scheduler assign tasks to specialized blocks on the graphics card. Vulkan would slow this innovation down severely. Unless they make Vulkan extensions? I’m not too sure how that would work.

Personally I haven’t used CUDA, but if anybody could comment on its ease of use compared to Vulkan that would be great.


> help the GPU scheduler assign tasks to specialized blocks

Source? I can't find anything for '"cublas" scheduler'…


The GEMM instructions are sent to Tensor Cores by the scheduler. The cutlass library is the open source version of cublas:

https://github.com/NVIDIA/cutlass


Seems like you can leverage those via VK_NV_cooperative_matrix…


This is the kind of thing where the whole Xilinx acquisition is gonna make a play.

Bespoke accelerator FPGA solutions are the future and will probably scale a lot longer than CPU/GPU combinations.


I really doubt that. FPGA is really power and silicon intensive. A more traditional "many 'fixed function' processor" a la GPU will beat it for any algorithm which at least semi-successfully maps to the provided operations. And most math is matrix multiplication anyway, if you squint your eyes and hold your tongue at the right angle ;)

(I use "fixed function" loosely here. It's not a fixed function T&L pipleline, it's meant in the "the circuit is fixed but can execute arbitrary code" sense.)


You can doubt it all you want, but this is where Xilinx Alveo is coming into play.

This isn't my space as I operate on low-latency networking side.

> A more traditional "many 'fixed function' processor" a la GPU will beat it for any algorithm which at least semi-successfully maps to the provided operations. And most math is matrix multiplication anyway, if you squint your eyes and hold your tongue at the right angle ;)

I don't think it is as simple as this. Certainly not for low-latency scenarios or for HPC efficiency where the gains are done my moving as much computation as possible onto accelerator cards and off the CPU.

The cost of moving things back and forth from the CPU can be expensive (data especially). We are already seeing dedicated solutions from Google, Graphcore that beats the pants off the traditional GPU/CPU approach.


How does FPGA fabric solve the bw bottleneck exactly? What solves it is putting the compute power closer to the CPU (or memory). But you can do that with an FPGA or a GPU, or a CPU matrix etc. Compared to FPGA, the others will have much more compute power(especially for HPC) in the same die size/heat envelope.


I think the idea is to bring the compute power closer to the storage or NIC devices. At least that is the impression I am getting from their smartNIC/smartssd device.

In this case being video streaming in from the NIC, encoding/decoding occurring directly on the NIC.

Same to a degree with the bespoke smartssd where if you have some complex encryption, encoding, decoding or data driven application than the processing happens directly on the storage device via the FPGA.


it will get worse, since nvidia lifted some IOMMU limitations in their consumer cards: https://nvidia.custhelp.com/app/answers/detail/a_id/5173/~/g...

this is a big deal especially since ai tech advanced.


I agree that the state of AMD's dGPUs still leave a lot to be desired, but they've been making some progress at least. On the consumer side, RDNA2 is raster, transistor, and power (albiet on a more efficient node) competitive with Ampere, which is big step up from previous generations. Thanks to having performance competitive top-end/halo cards, AMD GPUs have gotten a consumer mindshare win that they haven't had in years. RDNA3 should be even more exciting, with another big rumored perf/efficiency gain and their first MCM implementation. It should be exciting to see not just RDNA3 vs Lovelace, but also to see how Intel's DG2 changes the dynamic, since budget and mid-end competition has been completely non-existent lately. Everyone's selling all the chips/boards they can make atm due to largely to cryptomining and other supply chain issues, so I guess AMD gets a mulligan until next-gen (especially considering how currently for AMD, every GPU made means giving up 10X the profit on enterprise CPU cores due to shared die capacity constraints).

The other thing worth noting on the consumer side is that Zen 4 should also bring Navi 2 and DDR5 to their APUs. AMD's current APUs can already handle most eSports/low-end games at 1080p [1] but this next iGPU bump has the potential to displace the low-end AIB/dGPU market completely. When looking outside of the AIB numbers, AMD (and Intel) continue to increase unit shipments/gain market share at Nvidia's expense. [2]

As for GPU compute, there's no question that Nvidia continues dominate and it feels like if anything, AMD is falling further behind. While I have some hopes for CDNA/2 and ROCm continuing to mature, their ongoing lack of support for even basic stuff like desktop OpenCL or any RDNA cards means that AMD is moving backwards as far as ecosystem support goes. I really can't think of a better way to stunt adoption than to have no ML/compute support for the majority of their products. ¯\_(ツ)_/¯

On the process side, assuming Intel suffers no more delays on 7nm (a big assumption), it will still be half a step and half a year behind TSMC's N3 process in 2023 (beyond that, I suppose it'll be an open question of who gets their GAAFET working). AMD's CPU architecture team has a pretty solid track record at this point though, and I don't think Zen 4 or Zen 5 will be at any particular disadvantage vs Granite Rapids.

Obviously, AMD has lots of competition, but as long as they keep executing, I think they have a pretty bright future (also, we'll have to see how Xilinx plays out).

[1] https://static.techspot.com/images2/news/bigimage/2021/04/20...

[2] https://www.jonpeddie.com/press-releases/gpu-shipments-soar-...


Thank You, I dont follow closely on the GPU front. I will read up on some of the information provided.


> On the consumer side, RDNA2 is raster, transistor, and power (albiet on a more efficient node) competitive with Ampere, which is big step up from previous generations. Thanks to having performance competitive top-end/halo cards, AMD GPUs have gotten a consumer mindshare win that they haven't had in years.

TBH, the only reason I pay 800$ for a card today is to play games with ray tracing on at an acceptable resolution. This is only possible with nvidia RTX + DLSS.

I wouldn't pay 800$ for an AMD card that gets me similar rasterization performance but none of the "modern" (3 year old already...) features I care about.

Right now, _any_ GFX card is better than having no GFX. Both NVIDIA and AMD are sold out.

I can't know how real the AMD mindshare is. We probably could only know if we had a market with enough supply of both.


I can't think if there has been a worse time to be a PC enthusiast. Everything is so hard to come by, and if you can find parts, prices are insane.


I bought an RX580 in december for $140 on ebay.

Currently, one in the condition I have goes for $350.

My GPU purchase is out-earning Bitcoin.

This is dumb.


Just gaming enthusiast in general. Try buying latest gen console in Europe. I gave up and am now playing Rome TW port for iPad.


Yup. My gaming PC is still rocking a 3700X and an RX580 because finding stock for GPUs at close to MSRP for the high-ish or even mid range cards is near impossible. :(

Edit: and I’m in Germany which means import fees, and other things that make things a bit more expensive.


A 3700x? That’s less than two years old. We’re also in a year+ long pandemic.


I was able to buy it a while ago but I paid a bunch too much. It’s the GPU that I’m hurting to upgrade.


I've basically in the same situation as you - same specs, based in central europe - and the GPU situation is dire. PC components places here are completely wiped out of GPUs. The RX580 is gonna be fine-ish for me for the next wee while, but I'd like an upgrade if possible.


I'm in the same situation (even with the same CPU and GPU)


My gaming PC is on a fourth generation Intel with a RTX 2080. Still enough to do >100fps in the games I play on 1440p.


I was looking at even a 1080ti and they’re > 500 euros here in Germany! It’s ridiculous. I mean it’s an amazing GPU but it’s 2 generations old now.


In Austria even 1660s are >400 Euros. Try that for insane.

1080TI was one of nvidia's "mistakes" as it was ahead of its time in performance and VRAM size and was one of the reasons people owning it could afford to skip the overpriced RTX20X0 series and even the RTX30X0 series. That's why they held up in value so well, even before the pandemic shortages.


I could sell my 1060 6GB for roughly same price I've bought it back in 2016.


True. It really is a special card. Good thing is, at least for me, I don’t need a new GPU I’d just like one for higher fidelity in games.


This situation has convinced me to buy a completely new system when prices come down again and use the old one as a server/backup.


Yeah if you've had a "ship of theseus" pc for a while it seems like a complete rebuild every 5 years is usually reasonable to prevent any component bottlenecking you too much


Worst time so far. The problems will only increase because PC will become a small niche item for enthusiast not a common item.



Yup. PCs will mostly move into 1970s hobbyist territory but nowadays you can't build 5nm chips in your garage.

Well, not all of them, since a ton of professionals use them, but some niches like PC gaming will feel a bit like that very soon, compared to mobile gaming.

https://www.statista.com/statistics/292751/mobile-gaming-rev...

(and the trend is for mobile games to become much bigger much faster than desktop games)


PC gaming isn't niche. In the past few years it feels far more mainstream then it ever has before. You underestimate that there's literally a generation which has grown up on Twitch and YouTube where people predominantly play games on PC.

Sure mobile gaming is massive. But considering the current growth of PC gaming it's hardly something that negates the existence of PC gaming even if it's smaller in comparison.


I've been playing PC games since 1998 and I've played them exclusively until 2018.

They heyday of PC games from a market share perspective was in the past (probably around 2005?). At the time there were just PCs and console, I think they were reasonably close.

Now desktop games are less than a third and they will probably be around one fifth or less within a decade.

That doesn't mean that they will be bad, the overall PC game market value will grow, it's just that the mobile game market will grow much faster and become much, much bigger over time.


Mobile gaming is surely huge and growing. But I don't see how it has much bearing on the health of PC gaming. It's not a zero-sum game. I can buy that PC and Console gaming are in competition but the gaming experience between mobile and PC/Console is vastly different.


Game streaming will become more common, so people won't buy PCs but rent games to play on the cloud.


Ah yes the 5nm chip hobby scene from the 70s :)

I suspect the hobby scene will be like what I see from my friend who hacks keyboards, special mod-able components. There's advantages for when a product is designed on hobby merit, and I think it'll be a longer bastion of enthusiasm.


Also there are potential problems with x86 pc's not being able to match ssd->gpu streaming speeds of custom architecture consoles.


That's not an x86 bottleneck as Microsoft is bringing the tech to PC[1], and best part, it even works on Gen 3 SSDs so you don't have to burn $ on top of the line Gen 4 drives.

[1]https://www.guru3d.com/news-story/microsoft-to-share-more-de...


Look at slide 5 — they are still using system memory instead of reading directly to gpu. Although ms rep did say that direct streaming is being worked on[1]. Also we still don't know details on nvidia's proprietary rxt-io. But I wouldn't call that solved. [1] — https://imgur.com/a/fHnLLQo


It's the same story every generation. There may be some things the consoles are better at right now but give it 2 years and we'll be complaining how the consoles are holding things back again.


You underestimate pc gaming.


You can buy a pre-built gaming PC with a Nvidia 3080 today and have it delivered by the end of the week. The shortage exists exclusively on the retail side, OEMs are awash with both graphics cards and CPUs.


This is not true. Every major prebuilt company has significant delays on 3080s and 3090s. I just checked.

ibuypower: 6 weeks

nzxt: 4 weeks

originpc: 4 weeks

maingear: 12 weeks

alienware: 5 weeks


I just went this route to build a ML workstation. I was able to get a prebuilt with a 3090 for ~30% of a comparable build on https://lambdalabs.com/.

Now I just need to wait until next year to get the second 3090. I considered waiting for the new A4000s to launch to see if they'd have better inventory but figured 3090s have additional value as a gaming gpu.

I'd be curious how professional ML folks are fairing and what they're using -- it's not too late to cancel my build.


Just got this delivered yesterday:

https://www.novatech.co.uk/savedbuild/ccf0fcec-685d-4f4b-8be...

New to ML but looks similar enough to some of the builds on your site that I’m slightly less scared that I made the wrong choice!


Dope, 7 days is a pretty nice turnaround. I went with a 'custom' prebuilt so I could plan for adding additional cards once they're well supplied. You might run into trouble with the PSU, case clearance and mobo throughput if you plan on adding another 3-slot 3090 - those are all sub-1k though so you can always swap once you need to!


AWS GPU compute is pretty cheap if you don't need their newest beefiest rigs.


I've prototype on a collab notebook and have been using AWS GPUs for training. Having to jump through hoops for quota and the general anxiety in trying to optimize the costs led me to just build a machine with a 1-time upfront cost. As a sole proprietor, I can write off the capex against future revenue anyways and depreciate it.


Alienware's published lead time should be taken with a large grain of salt. My 3090RTX / Ryzen 5600X build just shipped 12 days after I ordered it, a full 3 weeks faster than the estimate I was given at the time that I ordered it.


I think you forgot to check the actual majors Dell, HP/Omen, and Lenovo/Legion but maybe they're just as bad.


Alienware = dell for the top end stuff like 3090s.

And to that end: https://i.imgur.com/oXgofp7.png


We just bought 20 Dell Prevision workstations with 3080’s in them and they were delivered fine with no question about availability for more.

(In Sweden)


Sure, but now you are stuck with un-upgradeable FCLGA3647 systems.


That’s what I did, a lot of the smaller pre-build retailers are out of stock but HP/Dell/etc. still have the high end cards in reasonably priced builds.


I've been trying (unsuccessfully) to give AMD my money for a 5900 (in the US), so maybe I'll be lucky enough to contribute to their upcoming record Q2 sales


Dip your toe into the dark side and try out Distill Web Monitor, and you should be able to find one in a week or so.


And since you're on hn, you can probably figure out streetmerchant


As someone who've used neither, why is streetmerchant better than Distill Web Monitor?


It's free and open-source.


Me too. Every day I'm like, can I buy a 5950X yet? No? Dang. Please take my money. No not you, scalpers!


I've seen the 5950X more frequently on NewEgg's shuffles...

It doesn't quite hit my sweet spot for single core peek during multicore (you know, Minecraft and other old single player games) system use. The 5900X looks like a good mix of single, multi-core, and value budget.

Which is also the same reason I want a 6800 XT GPU to go with it, 16 GB of videoram to throw at games or just way too damned many tabs being open.


Genuinely curious, do GPUs help with multiple browser tabs? I was assuming it's the RAM's job.


That's a bit like asking if GPUs help with running multiple applications or how long a rope it, it really depends on what in those tabs. WebGL, some graphical rendering, Canvas API and more is powered by the GPU in a modern browser, but websites like HN won't have any need for the GPU.


Maybe if all those tabs have videos playing?


Not even then, unless you are using a browser from 2015.

Newer one are very good at puttning background tabs to sleep


I was in the same boat and finally caved and bought one from antonline.com. They have them in stock but they're bundled with a $300 monitor (total price of $1,178.98, which includes tax and shipping). I resold the monitor for $275, so my losses weren't too bad.

Anyway, if anyone's been waiting for a 5950X for 6 months like me and doesn't want to pay a scalper, this might be something to consider. Bundling sucks and I hate that so many retailers are basically forcing you to buy random stuff you don't need. Almost as bad as the scalpers. Almost.


Can somebody explain why scalpers are a thing here? Why aren't the original vendors just raising prices to what the scalpers are charging, i.e. the market clearing price? If it's what's going to happen regardless then better that they get the money than some crummy middle man.


The companies expect some memory from the community, and want to continue selling to them in the future. Gamers are very price sensitive, and will resent them for that.


I think the current batches are mostly reserved for pre-builts at big vendors (dell, hp, lenovo).

Had no problem ordering machines with fairly new CPU from HP with 4-5 days delivery. GPU is a whole different story...


I wanted a 5900x but settled for a 5950x bundled with a useless one year of Microsoft 365 for a $100 premium over MSRP.


I managed to score a PS5 but still can't get a 5950X.


After refreshing Amazon's Ryzen page for a month to get a 5950x, I finally bit the bullet and paid 200$ extra to buy it from a scalper. Even though I paid more than the list price I think the value of this CPU is insane compared to the competition. IMO AMD massively under-priced their product.


I agree the value is higher than expected, but I do think that AMD is doing a surprisingly good job of balancing current sales against past and future inertia. Obviously, per the article, they're doing great in terms of sales, but should they be doing better?

The standard, for more than a decade, has been to buy Intel. If AMD is 2x the performance of Intel, and consumers were perfectly rational, they could list their products at 1.99x the price and everyone would buy only AMD. Consumers are not rational.

If instead they list it at an insane value, like 1.5x the price with 2x the performance, people will change the default choice to AMD. Currently, recommending Intel means you're an out of touch dinosaur, using obsolete knowledge... Haven't you read the benchmarks? What will you recommend next, 15k RPM SAS spinning rust for storage and the icc compiler? (Or maybe you're making a pragmatic decision based on stock levels, but that's a bit nuanced). Cultural and emotional ideas like that are likely to last even after Intel catches up.

It's honestly probably a good thing for AMD's cultural forecast that there's scarcity and scalping. That will have a more lasting memory; people wouldn't remember getting an AMD 2x processor in the 2020s at only 1.99x the price of Intel, but they'll remember buying processors from shady scalpers at $200 over list because it's just that good.


A clear sign of market failure. There’s a reason scalpers can charge the prices they do.


Interesting that despite the shortages AMD isn't really pushing up their margin. It only increased by 1% over 1 year ago, and was flat against the previous quarter.

> notebook revenue is up once again, setting a sixth consecutive record for AMD

Great to see notebook manufactures finally shipping large numbers of AMD machines.


I have some money on AMD as it seems to be a fan favourite. When I ask why I hear "price points". It is possibly a good way to remain in mindshare


Yeah, it also helps that the current alternative to Apple M1 is only AMD CPUs, and I gladly paid for a new-gen AMD notebook...


> Meanwhile AMD’s Enterprise, Embedded, and Semi-Custom segment has seen an explosion of growth of its own over the past year, thanks to the launch of AMD’s third-gen EPYC (Milan) processors, as well as the 9th generation consoles.

Surprised to see enterprise switching over this quickly from Intel to AMD. How much infrastructure needs to get swapped out when making this switch?


Big companies usually refresh all their hw on 3-5 year cycle, as older hw is too expensive to maintain (too much power use for the performance + increased repair costs).

So in cases like this, it’s usually not swapping “good” hw, just ordering replacements from different vendor, when batch of hw reached their planned EOL.


"Enterprises" do indeed dump hardware after 3-5 years, but not because it stopped being TCO-positive. It's just because they aren't sophisticated enough to own a computer without a warranty. Organizations who buy their computers by the megaton know that these machines are TCO-positive for 10+ years. There's a reason you can still get a Sandy Bridge in GCP or a Haswell in EC2.


It depends how hot you run them. If you idle them 99% of the time, then yeah, warranty is main issue. But if you burn your cpu all day, power usage becomes significant expense.

I guess you can get SB/HW from cloud vendors for few reasons:

1. Intel server CPU are usually generation behind customer ones, so they are little younger than one may think

2. They are used for workloads that require little of CPU power overall

3. They may have important customers that certified their software for those specific CPU, so they need to keep them in stock

I don’t work at Google/Amazon, so I’m just speculating.


> 1. Intel server CPU are usually generation behind customer ones, so they are little younger than one may think

Yes but big customers get significantly earlier access than the general public. See how Intel shipped 110k+ units pre-launch: https://www.anandtech.com/show/16539/intel-ice-lake-xeon-sca...


> I don’t work at Google/Amazon, so I’m just speculating.

Since we're speculating, let me throw out another possibility: you aren't actually running on hardware that old.

As long as the instruction set is compatible, the hypervisor can freely lie about what CPU you're running on, and GCP obfuscates the CPU model in their instances.


It's reached the point where, for example, a Dell AMD server is 99% identical to a Dell Intel server (but cheaper and faster of course) so the only real barrier to switching is psychological.


If you can't find a descent AMD graphics card or 5000 series processor, I scored an descent OEM 4600g PC for $400 at Costco.


Man glad I brought mine in Q4'20 I know its a real pain to get HW now. Went overboard with a 5950x and 3090 but it should last me a long time.


Since this article is about the stock, I hope I can share an update on my AMD position, though I have found HN has been very unfriendly when it comes to anything related to stocks.

AMD has been one of my best investments. I remember being on HN toward the end of 2019 and being downvoted hard for touting the stock. Back then I had 1200 shares at a cost basis of about $17. Today I have 1500 shares and a position worth about $130k after hours, almost about a 474% return, in less than 2 years. This isn’t even my only position that has turned out so well, you may have heard of some others in my past posts, just look for the ones that are very gray.

The moral of the story is do your due diligence and don’t be afraid to invest boldly, early and often.


You’re getting downvoted possibly because “look at my juicy unrealized gains” doesn’t contribute materially to the discussion


Probably HN wonders if you think we're the IRS, declaring every capital gain like that :)

I would see more material if you had some insight into how to invest in AMD looking forward. Saying do DD on all companies in general and then make a bold early play isn't super relevant. What do you think about AMD's prospects going forward?


My recommendation for AMD is a buy.


Similar story here. I have 1000 shares in my taxable account, average cost is just under $13. Another 400 shares in my IRA, average cost of $36.

AMD has done well in terms of absolute dollar gains. Percentage wise, I have others that have done better, like MU, FB, PYPL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: