Hacker News new | past | comments | ask | show | jobs | submit login
Intel slashes prices of Xeon 6 CPUs by up to $5,340 (tomshardware.com)
40 points by ksec 53 days ago | hide | past | favorite | 42 comments



Makes sense - a few weeks after Intel released Granite Rapids (their latest server chips), AMD released Epyc Turin with 40% better multi-core performance for the top-end chips (Intel 6980 vs. Epyc 9755, both 128 cores) - and not only that, but AMD was much cheaper. This price drop brings Intel 6980 from $17k to $12k, exactly matching the corresponding AMD 9755 price. Without the price reduction, Intel’s price per performance would be completely uncompetitive.

Curious if anyone has had success building dual socket GNR or Turin servers at home - seems motherboard availability is limited for retail customers. Also curious if anyone still sees Intel outperformance in any context, eg relating to AVX-512 or MKL workloads.


> Also curious if anyone still sees Intel outperformance in any context, eg relating to AVX-512 or MKL workloads.

Taking advantage CPU architecture still goes a long way. Yes, AMD is generally faster in SPEC and general purpose loads (and I root for AMD since forever), but when you write your application with a target microarchitecture, the results can be completely different.

We have some scientific applications where newer versions work slower than previous versions on older hardware, but the situation completely changes if you compile both on a period-correct hardware for the newer version.

ASM/Hand tuning is both a blessing and a curse.


It’s crazy for someone would have a Turin set up at home now as the TCO would cost more than a car, and also have to deal with power supply and cooling.


Not a very good car. :)

Some of us get crazy with our homelabs. The love of technology is enough of a motivator.

It's cold in my office today. I'm considering that I could use more cores, crunch some ML stuff locally instead of in the cloud, and solve two problems at once. The thermostat for the heater is allllll the way downstairs after all :D


256 cores = much more fun than a car!


Well, I'll say it depends, even if the car is out of the picture.

256 cores can be 256 levels of misery depending on where and how you install it. :)


The article's subtitle ("They are still more expensive than AMD's competing EPYC, though.") seems to directly contradict the article. Perhaps I missed something in it, but a few times the article discusses how "Intel's Xeon 6 CPUs are now cheaper than AMD's latest EPYC 'Genoa' processors both in absolute numbers and in terms of per-core pricing" and "Intel's Xeon 6900P-series processors are now cheaper than AMD's EPYC 9600-series CPUs in per-core pricing."

Is the subtitle simply wrong? The only way I can make sense of it is to suppose it refers to the price if you actually attempt to acquire a Xeon as opposed to the MSRP (if that is even the right term in this space).


You're correct, my understanding is that AMD has now the Epyc Turin released, Genoa is a 2022 product.


Ah, I see. Intel's price reduction leaves the Xeon 6 prices lower than the AMD Genoa (2022) prices, but higher than the AMD Turin (2024) prices. Strange that the article doesn't mention Turin at all.


Anyone here homelabbing with a Xeon CPU in their server? What kind of secondhand models are best worth it for homelab in terms of price vs performance and power consumption?


You should look at 2-socket mid-range models of any generation you'd like to use. Higher end models are generally specialized CPUs for certain domains (HPC, banking, etc.), so they're generally hotter.

If you're not going to go full-throttle all day, every day, having SMP is good. For an homelab environment, I think 65W-95W band is ideal if you want to keep noise and heat output manageable.

While it brings some performance hit, enabling frequency-scaling (SpeedStep in older parlance) is beneficial. Using a 2U server will bring noise lower since you won't be using, counter propeller, 10KRPM 40-50mm fans.

If you want especially quiet servers when they're not under load, look at Dell PowerEdge. They run like sleeping cats when idle / lightly loaded, but scream like an agitated parrot if you heat them up.

Lastly, a good way to maximize performance without too much power consumption is using an "optimal" RAM configuration which utilizes all channels with homogeneous RAM modules.


All two socket systems have horrid idle power consumption though.


I think newer systems are better since they support desktop-like deeper sleep states, as well. Maybe you can further optimize with a high hysteresis on-demand scheduler and core-pinning long running daemons to a single socket to keep the other one sleeping for most of the time.

We do the exact opposite since we're going "full throttle all day, every day", so I can't give you numbers on that front.


I have a system with a Xeon 8173M (Skylake-generation, 28 cores + HT) with 256GB in a desktop ATX case. I bought the CPU and RAM second hand, and I would guess I spent €600 between them. I bought a new Supermicro X11SPL-F motherboard which cost about €500. Add in PSU, SSD, the case, etc. and it was under €1,500 for a considerable amount of computing power which doesn't make a lot of noise and has reasonable power consumption for my needs.


I got a bunch of HP DL360 Gen8 and an oooold cheesegrater Mac Pro 4.1. These things are a dime a dozen because it all is/was enterprise gear - your best bet is to check out IT resellers, they live and breathe on quickly passing through inventory.


I recently got a PowerEdge R730XD with 24 drive bays, Two Xeon E5s (a mid range model, I can't recall now) and 256GB of RDRAM installed for less than a new, passively cooled N100.

I need to get disks, though. That was not part of the deal.

However I won't be installing that thing at home. I'm not that mad, yet.


> However I won't be installing that thing at home. I'm not that mad, yet.

I put my farm under the roof, in a properly enclosed rack. No noise and I'll also get some fire alarms to cut the power feed to the rack in case something goes bonkers up there.

(Any recommendations on CO2 extinguisher systems welcome)


It's great that you have space for that. I don't have that much space to do that. Another reason I don't yearn for a homelab is that I play with a lot of hardware already.

I have no idea on CO2 extinguishers, sorry.


> I won't be installing that thing at home. I'm not that mad, yet.

What do you do instead? Colocation rental in a DC?


That one will live in a university's data center, for a joint project I'll be doing with my professor.

It'll be a project-centric server. Not a general purpose one. For smaller tasks, N100 and RaspberryPi/OrangePi 5 series are more than sufficient.


I've been running a Xeon-D server/NAS for several years. Nice low power consumption. Was pretty cheap brand new integrated into a Supermicro server board.


I have a pair of now fairly old Xeon D-1541 which are not the fire-breathing xeons people imagine, but they're robust, sip watts, and have some really full feature sets that have kept them around in my rack for a long long time.

I got them as they were the cheapest path to 128GB support (in ~2014) and they're still useful today. I run one NAS/VM/home automation box and one Jupyter box with an nvidia GPU in it. They never complain. Neither does my power bill.


I have an Apple Mac Pro Late 2013 (aka trashcan) with an Intel Xeon 8 core cpu.


Could be a $5,340 discount across the board ;-)

One can dream...


That' crazy. Why doesn't this happen in the GPU space? Is there some silent agreement between AMD and Nvidia to not compete down in cost and just push each other's prices up?


Supply and demand. Intel isn't happy with how many of these CPUs it's been able to sell, while Nvidia cannot produce enough GPUs to meet demand.


I was talking about AMD's GPUs. These aren't as performant or in high demand as Nvidia. Not by a long shot.

Nvidia has 9x(!) the discrete GPU marketshare as AMD. Nine times! AMD cutting prices would help them increase that.


Streaming multiprocessors. That's the reason. Nvidia has an extremely complex SM architecture that CUDA is based off of and AMD/Apple focus on simpler ones because they thought it was more power efficient (oops, it's less power efficient: https://browser.geekbench.com/opencl-benchmarks)

This is such a surface-level factor in the discussion that I don't even know why you'd be offering your advice on Nvidia if you can't tell why people own their hardware in the first place. We've been having this GPGPU discussion for like 12 years at this point... people insisting on their opinions without understanding what Nvidia does is how we got in this mess in the first place.


You're talking about a market where the ASP for AMD is like 450 bucks and NV has 10x the market share at a 50-60% higher ASP. Then you look at Intel Arc, which has an ASP somewhere in the 200 bucks range. I don't think these market shares are primarily dictated by economics and the commodified properties of GPUs (compute power etc.), but rather different things, like software.

In any case, AMD is clearly focusing on data center applications and is starting to enjoy some inroads there.


AMD would rather use their fab capacity to sell $20,000 MI300X's than $450 GPU's.


Exactly.


AMD cutting prices won't affect the biggest thing: CUDA is the software basis of machine learning, and creates an effective hardware manufacturer monopoly. You have to buy Nvidia, to get a good CUDA experience.


Intel is selling their dedicated ARC GPUs on razor thin margins to try to compete. I have one friend who got one for a Plex server which let's him support multiple transcoding streams with ease. Another friend just replaced their broken rtx2080 with one since it was cheap and they don't feel a need for cutting edge graphics.


If both are selling ~all the GPU's they can manufacture, at high prices - then no incriminating communication (let alone an agreement) is needed for both to keep raking in the profits.


Maybe is related to all the people buying GPUs for AI?


AMD stopped competing with Nvidia in pricing a decade ago.

Even when they had a better product (more than a decade ago), Nvidia was still outselling AMD, think of the Nvidia Fermi and Tesla, and at higher margin.

Since then they are just like "what's the point?" and AMD stopped bothering.


Entirely possible given that the CEO of AMD is the niece of the Nvidia CEO


>niece

They are first cousins (once removed).


Removed from what?


> Removed means a different generation. When cousins are in different generations than each other, we say they're removed. "Removed" is like “grand” and “great,” but with cousins.

https://support.ancestry.co.uk/s/article/Understanding-Kinsh...


It means the child of a first cousin. I'm a native speaker of English and I find the phrase confusing also.


Wait what? Over here a 24gb RX 7900 XTX costs 900€ while the RTX 4090 starts at 2500€.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: