Cursory look gives you a ~$3500 price tag for a gaming PC with a 3090 [1], vs. at least $4k for a Mac Studio with an M1 Ultra. Roughly the same ballpark, but I wouldn't call the M1 Ultra more affordable given those numbers.
> Cursory look gives you a ~$3500 price tag for a gaming PC with a 3090
That 3500 is for a DIY build. So, sure, you can always save on labor and hassle, but prebuilt 3090 rigs commonly cost over 4k. And if you don't want to buy from Amazon because of their notorious history of mixing components from different suppliers and reselling used returns, oof, good luck even getting one.
Not to mention if you build your own PC you can upgrade the parts as and when, unlike with the new Mac where you'll eventually just be replacing the whole thing.
I believed that until I realized I couldn't individually upgrade my CPU or RAM because I have a mobo with LGA1150 socket and only supports DDR3 (and it's only 6 years old).
So eventually you still have to "replace everything" to upgrade a PC.
You were unlucky to buy ddr3 near its end of life then (like someone buying ddr4 now), but you could still upgrade stuff like your GPU or drives independently. My first SSD (a 240gb Samsung 840) is still in service after 9 years with its smart metrics indicating only 50% of its expected lifetime cycles have been used, for example.
You could also put a 4790k, 16gb of ddr3 and a modern gpu in that system to get a perfectly functional gaming system that will do most titles on 1080p high. Though admittedly we've passed the point where that's financially sensible vs upgrading to a 12400 or something as both devil's canyon CPUs and ddr3 are climbing back up in price as supplies diminish
> I believed that until I realized I couldn't individually upgrade my CPU or RAM because I have a mobo with LGA1150 socket and only supports DDR3 (and it's only 6 years old).
DDR4 was released in 2014, which would suggest you purchased your mobo two full years after DDR3 was already deemed legacy technology and being phased out.
Also LGA1150 was succeeded by LGA1151 in 2015, which means you bought your mobo one full year after it was already legacy hardware.
Yes, they entered the market around those years, but what does that change? DDR3 and LGA1150 were not deemed "legacy" the day DDR4 and LGA1151 motherboards entered the market. They were 2-3x the price, and DDR3 dominated RAM sales until at least 2017. In fact, the reason DDR4 took so long to enter the market was incompatibility with existing hardware, and higher costs to upgrade. [1] I didn't go out of my way to buy "legacy hardware" because they weren't, at the time.
Point being, PC-building makes it easier to replace and repair individual components, but in time, upgrading to newer generations means spending over 50% of the original cost on motherboard, CPU, PSU, RAM. Not too different than dropping $3K on a new Mac.
> Yes, they entered the market around those years, but what does that change?
It means the hardware was purchased after it started to be discontinued.
It's hardly a reasonable take, and makes little sense, to complain how you can't upgrade hardware that was already being discontinued before you bought it.
> DDR3 and LGA1150 were not deemed "legacy" the day DDR4 and LGA1151 motherboards entered the market.
I googled for LGA1150 before I posted the message, and one of the first search results is a post on Linux tech tips dating way back to 2015 on whether LGA1150 was already dead.
I think you are forgetting the context of my replies. I'm not saying it's unreasonable to have to upgrade discontinued hardware, even if you have to do it all at once. My take is that it's not too different from having to replace a Mac when the new generation comes in (which is usually every ~5 years for Apple, not too far from my own system's lifetime). Being able to upgrade individual parts through generations is a pipe dream.
Also, we must have a different interpretation of "discontinued", because DDR3 and LGA1150 were still produced, sold, and dominated sales for way long after I bought that system. At the time (and for the next 1-2 years), consumer DDR4 was a luxury component that most no existing hardware supported.
You can still buy DDR3 new for not that much? 16GB is about $50 from numerous brands on Amazon at the moment. I bought some for an old laptop a couple months ago.
To do CPU upgrades you eventually have to replace the motherboard but you can keep using whatever your GPU/storage/other parts is. Sometimes that also means a RAM upgrade but it's still better than the literal nothing of modern Macs.
Since the context here is using these machines for work, a mid-level engineer will easily cost an extra $1000* in his own time to put that together :)
EDIT: I’m quite confident this is not at all an exaggeration. Unless you have put together PCs for a living. $100/h (total employment cost, not just salary), 1-2 hours of actual build & setup, 8 more hours of speccing out parts, buying, taking delivery, installing stuff and messing around with windows/Linux (I’ve probably spent 40 hours+ in the past couple years just fixing stuff in my windows gaming pc. At least 1 of those looking for a cabled keyboard so I could boot it up the first time, ended up having a friend drive over with his :D)
To be fair here, there is more to it than just assembly.
You have to spec out the parts, ensuring compatibility. Manage multiple orders and deliveries. Assemble it. Install drivers/configuration specific packages.
All of these things are easier today than ten or twenty years ago - but assigning it to a random mid-level engineer and I'd set my project management gamble on half a day for the busiest, most focused engineers least likely to take the time to fuss over specs, or one day for the majority.
ofc. to get to $1000 for that they'd still have to be on $230k to $460k.
Given that the last time I put together a PC computer was 2006, it'd probably take me DAYS to spec out a machine because of all the rabbit holes I'd be exploring, esp with all the advances in computer tech.
PC part picker will do the heavy lifting for you. There are also management tools that will let you install software bundles easily, no real extra time investment.
Just knowing about services like PC Part Picker and the management tools you mention requires time and expertise that people generally do not have before they build a computer, so "no real extra time investment" may only be true for someone who can amortize those upfront costs across many builds.
In my case I have built a couple PCs before, but it was so long ago that I'd have to re-learn which retailers are trustworthy, what the new connection standards are these days, etc. It's just not worth it to me to spend a dozen hours learning, specing, ordering, assembling, installing, configuring, etc to save a few hundred bucks.
A senior engineer in the Bay can easily pull down $400k/year in total comp, which is $200/hour. The rule of thumb I've always heard is that a fully-loaded engineer costs roughly 2x their comp in taxes/insurance/facilities/etc.
When someone costs the company north of $3k/day, it's cheaper all round to just plonk a brand new $6k MacBook Pro on their desk if they have a hardware issue.
The 3090 claims are overstated. There are multiple competitors in that space, and all of them need the TDP.
Performance per watt? I could see that being disrupted, but an iGPU in 2022 will be orders of magnitude less powerful than a dGPU, if wattage is ignored.
They are still a year+ ahead of 3090 on process node. Max was about equivalent to a 2080, so 2X max does line up with a 3090. A big difference is no ray tracing hardware, which takes up a lot of die space. Same process node and no ray tracing hardware and nvidia would come in at far less die space (3090 is 628.4 mm ^2, M1 ultra is 850mm^2).
If Nvidia were on the same node and increased die space to match M1 (ignoring the CPU portion of the die size), they would then be able to run at a lower clock with more compute units and probably match the TDP discrepancy.
An iGPU isn't necessarily slower if the system ram is fast, and M1 was one of the first consumer CPUs to move to DDR5. 3090 has 936.2 GB/s with GDDR6X, M1 Ultra with DDR5 memory controllers on both dies gets 800GB/s.
Having had my M1 MB pro from work freeze and stutter, I'm just not buying it. Your theory is great, never once expected this BS in practice.
For the record: I was the first M1 recipient (temporary 16gb MB, stock issues). I needed an Intel MBP because Rosetta ain't all that. I opted for, and was upgraded to, the 32gb M1 MBP. I chose M1 over Intel because it was unbelievably faster for the form-factor. My original comment does not concern laptops. My PC is orders of magnitude more powerful.
TDP is physics. You all might perceive Apple as perfect and infallible and all so lovely, but physics is physics.
I use AMD, not NVIDIA. And "what if" is irrelevant. It's like intentionally neutering Zen2 by comparing it to Intel single-core (as was done all the time). The reality is absolute, not relative. Comparing effective performance, not per-TDP, is what matters to the user. And my network/gpu/audio drops on both my 16gb M1 MB and 32gb M1 MBP under load.
Seriously not buying that "Apple can do nothing wrong" bias.
Take those #Ff6600-colored glasses off. The M1 has unbeatable value proposition in a pretty wide market, but Apple couldn't be further from a universally good machine.
I'm not being snarky but I dont believe Mac people would know how to build a PC given their history of non-modifiable hardware and no way to repair them.
And cheapest Mac Studio with M1 ultra is A$6000 so yes....
20-Core CPU
48-Core GPU
32-Core Neural Engine
64GB unified memory
1TB SSD storage¹
Front: Two Thunderbolt 4 ports, one SDXC card slot
Back: Four Thunderbolt 4 ports, two USB-A ports, one HDMI port, one 10Gb Ethernet port, one 3.5-mm headphone jack
Here in UK it's not a big deal. Subscribe to discord alerts for FE series drops, last 3090FE drop in February the cards were in stock for a full day, at RRP(£1399). I got a 3080 drop at RRP this way too(£649).
But even ignoring the FE series, the prices have already crashed massively, you can get a 3080 AIB for less than £1000, and 3090s frequently appear around £1500-1600.
I can right now (ok, in the morning actually) walk into a computer store across the road here and buy 3090 off the shelf for 2299..2499€ (different makes and models). Those are in stock and physically on the shelf. Same for lesser cards of same series or AMD RX6000.
I'm seeing 3080s, in stock in stores I might consider buying from, sub-1800 AUD. It is heading back towards RRP (still about 50% over I guess). 3090s are twice that, yep.
You also need to compare the right cpu. M1 Ultra cpu is the equivalent of the fastest threadripper. Which costs $3990.
So a pc with similar performance would be $7500
Not the top-of-the-line threadripper (which can go up to 32c/64t), but probably similar to the 5950x (16c/32t), which costs like 1000$.
But you’re comparing apples to oranges, because the real advantage of M1 chips is the unified memory - almost no CPU-GPU communication overhead, and that the GPU can use ginormous amounts of memory.
I can't put a 3090 into a 3.5 liter case. Even a 10 liter case is really pushing it. That's before mentioning power savings. 3090 real world power when in use is something like 4x as high.
Unless you're running a farm of these, the power cost differences is going to be largely unnoticeable. Like even in a country with very expensive power, you're talking a ~$0.10/USD per hour premium to have a 3090 at full bore. And that's assuming the M1 Ultra manages to achieve the same performance as the 3090, which is going to be extremely workload dependent going off of the existing M1 GPU results.
1. https://techguided.com/best-rtx-3090-gaming-pc/#:~:text=With....