I would love to be able to socket one of these suckers in a Linux desktop for a reasonable price. I'm loving seeing competition in this field, but I don't like using MacOS and don't want to pay the Apple hardware tax.
I run desktop linux at home, and agree with the concerns/frustrations about running MacOS but I've never understood the complaints about Apple's pricing.
The products do come at a bit of a premium, but, in my experience, it's well earned and is a premium experience as well. There have been occasional periods of sub-par hardware, but generally (and especially current generation) Apple's hardware is extremely well designed and durable.
On top of that, I don't get the software engineer aversion to paying for quality tools. How many hours a year do you spend on a computer? How many hours over the lifetime of the computer will you spend? Even if devs weren't generally in top income brackets it would still be easily justifiable to spend several thousand dollars on tools that you were happy with.
Again, I completely understand not wanting to go with Apple in order to have much finer grained control over your system. But if Apple is selling what you're buying, then the pricing is more than fair imho.
With my cheesegrater Mac, Apple wanted $3,000 for 160GB of memory (to go from 32 to 192). OWC sells the same memory (same spec, same manufacturer) now for $620, though I believe I paid about $1,000.
They also wanted another $3,000 for an 8TB SSD. Similarly, I was able to buy a 4xM.2 PCI card, and populate it with 4 x 2TB SSDs for under $1,500. Furthermore my PCI setup is faster than the Apple system drive (7GBps versus 5.5).
Overcharging on those kinds of bumps is standard industry practice, though. I just checked Lenovo, and they want $224 to bump an X1 from 16GB to 32. And that’s a sale price, the normal price is $340 (which seems largely theoretical since thinkpads are eternally on some kind of sale, but that’s neither here nor there).
Apple-to-apple (heh) comparison is tricky in this case since I don’t see LPDDR5 laptop chips for sale on their own, and this is a soldered-on-laptop chip so DIY after buying isn’t feasible, but an entire 32GB DDR5 desktop set can be had for about $100, so even if the low-power laptop form-factor commanded a premium, that’s still pretty egregious.
Dell is similar: for a desktop they want $200 to bump from 16GB to 32GB DDR5.
Apple might be a bit more daring with those prices than some PC manufacturers, but it’s hardly a behavior unique to them.
The difference is that you can buy the upgraded parts from a third party. This has always been difficult with Apple. RAM was constantly soldered in, but at least it was possible for an advanced user to upgrade it. Nvme drives had needless proprietary connections. I upgraded my old Intel Air and a $20 adapter and a n Evo Nvme drive was larger and faster than anything Apple would offer, at a cheaper price.
The pricing may be standard (a bit higher if we're honest) but the bigger issue is how far Apple goes to make you pay that premium. It borders malicious. It's also not great for waste, considering upgrading is near impossible.
> The difference is that you can buy the upgraded parts from a third party. This has always been difficult with Apple.
Not "always". This is an artifact of the MacBook Air modern "appliance" era of Apple (and laptops / computing in general) where thin and light bodied laptops and compactness are the highest priorities, prized above self-service, upgradability, etc.
PowerBooks, by contrast, were amazingly and easily upgradable. You didn't even have to open up the case to do a memory upgrade — the keyboard just popped right up.
Two decades ago, the Framework laptop would have never needed to exist. And for the most part, likewise with Hackintoshes.
Sure, that's a story from 1983 but it is in direct conflict with facts. I have owned several powerbooks, macbooks, and macbook pros and up until fairly recently they were all upgradeable with respect to RAM and drives.
I don't think the linked discussion about the earliest Macs in the early '80s not having extra peripheral slots is saying the same thing as consumers being able to upgrade memory and storage.
To the best of my knowledge / memory / experience, all of the Mac models introduced prior to the Air in 2008 had upgradable memory and storage.
Was it really bullshit? I dare say that if you look at home computing platforms available in the early 1980s, most had no first-party support for third-party memory upgrades.
Laptops, at least, are increasingly all-soldered RAM from many brands. SSDs are mostly still m.2, I think, so at least those are upgradable. Mac Pros were upgradeable until this most recent iteration switching to their ARM chips. The other companies also use lower-spec components than you could buy for less yourself, same as Apple.
If anything, the rest of the industry has been moving in Apple’s direction with these kinds of limitations. They’re not more evil than their competition, they’re just ahead of the curve.
I agree Apple takes it further with the gouging than others. We should absolutely complain about it and criticize them for gouging! My criticism of the “Apple Tax” complaint is that 1. Nobody ever complains nearly as vociferously about other manufacturers doing similar stuff, and 2. The particular way people complain about Apple’s behavior is always phrased to kind of imply it’s unique and not just a more egregious form of a common behavior.
> the rest of the industry has been moving in Apple’s direction with these kinds of limitations
That's the "evil". Your next sentence doesn't justify the former. Apple is the one pushing the bounds. Others see Apple can get away with it, so they follow. There's good reason to believe that if Apple didn't push this then we wouldn't be in this state (right to repair is another piece of evidence related to Apple too).
I don’t think soldering is an “evil”. Sockets suck in a hundred different ways, and we only tolerate them for the stuff we need to upgrade or replace. The demand for upgrading / replacing CPUs and memory has been dropping, steadily. I think that socketable CPUs have always been a dodgy proposition, even at the best of times—when I want to replace the CPU, it’s probably got a new socket, or I’m upgrading from DDR3 to DDR4 or something like that. Upgrading RAM is nice but having tons of memory bandwidth is much, much nicer.
Sockets mean that you dont have to guess inventory levels so carefully, Apple presumably has to build boards that will be low demand, like 92gb ram with 4tb nvme, separately from 8tb and 1tb
I find it genuinely astonishing that the contemporary reason for soldering memory onto the motherboard hasn't been raised already. Yes, the reason Apple moved to soldered memory for their MacBook Air line is likely some combination of physical packaging and fleet reliability. But with the latest CPUs, those reasons are now secondary.
The primary reason is now performance. Don't believe me? Look at how RAM is arranged on a GPU or in a game console. The RAM chips are located as close to the CPU/GPU as physically possible, limited only by thermal constraints.
You will never see DIMM slots on an NVIDIA card because the compromises would be untenable. Longer traces means lower performance, while sockets will make the cards thicker and with poorer thermals. The reasons why NVIDIA cards are the way they are correlates with why Apple's latest comptuers are the way they are.
What Apple did with packaging RAM so close to the M1/M2 chip is a big part of the Apple Silicon performance story.
> Yes, the reason Apple moved to soldered memory for their MacBook Air line is likely some combination of physical packaging and fleet reliability.
I love how you've pushed your opinion as fact here, spinning the positives and conveniently omitting the clear anti-consumer advantage that soldering RAM achieved: non-serviceability meant shorter lifetimes, i.e. more replacement sales (and hence more e-waste, but who cares about externalized costs, right?) and more money minted during the original sale (since people couldn't buy and upgrade with cheaper components, they were forced to pay Apple prices). At the time it was implemented, there was no real technical advantage.
Performance benefits have now come along as a bonus, a happy side-effect. But the main mover has always been (and always will be, with Apple) pure and distilled greed.
At the time Apple did this, they had already been soldering RAM onto the motherboards of iPhones because phones needed the smaller envelope. At the time, it was probably more about saving money on manufacturing by not having extra, moving parts, and making it simpler to automate the fabrication. Also, even back then, this would have an impact on thermals and size, which could easily be the only justification needed.
>Look at how RAM is arranged on a GPU or in a game console.
Are those ram chips on a CPU carrier in a POP configuration? or are they fanned out on the PCB allowing technician to replace individual components?
Part of the reason for Apple move to ram on SOC carrier was killing third party market. Now its impossible to repair/upgrade ram. Similar with T1/T2 proprietary pcie NAND in raid configuration.
Not manufacturers, third parties are able to repair normal computers, even with soldered ram. POP ram is soo difficult nobody does that, and Apple NAND is special unobtainable proprietary kind with pcie bus and locked firmware.
My critique is not “this behavior is acceptable”, and I certainly didn’t describe it as justified. My critique is “the industry is full of bad actors but everyone mostly yells about one particular bad actor”, which I think is unproductive.
I understand that, and for the most part I actually agree. I think in those situations it is less important to point fingers at specific actors but at the structure and system. Because you're right that it isn't about individuals in those cases, but systematic problems.
But there's 2 things I need to say:
1) In a system of bad actors, it is still useful to point fingers and discuss the largest (or small subset) that are doing this. Because
2) Certain actors are driving forces, which others follow. This is Apple. They even brag about having this position. They even have the credentials to do so! They earned that right. They took away the headphone jack. They got us all on ear buds. They got us all on smart phones in the first place. They made podcasts a thing. The aren't the cause of everything, but they are one of the drivers, and definitely the most powerful one. It would be naive to be under the impression that Apple doesn't have significant influence on the entire market. They've done things that resulted in significant backlash, had lots of consumer complaints, and have no become the status quo (e.g. headphone jack). They were the only major company to solder RAM for decades, and people complained about it for just as long. They have weight, they know it, and they know how to use it. That weight allows them to make markets that don't even exist.
Because of this, I will point fingers. My complaint with your comment is that you characterize Apple as just another player being pulled along by forces outside their control. But let's be real, Apple was the first $1T company (late 2018), the first $2T company (late 2020), AND the first $3T company (Jan 2022). No one else even comes close. Microsoft is #2 market cap, but has a 12% gap, then Google with a 43% gap. There are only 4 trillion dollar companies (Amazon being the next) and realistically only one builds laptops (MSFT isn't as hardware focused). Apple sets the tone. Yes, the system sucks and is problematic, but that doesn't mean we should ignore the one dictating that system.
People complain about the largest and biggest company as they usually define how the industry follows. I think more worrying is your and other people fanboying/ defending with whataboutism instead of just accepting Apple is doing bad shit and so are others but this thread is about an apple product so of course we will complain about Apple here. Do you know of any other company or brand whose new product is talked about so much on hacker news that it self shows Apples clout.
Soldered in memory is a key part of why apple hardware is lower power, smaller, and higher memory bandwidth than the x86-64 counterparts. Sure 2 dimms on a mac mini, or mba wouldn't hugely change the form factor. However 4 dimms, 8 dimms, or 16 dimms would. Unlike the AMD and Intel product lines apple does actually provide 2x, 4x, and 8x the memory bandwidth with basically no compromise on power usage, physical size, or noise.
Sure AMD Epyc, Intel Xeon, Threadripper Pro and similar to improve memory bandwidth, but at a substantial penalty in size and power.
Chinese hackers are constantly chipping away at Apple proprietary BS, and at least when it comes to storage we have many options now https://www.youtube.com/watch?v=yR7m4aUxHcM "replace EVERY DEAD SSD for M1 Max, M1 Pro, M1 & T2 Mac, T1 Mac, BONUS:M1 Ultra (FOR DUDES IN DENIAL)"
Does the unified memory model affect how you see this at all? Those 160GB of ram would be available to your GPU, which would not be true on a PC. For some people, that might make a big difference.
It means that, but tbh that's more and better than 128GB of ram and 64G of vram (good luck going past 24G w/o breaking the bank).
Today AM5 boards max out at 128GB, so to beat that it seems that you must go and shop at places where you must request a quote to help them know how much you can be charged.
> The products do come at a bit of a premium, but, in my experience, it's well earned and is a premium experience as well.
As a long time Linux and macOS user, I don't agree. Even though macOS is more polished than your average Linux desktop environment, it's really very hard to ignore the unjustified markup. Nowadays I can buy a miniPC with a Ryzen5 and 32GB of RAM for around 400€, but the cheapest Mac mini nowadays sells for over 700€ and comes with 8GB of RAM and an absurd 256GB SSD. Moreover, a Mac mini tops up at 24GB of RAM, and for that you need to pay an additional 460€ for your weak 256GB HD box.
> I think to compare prices you need something with adequate GPU performance (which the M2 has).
I feel your comment reads too much like fanboy excuses. CPU is not the only or even main requirement. I personally want to max out on RAM and HD. I can buy a mini PC with 32GB of RAM and a 500GB nvie for 400€. With a Mac, I need to spend almost twice that to only get 25% of te RAM and 50% of that HD space.
This was the norm since Apple shipped Intel core i5 Mac minis.
There is no way around this. Apple price gouges their kit. It's irrelevant how you feel they fare n artificial benchmarks.
The M2 Mac mini's RAM is integrated into the SoC package, which has some advantages (good memory bandwidth, no copying between CPU and GPU RAM) and disadvantages (expensive, non-upgradable DRAM tiers.) Internal flash storage is basically non-upgradable as well (though you can easily plug in external thunderbolt m.2 storage.)
It also doesn't currently run Windows natively, nor does it support eGPUs.
I'm not sure any Mac mini model was ever much of a competitor to cheaper PCs, but mini PCs have gotten a lot better over time, probably inspired somewhat by the Mac mini, while the mini has followed in the footsteps of other Mac models by adopting Apple Silicon and unified memory.
The mini is a perfectly decent Apple Silicon Mac, and compares favorably with the older intel Mac minis in terms of performance, but I'd spring for 16GB of RAM (at least) for my use cases.
I don't see the point of your comment. It matters nothing if you underline design differences if in the end you can get a cheaper minipc that's upgradeable and ships with more memory, and you can't do anything about your Mac mini other than scrap it and buy a more expensive model.
> The mini is a perfectly decent Apple Silicon Mac
That's all fine and dandy if you artificially limit comparisons to Apple's product line.
Once you step out of that artificial constraint, you get a wealth of miniPCs which have a smaller form factorz are cheaper, have more RAM and HD, are upgradeable and maintainable, and in some cases have more computational power as a whole.
Both of these companies are huge in the mini PC space. Most people have never heard of them because all they do is mini PCs.
Minisforum latest 7940HS lines are better than M2. More powerful, fairly close on power efficiency, better GPU, cheaper, and without all the nonsense that comes with buying Macs. Their fully juiced model is $800 (and doesn't lock you in to a model that milks casb from you like a sow).
It isn't. As the Reddit link explains, the only benchmarks it wins in are synthetics.
You can play Red Dead Redemption 2 at 1080p at over 60fps. You can produce all the synthetic benchmarks in the world but this is as powerful as a console. This is the most powerful iGPU out there, it is about as powerful as 1060.
There is also the fact that the AMD CPU is newer, has a 50% higher TDP, and is built on a smaller node, but is far from providing 50% better performance than the M2 [0]
On the other hand, I don't own a Mini, nor I'm on the market for a mini PC, but I don't feel like dropping $800 for a prebuilt PC unless they provide stellar support and warranty, and chinese OEMs aren't really known for that.
I'm not sure what Beelink is supposed to be mistaken for. I only know them for their micro-PCs and I'm not familiar with another brandname that it is supposed to remind me of.
I hadn't heard of Minisforum though. But the same goes there - not sure what is is supposed to be mistaken for.
I just priced the Ryzen Lenovo ultra small form factor, which is smaller than a Mac Mini and only slightly larger than a Playstation 2 slim, and it was £500 rather than £400 but other than that those numbers didn't seem far off the mark.
The Mini comes with an internal PSU, though. All these mini PCs come with external PSUs, some hilariously large at over half the size of the PC itself.
This is true about it being an external PSU, but it is nowhere near 1/2 the size of the computer.
I have one because I was able to get an i5 that was passively cooled, so great for a Plex server that's second hand for only £100. The PSU is more like 1/8th, maybe smaller.
I happen to own a Lenovo Thinkcentre. The PSU is perhaps 1/3 the volume of the computer itself, which I think is crazy for a computer with a mobile chip inside.
I know that some Intel NUCs have monstrous PSUs [0], which I think should constitute false advertising regarding the actual size of the computers.
Fair enough, if I had got that Intel one I would probably have a similar opinion. The power brick I have is about the size of the small Lenovo travel power supplies, maybe about 25% of that Intel one, at a guess without seeing it in person.
It's definitely smaller than 1/3 the volume of the unit unless I have a incredibly small form factor rather than an ultra small form factor unit? But I don't think so. It is very small.
>Even though macOS is more polished than your average Linux desktop environment, it's really very hard to ignore the unjustified markup.
It's justified if people pay it.
>Nowadays I can buy a miniPC with a Ryzen5 and 32GB of RAM for around 400€, but the cheapest Mac mini nowadays sells for over 700€ and comes with 8GB of RAM and an absurd 256GB SSD.
So what's the problem?
If you like running Linux on your Ryzen5 miniPC with 32GB of RAM for 400€, you're more than free to do that. Apple's not stopping you.
I'll gladly play Apple processor prices for an Apple processor, but I'm not going to pay Apple PC prices for an Apple PC when all I really want is the CPU.
I guess Apple doesn’t want to be Intel. It is like the old Kelloggs adverts “we don’t make cereal for anyone else”. Alluding to those brands that use the same cereal at a marked up price and resell it to a supermarket brand to sell at a lower price in a plain box.
Except if you want 32gb ram and 2tb disk space, they charge well in excess of what anyone else (for the most part) is able to charge for similar upgrades.
> The original point of this thread is that they'd like to be able to get an M2 PC like that.
Not really. The point is that Apple's products are overpriced, and in particular the Apple Mini underperforms and simply isn't competitive when compared to today's alternatives. I repeat, a mini PC with 8GB of RAM and a 256GB HD on the market for over 700€ simply doesn't compete with miniPCs with Ryzen or Intel i7 or even i5 that ship with 500GB HDs and at least 16GB of RAM which can cost 200 or 300€ less.
Neither of them really have HDs, rather SSDs, but more importantly the memory numbers are not directly comparable because macOS has a memory compressor. Which matters a very large amount for many workloads.
The markup is because I can go to apple.com, pick the RAM and storage I want, and not really have to figure out anything about flavors or distros, knowing that it comes with the best chip on the market. This is leaving out that it just works with my phone and tablet.
The markup is because Apple can command it based on a long history of quality and building a tech luxury brand. It's solely about keeping margins high and the brand status high.
Even if they could make the same money by lowering prices (and increasing volume), it'd be a terrible idea based on how consumer behavior and status seeking actually works. If the quality were the same and the product were far cheaper, consumers wouldn't want it as much. The absolute worst place to be in any market tends to be the middle, you go to the middle to die. High margins provide a margin for error in business, it's invaluable.
It's the combination of OS and price. When Apple is very often double or more the price for competitive products, charges exorbitant prices for modest upgrades (currently $200 difference between 8 and 16 GB RAM. Over $400 for 1 TB SSD when a good 1TB M.2 goes for about $55), the OS is annoying and restrictive, and my choices for replacement OSes are all very experimental compared to non-Apple hardware, why would I spend significantly more for a significantly worse experience?
And I'm happy to pay for quality tools, but I'm not spending $20 for copy-paste (edit: hyperbole. Every time I find some missing simple functionality that is available for free and easy on Linux, the Apple equivalent is available on the app store for a significant charge. Some games are simply more expensive on iOS than elsewhere because the developers know Apple users will pay more). Apple is still largely in the "luxury designer product with a consumer base of 10% historical savvy users who prefer it and 90% rich kids who like the logo and don't mind flushing money down the drain for a brand".
> When Apple is very often double or more the price for competitive products
This is a complete lie. I've used PCs for 30 years and I'm not an apple fanboy but when you actually look at the market, there's only very few products that can compete to the macbooks to begin with and they're always in the same price range or more expensive.
> charges exorbitant prices for modest upgrades (currently $200 difference between 8 and 16 GB RAM.
This is true, but it's also true in the PC world. And PC laptops increasingly come with soldered RAMs so you don't even have the advantage of upgrading it yourself anymore.
And looking at the Framework laptop shop right now, I see a 16GB DDR4 DIMM for $60, and you can throw in a standard M.2 storage module. Maybe Framework is just singularly better, but that's still a reason to not buy Apple for me.
This is just being pedantic. "Mac vs PC" is just a widely known way to distinguish between Mac's and non-Mac's.
> And looking at the Framework laptop shop right now, I see a 16GB DDR4 DIMM for $60, and you can throw in a standard M.2 storage module. Maybe Framework is just singularly better, but that's still a reason to not buy Apple for me.
Ok now instead of looking at number of GB of RAM you get per price and show me quality and efficiency between these. I'm no hardware nerd/genius but from what I understand the Apple Silicon was out performing even what Apple CLAIMED in their keynotes and that caused everyone to scramble because they couldn't sit on their thumbs anymore. (Cough Intel)
So yeah, you may get same numbers of RAM, but that RAM isn't equal.
Apple marketing made "Mac vs PC" to try to differentiate themselves, and I refuse to adopt their marketing terms as if they are facts. Most people who work with computers will call a Mac a Mac, but won't use the "PC" category as if it doesn't include Apple PCs.
Apple Silicon is really good for what it does. Apple does not create its RAM and the RAM is indeed equal for a significant markup.
Yeah, and you could argue that ARM and RISC-V Windows and Linux wouldn't be PCs, but I don't really see the value in using the terminology for such a dated use. It's not really useful anymore since none of the same software is compatible across different OSes anymore.
"Mac vs PC" was way after Windows software was already not compatible with IBM PCs, so by that point, the implication was just "PC == x86", and Apple was already transitioning to Intel as well. "PC" being tied to a specific Intel processor is confusing. It's all very bizarre and mostly marketing.
It's very much of a piece with Apple's infamous "what's a computer" iPad ad. Apple have always been trying to position their products as existing in some sort of technological alternate universe ("Think Different"). They go out of their way to avoid applying conventional terminology; you will never see them call AirPods "earbuds", for instance. This extends even down to the banal - their just-announced AR goggles are fastened to the user's head by way of a "Head Band", because heaven forfend such advanced technology be sullied with anything so prosaic as a "strap".
This relentless self-othering has a dark side; apart from being good marketing, it also conveniently excuses behavior that might otherwise be regarded as unacceptable - much like how rebranding small LTE-enabled computers as "mobile" reset user expectations across the board for things like admin control, advertising in the UI, the worth of software, and other pesky social mores inconvenient to the money-extracting classes.
Pedantic perhaps, but not a terrible thing to draw attention to.
> And looking at the Framework laptop shop right now, I see a 16GB DDR4 DIMM for $60, and you can throw in a standard M.2 storage module
To be fair to Apple, theirs is LPDDR5 rather than DDR4, and it’s built into the SOC so it’s a different product.
You can currently only buy a system with 16gb of DDR5 from framework as part of a preorder bundle, not separately, and that’s a c£400 bump which makes the system much more expensive than the mac.
To be fair to framework, they are going to release a separate ddr5 module according to their store which will be cheaper than the Mac option, but if you are buying right now (which is what counts) then Mac is cheaper.
All my laptops including recent ones have upgradeable RAM and SSDs. I usually buy laptop with smallest amount of RAM and SSD and then upgrade those myself. Way cheaper.
> Every time I find some missing simple functionality that is available for free and easy on Linux,
So basically you'll exploit the charitable developer who spent countless hours developing the software rather than pay them a fair price for their work. This is why people develop for Apple platforms first - because the people who buy Apple will actually pay them for their time and labour.
That's an insultingly uncharitable read, and is loaded with some pretty unfair assumptions.
I am a programmer, and I contribute plenty of FOSS code. Very often, I find a solution with some issues and submit PRs. I'm not arrogant enough to do a couple hours of work and charge $20 per download for it, and I'm not a useful enough idiot to work for free for Apple, so I guess that bars me from doing the same on MacOS (even though I have to work with it for work). I guess if some of my code is general enough, some well-meaning Apple FOSS users can port it over.
It's interesting to me how much comradery and work for the general community is done in the open for and among Linux and BSD users with only the expectation that others will do the same for them, but many Apple users I've run into are like you, treating the simple desire to make things better for people with absolute derision and disgust. I guess if you aren't maximizing profit, why do anything at all, right?
> I'm not arrogant enough to do a couple hours of work and charge $20 per download for it,
How do you know it took whoever "a couple of hours of work"? How much time and effort do they have to put in to maintaining the software? And how much training and work did it take to get them to the point of being able to make the program in the first place?
> Ever heard the story of Picasso and the napkin? Legend has it that Picasso was at a Paris market when an admirer approached and asked if he could do a quick sketch on a paper napkin for her. Picasso politely agreed, promptly created a drawing, and handed back the napkin — but not before asking for a million Francs.
> The lady was shocked: “How can you ask for so much? It took you five minutes to draw this!” “No”, Picasso replied, “It took me 40 years to draw this in five minutes.”
It's not about maximising profit, it's about people getting paid for their time and work.
Mostly by looking at the level of functionality and comparing to other software, cross referenced against my career as a programmer.
Somehow, I constantly come across lifelong programmers who insist on working for free. They get software for free, give their software away for free, and very often have encompassing philosophies of software freedom (and often freedom of information and data in general). I find it very sad that the idea of mutual support and love of software and art without money changing hands is regularly met with such resistance from people who haven't experienced the joy of being in a community that doesn't constantly look to extract cash from their own.
I know what exploitation is, but it's not the group of programmers working for the good of one another.
Food, shelter and electricity cost money. Unless these holier-than-thou programmers inherited wealth, they're going to have to do something to acquire money to pay for these necessities for keeping alive. If not, technology evolves at a crawl because everyone who codes can only do it as a part time hobby around their actual job. Which would go some way to explaining why half the open source software in the world such as Gimp is absolutely dreadful in comparison to the private offering.
Money is the best system we've come up with for the exchange of value of labour across industries, but you are welcome to go visit a farmer and attempt to agree on how many lines of code are equivalent to five parsnips and a dozen eggs.
I've been organizing a large-ish volunteer-run gaming convention for 20+ years. One thing I've noticed is that while people with stable jobs (such as developers) are often willing to contribute their expertise for free, freelancers (such as artists) often expect to get paid.
The reason is quite simple. A stable job is much like inherited wealth. Because your income is reasonably guaranteed, you can live your life without trying to turn every opportunity into a business transaction.
As some of us are professional event organizers, we often compare our convention to professionally run events. In some aspects, we are really amateurish, as we are just a bunch of volunteers doing things for cheap. In other aspects, we are better than professionally run events. We can choose to do the right thing without having to worry about business models and profitability.
Sometimes money is what gets boring but necessary things done. And sometimes money is the reason why we can't have nice things.
Why is charging $20 per download arrogant? I'd presume you wouldn't do it for any price, which means you're effectively charging $(infinity) per download.
And you got the audacity to complain that some people try to make a living while providing a presumably valuable service/app for their users. "Treating the simple desire to make things better for people with absolute derision and disgust" eh?
I have refused to develop any software for Apple's OS or hardware for about 30 years now, because Apple has always routinely stomped on anyone who disagreed with their corporate ambitions - clone makers, reverse engineers who figured out how to use part of their libraries without permission, open source coders trying to write services compatible with Apple's stuff, and others.
Basically the company is hostile to anyone threatening their control, they're the champion of closed source and proprietary much more than IBM ever was.
> Apple's hardware is extremely well designed and durable.
The thing is it often isn't. I mean there are many vendors way worse then Apple. But it often just seems/looks/feels to be extremely well designed and durable. But if you ask someone with expertise they often can point out docents of "corners cut" which are not well designed and hurt durability. E.g. for saving some mm of high, some cents of production cost, some very small amount of weight, or likely but not provable intentional fragility to enforce their ecosystem. IMHO it's very clear that reliability and stability are secondary and looks and feel are main priority.
Doesn't mean that they don't have some very good engines giving their best to make things durable but when other constraints matter more it can be hard.
But it puts the reason ability of their sometimes very excessive premium prices in question.
Especially if you learn of they squeeze out their supply chain in ethical extremely questionable ways while hypocritically pretending to care about issue like child labor (if you squeeze your assembly line provider so hard that they can only make profit by using child labor and disgusting working conditions then no you do not care about child labor no matter how many times you force them to sign a contract where they agree to not use child labor and have proper working conditions).
I’m sure a laptop manufacturing expert would notice all the corners cut. But if I, as someone who uses these computers for ~8 hours a day everyday for more than 5 years per machine, don’t notice them, then does do those cut corners really matter?
That said, I agree that Apple’s strategy of maximising shareholder returns by squeezing their supply chain to just under breaking point is disgusting and that they are ethically culpable for the inevitable results. They’re probably not alone in this, but they have the most market power and probably push it the furthest. This is definitely an instance where you can hate the player and the game.
Louis Rossman gives lots of examples on hist YouTube channel. For example, in one case Apple appears to have cut some corners when designing the power supply for the SSD, so that when it fails it dumps 13 volts into the 2.5 volt SSD which permanently damages it: https://youtu.be/7cNg_ifibCQ?t=309
In other videos he shows how Apple puts display data lines right next to high-voltage lines instead of separating them with a ground line, which means you’ll fry the board if you get a tiny bit of moisture in that area.
So you will notice these cut corners when your device breaks and the repair costs almost as much as the original device.
the problems with the kind of corners apple tends to cut is that they don't matter until they do (i.e. just for a small set of users) in which case you often run into fun things like needing whole motherboard replacements or similar which tends to mean getting anew laptop and hoping you did a data backup properly and apple repair doing everything they can to push off any blame (i.e. cost).
I mean it's not worse then other laptops, definitely not.
But for the price a "developer laptop" costs I would expect more (i.e. _lowest possible spec_ of M2 14" which allows 64GiB of RAM + 2TB SSD == 4619€). Mainly I would expect better service when it comes to repair, especially if it's needed due to not so perfect design of Apple. I mean that price are 2 and a half Framwork semi 14" laptops with maxed out Ryzen CPU and 64GiB or RAM + 2 TB SSD. For most (but not all) use cases this Framework laptop will perform comparable. And sure the Framework laptop isn't as good as Apple but it's not "just half as good" (assuming the .5 missing framework laptop is for a hypothetical expensive OS license) and Framework isn't even "cheap" for what you get ... the balance is just not right.
I mean if they did other things to fill the gap like very good flexible support, actually ethical supply chain handling and/or similar it would be reasonable price I would love to pay (well and put Linux on it, but implicitly paying for a hypothetical OS license is my issue).
> Apple's hardware is extremely well designed and durable.
It's well designed, yes, but it's certainly not durable. The entire device will have to get scrapped and replaced if even the slightest thing goes wrong with any of it.
They did a full motherboard and keyboard replacement of my 2016 MacBook last year for no cost even though it was wwaaaayyyy out of warranty because the fault was a known issue.
Oh yeah, I remember they did a free out of warranty replacement on one of my MacBooks a bit back, but there was a potential battery explosion issue on that one.
I've been pretty anti-Apple for a long time. Look me up, I've been using the same handle forever, and I'm sure I've talked shit about it publicly over the decades. I've always lamented their proprietary connectors and formats, the premium they charge, and perhaps worst of all, their seemingly mindless fans. I've been using Microsoft as my daily driver OS since moving away from a TI-99/4A.
When I changed jobs in late 2020, I got to keep the 2015 Macbook Pro that I used for work. I'd been using a Mac since the company switched to Ruby on Rails, while continuing to run a Windows machine for things like gaming, video editing, Photoshop, etc.
By 2021, the battery on this 2015 Macbook Pro was noticeably swelling. I decided to try my luck, and took it to the Apple Store in Philadelphia, where they looked up the serial number, collected the machine, and a few days later, notified me that my battery had been replaced, free of charge. Because of the design of the MBP, this meant replacing the keyboard and all that surrounded it. Physically, it was like a new machine. This is on a machine that I hadn't even bought.
My understanding is that it's kind of a crap shoot. For known issues like the 2015 battery, there are typically windows of time when they'll fix the problem for free. I was well past that window. I was actually willing to pay for a battery replacement - until the debut of M1 architecture, the 2015 15" MBP was the best of the before-times - so I was pleasantly surprised that I got so much refurbishment for just showing up to the store.
They'll do free replacements if it's a known defect, and especially if it's widely experienced. Even Nintendo does that if they sniff a potential class action and want to avoid it.
Usually only after a very large public spat about how they are not being responsible for the problem. They have in the past denied problems that were widespread (board warp, keyboard, etc.) until either lawsuits or extreme push-back from the community - usually when they release a new product with those mistakes removed and it's impossible to deny them any more.
Painting Apple out to be a company that "does the right thing" without being forced to do so isn't accurate.
Even without that.
I won’t go into details, but they went out of their way quite a few times, without me having to pay a dime, and no defect whatsoever was involved.
The next revision of the MBPr didn't have such luck on the design/durability side (I owned both, and heartfully share your praise of the 2016 version)
iPad had similar ill fated designs. I went to replace the battery of an iPad Pro and was offered a new one for the price of the battery, as a standard procedure (it's glued to the motherboard under all the other components, so no in-store replacement)
I believe my parents are still chugging along with my ~2010 MBP after a couple of home-done battery swaps. Obviously not the fastest laptop in the world but never had any problems with the hardware.
Completely unsupported by Apple. Doesn't get OS updates or security updates. No apps get updates for it anymore through the App Store.
Saying that using a 2010 computer when it's an Apple computer is actually one of the worst arguments you can make in this context. Virtually any other computer running any other OS is more supported.
> Completely unsupported by Apple. Doesn't get OS updates or security updates.
Sure but the original point was that the hardware was "certainly not durable". Every one of my Mac laptops (except for the 12" 2012 MacBook which is my local emergency machine) since ~2010 is still in active use by family. There's nothing wrong with the hardwares.
I totally get your point about security updates but maybe, making some generalizations here, it is not a terrible thing that the machine OPs parents use, doesn't get unexpected surprises - no unexpected family tech support calls. Us technology folks love the new and the shiny but predictability is a premium for the older generation (again generalizing - there are seniors who can run hoops around the average developer with their tech skills but thats not the norm).
My father has been doing embedded software development since the early 80s. Thankfully the only tech support messages I get are when I break things like their email.
People get upset because they don't have a "cheap" product offering.
With Dell there's the Inspirons
With HP there's the Pavillion
With Lenovo there's Ideapads and Yoga's and stuff.
With Apple, their cheapest laptop is comparable to a business laptop (Elitebook, Latitude), and their most expensive laptops are comparable to the "creator" laptops like Dell XPS or HP ZBook -- except the Apple computers have better build quality than those.
But people only look at the spec sheet and claim it's overpriced.
Apple are definitely guilty of gouging the price of upgrades, but that's just business sense, if you need the update then you have to trade how much it's worth to you- it's just that other manufacturers do not do that.
Apple's whole trick to gaining performance is to trade away the ability to upgrade RAM because having the RAM/GPU/CPU all exist within the same chip improves the physical locality which decreases latency.
This is ultimately the same trick that consoles use in their APUs. At minimum it's a great way to bring power down and enable HSA/Fusion/M1 style unified-memory shenanigans, and in the long term it's going to be the only way forward with rising dGPU costs in the low-end market.
Personally, I don't care about them having a cheap offering. I hate them because the entire way they work is to lock you into their ecosystem. It's like Microsoft used to be, only much worse because at least back in the bad-ol' days of the 2000s, MS didn't sell hardware, and didn't have an "app store" either.
With Dell, by contrast, I can buy an Inspiron or a Latitude or a workstation, then wipe out the pre-loaded Windows junk and install the Linux distro of my choice, and it works fine. Then if I don't like the included SSD size, I can easily pop it open and install a bigger Samsung one that I bought on Amazon. And having a Dell laptop doesn't pressure me in any way to own a Dell cellphone (which doesn't exist anyway), or vice versa.
The only part of Apple's ecosystem that has ever held a central place in my life are macs. Since 2013 I've used Macbook Pros as my primary development machine and arbitrary android devices for my phone. I did also buy an iPad 3 which I still have, but it's never been very useful beyond being a browsing or reading device. I don't really buy any claims of being locked in, I use macs because they're preferable, and tend to keep them for a long time. Windows is a trashheap, don't feel like bothering with Linux.
I haven't upgraded my intel macbook pro from 2019 just because the ram and ssd upgrade are irritatingly expensive, but eventually I'm sure it'll be tolerable somehow, either because my current one has become wildly too slow or because the cost comes down.
I have no reason to buy an iPhone, so I don't, what's the problem?
Yeah I agree. I’m all in on Linux but my wife still runs macOS. I still have an iPhone but you’ll pry Linux and i3 from my cold dead hands. That being said the apple hardware is some of the best. They’ve earned the right to recoup and then some by charging a premium for things even as silly as ram and storage imo.
Software and such has taken a bit of a dive of late imo but it’s still a viable platform for devs to sell apps and services into. So shrug.
> Then there's the use of child and/or slave labor.
It doesn't make it any better, but let's keep in mind that this is true for ALL current laptop brands unfortunately. They all come from the same factories.
If anything I trust Apple more to audit their supply chain for human rights violations that I would trust eg Dell or Lenovo (and I use those brands exclusively myself).
I trust Apple to behave like a for-profit business.
I've seen Apple make changes when it's benefited them (because their image matters to their customers).
Then why do they seem to lobby in favor of slave labor? [0]
At least for me it's difficult to come up with a reason why Apple might do this aside from knowingly profiting from it.
It's not meant to imply that it's exactly the same thing as a tax. It's meant to suggest that e.g. Apple hardware is significantly higher margin than alternatives, an affordance allowed due to the reputation conveyed by the brand.
Like all analogies and metaphors, it does not imply equality, so showing how the two things are actually different isn't very interesting or constructive unless the statement being made hinges on wrongly comparing aspects that don't line up (doesn't seem to.)
> an affordance allowed due to the reputation conveyed by the brand
Calling it a tax implies there's no added value, which is entirely false. If their products are more expensive for a similarly capable device from another manufacturer (that's a big if), it's because Apple's products have a reputation for being higher quality. Many people will always be willing to pay a lot more for a little more quality, in the least.
I'm not even trying to argue that it's necessarily true, I'm just saying that arguing about semantics would simply be showing ignorance to what a brand tax is. One can argue if the prices are actually justified by some tangible value or not, it's neither here nor there.
That said, the evidence that Apple has difficult to justify prices is definitely present in their product line. Love it or hate it, it's extremely difficult to justify a $1,000 monitor stand; even if it's actually not a result of high margins, it's really hard to see how they can add so much value into the concept of a reticulating monitor stand that it would be worth such a price tag. Or perhaps the $700 caster kit for the Mac Pro (a price that hasn't changed: I looked!) They may be the nicest casters ever made, but if it's an option you happen to want, the price from Apple is $700. They don't offer a cheaper option, so that's what you get with Apple. You may want it anyways, because well, it's just so nice and you don't care. That's fine, but that's definitely what most people call the "Apple Tax".
It is extremely difficult in same sense as I'd find difficult to justify million dollar home / 100K car with custom finishes and pleasing interiors. Are they really need to for basic dwelling and transporting from point A->B? Nope. However any reasonable person would set me right pointing out that I am in market of 400K home and 30K car.
But when it comes to tech product people have this thought that pricing should be what they have calculated is correct. And any reasonable person would agree to it.
Legally, this isn't true. You're obligated to pay "use tax" (same as sales tax) to your state of residence when you bring it home if you aren't an Oregon resident, for most states. (Of course, almost no one does...)
If you live outside the US, however, you can probably get away with this as the total value is likely under the customs exemption threshold, or if you use the device for a bit before you get on the plane, you can claim it's "used".
Yes of course, I wasn’t encouraging anyone to evade taxes. However, if you are an Oregonian or Alaskan, you are good to go. Not sure how it works if you are from BC, maybe it’s covered under your tax free import allowance.
As a Washingtonian, I would always list and pay use taxes on my purchases from the Portland Apple Store cough if I ever shopped there before.
You're making a silly point. There are not legal substitutes for paying taxes (inb4 loopholes for wealthy and corporations), while there are plenty of more affordable substitutes for Apple products.
If there is no competition (substitute, for given use cases and/or customers) and customers are willing to pay for it, then that is pretty much the definition of its being justified.
A sales tax of any sort can be avoided by not buying the product. I assumed nobody would misinterpret the common colloquial term "Apple tax" (meaning "Apple's much higher than typical profit margin"), which I didn't invent and has been used widely for a very long time.
$3,999.00 is the entry price for the M2 Ultra. The M1 Ultra is supported by Asahi Linux and I have to say the memory bandwidth makes things like running llama.cpp way faster than Intel CPUs. The rust GPU driver is also a joy to use.
Compared to the price of Two very good nvidia cards and the PC to host it I think the price is reasonable for running Linux on high end Apple Hardware.
Another thing is that the memory consumption is under 20W when idle and 100W when compiling the Linux Kernel. I think you make up the money just in electricity after a few years.
I was doing a lot of playing around in Stable Diffusion when it first dropped, with scripts generating a lot- some stuff I left overnight, 12+ hour runs. My M1 Pro Macbook at doing SD inference averages about 60-70 watts, or around that I think. The inference speed is about 0.9 iterations/s (1.1s/it). Later I had time and got SD running on a workstation of mine- an HP Z840 with dual Xeons and an RTX 3090. The 3090 eats 350 watts alone, and the rest of the system sucks another 250 watts or so, for a total of 600W. The generation speed (same Euler sampler) is about 9 iterations/s.
So, 10x faster than my Macbook, nearly exactly. While using 10x the power. Utilizing larger batching with the extra 3090 vram can actually result in even more throughput.
But I find it interesting that it winds up being nearly the same in images-per-$.
> Compared to the price of Two very good nvidia cards and the PC to host it I think the price is reasonable
Apple’s marketing about how they compare to nVidia GPUs has been exaggerated, to put it lightly.
A pair of high end nVidia GPUs will be substantially faster than an M2 Ultra. They don’t really compare.
> I think you make up the money just in electricity after a few years.
You might want to check those numbers. At least where I live, I’d have to burn almost a thousand watts constantly at idle 24/7 for several years to start marking up the electricity premium.
The power savings is great, but it’s not going to justify spending thousands of dollars more. I love the quietness of my M1/M2 machines, but I’m under no illusions that I’m saving money by using them.
Apple's hardware tax funds their R&D, which is why Apple develops stuff like the M2 ultra when the Intel and Linux OEMs sit around waiting for Intel or Qualcomm to shit out some useless chip that's undifferentiated.
You probably can have a better AMD setup. Just look at things they brag about [1] - our newest 24-core machine is UP TO 4x faster than 8-core Intel from I don't know how many years ago. I have no idea about actual performance but it doesn't look great, especially given that they are running software for which these chips were specifically optimized.
I currently use macos, the software is tolerable, but when I need some serious computation power the server is running Linux. You can have more cores, you can have more memory, what else do you need?
if you go with an AMD Zen 4 Epyc server unit you still only get 12 "CPU memory channels" with M2 Ultra (or M1 Ultra) the equivalent of 16 "CPU memory channels"
don't matter for many applications but for some
and the it's a server unit you are buying not a prosumer desktop (with a ryzen CPU you get 2 channels, threadripper 4 channels and 8 with a threadripper pro)
Why do you believe that the "Apple hardware tax" here is being applied to some pieces of aluminium, rather than to their extremely-high-CapEx investment into a new SoC?
I would bet money that when the Mac Pro logic board becomes available as a component through the repair program, that a replacement of it will be 90-95% of the cost of replacing the full machine.
> Why do you believe that the "Apple hardware tax" here is being applied to some pieces of aluminium, rather than to their extremely-high-CapEx investment into a new SoC?
Sometimes it does seem to be applied to pieces of aluminum, like the $999 stand for the Pro Display XDR.
But your point is a good one that many people seem to forget: a lot of the tax is to recoup their massive investment in developing their own Apple Silicon SoCs as well as the OS and other software.
My understanding is that the profits are accumulated in a cash/low-risk investment reserve, which serves several purposes: hedging against the ~100B of debt they apparently hold, ensuring smooth operation in a downturn, and also enabling the acquisition of valuable companies or talent, such as P.A. Semi in 2008.
Let's put it this way: it would be a lot harder for Apple to do what it wants if it didn't have a stupendous amount of cash to fund it. That money pile means that they can shoulder the risks of their own plans, and the fact that they just pile on more and more profit every year just means these plans are largely successful.
You perfectly articulated my reaction to all of the complaints. It seems like people believe companies should aim for as close to zero profit margin as possible, as if margin is immoral waste akin to greed, and not fuel for for the business.
Its not that it is immoral, just that it indicates management lacks the imagination of what to do with it which can be problematic if it is a competitive industry.
Apple hardware is genuinely better in so many ways large and small: case, keyboard, hinges, touchpad, etc. I wish I could pay the Apple "tax" and without having to use MacOS (RIP Bootcamp).
I think it is already supporting installing with the base M2 chip and they are putting in support for the Pro/Max/Ultra in the kernel in just this latest release, with install support for those machines coming soon.
There's nothing impressive about Apple's chips in a desktop environment.
If you've got a high end current generation Intel or AMD workstation processor plus an RTX 4090, you've got a more powerful and probably cheaper system than Apple's desktops that have the M2 Ultra (which start at $4000).
Performance per watt is where Apple Silicon shines, so the only thing a Windows/Linux user should have envy over is the MacBook Air and MacBook Pro. I would personally stay away from Apple's desktops even though I like macOS well enough.
given that it's a pro-sumer product the price isn't even that absurd (I mean look at the cost of Threadripper PRO workstations)
the main issue I would have with buying such system is repair-ability
For one the questionable anti-repair, e.g. what they did with hard drives on their last desktop systems. I mean wtf. do you mean I can't just easily replace a worn hard drive that is a _must have basic essential feature
for commercial usage in larger scale.
for the other that fact that RAM can not be replaced at all, which is most times not an issue for desktop systems on normal consumer use, but is an issue for a workstation system which might run 24/7 for most days on high utilization
How so? Apple says 20% faster cpu than the M1 ultra. So now it should be about as fast as a 13th gen i9 chip or comparable Amd chip. Power consumption is still very impressive, performance is as expected, so yeah it is good but "nothing like it" sounds like much, much more.
People tend to claim their hardware is generally overpriced. I do think that's more of a myth when examined more closely. However their RAM and storage upgrades are clearly absurdly expensive as we can easily compare them to retail numbers.
I like having a real package manager, good native containerization and virtualization, being able to develop on the same OS that the deployment target is, and a large handful of things that are simply more pleasant on Linux (like the assumption of a case-sensitive filesystem, better job scheduling and daemon handling, more scriptable system utilities, and simply more transparency and control of my machine front to back). The DE situation is crappy, but I'd expect this place to recognize that there's more to an OS than the DE, and that developers might take a subpar DE for a better development experience as a whole.
Except that puffery specifically covers claims that are not objectively verifiable. For example "the world's best computer ever!". The claims in the apple presser are clearly verifiable. The language is just... Embellished.
Yeah, it's called "Apple speak". I'm not even joking. For whatever reason, it's been Apple's thing for probably at least a decade and a half. Yes they say it at every release.
Don't ask me why, but it's Apple's thing, and I'm not aware of any other major company that does it in quite the same way. You can find plenty of parodies of it on YouTube if you want.
This was probably the only thing they said that got me riled up. Of course your new one is a multiple faster than the very old and very outdated previous one.
Comparing old-when-they-came-out-and-that-was-still-a-while-ago Intel chips to current-gen MX chips is borderline unethical.
Absolutely unethical. And, also highly effective. Together with the placebo effect, the walled garden, sunk cost, etc, it is almost impossible to convince some otherwise very intelligent people that the number crunching, if that is the primary workload, is a fraction of other systems.
- AMD Ryzen 9 7945HX, released Q1 2023 has 2x the performance of M2 Max 12 core, also released in Q1 2023. Throw in a dedicated GPU, and certain tasks that would take 1 hour, will have you wait until the next day on a M2.
That isn't to say that one is better value than the other, all that stuff is subjective, and depends on needs, blablabla. Just that the original claim of "3x faster than what's on the market", was such a misleading statement when they were comparing it to their old antiquated laptop intel CPUs offerings.
“Up to” is legalese, I think. If you say “x% faster” and there is anything that isn’t x% faster on the new hardware, you may be taken to court and lose.
But I think they make a subtle dig at many other consumer products.
I have seen too many examples, where the "latest" product is inferior to previous-gen.
EG:
- A few car models I am familiar with had adaptive cruise control a few years ago, standard. Now, you need some expensive package. Definitely not "our best model xyz ever"
- Consumer appliances like refrigerators and washing machines just get more expensive, and more complicated, and crappier.
- A key. A friggin door key. How can you make a "next-gen" key that is worse than a piece of metal? Hello, every "digital" key fob. They SUCK! Slow, cumbersome, annoying sounds. Did I say slow? No, many keys are not "the best ever key".
- Ever update Microsoft Windows to a new version? Through today, I am scarred that an "update" to a device will be a regress.
So yea, when Apple says "our best ever", it may sound obvious and trite.
But other companies can't say this for _so many of their product lines_...
Definitely thinnest. Still terrible. I don’t recall if they even attempted “best ever” with that keyboard. I suppose perhaps they would have said that about the laptops that included these keyboards …
My point is “thinnest” can be true without being “best ever.”
I wouldn't call it "subtle". The only time I watched an Apple event I was shocked to find how accurate all the jokes were and how Apple really didn't miss any opportunity to boast about how many millimeters and percent something was thinner than before.
> I have seen too many examples, where the "latest" product is inferior to previous-gen.
You mean like when they removed the headphone jack? Or the magsafe laptop charging port, before adding it again years later? Yes, Apple made many improvements but they still mess up a lot and their software has just as many bugs (I use Mac OS for less than a year and already discovered three easily reproducible bugs/crashes in preinstalled applications).
> Of course you aren't going to release product which is going to be slower, right?
Sure you can, you could release a product which is tad slower but much cheaper than flagship. Typically some cores are binned in the higher end chip to make it lower performance, you could launch the lower end chip later separately .
It was also normal to launch a lower performance chip but that has less idle or total power draw. M1 and M2 series are fundamentally that, there are better performing chips ( by core or total ) than M1, M2 but none at their power draw.
One can use truths to bullshit; that's what media, marketing, corporate do. In this case, phrases such as 'further than ever', 'the most powerful ever made', 'the largest and most .. ever created' are hedged. One has to add many missing ceteris paribus clauses (which cetera is paria?) to make something out of these lines. Media, corporate people don't lie, but they use truths to bullshit.
At the risk of sounding snarky... is this a joke? It's advertising. Advertising is pretty much always filled with over-the-top language. If they could legally say that this device might cure cancer and make you rich, they would!
So there's an M2 Ultra, M2 Max, and M2 Pro. I hate this naming scheme. It is totally unclear how you're supposed to order those in terms of performance.
Max kinda makes sense compared to the other level, since it's the full-spec part, and the others are lower-binned derivatives. Since Ultra is really two Max-grade chips welded together, seems like they should have done something like M2 Duo ... ah.
I think it's more comparable to GPU naming. 40XX gives me the generation, and the XX part (e.g. "4060"/"3080 Ti", etc gives me relative the performance within that group. Except instead of something informative, Apple just goes with an arbitrary word that means "the best."
which makes the name redundantly uninformative! So why not make the differentiating technical spec part of the name, or have something similar like M2 XXXXX where XXXXX is the number being comparable?
The average person does not care and learns the hierarchy directly from the marketing materials, not from incomprehensible (to laypeople) technical specifications.
I think it makes sense if you think about it. “Max” is pretty objective, it means the highest limit of something, so that’s the most powerful. “Ultra” isn’t very clear, but we know it’s lower than Max and higher than “not ultra”, so it’s in the middle. Since both “max” and “ultra” imply something not just better but larger, we can conclude that the plain M2 is the version targeted at more power efficient devices. Finally, Pro stands out as the only modifier clearly unrelated to computing performance, so it must have to do with “professional” features, like ECC RAM and MDM.
Can't tell if this is sarcastic or not. First, it's just wrong, as the M2 Ultra is more powerful than the M2 Max. I could easily justify these adjectives being in any order.
Everytime I think of Ultra, I think of MK Ultra - very dangerous. Max, well, he is my nephew - nice kid, not so dangerous. Pro - I am a pro, pro must be amazing. So I say, Ultra > Pro > Max. There, I solved it.
This worry you have is only a worry if we assume that the M2 Ultra is unmatched by AMD and Intel. In reality, Apple doesn't make the fastest desktop chips money can buy.
Unbiased benchmarks need to be compared. Apple's press release doesn't compare performance to x86 with quantitative benchmarks and that is intentional. Don't just believe Apple when they tell us that this is the greatest thing since sliced bread.
For one thing, both Nvidia and AMD kick the pants out of Apple's graphics solution. Nothing comes close to touching the RTX 4090.
High memory workloads are completely off the table with Apple systems now. The previous Intel Mac Pro could handle more memory (1.5TB compared to 192GB), and it was modular.
Apple Silicon only really shines in performance per watt, but once you need a higher level of performance you reach a point where Apple has no product to compete. In a desktop environment, the advantages of Apple Silicon get whittled away.
They actually just today released an update [1] with fixes for sleep:
"We now have a cpuidle driver, which significantly lowers idle power consumption by enabling deep CPU sleep. You should also get better battery runtime both idle and during sleep, especially on M1 Pro/Max machines.
Thanks to the cpuidle driver, s2idle now works properly, which should fix timekeeping issues causing journald to crash."
The Lenovo T14s with an AMD CPU and Intel WiFi sleeps perfectly every time, the CPU scheduler seems to behave right too since I have a full days battery life normally.
I went into the BIOS and enabled S3 sleep though, since it's sold as a Windows machine and Microsoft is pushing the new hybrid sleep stuff really hard.
I sometimes feel the same. Then I have to deal with Homebrew, Docker or macOS equivalents of coreutils, and I'm happy to go back to my Linux-powered Thinkpad, thankyouverymuch.
Install the GNU coreutils and all the other much more featureful gnu variants, put them first in your path, done. I've been rocking this setup for years.
Docker on mac does really truly suck, namespaces in Linux is truly the killer app for the kernel, maybe Apple will adopt a port of runj one day.
You're listening too much to the biased Apple marketing, BSing you with comparisons to years old Intel CPUs locked into poorly cooled chassises on their laptops (they compared it for Intel MacBook Air ffs).
The new AMD and Intel CPUs are fine. There's a reason Apple never dares to mention them in their marketing shows - they always ignore things that are competitive.
Your FOMO reaction is exactly the emotional reaction their ads are meant to trigger - they're dialed in to manipulate you into this feeling to feel bad about not owning their products.
Intel (and AMD) reportedly have M2 Pro-like quad channel, GPU heavy laptop CPUs in the pipe.
Those will be very cool!
Technically they both have datacenter focused, M2 Ultra esque APUs too (the MI300 and Falcon Shores). Intel delayed and then canceled the CPU part due to a lack of demand (it is now just a big GPU). The MI300 was HPC only, but AMD is trying to spin it as a more general AI product now.
I got a base-spec MacBook Pro 16 inch with the M2 Pro chip from work last week. Before that, I was on my Linux workstation with an AMD Ryzen 3950x, a desktop CPU released 3.5 years ago.
The compile time for a debug build of LLVM and MLIR is basically the same on those machines, the desktop machine only winning by a little bit. Roughly 10 minutes. Yes, of course desktop vs. laptop is unfair, but it's also a 3.5 year old chip.
And if you really want all-out performance, there are still AMD ThreadRippers, which should easily beat an M2 Ultra in almost all multi-threaded workloads.
(Details on the benchmark run: Building LLVM from source using precompiled LLVM 15 downloaded from the official website, debug build, only clang;mlir;lld projects, using mold/sold linker on Linux/Mac respectively, ninja build tool).
How about using Qualcomm machines with linux. Microsoft is pushing for WindowsOnArm. Any ideas on how much support there is for linux? considering these chips are similar to the ones that are running Android
Qualcomm's processors aren't really comparable to the Apple Silicon ones. They haven't made anything like an M2 Pro/Ultra.
I'd also say that Microsoft isn't "pushing" for Windows on ARM. Rather, they're adding support for Windows on ARM with very little weight behind it.
This is where things get into a chicken-or-egg problem. There's no reason to buy a Qualcomm/Windows ARM laptop given that an Intel laptop will perform a lot better, have better compatibility, and not have to emulate x86/64 for all the apps that developers won't port to ARM. Given that there's little reason for someone to buy an ARM laptop and only 1 of those reasons is under Qualcomm's control (making a better laptop processor), there's little reason for Qualcomm to make a better laptop processor. Given all that, over the medium-term (say, 2-5 years), there's little reason for Microsoft to devote a lot of effort to Windows on ARM.
When Apple introduced M1, they gave users a chip that was way better than what Intel was offering. It was night and day. Everyone knew that there wouldn't be Intel Macs in a few years so we all bought ARM Macs and developers ported things to ARM (as users knew they would). By contrast, no one in the Windows world is betting that ARM is the future of Windows - not users when buying, not developers when compiling, and not chip makers when making processors. Without commitment, there's little chance for success and no one is willing to truly commit.
If you want Linux on ARM, you can go out and buy a Pinebook. The problem is that you aren't getting a flagship CPU. You're getting mediocre 2016 hardware. Even if you got a Qualcomm 8cx Gen 3 meant for laptops, it's half the speed of an M2 (regular, not Pro or Ultra) and slower than what Intel will be selling you.
The problem is that the options for non-Apple ARM laptops are just poor. Even if you could get the best Qualcomm has to offer, you wouldn't buy it because you could get a better Intel machine - and not have to deal with compatibility issues and emulation. Sure, Windows on ARM can run x86/64 code, but there are always things that don't quite work or work very slowly or chew through RAM or battery because you're on Windows/ARM and no one is expecting that.
You can run Linux on ARM today, but there just aren't ARM machines that are compelling to use that aren't coming from Apple (save for some stuff like the Raspberry Pi).
> For example, M2 Ultra can train massive machine learning workloads in a single system that the most powerful discrete GPU can’t even process.
It's interesting to see that Apple is explicitly targeting the M2 Ultra as an alternative to GPUs for machine learning.
I think it is great for consumers for Nvidia to have a competitor in this space. If you wanted to do ML, you really didn't have much of an option other than Nvidia, which allowed Nvidia to do things like disable certain features in consumer GPUs even though the hardware was capable and limit RAM. Having Apple as a competitor in the consumer space will force them to up their game.
That unified memory architecture makes them well suited in many cases - that's the one bit current x86 desktop platforms don't have a good answer for, where the system design generally keeps GPU and system memory as entirely separate technologies.
This is why fanless M2 MacBook Airs are regularly loading models 5k dollar PCs might choke on - you can get 24gb of unified memory, while almost all high end consumer GPUs top out at 16gb.
4080 is still a 16gb part, and still the 2nd fastest part Nvidia sell at consumer level (its a 1200 dollar part). The 3090 (for however long it remains on sale) and the 4090 are outliers.
The obvious counter point is an M2 MacBook in 14/16 inch size, where wallet permitting you can have up to 96gb of unified memory too...
I simply chose the Air in my example earlier because its incredible you can have that level of performance in ostensibly an entry level passively cooled machine.
Well at least in the U.S. a lot of used 3090s are out for less than $800 on eBay which is a steal for 24GB VRAM. The 4090 is about $1600 and it's still a consumer-level card; gamers are buying it more often than the 4080. For ML workloads, it's actually a pretty good deal considering how expensive Nvidia's higher-end, non-consumer offerings are. Lambda Labs did some benchmarking: https://lambdalabs.com/blog/nvidia-rtx-4090-vs-rtx-3090-deep...
But yeah the M2 MacBooks are incredible for local LLMs for their price. Nvidia doesn't have any consumer-level priced accelerators with that much memory.
Yeah but M2 effective speed for ML is closer to intel CPUs than it is to nvidia gpus. The correct comparison here is to compare how fast you can train on Apple M2 vs an Intel CPU with equivalent RAM.
> It's interesting to see that Apple is explicitly targeting the M2 Ultra as an alternative to GPUs for machine learning.
For many purposes the limiting factor on local LLMs is the amount of VRAM. nVidias with 64GB+ are insanely expensive (unless you're a funded startup). The ability of Apple Sillicon GPUs to use system RAM for the GPUs is a game-changer.
While I agree it's expensive it's not quite apples to apples. GPU memory is very very expensive. A100 80GB costs $10k msrp (> $15k on ebay). An RTX 4090 only has 24gb of vram at $1600. Even if you do dual 3090's + nvlink for $1600 you only get up to 48GB, no where near the up to 192GB of vram you can access on apple silicon. The upgrade from 64GB to 192GB of ram is $1600. That's still expensive of course, but 128GB of 800GB/s memory that your GPU has access to for $1600 is actually not bad value. It's not like just sticking an extra 128gb of ram into your computer. DDR5-6400 dual channel will give up up to 102 GB/s, which is no where near M2 Ultra or GDDR6X.
The memory bandwidth is still a bit lower than Nvidia's best cards, and it doesn't have the equivalent of Tensor Cores. If they wanted they could compete, but it's clearly not their desire. They build consumer end products.
The neural engine on all recent Apple silicon (and A## devices) has "tensor" cores for matrix calculations (note: Apple abstracts all of this behind coreml so there is some conflation between the ANE and AMX instructions/hardware). The M2 Ultra offers 31.6 trillion ops per second with fp16, for instance, which actually bests an A100.
The software support is terrible, of course, which is the biggest limitation, but Apple clearly wants to be in that realm as well.
The neural engine has severe limitations at the moment. I tried using it for BERT about a year ago and kept crashing its API because of "out of memory" issues. The theoretical TOPs you mention also don't necessarily translate into usable TOPs because of memory bandwidth and caches. This is why for example the comparison of the M1 Max with a RTX 3090 was completely off.
I certainly can't speak to your specific uses or issues, but I mean we've really moved the goalposts from the prior claim that it didn't have tensor (e.g. matrix) functionality.
My daily work life includes a lot of model running on Apple hardware (Apple Silicon and A1# chips with the neural engine) using CoreML, often Pytorch models converted using coremltools. The performance of the Apple chips is spectacular if the intrinsics are supported (things obviously get dicier if there are currently unsupported ops). I mean, the memory bandwidth of the M2 Ultra is within spitting distance of the GDDR6X 4090.
People aren't going to be replacing H100 arrays with Apple Silicon and even as a fan I use nvidia hardware for training and convert the models to CoreML after the fact, but Apple clearly isn't just satisfied being some toy. They are continually climbing up that vine.
Yes, you are correct in that the ANE does have the equivalent of tensor cores and that I didn’t mention that. I just don’t expect it to be usable beyond inference because the number of compute units will not work for batches in medium/large/huge networks. That’s obviously by design! The ANE silicon size is tiny compared to the GPU area. I wouldn’t be actually surprised if Apple strategically only invests in using their GPU for LLM (1B+ params) work.
Note that if you are currently using CoreML for LLMs all the work is done in the GPU.
Regarding Tensor cores, it does have them as part of the 32 core Neural Engine. Apple considers AI/ML a consumer feature, all the way down to the iPhone hardware. At the same time, this isn't a data-center supercluster. It's still just a mid sized workstation.
There is a difference. We train with large batch sizes these days. The ANE silicon size is tiny and can't do the large matrix multiplications for big LLMs with or without a batch size higher than 1. Meaning that it cannot saturate the RAM bandwidth and that you're better using off the much bigger GPU on the Apple die.
Nvidia's NVIDIA DGX™ GH200 supercomputer links 256 Grace Hopper CPU+GPU chips with 576 GBs of unified memory each into a single 144 terabyte GPU address space.
Besides Nvidia cards usually being faster and having higher memory bandwidth, Ada cards also have FP8 cores. I'm not sure how well Apples' M-series chips handle low/mixed precision tensors, but I wouldn't be surprised if Nvidia cards perform better with them.
Interesting that the chip still doesn't seem to support hardware acceleration for AV1 despite the fact that Apple is a member of the Alliance for Open Media consortium which developed the format.
Not to say for sure Apple Silicon chips support HW AV1 decode, but I would speculate that recent generations of their SoCs may have hardware support behind a software limitation through the lack of APIs.
Under /System/Library/Video/Plug-Ins/AppleVideoDecoder.bundle there's a Info.plist, inside there are references to AV1, below that is VTIsHardwareAccelerated = true.
AVD refers to Apple Silicon HW decoder. Side note, GVA means Intel, VCP means AMD.
I guess it will be like the situation of VP9. It was available at first via a special entitlement to YouTube iOS app [1] (com.apple.developer.coremedia.allow-alternate-video-decoder-selection) and later opened to all apps.
Apple seems to be taking a wait-and-see approach with AV1. Sure, they're an AOM member, but lots of companies are members of bodies that they aren't enthusiastic about. I'm not saying that's the case with Apple. I'm just pointing out that membership doesn't mean enthusiastic support.
Microsoft withheld AVIF (AV1 image format) support from Edge despite the fact that Chrome/Chromium support it and Microsoft said it was due to licensing issues (https://toot.cafe/@slightlyoff/109899372183448386). It is finally getting support, but clearly there was something Microsoft's lawyers were worried about. It's possible that Apple has some similar concerns, despite the fact that AVIF is supported for Safari. Maybe Apple thinks that some portions of AV1 that are only in video are problematic.
It's even possible that Apple has hardware support for AV1 in their M2 chips and just has it locked away until their lawyers green-light it.
Companies and their lawyers are going to see a lot more claims than we do about this stuff. We know Microsoft was concerned about AVIF, but we have no idea what their licensing concerns were. It might just remain a mystery to us for a while.
1) Some small sites, like YouTube, are adopting it. However it is a chicken-egg problem that usually has to start from the hardware site because, in contrary to software, it can't be added later. That's why those organizations usually include Companies like Intel, which had delivered AV1 with their GPUs recently.
2) None, really, the increased use of bandwidth + CPU time is not a blocker per se. In this case y don't need hardware support but it sure as heck wold be nice to use those CPU cycles for something else.
What's even better for the environment is not buying a new computer unless it's actually necessary.. my refurbished M1 Air 8GB still works great - I can comfortably run a Debian VM with Vivaldi & VS Code (doing .NET development), and when I need a GPU for a few hours a month I just use a cloud instance.
Good perf per watt means more than just good battery life. It also means the laptop's fans won't be screaming like the machine is being burned alive when being pushed, that the chassis won't get uncomfortably hot easily, and and performance doesn't fall off of a cliff when you're unplugged.
In short, it's about balance, which I think is just as important as raw performance for most people.
The M2 Ultra will definitely crack the top 20 but it's no surprise a 24 core option is beat by 64 core competition, which tops that chart. Usually their performance claims are about previously available x86 Mac workstations, not x86 in general. You could even go beyond that chart and look at dual socket AMD to get 192 cores, you just won't be able to run a supported macOS on it.
Their more traditional "vs the rest of the industry" performance claims on laptops aim towards the single thread performance of M2. While maybe not a chart topper a year later, it's quite a far cry from the 2.5x difference you've found there. In the end though, if all you want is the fastest, loudest, most plugged in laptop you can just go out and buy a laptop chassis which accepts desktop CPUs from the likes of Clevo et al.
I don't get paid to run benchmarks, so I couldn't care less what the high scores are. My M2 laptop is 3x as fast as my previous laptop while being completely silent, making it very enjoyable to work with.
Not OP, but at compiling code (Rust). Some anecdata:
time cargo test
# old machine (2018 16" Macbook Pro 2.2 GHz 6-Core i7)
cargo test 2077.73s user 221.03s system 596% cpu 6:25.67
total
# linux dev machine (AMD Ryzen 5 3550H - 4-Core 2.1Ghz, using mold linker)
real 4m48.771s
user 25m0.121s
system 2m9.922s
# new machine (2023 14" Macbook Pro M2 Pro - 10 cores 6perf/4efficiency)
cargo test 389.91s user 39.56s system 478% cpu 1:29.66 total
This is what the industry calls 2.5D technology. Intel has EMIB and TSMC has CoWoS. It’s one way the industry has been able to keep Moore’s Law limping along with chiplets.
It is like AMD Chiplets but all on one actual Die. The actual chips are produced as one chip. Chips with a bad 'half' get cut in half to become a M2 Max and a lower binned M2 Max.
The Apple method here of using one die allows for much higher bandwidth than Chiplets.
I believe Pentium 4D counts more than Pentium 2, but Pentium Pro does count and it was part of why it was so expensive (comparably). The slot-mounted CPU modules were essentially avoidances of multi-chip-modules or large easily broken (thus low yield) dies, and ran a normal PCB.
I have over 2 million Marriott points. On their online store, I can get a Macbook Pro with M2 with a 1TB HD. Or I can get an iPad Pro with a 2TB HD. I know this pain, as a cheapskate with too many points from work travel :(
That isn't really a good example of transistors well spent. You can do 32 bit RISCV or MIPS cpu in the same process node that is 2-6x faster than that 8086 using the same number of transistors.
Today, I had an interesting encounter after work when I brought my Framework laptop to the store. I had a chance to closely compare it with the current MacBook Pro line and wanted to share some thoughts.
Firstly, I was not particularly impressed with the MacBook Pro cases and haptics. The build quality seems to have deteriorated compared to previous iterations. Granted, this could be a matter of personal preference, but it seems to me that Apple's relentless pursuit of thinness and lightness has come at the expense of the premium, robust feel of their older MacBook models.
As for the haptic feedback, it felt a bit off. Earlier MacBooks had a satisfying clickiness and responsiveness, but that didn't seem to be the case with the current model I tested. I suppose this could be subjective, and perhaps some may prefer the current haptic approach.
On a positive note, the brightness of the screen was impressive - it certainly stood out. Apple has always been at the forefront of display technology and it's evident they are maintaining their standards in that regard.
Comparing with other laptops in the store, I couldn't help but appreciate the Framework's design and user-focused approach even more. Other laptops felt plasticky and cheap in comparison, with subpar build quality and aesthetic.
Despite the MacBook's stellar screen, it's hard to beat the Framework when it comes to customization, repairability, and the ethos of the company. It's a refreshing change in an industry that seems to be moving towards sealed units that discourage user modification and repairs.
The old MacBooks were in a different league altogether, and it's disheartening to see how things have evolved. I would love to see a return to a focus on robust build quality and user experience beyond just raw specifications and form factor.
> relentless pursuit of thinness and lightness has come at the expense of the premium, robust feel of their older MacBook models.
The 14/16" are significantly thicker and heavier than previous generations, both in actual measurements and in appearance (as they don't taper at the edge like they have since the 2012) I truly cannot imagine looking at that and thinking any part of the design was imagined with the words "thinness and lightness" in the mind.
"Apple's relentless pursuit of thinness and lightness"
That ended in the previous generation.
The 12" MacBook and previous-gen (final-gen Intel) MacBook pros were the most beautiful and the most thin/light MacBooks
With the new MB Pros, Apple went with much more compromise towards functionality and quality. I think they are almost flawless – just much more ugly than previous gen!
I am not sure about the Airs – and maybe that's your primary comparison.
Having sold my Framework to buy a 14" MBP - I don't understand how you can claim that the build quality isn't "premium and robust", especially compared to the Framework which felt extremely flimsy.
> As for the haptic feedback, it felt a bit off. Earlier MacBooks had a satisfying clickiness and responsiveness, but that didn't seem to be the case with the current model I tested. I suppose this could be subjective, and perhaps some may prefer the current haptic approach.
I think there was a change in Apple's trackpads around 2015-2017. I'm going off memory, but prior to that, the haptic feedback was more like a motor, and after it was something more like a speaker. I noticed it because I had a 2012 Macbook and bought a Magic Trackpad, and the click had a noticeable difference.
I find it laughable that they advertise “gaming performance” every year and fail to mention that they’ve actively kept gaming out of of the apple ecosystem for essentially forever (or at least as long as I can remember)
I would gladly pay Apple $30,000, like I will get financing and see this as a new car payment, for a machine where all chips are full open source and documented and detachable and auditable.
Ok why is it so expensive? Is it because there is no competition or because they have poor yields on that chip?
I mean after all this is just a phone in a fancy big case. I mean, they are not too constrained by the size nor the cooling. Extra connections on board add how much, $50?
I'd think the beefiest version wouldn't cost more than $2000. It doesn't have a screen like a laptop etc.
It’s not the answer you’re looking for, I suspect, but it’s the price that it is because Apple’s marketing team has run a bunch of market research and this is the price they judge will maximise their overall profit.
It's less complex as you don't have the size constraints and more freedom regarding heat and power, everything else is very much the same in principle.
I don't believe it costs more than producing a phone. Only substantial costs may be when it comes to poor chip yield.
By the same logic, the Nvidia H100 is just a phone and should cost a few hundred dollars (H100 is priced at $40,000 per unit and has fewer transistors than M2 Ultra)
Not all product pricing is determined by the cost to produce the product.
Not all product pricing is intuitive to consumers.
Products are frequently sold at a loss as well (google "loss leader") for various reasons - usually to ease adoption of a platform (Microsoft's Xbox 360 comes to mind, though my memory of it being a loss-leader might be apocryphal).
Whether or not you believe it costs more or less to produce a chip than it costs to produce a phone isn't relevant. The price of phones aren't determined solely by the cost to produce phones. There are many more factors involved. Supply chain agreements, labor contracts, marketing, regulatory, tooling, distribution, market factors, etc, etc - those are just some of the things that influence a product's pricing. The factors considered when setting a chip's price shares those same concerns and probably more.