The author notes how Tensorflow on the M1 isn't useable and hasn't been this whole time, but then uses this as one of the reasons why the M2 needed to be returned? Did he not realize that the same would be true on the M2 as the M1? If he needed fast Tensorflow for his workflow and bought an M2 mac expecting better I'm not sure what to say. (Also who is using TF? every hot thing recently is using PyTorch and while Metal support in pytorch has had struggles at least people are getting things working)
I don't disagree that with the main conclusion, it is a lot of money and for most people it doesn't make sense. If I was spending $2600 I'd be looking at places that do certified refurb M1 Ultra Mac Studios with at least 64gb unified memory, which I think will be going a long way soon to helping future proof a machine for the next few years.
It reads odd as it is written as someone having the understanding of their technical workflow and doing benchmarking while simultaneously essentially falling for the same marketing hype that Apple used for the previous chip. Like, you can't act both ways.
It is true that Apple is pushing the ANE very hard in its marketing considering how little apps actually use it.
He doesn't say he wasn't using it on this machine. In fact elsewhere he mentions he was.
He just says his workflow is not bound to macOS (like it would be if you would do iOS development in Xcode for example) so he's got other options too.
I'm in the same place. I also moved from Mac to a NUC with FreeBSD as my main driver. It was cheaper than a medium mac mini and that's with 64GB RAM in it and 1TB NVMe 980 Pro SSD. Whereas the mini that's more expensive wouldn't even have half that.
Additionally I have much much more customization of my user experience with KDE. So I can really tailor the interface to my workflow. This was a bigger reason to switch than the hardware price. The ability to max out the RAM for cheap was also not 100% necessary but it's nice not having to worry about this later. And it allows me to run VMs whenever I need.
> Despite having a version of it in Windows, there’s a good reason why people preferred using Photoshop / Illustrator on Macs. First and foremost, Photoshop was developed on the Macs first. Color is more accurate on the mac out of the box than on windows. macOS is more stable than Windows. These reasons make Macs the preferred platform for graphic illustrators around the world.
wut?
Color is more accurate on a Mac if you don't pay attention to the display you purchase with/attached to your device. Yes, Photoshop was first developed on the Mac... 33 years ago.
I get if you like macOS, but the marketing behind it has done a really good job to paint it as if it's the only option out there. It's insane.
I’m not a photographer, nor a creative expert, but my understanding is that colour work is one place that that MacOS actually excels, fundamentally, at at the operating system level.
It’s able to handle colour profiles correctly on a per app basis, rather than per display.
I only know a bit about color profiling, but I don't understand why you'd want a color profile on a per app basis? I see it for device-level... like a printer would have a profile, a camera would have a differently calibrated one, and multiple monitors with different color profiles to make sure they're uniform.
I mean if they have that feature and it's used for something big I'm not privy to, cool, but otherwise it's something that most large OS's handle.
Again only going on off hand knowledge. But colour is one of those things that seems straightforward to begin with, and soon becomes insanely complex as you start peeling the layers back XD.
Different files, applications, media can be encoded in different colour spaces, as can different displays, printers, dyeing processes take them as inputs - and the conversion isn't always straight forward.
For most consumption activities this doesn't matter, but if you work in colour and your job is to make sure that that signiture 'Pepsi Blue' is consistent in print media, television advert, merchandise, and the product, some finer treatment of them in the OS becomes necessary.
To iterate - not a colour expert, but I just thought I'd politely enlighten you as why macOS is so highly regarded for this kind of work, it isn't just marketing. Though no doubt, there is plenty of that too.
As someone involved in both, most people don't work in commercial audio production or photography. And even for audio and photography, Windows has similar or even the exact same applications available, with perfectly fine performance.
Ableton, Kontakt, Lightroom, Premiere and Resolve all work fine on "not macos"
As for color accuracy, I have a plenty expensive monitor (bought on black friday, of course, never pay full price for expensive hardware) and a Spyder calibrator because factory calibration is a nicety that stops being true a month into using the thing. Although of course the mac mini doesn't even have a display, so it can't even get points there =D
But then later on he suggests he's using macOS Ventura, which is confusing. Also, "rendering" is doing a lot of unspecified heavy lifting here — I'm curious whether he's comparing apples to apples, so to speak.
Oh i do remember this happening a long time ago but haven't had that issue in like 6 months.
Although in my case it might have been a specific Language Server Protocol (LSP) package I was using that was consuming 100%+ of my CPU (they fixed it).
Used to have an issue where my bluetooth headphones would annoyingly hiccup every minute or so while listening to music.
I use Sublime Text 4 on an M2 Mini and it's mostly perfect. However, a few days ago, it exhibited a strange pattern of behavior where it would freeze every 2 seconds. No beachball, just froze and unfroze repeatedly. I didn't experience this with any other application. I'm all USB - no bluetooth. Looking back, the only thing I changed that day was I turned off my 3rd monitor from the DisplayLink Manager. I have 3 monitors connected (Thunderbolt Display, HDMI, and DisplayLink). I think there's a high chance it's related because I frequently experience a problem where if I disable the 3rd monitor, ST4's editing surface will rotate 90° if I move the window to the top corner of my 2nd monitor. It's a strange sight.
I've since kept the 3rd monitor on and haven't had any problems. So, if anyone has this problem, try fiddling with your DisplayLink connection: perhaps try disconnecting the adapter completely and/or try turning it off/on from the manager.
I too have had 0 issues with Sublime Text on my 14in M1 Macbook Pro...I've been using the machine since last March and never noticed anything like this...
Oddly, this makes me think maybe I should look into the M1 Mac Mini. I’m mainly a management type doing email, but I use a lot of screen space. I typically have 3 different chrome windows (1 for each gmail account: work, research, and personal). I’m currently using an M1 MacBook Pro which easily handles 2 XDR monitors. Would a mini handle 2 XDR monitors? I suspect so. Probably need the Pro to get the extra thunderbolt ports, but that’s ok. $800. Hmmmmm
I'm not sure that you would be happy with the dual monitor performance - it was one of the few complaints people had about the first M1 chips. Does your MBP have an M1 or an M1 Pro?
I've got one and really enjoy it, but I can't get it to output to a 1440p monitor. Two different 4K monitors - no problem.
My workflow isn't nearly as heavy as OPs (mostly DevOps stuff; some Kubernetes clusters here, some toy Go apps there), but it does a great job at these tasks while being super quiet and fast about it.
Our other Mac Mini sits in our living room and is used for Zoom game nights. We play Jackbox on it.
The Intel Mac mini that it replaced would really struggle with Jackbox despite having "fast" chips. On the M1 mini, the fans don't even turn on, even while sharing it over Zoom with other people's video on.
This might not be the right place to ask: I was given an absolutely enormous curved monitor by my employer (it’s what they had on hand), and I would like to move to something more moderate in size. I am also mostly doing management type work, eye fatigue is a concern, and I’d be pushing it from an M1 MBP. Any favorite moderate sized monitors that won’t break the bank?
This is clickbait. I can run Stable Diffusion (and PyTorch, and so on) on my M1 Pro just fine, although you do have to know what you are doing.
Although it is true that an Intel machine with a dedicated GPU is just “easier” and can be assembled for far less (I did just that a month ago: https://taoofmac.com/space/blog/2023/02/18/1845), none of the other criticisms are valid and seem to stem from a lack of understanding of how to actually use the machine and what it excels at (I intend to get one myself, despite having built a CUDA box, because it excels at audio and video editing).
The only thing that is worth taking away is that Apple kneecaps the entry-level M2 Pro configuration with only 512GB of storage (it should start at 1TB).
Classic case of “holding it wrong” and doing it for the clicks/viewers/controversy.
About mac storage — buy large high quality nvme, 40Gbps thunderbolt 3 enclosure for it, install your system there. Perfomance is very acceptable (~50% of the internal bandwidth, sorry no idea on latency) BUT you can simply take and boot it on any other M1/M2 mac if something happens.
I was trying this and that, and what worked for me was having an external PCIe drive (thunderbolt+nvme). Then booting on another mac involved a series of reboots into recovery to pick that drive (no idea what is goong on there) every time the drive was connected to another mac, but, on the whole, I do not remember having troubles.
Macbooks have pretty bad GPU per USD performance compared to Nvidia but this post is just a clickbait.
"Why advertise some super AI chip if it doesn’t do AI stuff? Who knows." => Neural engine is fp16/int8 limited so it wouldnt be good enough for training anyway but its pretty good for inference with CoreML in a lot of cases. Also training anything outside Nvidia ecosystem is very bad cause software was written with CUDA first and everything else second (AMD for example have similar story to Metal backend).
As someone who works on Deep learning for past 6 years I can easily say CUDA wont be replaced anytime soon at least for training, almost every software out there was written for either CUDA or CPU.
And for local model development you don't need your rig to be powerful as training wont be done on your local machine anyway and MacBooks are good enough for that (a lot better than windows for sure).
How is it a clickbait? He bought an M2 Mac based on the advertising and reports from other users. He did some pretty in-depth research into if the machine will work for his workflow. He decided it didn't because of a lack of performance for the price.
It's also just a weird nonsense comparison. You buy a GPU from Nvidia and you just get a GPU. You buy a Mac from Apple and you get a whole computer that includes a GPU.
To make a meaningful cost comparison we need to include an equivalent computer in which to put Nvidia's GPU.
Then we get into what workloads you need the GPU for.
I kinda quit reading when the real-world task and Geekbench compute were conflated. Please list some actual task times and chart them when that's what you're discussing.
Also - it was unclear - did the software have issues with the M1? Or only the M2 Pro? I have no idea what OBS is.
OBS is widely used video recording and live streaming software. If you're looking at a Youtube video or a Twitch stream where multiple streams of video are overlapped with one another, e.g. the commentator is in the bottom corner while the commented content covers the rest of the screen, you're probably looking at OBS.
OBS has had issues on my M1 Macbook, but not anything related to performance. It's mostly that the OS doesn't allow OBS to pick up desktop sound. There's an app that allows you to capture the desktop sound by routing all of the sound into it, but it requires you to disable some security features on macOS, which I'm not happy about. However, if using OBS was part of my job, then yeah I'd remove it 100%. But ultimately I just found it easier to go back to Windows to do gaming. It's amazing how far macOS has come... I just wish it was better for gaming. I can't wait for the day I never have to use Windows again, although it may never come.
Probably referencing Open Broadcaster Studio (https://obsproject.com/), very popular for Twitch streaming, or streaming in general (my mom uses it to live stream church).
TL;DR: Author is returning because it’s expensive, had some bugs with the apps they were running and they didn’t get the performance they were expecting (unoptimised apps seem to be the culprit).
The bugs mentioned, sporadic crashes across a range of apps, seem odd. Maybe a faulty unit or a bad OS install? I’ve occasionally had Macs come “fresh out of the box” with buggy OS installs.
Calling these "some bugs" is quite the downplay. Any machine with the issues mentioned (whether it's a mac mini or a puget sound monster pc) is an immediate return. And if you drank the mac koolaid, that would also be the moment you sober up and go "hold up, I know I've used multiple generations of mac mini but can I actually get better performance by spending less?".
As mentioned, unless you absolutely need OSX (which tends to not be true for the vast majority of users), the "why don't I just put a 4080 in a PC" route is a pretty reasonable path to go down once you realize that doing the work you need to do is more important than the machine you do it on.
I agree that there's something else happening related to the "ghost in the machine" stuff, and wish he would've pursued that further. I'm also not sure why he imagined that the Mac mini's integrated GPU and AI hardware were going to beat $1,200 of dedicated hardware and CUDA-optimized software.
I had a friend who worked for an airline many years ago (pre-9/11) and told me lots of stories.
One thing he told me sticks in my mind though.
The airline would periodically have problems - cancelled flights, bumped passengers, etc. He said this was disservice, and it was handled ok by most normal people. But one situation was different. He said that first class customers, who would routinely pay 10x the price for a ticket - they did not handle bad service well. Disservice a first-class customer and they would be MAD.
I strongly considered the 4090, at $500 more you get a lot more performance. (if you can find one). However it mostly came down to this:
1. I'd need a new power supply as well (need 4 PCIe Ports)
2. The power usage is drastically more (I care about this)
3. More heat and noise
4. I don't need the extra performance that much.
So yeah from a value perspective 4090 is a better bet for most people, but after careful consideration I went for the 4080. It's been fantastic so far.
Once you return a $2600 computer, you realize you have $2600 in your pocket. And now you're going to actually do some bang for your buck math to make sure you're wasting money _again_
The best value for money, though, is money. If the 4080 does what this guy needs, even if the 4090 is better, that's still hundreds of dollars saved. The bar to clear is "is it fast enough for what I need it to", after that it's a luxury spend.
Thank you. I see this all the time where theoretical performance/w or performance/$ purchases are made as if you own a data center. Performance unused is very expensive indeed.
Sorry, but your reply is really funny, even if it's a jab at me.
I outlined the reasons for the 4080, full knowing it's drastically less performance than the 4090. I balanced power usage, heat and noise vs how much performance I need.
Doesn't make your comment any less hilarious though.
There was no jab, just a typo. That was supposed to say "you're not wasting" money. But you only get an [edit] link for so long before your comment's permanent.
Lots and lots of YouTube reviewers give their opinions and various benchmark-oriented stats around this. Gamers Nexus are pretty good in my view (though their videos tend to take a long time to not say a whole lot)
Two and a half years ago, I put together a PC for 1500 euros that matched the performance of a 10k mac pro (at the time).
It replaced a macbook pro (with touch bar, no less) that costed me $4500. It swelled on me about two year after purchase, shortly after the expiration of the standard warranty.
Agreed! This issue sounds like a case of a bad machine. I have about 50 tabs open in Chrome on my Mac Studio and have consistently had far too many open for over six months without issue.
Some weird glitches, but a lot of these seem to boil down to "You don't need a maxed out Mini M2 Pro if you already have a high end Mini M1 from two years ago". Yeah, I can believe that.
He might have been using OBS that's compiled for Intel, which struggles a lot and has weird bugs. OBS just recently started officially shipping Apple silicon builds. Those perform way better.
Almost all software you use is third party software. If it doesn't work on your hardware it doesn't matter where the problem lies as an end user, you can't make use of the hardware.
Hell, the Adobe suite still lacks a bit on Apple Silicon. Entire industries are built on the marriage of Adobe and Apple products. I'm not going to say that Apple Silicon sucks or that it isn't the better choice in a lot of workflows, but shifting the blame to third party developers is incredibly short sighted. The platform is in its infancy.
I didn't say that? I was saying that any platform without good third party support could affect sales of that platform. Apple decided to take on this large architecture shift and they decided to go all at once instead of doing it piece meal. e.g. nobody asked Apple to make their own GPUs, graphics APIs, etc. These types of things create a lot of friction that puts a lot of work on third party devs.
The argument you're trying to make, that Apple is somehow responsible for bugs in third party software because "any platform without good third party support could affect sales of that platform," is unsupportable and utterly bereft of logic. You're suggesting the software bugs OP experienced are going to cut Apple's sales, but Apple has plenty and a myriad of professional third party developer support. Apple has 30M registered developers, including all of the world's largest developers, like Microsoft, Google, Oracle, Adobe, Block, Intuit and VMWare, and the list of major developers goes on and on. Bugs in OSS software OP is stumbling with isn't going to affect sales of Macintosh in any way whatsoever, though it may and should affect how many users there are of that application.
You and the OP apparently want to blame Apple for bugs in third party software, which is insane because all responsibility for bugs always and only lies with the developer and no one else. Apple has absolutely no control over nor has any ability to address bugs in third party software. Developers should do their jobs, but users should also report bugs if they expect them to be solved. Returning a computer due to buggy third party software is like returning a new truck because the engine knocks when the tank is filled with vegetable oil. The complaints by OP are absurd on their face.
You completely missed the point. Apple has no control over third party software. Blaming the machine for bugs in third party software is beyond absurd. To get bugs fixed, one must submit bug reports to the developer. Blaming hardware manufactures won't accomplish anything.
When you put out new hardware and a new operating system and multiple pieces of software don't work that's hardly the developer's fault. Apple should be addressing these issues or rolling back to Rosetta more until the software developers can catch up.
Especially when you're charging premium prices for a machine that "just works".
> When you put out new hardware and a new operating system and multiple pieces of software don't work that's hardly the developer's fault.
Again, software bugs are only ever the developer's responsibility. It is not ever ever possible in any universe for a hardware manufacturer and os vendor to fix any developers' bugs. Apple is responsible for their hw working, and their os working. These work fine and continue to work fine even if an application crashes. Any application that chokes on Mac and/or macOS, or any hardware or any OS, can only be remedied by the developer of that application, namely, by fixing their bugs.
Microsoft put a lot of work on backward compatibility, even on games [0].
I'm not saying app devs aren't to blame, but Apple changed ISA under their feet, Rosetta mostly works, but not always. Who pays devs to fix bugs probably caused by Apple decisions?
Even Linus Torvalds says "we do not break userspace" [1], in fact not fixing "bugs" because things are the way they are.
Apple just doesn't care and never had, it's thier way or the highway, and you pay a lot for it, it just must works, no questions.
Guilty as charged. I stated that because multiple variables are at play. New OS, plus new chipset. I don't know which is the culprit of most of these problems.
Let's assume it is the new OS and new chipset: the culprit is still the buggy software. Developers need to fix bugs in their software, and they are the only ones who can. When the first ARM processor was released and x86 code would not work on it, the problem wasn't with the processor or OS; it is always with the application code. Though x86 code is not compatible with an ARM processor, thus would never work as is, there is still nothing wrong with the CPU or OS, technically making x86 code remarkably buggy on ARM CPUs.
Those all use accelerated graphics. Either directly (sublime, premier, obs) or indirectly through embedded browser. Suggests his mac has a graphics card issue, hardware or driver.
Exactly. Imagine if you had the resources to create your own CPU and SoC, selling it as a product then saying "stupid software developers" when nearly every application won't work properly on it.
Because it accomplishes nothing. Apple has no control over third party software. Software developers fix their bugs, not computer manufacturers, which is indeed a silly expectation.
It accomplishes letting people know the product is not ready for users.
> over third party software.
Obviously Apple has a huge amount of control over third party software. But that's not the point: even if they had no control, the product isn't ready.
Software that was not adequately tested or tested at all on a new system before release on a new system does not speak to whether the system is ready. If it's the software that isn't working right, maybe, just maybe, the problem is in its code. Usually, bugs work like this: experience the bug, duplicate the bug, report the bug, identify the bug, understand the bug, fix the bug, release bug fix update. It never works like this: experience the bug, blame and return the hardware. That solves nothing because it didn't actually address the bug, which is still there.
Damn. Anyone had the same vscode lags and crashes? These machines have great reviews from other sources but if my editor is dying it’s not going to work out.
My daily driver is a Macbook Air M1 with 16GB of RAM, and my backup is a Mac Mini 8GB; both hooked up to 5K displays.
I live in vscode, a plethora of browser addons and tabs (with auto-suspend extensions after 30min of idle), some Teams and Google Meet sessions (video piped through OBS) and some terraform every once in a while; no discernible performance difference between the two machines 99% of the time, although the remaining 1% is a swapping-party on my Mac Mini because 8GB of RAM is certainly not enough.
Not really. I got a M2 Mini Pro, just the base model. Using VSCode, XCode, iOS, Android Emulators, Affinity Suite, bunch of other stuff, some casual games too. Can't complain, everything is very smooth and fast all around. Best thing is fans are very quite and rarley even turn on at all.
Thanks. Not complaining about moderation, but it's very strange being downmodded into dust for asking 'is anyone else asides from the reviewer experiencing this?'
I haven't had any issues on my Mac Studio with VS Code, Chrome, or any other apps. Has your machine always behaved that way? Could it be something you installed that isn't playing nicely with VS Code?
Great product review, I appreciate the thorough performance testing. In a world where everyone is rushing products and APIs to get on the hype train, it’s good to see some sanity from our community.
I don't disagree that with the main conclusion, it is a lot of money and for most people it doesn't make sense. If I was spending $2600 I'd be looking at places that do certified refurb M1 Ultra Mac Studios with at least 64gb unified memory, which I think will be going a long way soon to helping future proof a machine for the next few years.