Hacker News new | past | comments | ask | show | jobs | submit login
Apple unveils M3, M3 Pro, and M3 Max (apple.com)
1035 points by ehPReth on Oct 31, 2023 | hide | past | favorite | 1126 comments



Related ongoing threads:

Apple unveils the new MacBook Pro featuring the M3 family of chips - https://news.ycombinator.com/item?id=38078065

Apple supercharges 24‑inch iMac with new M3 chip - https://news.ycombinator.com/item?id=38078068


A few things I noticed, as I'm seeing the variety of SKUs becoming more complex.

- Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

- Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

- The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

- The M3 Pro actually has more E-cores than the Max (6 vs 4). Interesting to see them take this away on a higher-specced part; seems like Intel wouldn't do this


I'm wondering now if Apple tracks this sort of stuff in their released machines. I know my iPad always asks me if I want to send stats to Apple (and I always say no). So let's say that enough people do, then do they have a good idea of how often all the performance cores are used? Max memory B/W consumed? Stuff like that. Back when I was at Intel there were always interesting tradeoffs between available silicon/thermal/margin resource and what made it into the chip. Of course Intel didn't have any way (at that time) to collect statistics so it was always "... but I think we should ..." not a lot of data.


Apple shares anonymous usage data by default on all their operating systems and users are asked again on every major update.

Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

But I suspect at high-end they only really care about the performance of a few dozen professional apps e.g. Logic or Final Cut. And at the low-end it's likely just efficiency.


> Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

A 95% opt-in rate is INSANELY high for any type of usage-stat opt-in, everything above 50% is usually outstanding.

What is known about "similar defaults"?


Apple enjoys a level of consumer trust that essentially no other technology business, and almost no other business at all. Whether that's justified or not is a matter of opinion.


It seems like the comment above is describing out-out and the it pesters you to opt-back in if you opt-out.


That's not how it works. You get asked the question again on every update, regardless of what you chose the last time.

So there are people who were opted-in that change their minds. My friends and family opt-in rate is <50%. And most of them are non-technical.


It honestly doesn’t matter. We’re talking about hundreds of millions of devices sending data in either case. A hundred million more provides no additional value.


...unless there's a correlation between opt-in choice and usage patterns.


That’s the trade off. You don’t opt-in, then you don’t get customized stuff. Shouldn’t be surprised if Apple doesn’t optimize for your usage.


Major updates are infrequent maybe once a year if you always update, it’s not pestering you. And the UI makes it very easy to skip unlike some designs.


Unless there is a flurry of network vulnerability updates, then a bespoke fork is set in the road for them.


Security/minor updates don't prompt for this AFAIK


It’s a step in a setup wizard. Whilst it’s explicitly asked, and far from dark pattern territory, it’s designed in such a way that I wouldn’t be surprised by a 95% opt-in rate.


I would be VERY surprised.

To someone with experience in that area of UX, a 95% opt-IN rate is ridiculously high.

A 95% consent-rate would already be hard to achieve as opt-OUT.

For opt-in a 95% rate would require both attention AND consent from 95% of the audience at this stage in the setup wizard.

I highly doubt that it can achieve 95% attention, let alone 95% consent.


But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.


I was actually more genuinely interested to learn about the "similar defaults" mentioned in the OP, the 95% comment was just a side-note to a huge overestimation on how easy consent is achieved.

> But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.

Thing is, you don't even have 100% of the users' attention in this case. The user wants to use the device, you're just standing in the way.

The scenario is this: You force the user to take a decision between option A and B. Regardless of his decision he will achieve his immediately desired outcome (move to the next screen / use the device).

Getting 95% to vote for 'A' would require some quite aggressive dark pattern, to the point that option 'B' would need to be almost invisible and actively discouraged.

Even if the UI would be a pre-checked check-box and the user would just have to select "Next" to Continue (=opt-out), your rate of consent would not be 95%. As mentioned, everything beyond 50% is already outstanding

Or, let's rephrase: If Apple would have 95% opt-in rate, they wouldn't bother chasing for consent again on every SW-update


Another way of putting it: an option for a 100$ itunes gift card no strings attached, probably wouldn't hit 95%


I do agree it's probably not 95%. But 60% wouldn't surprise me.


Expect something in the ballpark of 20-25%, and that already assumes that Apple's above-average brand-reputation translates into above-average consent on data sharing with them.


To add to this, it's not like a mailing list, either. Marketing opt-in is lower because it's annoying. A lot of people don't want emails.

Anonymized stats from your machine? Most normal people (who don't use computers like we do) do not care and just click the most affirmative option so that they can move forward.


This is deeply misguided opinion aboit 'nornal' people. To nornal people 'Anonymous' is a lie.

My dad can't tell apart Windows and Linux, but he makes sure to uncheck any kind of data collection, tracking, and clicks 'reject all' on every cookie warning


Yeah, I don't think allowing telemetry etc is really a matter of technical literacy, and is more a matter of social trust. High-trusting people will see no problem, low-trusting people will say "no way!". I'd imagine this varies widely but with definite trends based on social/economic class.


> To nornal people 'Anonymous' is a lie.

Normal people don't even give a second of thought to this. My partner knows the difference between windows and Mac, and is perfectly content to browse the internet without an ad blocker and to read in between all the cookie dialogs. The only time she clicks on one is when it's required to proceed, and she'll click whichever one is the most obvious button.


I think that was kind of the OP point. "Pro" users are significantly more likely to be opt-out in this scenario, unless they are not Pro users but just want the Pro machine for conspicuous consumption, making a much more dramatic swing in the usage data that is collected.


The word Pro in the product name really doesn't separate consumers as well as you might think.

Every college kid has a Mac Book Pro, yet they are by definition not Pros


It’s more like 15% opt in. I know because it controls dev access to analytics on their apps.


Wait telemetry is opt-out?

And I've never heard people complain?

Genuinely surprised as it seems to be quite a commonly controversial thing amongst devs.


It's not exactly 'opt-out', they ask you on first boot or after major upgrades, and you either select "Share with Apple" or "Don't Share with Apple". It's just that the "Share" button is coloured blue so looks more default since it's more prominent (at least on iOS, I think it's basically the same on macOS).

It's not like it's enabled by default and you have to know to go and find the setting to turn it off or anything..


It’s opt-out, but it’s not enabled silently. It’s a pre-ticked checkbox on a screen you have to review when you first setup the machine (and when you do a major version OS upgrade).

IMO that’s quite different to something that’s just silently on by default, and requires you to set an environment variable or run a command to opt out.


On a phone there is no box at all. It's two options to select. The opt-in is highlighted, but there is no "next" button -- you have to select an option.


I don't think it's pre-checked, is it? I thought it was Yes/No buttons


No the default action is to do nothing (ie do not install the OS). You have to actively consent or reject.


Yeah, that's kind of surprising, given that Apple is often hailed as a privacy champion.


It’s not really opt-out or opt-in: it’s an explicit, informed choice you have to make when you start up your Mac after first purchase or major upgrade.


Well, Apple generally has so much info about your every step people stopped caring a long time ago.


I think you are talking about Google, not Apple.


No, both of them actually. Don't trust them too much.

This calls out some soft spots that were exposed during the Hong Kong riots: https://www.youtube.com/watch?v=nQ9LR8homt4


Not the OP, but I am not watching a random YouTube video from a random guy to help you prove your point. I can confidently link you some of these that “prove” that the Earth is flat.


It's not a default because users must choose yes or no. So there basically is no default.


> asks me if I want to send stats to Apple (and I always say no)

so you like them enough to pay them thousands for the premium product, but not enough to tell them how much CPU you use?


I have no idea what information they’re collecting on me, and it seems very few people do (given that nobody was able to answer the above question).

Could be “how much CPU does this user use?” but could also be “when prompted with a notification that a user’s iCloud backup storage is low, how long did they hesitate on the prompt before dismissing? How can we increase their odds of upgrading?”

Also, my willingness to provide information does not correlate to how much I “like” a company’s products. If I buy a sandwich from a deli, and they ask for my email for their newsletter or something, I won’t give it. That doesn’t mean I don’t like their company or their sandwich. Could be the best sandwich in the world, they don’t need my email.


In addition to the reduced memory bandwidth, the M3 pro also loses 2 performance cores for only 2 more efficiency cores.

M2 pro: 8 performance cores + 4 efficiency cores.

M3 pro: 6 performance cores + 6 efficiency cores.

Not a great trade... I'm not sure the M3 pro can be considered an upgrade


Depends. Is it faster? Then it's an upgrade.

Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought we'd learned our lesson with the silly Mhz Myth already?


I guess we'll have to wait for benchmarks but I did find this interesting:

Apple's PR release for M2 pro: "up to 20 percent greater performance over M1 Pro"

Apple's announcement for M3 pro: "up to 20 percent faster than M1 Pro" (they didn't bother to compare it to M2 pro)


Sure, that's the title, but at least in this PR they immediately show a graph with a comparison to both.

Presumably it makes more marketing sense to compare to the M1 family up front because most people that bought an M2 last year are probably not going to be upgrading to M3. They are speaking to the people most likely to upgrade.


fwiw, i cant remember the last time i saw a company go back more than a generation in their own comparison. Apple is saying here as much as they're not saying here. M2->M3 may not be a compelling upgrade story.


The vast majority of Mac users go years between upgrades. For any other vendor it might seem weird to show several comparisons going back multiple generations (M1 and x86), but for the macOS ecosystem it makes perfect sense since only a very tiny slice of M2 users will be upgrading.


and what makes you think windows users update their devices every single generation?


Windows has distinct groups: the people who buy whatever costs $700 at Costco every 10 years / when it breaks don’t care but there’s also a vocal enthusiast community who do upgrade frequently. That group gets more attention since it’s a profitable niche and gaming generates a lot of revenue.


I used buy a $700 Windows laptop every 18 months in the 2000s. Then I got fed up with them just falling apart and switched to Macbooks. My 2013 purchase is still alive and being used by the kids.


In the 2000s, I went through a wide variety of PC laptops (Lenovo, Toshiba, Dell, Alienware, Sony, etc.) all within the range of $1200-$6500 and they all died within 3 years (except for the cheapest one which was a Lenovo with Linux). Some died within a year.

When my first Macbook lasted for more than 3 or 4 years I was surprised that I was upgrading before it died. I went through many upgrades with almost zero issues(one HDD failure, one battery failure). I still have a 2012 Macbook Pro that I've since installed Linux on.

When I bought the first touchbar Macbook (late 2015?) I spent around $6k maxing out the options, and I was surprised at how totally trash it was. Hardware QC issues were shocking: particles under the screen from manufacturing, keys stuck within the first hour of usage, external monitor issues, touchbar issues...

I haven't bought a laptop since.


Did you buy a macbook for $700? That was a pretty low price back then which meant you were buying devices made to a price. Buying a Macbook is one solution, another would have been to spend more money on a higher quality Wintel system.


No, it was around $1100 IIRC, maybe as much as $1300.


Right, so when spend twice as much you wind up with a better device. I think this might be only tangentially related to the fact that it was an Apple product, rather, you weren't purchasing the cheapest available device.


Ten years ago Apple was by far the highest quality laptop manufacturer. There was essentially no other option back in the early 2010s. Even now laptops with a "retina" display are not always easy to find for other manufacturers. In retrospect, that was probably the killer feature which induced me to switch.


Yeah, the quality of PC laptops has improved but that really just means you can get closer to equivalent quality at equivalent pricing. I've heard people claim to have saved a ton but every single time I used one there was some noticeable quality decrease, which I find kind of refreshing as a reminder that the market does actually work pretty well.


Did you treat the MB differently because you paid more? If so, that may have yielded longer life in addition to quality design, etc.


Not really. The difference in build quality was night and day; metal vs. plastic, keyboard that doesn't flex, etc.


Windows users buy whatever, from so many brands, that it doesn't matter how often they upgrade, they're likely to not upgrade from the same vendor anyway (so that the comparison to its older generations to be meaningful in the first place).


> and what makes you think windows users update their devices every single generation?

They don't, but the difference is that Windows users generally don't know or care about processor generations. In contrast, it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.

You can test this by asking Windows users what CPU they have. For the few who know and who have an Intel CPU, you can ask what their Brand Modifier¹ (i3/i5/i7) is. If they know that, you can ask what the 5-digit number following the Brand Modifier is — the first two digits are the Generation Indicator¹. I'd be surprised if more than 0.01% of Windows users know this.

¹ Intel's name


Intel's CPU naming strategy used to drive me nuts when trying to talk to anyone at work who knew "just enough to be dangerous." Why is X so slow on this machine, it's got an [6 year old, dual core] i5! It runs fine on my laptop and that's only an [1 year old, quad-core] i3!


> it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.

Not at all. I've worked with FANG developers with brand new M1 MBPs that had no idea what 'm1' meant until something broke.


like everything you said could apply to nvidia gpus as well


man, that's a whole lot of mental gymnastics to justify scummy benchmark practices from apple.


How are they scummy? The M3 vs. M2 performance improvements they showed looked pretty modest.

My interpretation while watching the event is that this is a company persuading x86 holdouts to upgrade to Apple Silicon, and maybe some M1 users as well.


It’s absolutely not, and that’s fine. The video has statements that the machines are made to “last for years” and they want to save natural resources be making long lasting machines.

I’m currently at 4 to 5 years on laptops and 3 to 4 years on phones, and even then I hand them over to kids/friends/family who get a bit more use out of them.


> they want to save natural resources be making long lasting machines.

Apple always comes from a position of strength. Again, they're saying as much as they're not saying.

Also, if they really cared about long lasting machines: slotted ram and flash please, thanks!


Huh. So they used to do this, but looking at the M series chips it seems like the architecture assumes the CPU-GPU-RAM are all on the same chip and hooked into each other, which enables zero copy. Someone more well versed in hardware could explain if this is even possible.

Expandable internal storage would be nice, yeah. But I get the sealed, very tightly packed chassis they’re going for.


> get the sealed, very tightly packed chassis they’re going for

The Dell XPS 17 is only 0.1 inch thicker yet has fully replaceable RAM and 2(!) m2 slots. I’m pretty sure what Apple is going for is maximizing profit margins over anything else..


I have an XPS 15. And while I liked that I could bring my own SSD and RAM, the build quality is nowhere near a Macbook Pro... like not even in the same galaxy. I had to have it serviced multiple times within the first few weeks. It had to be sent to Texas, and when it returned, one WiFi antenna wouldn't plug into the card, and the light on the front was permanently broken. I could have demanded Dell fix it - and I'd have been even more weeks without my main work laptop. So, by pure numbers/specs? Sure. By real world quality, no way would I favor Dell.


The issue is often comparing apples (heh) to oranges.

I understand the desire for slotted RAM, but the major limiting factor for nearly 10 years was CPU support for more than 16G of RAM. I had 16G of ram in 2011 and it was only 2019 when Intels 9th Gen laptop CPUs started supporting more.

The Dell XPS 17 itself has so many issues that if it was a Macbook people would be chomping at the bit, including not having a reliable suspend and memory issues causing BSOD's. -- reliability of these devices, at least when it comes to memory, might actually be worse and cause a shorter lifespan than if it had been soldered.

Of course it always feels good to buy an underspecced machine and upgrade it a year later, which is what we're trading off.

But it's interesting that we don't seem to have taken issue with BGA CPU mounts in laptops but we did for memory, I think this might be because Apple was one of the first to do it - and we feel a certain way when Apple limits us but not when other companies do.


There’s a lot of flat-out wrong information in this post. For one, even the low-power (U-series) Intel laptop CPUs have suported 32GB+ of memory since at least the 6th generation[1]. Many machines based on these CPUs unofficially support more than that. I have a Thinkpad with an i7-8550u and 64GB of DDR4, and it runs great.

On top of that, the higher-power laptop SKUs have supported 64gb or more since that time as well.

Secondly, it’s silly to claim that having RAM slots somehow makes a computer inherently more unstable. Typically these types of issues are the result of the manufacturer of the machine having bugs in the BIOS/EFI implementation, which are exacerbated by certain brands/types of memory. If you don’t want to mess around with figuring that stuff out, most manufacturers publish a list of officially-tested RAM modules which are not always the cheapest in absolute terms, but are always night-and-day cheaper than Apple’s ridiculous memory pricing.

[1] https://www.intel.com/content/www/us/en/products/sku/88190/i...


Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM. I know because I had to buy a high end workstation laptop (Dell Precision 5520 FWIW) because no other laptop was supporting more than 16G of RAM in a thin chassis.

No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

I know this because it was something I was looking at intently at the time and was very happy when the restrictions were lifted for commercially viable laptop SKUs.

Citing that something exists predisposes the notion of availability and functionality. No sane person is going to be rocking into the room with a Precision 7520 and calling it portable. The thing could be used as a weapon and not much else if you had no power source for more than 2hrs.

Also, socketed anything definitely increases material reliability. I ship desktop PC's internationally pretty often and the movement of shipping unseats components quite easily even with good packing.

I'm talking as if I'm against socketed components, I'm not, but don't pretend there's no downsides and infinite upgrade as an upside, it's disingenuous, in my experience there are some minor reliability issues (XPS17 being an exceptional case and one I was using to illustrate that sometimes we cherry pick what one manufacturer is doing with the belief that there were no trade offs to get there) and some limitations on the hardware side that limit your upgrade potential outside of being soldered.


> Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM.

> No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

Here are the Lenovo PSRef specs for the Thinkpad T470, which clearly states 32GB as the officially-supported maximum, using a 6th or 7th gen CPU:

https://psref.lenovo.com/syspool/Sys/PDF/ThinkPad/ThinkPad_T...

This is not a behemoth of a laptop; I'm writing this on a T480 right now, which supports 32GB officially and 64GB unofficially, and it weighs 4lbs with the high-capacity battery (the same as the T470).

I can't tell if you're trolling or what, but if you're serious, you clearly didn't look hard enough.

Edit: since you mentioned Latitudes, Elitebooks, and Fujitsu lifebooks:

- Dell Latitude 7480 (6th gen CPUs) officially supports 32GB: https://www.dell.com/support/manuals/en-us/latitude-14-7480-...

- HP Elitebook 840 G3 (6th gen CPUs) officially supports 32GB: https://support.hp.com/us-en/document/c05259054

- For Lifebooks, I couldn't find an older one that supported 32GB, but this U937 uses 7th gen CPUs, and has 4GB soldered and one DIMM slot which supports up to 16GB. This is a total of 20GB, again, breaking the 16GB barrier: https://www.fujitsu.com/tw/Images/ds-LIFEBOOK%20U937.pdf

I believe these are all 14"-class laptops that weigh under 4 pounds.


One more thought: you might be getting confused here with the LPDDR3 limitation, which was a legit thing that existed until the timeframe you're thinking of.

Any laptop which used LPDDR3 (soldered) typically maxed out at 16GB, but as far as I'm aware, this was due to capacity limitations of the RAM chips, not anything to do with the CPUs. For example, the Lenovo X1 Carbon had a 16GB upper limit for a while due to this. I believe the 15" MacBook Pro had the same limitation until moving to DDR4. But this is entirely the result of a design decision on the part of the laptop manufacturer, not the CPU, and as I've shown there were plenty of laptops out there in the ~2014-2016 timeframe which supported 32GB or more.


Intel actually has this documented all on one page: https://www.intel.com/content/www/us/en/support/articles/000...

DDR4 support was introduced with the 6th gen Core (except Core m) in 2016, LPDDR4 support didn't show up until (half of) the 10th gen lineup in 2019. It's just another aspect of their post-Skylake disaster, wherein they kept shipping the same stuff under new names for years on end before finally getting 10nm usable enough for some laptop processors, then a few years later getting it working well enough for desktop processors. In the meantime, they spent years not even trying to design a new memory PHY for the 14nm process that actually worked.


Yeah, this link is helpful, but IMHO doesn’t actually call out the specific problem I was referring to, which is that only laptops that used LPDDR3 had the 16GB limitation. If the laptop used regular DDR3, or DDR4, it could handle 32/64GB. The table lumps everything together per processor model/generation.


They haven't made slotted ram or storage on their macbooks since 2012 (retina macbooks removed the slotted ram afaik). It might save on thickness, but I'm not buying the slim chasses argument being the only reason, since they happily made their devices thicker for the M series cpus.


> It might save on thickness, but I'm not buying the slim chasses argument being the only reason

Soldered memory allows higher bus frequency much, much easier. From a high frequency perspective, the slots are a nightmare.


It's not soldered. It used to be, but ever since the M1, it's in-CPU. The ram is actually part of the CPU die.

Needless to say it has batshit insane implications for memory bandwidth.

I've got an M1, and the load time for apps is absolutely fucking insane by comparison to my iMac; there's at least one AAA game whose loading time dropped from about 5 minutes on my quad-core intel, to 5 seconds on my mac studio.

There's just a shitload of text-processing and compiling going on any time a large game gets launched. It's been incredibly good for compiling C++ and Node apps, as well.


the ram is not on die, and 5 min to 5 sec is obviously due to other things, if legit


Sounds like the iMac had spinning hard disks rather than SSD storage.


Yup. I’ve been looking at the Framework laptop, and it’s barely any thicker than the current MacBook Pro.


I have no excuse for flash, but memory can't really be slotted anymore since SODIMM is crap. High hopes for CAMM making it's way into every other machine 2024!


Given that there is a legally mandated 2-year warranty period at least in Europe, I would be surprised if any laptops weren’t made to “last for years”.

The problem with Apple, however, is that their hardware will long outlive their software support. So if they really want to save natural resources by making long-lasting machines, they should put much more effort into sustained software support.


Yes my MacBook Pro 2010 is still going strong.

But, drivers are only available for win 7 and macOS High Sierra was the last supported version.

Luckily Linux still works great.


> i cant remember the last time i saw a company go back more than a generation in their own comparison

Apple likes doing that quite frequently while dumping their "up to X% better" stats on you for minutes.


Nvidia did it when they released the RTX 3080 / 3090 because the RTX 2000 series was kind of a dud upgrade from GTX 1060 and 1080 Ti


Apple always does game comparisons like this for their conferences though. The intel era was even worse with this iirc.


Intel era there wasn’t much to game, they’re using the same chips as all the PC competitors. The PowerPC era, on the other hand…


The majority of MacBooks out there are still intel based. This presentation was mostly aimed at them & M1 owners.


Is it a problem, though? The vast majority of people skip generation and for them the relevant reference point is what they have, which is going to be hardware from a couple of generations ago. M2 -> M3 does not have to be compelling: the people with M3 devices are a tiny fraction of the market anyway.

I find it interesting how people respond to this. On one side, it’s marketing so it should be taken critically. OTOH, if they stress the improvements over the last generation, people say they create artificial demand and things about sheeple; if they compare to generations before people say that it’s deceptive and that they lost their edge. It seems that some vocal people are going to complain regardless.


Given how strong they emphasised the performance over the Intel base - who now have had their machines for 4 years and are likely to replace soon (and may be wondering if they stay at Apple or switch over to PCs), it is pretty obvious that they also want to target that demographic specifically.


That’s not what it says. Actual quote:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.


Ok, so then the M3 pro is up to 1.3/1.2=~8% faster than the M2 pro? I can see why they wouldn't use that for marketing.


Depends who they are marketing to I think is the point. If the biggest group of potential buyers are not M2 users, then it makes sense not to market to them directly with these stats.

I've got an M1 Max 64GB and I'm not even tempted to upgrade yet, maybe they'll still be comparing to M1 when the M10 comes out though.


I'm also far from replacing my M1. But if someone from an older generation of Intel Macs considers upgrading the marketing is off as well.


I was referring to the graphic they showed during the announcement that verbatim said the CPU was "up to 20% faster than M1 Pro".

https://images.macrumors.com/t/wMtonfH5PZT9yjQhYNv0uHbpIlM=/...


Plausibly they thought market is saturated with M1:s and targeted this to entice M1 users to switch.


> Depends. Is it faster?

The devil tends to be in the details. More precisely, in the benchmark details. I think Apple provided none other than the marketing blurb. In the meantime, embarrassingly parallel applications do benefit from having more performant cores.


Heh, I recall seeing many posts arguing against benchmarks when all Macs equipped with an M2/8GB/256GB SSD scored much, much lower than the M1/8GB/256GB SSD. People said the synthetic benchmarks were not representative of real world use and you'd never notice the difference. 'Twas a battle of the optimists, pessimists, and realists. In reality, 'twas just Apple cutting costs in their newer product.


> Heh, I recall seeing many posts arguing against benchmarks (...)

It's one thing to argue that some real-world data might not be representative all on itself.

It's an entirely different thing to present no proof at all, and just claim "trust me, bro" on marketing brochures.


oh absolutely, I can't wait to see the benchmarks. Per the (non-numerical data) benchmarks in the video tho - it is faster. So... until other evidence presents itself, that's what we have to go on.


> Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought this at first then I realized the cost-performance benefit gained from adding more cores often outweighs just improving the performance of single cores. Even in gaming. I think this is what led AMD to create their Ryzen 9 line of CPUs with 12 cores in 2019.

That being said, I abhor the deceptive marketing which says 50% more performance when in reality, it's at most 50% more performance specifically on perfectly parallel tasks which is not the general performance that the consumer expects.



Few game devs bother optimizing games to take advantage of multiple cores


I find that frustrating with how intel markets its desktop CPUs. Often I find performance enhancements directly turning off efficiency cores...


Faster than what? M1 Pro? Just barely.


Reference should be M2 pro


I suspect it's about equal or perhaps even slower.


Based on what? The event video says it's faster.


M2 Pro was about 20-25% faster than M1 Pro, M3 Pro quotes a similar number. It has faster cores but a weaker distribution of them. Seems like a wash, but we'll see exactly how close when benchmarks are out.


2.5x is "just barely"? lol k.


> 2.5x is "just barely"? lol k.

That's only rendering speed, and M3 Max vs M1 Max (not Pro). M3 Pro is only 30 percent faster:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.


20%


Let me re-write your post with the opposite view. Both are unconvincing.

<< Depends. Is it faster? Then it's an upgrade. Has the CPU industry really managed to pull off it's attempt at a bs coup that more MHz always === better?

I thought we'd learned our lesson with the silly cores Myth already? >>


I think you're misreading the comment you're replying to. Both "more cores is always better" and "more MHz is always better" are myths.


Yup, exactly what I was saying.


Yes, but the number of cores in similar cpus do provide a good comparison. For example, with base M2pro at 6 p cores and base M3pro at 5 p cores, one would want ~20% faster cores to compensate for the lack of one core in parallel processing scenarios where things scale well. I don't think M3 brings that. I am waiting to see tests to understand what the new M3s are better for (prob battery life).


That's... the same view, just applied to a different metric. Both would be correct.

Your reading comprehension needs work, no wonder you're unconvinced when you don't even understand what is being said.


That makes less sense because the MHz marketing came before the core count marketing.

I agree with GP that we should rely on real measures like "is it faster", but maybe the goal of exchanging performance cores for efficiency was to decrease power consumption, not be faster?


Probably a balance of both tbh, as it appears to be both faster AND around the same performance per watt.


The new efficiency cores are 30% faster than M2, and the performance ones 20% faster, so lets do the math:

    M2: 8 + 4

    M3: 6*1.2 + 6*1.3 =
        7.2 + 7.8
That’s nearly double the M2’s efficiency cores, a little less on the performance ones.

They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.


You're not considering the difference in performance between the p and e cores. The math should be something more like:

  M2 pro = 8*3 + 4 =28 (the *3 representing that the performance cores contribute ~3x more to total system performance than the efficiency cores)

  M3 pro = 6*3*1.15 + 6*1.3 =28 (apple claims 15% more performance for the p cores not 20%)
> They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.

They don't claim either of those things. They claim the performance is 20% faster than the M1 pro. Interestingly, they made that exact same claim when they announced the M2 pro.

Energy efficiency might be better, but I'm skeptical till I see tests. I suspect at least some of the performance gains on the p+e cores are driven by running at higher clock rates and less efficiently. That may end up being more significant to total energy consumption than the change in the mix of p/e cores. To put it another way, they have more e cores, but their new e cores may be less efficient due to higher clock speeds. Total energy efficiency could go down. We'll just have to wait and see but given that apple isn't claiming an increase in battery life for the M3 pro products compared to their M2 pro counterparts, I don't think we should expect an improvement.


If you wanted to be even more accurate, you'd also have to take into account that most tasks are executed on the E cores, so having more of those, or faster, will have a much greater impact than any improvement on the P cores. It's impossible to estimate the impact like this - which is why Apple's performance claims[1] are based on real-world tests using common software for different workloads.

In summary, there is supposedly improvement in all areas so the reduced P core count doesn't seem to be a downgrade in any form as the OP suggested.

[1] https://www.apple.com/nl/macbook-pro/


I wouldn't trust Apple's marketing on that if it's where you got those numbers from


E cores are ~30% faster and P about 15%. So the question would be how much the Es assist when Ps are maxed on each chip. In any other situation, more/better E cores should outperform and extend battery. I’m not saying that means you should want to spend the money.


I love Apple's E cores. It just sucks that the M3 pro gains so few given the reduction in P cores.

Apple's E cores take up ~1/4 the die space of their P core. If the M3 pro lost 2 performance cores but gained 4-8 efficiency cores it'd be a much more reasonable trade.


I’m sure the difference is GPU.


I’d like to see that. Good point about die space.


Could you not resolve these questions with benchmarking?


Depends on what you consider an upgrade. As M3 cores perform better than M2 cores, I expect the M3 configuration to perform similar to the M2 one, even though it trades performance cores for efficiency cores. Apple apparently believes that its users value improved efficiency for longer lasting battery more than further improved performance.


Functionally, how does this impact observed performance on heavy loads like code compile or video manipulation? I doubt it's not much, and these are the low/mid-tier priced machines we are talking about.

If you bought a $2k M2 machine and traded it for a $2k M3 machine, you may gain better battery life with no concessions, except for benchmark measurements (that don't affect your daily work).


These are not low/mid tier machines when talking about "consumer-grade".


Yeah.

$2K-3K is what my 3090/7800x3D sff desktop cost (depending on whether you include the price of the TV/peripherals I already own).


Within the MacBook Pro lineup, they are objectively the low and mid-grade pricing tiers.


Indeed, but that's a bit of an oxymoron as any Macbook Pro is not a "low/mid-tier priced machine"


We all know what is meant by “low/mid-tier”. This is pointless pedantry. Next someone is going to come by with the throwaway comment complaint about how OpenAI isn’t “open”.


Fair enough, I was just arguing even Mac users might not have the cash or the patience to commit into another machine.

We've seen the same with Nvidia's GPUs going from the 10 to 20 series. If people don't perceive higher gains without compromises, they won't buy it.


Then why do they come with (low end) consumer level storage and memory capacity?


Different people have different needs. I certainly need a MacBook Pro for my work, but I use next to no storage. I’ve never purchase beyond the minimum storage for an Apple computer. I did however up the processor on my current MacBook Pro.

Minimum 8GB RAM is more universally egregious but I’m not going to sit here and justify my own exception whilst discounting the possibility that 8GB works for others.


The cost for adding an extra 8GB would be insignificant for Apple, though. The only reason they don’t is to upsell higher tier models


It would make them less money. /thread

To be fair– While 8GB is annoying– I've bought the M1 MacBook Air when it came out and it's remarkably resilient. I've only had it freeze a few times due to too little RAM.

I've also been using many different programs. I just have to be a tad mindful about closing tabs (especially Google tabs) and programs.


This makes going Mac Mini M2 Pro over iMac M3 feel real compelling. The respective prices of these models are in fact the same, so if you happen to have a good monitor already... (also the iMac M3 curiously doesn’t even have a Pro option.)


> Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

I believe this is due to the TB4 spec requiring support for two external displays on a single port. The base spec M series SoCs only support one external display.

I’d expect the ports to work identically to a TB4 port in all other aspects.


I really, really wish they would fix this silly display scan-out limitation. I gave them a (frustrating) pass on the M1 given it was the first evolution from iPhone/iPad where it wouldn't have mattered. But seems silly to have made it all the way to the M3. Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I'm sure there is some kind of technical explanation but both Intel and NVIDIA seemed to managed 3+ scanouts even on low end parts for a long time.


The technical explanation is that on the base M1/M2 SoC there is one Thunderbolt bus that supports 2 display outputs.

On the MacBook Air one output is connected to the internal display leaving one output for an external display.

(The Mac Mini that uses the same SoC is limited to 2 external displays for the same reason)

To support more displays they would have to add support for a second Thunderbolt bus to the base SoC.


Is this an actual hardware issue though? One issue is MacOS has never supported DisplayPort MST (Multi-Stream Transport) EVER as far as I can tell. MST allows for multiple display streams to be natively sent over a single connection for docks or daisy chaining monitors. Back on Intel Mac's if you had a dock with 2 displays or daisy chained 2 together you would get mirrored displays. The exact same Mac/displays in boot camp MST would work perfectly. 1x display per Thunderbolt 4 port is the worst!


You can get multiple displays from a single port, the hubs are just expensive.


You can't do it with a base model M chip. Not supported on Mac unless you go with displaylink and displaylink has weird issues on mac like no hdcp support and screen recording enabled that make it a really bad experience compared to mac.


There's no reason a whole Thunderbolt bus is needed for every two displays. It's just Apple's decision to build their GPU that way.

And to not support industry standard NVIDIA GPU on ARM Macs, too. 1 GPU typically supports 5 output over as little bandwidth as PCIe x1.


Not with nVidia, no, they are 4 displays, always has been. The NVS810 8x display card is using two GM107 GPUs.

AMD is 6 displays. You see this rarely on consumer boards but the ASRock 5700 XT Taichi for some inexplicable reason did expose all six -- with four DisplayPorts to boot, too. I do not think there has been 4 DP or six output customer cards since.


Even with less ports you can use Display MST hubs to breakout 3 displays from one. (But not on a Mac, even intel, they never added driver support. Works in windows boot camp though)


There are couple 900-, 10-, 20-, 30-Series NVIDIA with 5 outputs. 700- and below had up to 4. IIUC it's more like up to (x px, y px) max with up to N independent clocks without external adapters or something along that.


Just because there are X outputs on GPU, doesn't mean it will work with all of them at the same time


I was doing 5 for no reason from a GTX970 at one point. They just work. But for some reason(segmentation?) NVIDIA brochure pages sometimes disagree or contradict with products in the market.


Right, but why can't you disable the internal display to run 2 external displays? That wouldn't be an unreasonble compromise but seems not possible.


M1/M2 only has 1 native HDMI pixel pipe in any form, I think? Apple uses the HDMI PHY to drive the screen on tablets, and the screen on laptops. Base-tier M1/M2 also only have a single displayport pixel pipe, and Pro/Max get +1/+2 respectively.

The base-tier chips are designed as high-volume tablet chips first and foremost, with ultramobility crossover capability.

Using DisplayLink or certain kinds of thunderbolt multimonitor are possible while running outside the pixel pipe or running multiple monitors on a single pixel pipe (this is not MST which is multiple pixel pipes on a single stream). But yeah it's ugly especially on a base-tier processor with this eating cycles/dumping heat. You're running the hardware encoder at least.

Discord had this weird error if you tried to enable the screen/audio capture, it tries to launch something and fails and the solution is you need to manually install "airfoil" because it's an audio capture module that discord licensed. you don't have to fully install it but the audio driver is the part that discord uses and that goes first (has to be allowed as a kext, ie non-secure mode). theoretically a kernel-level capture like that could be a ton faster than userland, I think that's the on-label use of airfoil.


Allow the user to turn off the internal display in favor of 2 external displays. That would be a usable docked configuration.


you are right, but apple won't do this.


independent repair technician demo video to mux MBA internal and external display?


>I'm sure there is some kind of technical explanation

I'm sure it's a marketing explanation: they make bigger margins on more expensive machines, and they need some feature differentiators to nudge people to move up. 2 extra displays is a poweruser/pro feature.

They make their own silicon, it's not like they're shy about designing hardware, if they wanted to stuff features into the lower end books they easily could.


> Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I mean, it almost certainly is? I would guess a majority of the low-end SKUs are rarely if ever attached to one external display. Two would be rarer still.


At a ~recent work place the entire floor of developers had (Intel) MacBook Pros with dual 24" monitors.

Some of them were trying out Apple Silicon replacements, though I'm not aware of what they did monitor wise. Probably used the excuse to buy a single large ultrawide each instead, though I don't actually know. ;)


Which workplaces are these that buy low-end laptops for their employees but shell out for dual monitor workstations?


Is a 1,599 laptop a low-end laptop? An M3 Macbook Pro 14" that costs $1,599 can only drive a single external monitor according to the spec. A $1,000 Dell XPS 13 can drive 4 monitors via a single Thunderbolt 4 Dock that also charges the laptop!

Honestly, I'm an accountant and everyone in my office uses 2-3 monitors with $1,200 business ultrabook.


I think this use case is probably not the majority.


So? Intel doesn’t seem have any issues supporting it regardless of that.


I am not sure how can anybody compare Intel and Apple. Apple is a vertically integrated system that has a CPU component, with a proven track record of making the right decisions. Intel is a CPU vendor with shrinking market share. As I pointed out, this use case is probably not that important because it represents a very small user segment.


External displays can be used for multiple generations of laptop hardware. Unlike CPUs, displays are not improving dramatically each year.

MacBook Air is a world-leading form factor for travel, it's not "low-end".

MBA with extra storage/RAM can exceed revenue of base MBP.


We’re still talking the low end of this product line. If you’re buying two monitors for your employees, I’m not sure you’re skimping on the cost between an M3 and an M3 Pro.


As stated, it's not about cost.

The travel form factor of MBA is not available for MBP, for any price.


What's Apple high end laptop product line?


> low-end laptops

Heh, that's not how I would describe MacBook Pros. ;)


I work at Motorola and we get M1 airs unless you specifically request a Linux laptop. I wouldn't call it low end though. Low end is an Intel i3.


> low-end laptops

you're saying they're low-end because Intel? if you've got your macbook connected to two monitors, you're not very concerned about battery performance.

So isn't Intel silicon competitive speedwise? I thought the M[0-4]s were OK but sort of hypey as to being better in all regards.


I have worked in plenty i5-i7 windows/linux laptops before and a macbook m1 air with 16gb of ram is miles better in everything. Nothing like them.

And even if you do not care about battery, you still care about throttling.


Honestly anyone who calls them hypey hasn’t actually used them and spends too much time arguing about geekbench on forums.

Real world, the M series chips are by far the best I’ve ever used as a software engineer and it’s not even close.


Not a chance. Moving from an Intel MacBook Pro to an Apple Silicon MacBook Pro was absolutely revolutionary for me and my very pedestrian ‘interpreted language in Docker web developer’ workloads.

I’d seriously consider not taking a job if they were still on Intel MacBooks. I appreciate that an arch switch isn’t a piece of cake for many many workloads, and it isn’t just a sign of employers cheaping out. But for me it’s just been such a significant improvement.


More like cheap out on monitors such that devs want two crappy monitors instead of one crappy monitor


What dev shop gives their engineers base model machines?


Doesn't need to be a dev shop. Go into any standard office and most productivity office workers will be running dual monitors now.

But with the general power of the base model Apple Silicon I don't think most dev shops really need the higher end models, honestly.


Where are you getting that impression from the parent post? Maybe they were on a 2, 3, or 4 year upgrade cycle and still had a bunch of Intel MBPs when Apple Silicon hit the market. That'd be extremely typical.

What dev shop immediately buys all developers the newest and shiniest thing as soon as its released without trialing it first?


We stuck with Intel MBPs for awhile because people needed machines, but the scientific computing infrastructure for Apple silicon took more than a little bit to get going.


Yeah, they were running Intel Macbook Pros because that's what everyone was used to, and also because production ran on x86_64 architecture.

At least at the time, things worked a bit easier having the entire pipeline (dev -> prod) use a single architecture.


Yeah, that was my experience. The early M1 adopters at my previous company definitely ran into some growing pains with package availability, etc.

(Overall the transition was super smooth, but it wasn't instant or without bumps)


Huh? He was talking about dual monitor situations being a problem.

If the company bought Pro or Max chips and not base models, it wouldn’t be a problem.


Intel has supported three external displays on integrated graphics since Ivy Bridge in 2012.


I’m not sure what that has to do with it being a niche use-case or not.


Niche or not, being more than a decade behind the competition is gauche.


On one somewhat niche feature, on the lowest SKU in that particular product lineup.

I can pick areas where Apple is beating Intel. Different products have different feature matrices, news at 11.


They also don’t show any signs of catching up to the Raspberry Pi’s on GPIO capabilities.


They did with https://ark.intel.com/content/www/us/en/ark/products/series/... but sadly seem to have killed off that product line.


That was Intel, not Apple.

It does seem like a shame, though—Intel’s IOT department seems to try lots of things, but not get many hits.


Apple does not compete on checkboxes. If they deemed is necessary to remove, there’s a reason. Not saying I agree, just that’s how they operate. If there isn’t a need to support 3 displays then they won’t, regardless if the “competition” did it years prior.


> there’s a reason. Not saying I agree, just that’s how they operate.

Almost always it’s maximizing profit margins rather than anything else.


>there’s a reason

they operate 100% on profitability, not what's technically feasible. They are extremely focused on making money. Yes, there is a reason after all.


Exactly my point. It’s technically feasible to do many things. Apple will do what Apple does. Try to upsell you into the higher tier hardware.


If that were true Apple would have stopped bragging about battery life.


The longer battery life is genuinely useful to a wide range of people in a way that being able to connect 38 external monitors is not.

I recently went on a 5-day trip and forgot to bring the charger for my M2. The first day I thought I'd have to rush around and find one. By the fourth day I still had 8% and then finally realized I could charge it via USB-C instead of magsafe.


> connect 38 external monitors

Just 2 would be enough. Which seems like a basic feature their competitors are are capable of supporting for a very low costs.

They in fact are competing on checkboxes, specifically they are probably using this limitation to upsell theirs more expensive models.


Can you not connect 2 monitors on a Mac?


Not on those with a non-pro M chip.


Even if you use one of those Thunderbolt/USB-C expansion dongles?


Correct.


It has nothing to do with niche use-case or not. This is a regression compared to their own Intel Macbooks.


Well the number with two screens would be zero, because you can't do it. That doesn't mean people don't want to do it because 0% of the laptops do it. They're just unable to.


It’s a bit funny though that their competitors don’t seem to have any issues supporting this on pretty much all of their products.


Display pipelines are expensive and take area.


Easy to say but hard to prove. How much more expensive would an MBP be if they supported it? How many fewer units would they shift?

Those are harder questions to answer. We could assume Apple crunched the numbers. Or perhaps they just stuck to the status quo.

Only an insider or decision maker (maybe that’s you) knows.


The CEO is a supply chain guy. They've been optimizing their profit margins ruthlessly since he took the helm. I don't think any savings are too small, particularly if comparatively few users are affected and it motivates upselling.

I think it's weird though how far people go to defend Apple. It's objectively making (some) users worse off. Apple clearly doesn't care and the people defending them also clearly don't. But the affected users do care and "but money" isn't really a good excuse for them. It also doesn't solve their problem of not being able to use two external monitors anymore without spending significantly more money.


I think their assumption is that if you’re the kind of pro that needs that many monitors, you’ll upgrade to the better chips they sell.

But it’s a frustrating limitation and remains one of the only areas their old intel based laptops were better at.


For the past 3 years, including with the latest laptops, "better chip" means 14" M* Pro starting at $1,999. $1,299 M1/M2 or $1,599 Macbook Pro does not support that. When you can find support for dual external display on $600 Windows laptops, or Intel Macbooks since at least 2012. By any standard this is an embarrassment and a regression.


Having 2 monitors isn’t even that ‘pro’ these days. I see receptionists with three sometimes.


An assumption they are so unsure about, that they kind of force that decision on their users.


It’s a money thing. Apple wants to upsell. The production cost would be negligible, but now you have to buy the next level of the product.


I mean they are physical things and you can look at how big they are. But sure the rest of how that factors into cost and sales is harder to figure out, yes.


Unless you’re Intel?


It's because they don't want to put a Thunderbolt controller on the right side of the computer


Is this a change to the spec, or did they skirt around that previously, because I didn't think they supported more than one screen per port on the M1/2?


I'm running an M1 Max with two Thunderbolt docks, and each drives 2 4k displays, runs great, although it's kinda overkill. But it does require the docks; you can't connect directly.


> seems like Intel wouldn't do this

Wouldn’t do what? Intel has more E-cores than P-cores on most of their range, and especially on the higher end e.g. on raptor lake S the i9 all have 16 E and 8 P, the i7s have 8:8, only the lower end of the i5 (below 13500) have more P than E cores. And the i3 have no E cores.

The story is somewhat similar on mobile (H and HX), a minority of SKUs have more P than E, and none of them in P and U.

In fact that was one of the things which surprised me when Intel started releasing asymmetric SMT, they seemed to bank heavily on E cores when mobile and Apple had mostly been 1:1 or biased towards P cores.


I think you confirmed what you were replying to. Intel makes the numbers get bigger as you go up, regardless of whether that makes the most sense.


Oh yeah I misread the comment.

Although that’s not quite true either e.g. on raptor lake H, the upper i5 (13600H) has 8 E cores while the low range i7 (13620H) has 4, but the i7 has 6 P-cores versus 4. The base frequencies also get lower as you move from i5 to i7. And you have less GPU EU (80 -> 64).


Well, when your P is still quite E, I guess it’s a different equation :).


The SKUs are becoming more complex because they are probably learning why Intel/AMD have so many SKUs. Making complex chips at scale results in a range of less-than-ideal chips. This drives a the need to segment and bin chips into different SKUs to reduce losses, rather than trying to sell one SKU and throw awaying the anomalies.


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

Is this because they are not populating all the memory channels, or just using lesser memory ICs?

If its the former... thats annoying. It just makes their products worse for artificial segmentation and very little cost savings.


The new M3 14" MBP seems to be a red herring - why does it even exist? Why not just refresh the MBA instead?

An obvious rule of thumb is for "Pro"-branded laptops to only use "Pro"-branded chips. It's what they follow for the iPhone lineup, but I suppose iPad Pros also use non-Pro chips. Just seems like a very confusing SKU to create, and definitely something Steve wouldn't approve of.


It replaces the 13 inch macbook pro with m2. Apple always has a “pro” macbook at that price point and it is one of the better selling macbooks, because not all “pro” users have a need for cpu grunt. A lawyer, for example, probably wants a “pro” class of hardware but doesn’t need more than an 8 gb m1. You could argue they should get a macbook air, but this 14 inch macbook pro is effectively that but with a better screen and more ports, which is exactly what that kind of buyer needs.


I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

And yet, the MBA's screen in comparison is serviceable and nice, but nothing outstanding. That's the case for the MBP 14 (when the 16 is just too large and bulky).


I find it to be the perfect size actually. Easily in a backpack and is light, and can use it on the couch, etc. comfortably. I’d never buy a 16” laptop.


Absolutely love my 14” M2 pro and use it daily for coding. Perfect size/weight for the backpack, and endless battery at the local coffee shop.


The old 15” was like the perfect dimensions. It practically had the footprint of the present 14”, maybe even smaller. Apple made a big deal about how their new chips run so cool, yet they made the pro laptops as fat as they were in 2012 again so clearly thermals were an issue.


Aren't the new 16" laptops the same dimensions as the old 15" ones? I thought the 16" display was simply because they were able to shrink the bezels on the display enough to get an extra inch on the diagonal. Other than the rounding on the edges, my M2 16" Pro feels about the same size as my old Intel 15" one.


> I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

absolutely not... working for 10 years on 13/14 and never _felt_ that way I get this is personal ;)


I find the 14" perfect, but I also find a tiling window manager (universally) vital.


I feel there is an obvious appeal to the MacBook Pro 14"/16" with M3. It has a good display, lots of battery life, and plenty of performance.

I'm more confused about the "M3 Pro" variant. Its performance either seems to be overkill or not enough. A more sensible lineup to me would be:

M3 - 2 thunderbolt ports, dual monitor support, memory up to 8-24gb (2x4, 2x6, 2x8, 2x12, 2x16). In the MacBook Pro, always comes equipped with second tier upgrades.

M3 Max - 3 thunderbolt ports, quad monitor support, 32-128gb (8x4, 8x6, 8x8, 8x12, 8x16).

Then again this wouldn't let Apple upsell people on basic functionality like dual monitor support so they'll never do this.


About the M3 pro, I’ve heard a theory it’s most likely due to lower yields by TSMC and M2 pro and max being too similar.

Now it’s clearly, if you really need perf you get an M3 max.


The most popular Macbook Pro?

Look, I'm a 16" guy myself, I even carried one of the 17" cafeteria trays back in the day… but it's clearly the sweet spot for _most_ people.


It was pretty hard to saturate the memory bandwidth on the M2 on the CPU side (not sure about the GPU).


The GPU can saturate it for sure.

Llama.cpp is a pretty extreme cpu ram bus saturator, but I dunno how close it is (and its kind of irrelevant because why wouldn't you use a Metal backend).


Well, Metal can only allocate a smaller portion of “VRAM” to the GPU — about 70% or so, see; https://developer.apple.com/videos/play/tech-talks/10580

If you want to run larger models, then CPU inference is your only choice.


Aren't these things supposed to have cores dedicated to ml?


You’re thinking of the neural engine. I’m not sure that llama.cpp makes use of this. They’d have to turn it into a CoreML model to do so.


They are not as fast as the GPU (but much lower power).

Also, not many implementations can even use it.


> The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

That's not super surprising to me. Apple loves to charge stupid prices for storage and memory. Maybe it's worth it for lots of people to have the convenience of built in storage at the lower levels, but I have to imagine that most people would want 8TB of SSD would rather just get an external solution for... much less.


Yeah I can imagine that’s an incredibly niche setup. Maybe if you were editing on the go or similar, but even then, TB drives seems like a more pragmatic choice.


I think what Apple is pushing for is computing efficiency. It still gets faster but with much less power. Focusing on performance solely would be the wrong way to evaluate these chips. https://www.tomsguide.com/news/apple-m3-chip


I think it's a bit more nuanced than that.

There's a reason they didn't just stick an A series chip in their laptops and call it a day - they want more performance even if it comes at the cost of efficiency. It's probably better to say that Apple is pushing performance within a relatively restricted power envelope.

Just to illustrate my point - if m3 had exactly the same performance as m1, but with 1/2 the power draw, I don't think many people would have been happy even if it would have been an amazing increase in computing efficiency.


This drives me crazy. Apple plays the market like Nintendo. Pick something that no one cares about, do it better than anyone else, and make a big deal about it.

I dream of a world where instead of a marketing company becoming a top 3 tech company, a tech company would have. Wonder what they would have done for their laptop...

Or maybe this is just an inevitable outcome of capitalism/human biology where a veblen goods company will become a top player in a market.

(I have my own Google and M$ complaints too)


So Apple is the most successful company because they prioritize things that no one cares about?

I dunno, if a there was marketing company that could design the likes of the M series chips along with the mobile versions, develop a full technology stack from programming language and compiler, custom chips, through to shipping whole devices at unimaginable scale would make me wonder what technology companies were doing.

What other “tech” company really compares from a hardware perspective? Samsung? Dell? AMD? Love them or hate them, there’s no denying that Apple has serious technical chops. One day people will only hate Apple for reasonable things, today’s not that day apparently.


Apple develops its own OS. Apple develops its own development stack, frameworks, etc. Apple develops its own CPU/GPU architecture. Apple develops its own battery architecture. Apple develops its own tooling to manufacture a lot of their products. Apple develops its own tooling to dispose off their products.

There are very few companies that have as much first party Tech in their products from start to finish.

I think Apple under prioritizes sdvanced functionality but if they’re not a Tech company than it’s hard to see what is.


It's probably fairer to say "Apple builds products focused on things I don't care about."

Obviously, other people care.


Who knows... maybe they will be like Google (which I consider a tech/engineering driven org) and they'll throw away (good) products all the time just "because"?

I think Apple plays the "niche" very well, not only regarding marketing, but also from a techs view.


What a weird take. Literally every "tech" company is chasing Apple's silicon but you are trying to claim that they're not a tech company. Let me guess, iPhones and iPads aren't tech either, right?


no they arent lol


Considering that Apple has been significantly more innovative (tech wise) than pretty all of their competitors I’m not quite sure what this tells about them.


marketing


Not really . Or rather not only. The only two things I hate about Apple’s hardware is the lack of repairability and the price gouging for memory/storage upgrades. Otherwise they are objectively miles ahead of their competition

Of course I have no idea why am I taking the effort to respond to an edgy single word comment…


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

just contrasting this with the recent TR4 announcements from AMD, apparently their PRO variants top (theoretically at least) at around 325GB/s (non-pro versions are half of this), so just from that perspective alone M3 Max's might be better ?

i always have the naive assumption here that keeping the-beast i.e. the cpu fed with data is much better for overall performance than just clock-rates etc.


The 2x USB/Thunderbolt ports are on the same side. :(


Unfortunately, I don't see apple doing anything but price discrimination using mostly number goes up.


The max is probably only going to be in desktops, so better to use the die area for other things than E cores


Wish that you could get the 16-core CPU with the smaller Max GPU, but alas I just ordered one anyway.


One thing that left me a bit confused was the comparison against Intel Macs. Although I am still using an Intel 16-inch MacBook, I really wanted to see how the M3 fared against the M2, not Intel and M1. I think it's no surprise the M3 exceeds the Intel Core i7-9750H in basically all of Apple's own benchmarking. My real question, which will probably be answered next week, is how it compares to the generation right before it.

My work laptop is a 14-inch MacBook Pro and I've been impressed with the battery life considering all of the containers that run (on Kubernetes!) as part of my dev workflow. I guess Apple deciding to compare Intel and M1 was to try to convince existing MacBook users to upgrade to the latest generation CPUs.


I imagine the amount of people upgrading from an M2 Mac will be close to 0, while there's a lot of Intel stragglers out there.


Intel straggler here. And I'll remain one until Linux runs flawlessly on the newer generation. I have an older Thinkpad as my main workhorse and a now quite old Macbook air for on the road, it's battery has been shot for a long time but it still works quite good when plugged in and I'm too much of a miser to have it replaced. But if there is a way to run a full Linux distro on the newer hardware then I'll probably buy one. But I'm under no illusion that Apple makes their hardware to cater to me and that's perfectly fine.


Check out asahilinux.org - I have archlinux arm running as my main OS on Air M1 for a year now.


I recommend sponsoring Hector Martin here: https://github.com/sponsors/marcan

Yes, he's continuing to work on the long tail of drivers needed for Linux on Apple silicon. Looks like lately he's working on speakersafetyd, a daemon that will monitor audio levels and keep the builtin speakers from being damaged: https://github.com/marcan?tab=overview&from=2023-09-01&to=20...


You shouldn't link to Asahi Linux sites or individuals here - they (Asahi) don't like that.


How is anyone supposed to know this etiquette rule that you claim? As far as I know people are free to link to whatever they like subject to the laws of relevant jurisdictions.

For my part I would say that you shouldn't police other people and tell them what to do in public, or if you do it, soften it sightly.


You're not supposed to know it, unless someone tells you - that's what I'm doing.

I completely agree that "don't link to my site" is a ridiculous request - I'm not saying that I approve of it. But it's what they want - and the Asahi project has numerous problems with it anyway, so avoiding drawing attention from HN has some benefits to everyone.

My apologies if it sounded like I was policing - that was not my intent.


If you think it's a ridiculous request to not link to them, then why did you say I shouldn't link to them in the first place?

I reject the notion that I shouldn't link to someone in good faith, regardless of whether they like it.


I said that you shouldn't because Marcan is a toxic and abrasive individual, and so not linking to him or the Asahi Linux project is beneficial.

> I reject the notion that I shouldn't link to someone in good faith, regardless of whether they like it.

In general, I completely agree!


What problems?


Read for context: https://news.ycombinator.com/item?id=33793049 Creator of Asahi Linux didn’t like the way some HN commenters talked about the project and/or some livestreams in that and other threads, and started trying to block incoming links with a HN referer.


They're a small team, they can only handle/reverse-engineer the M1 as of now.

edit: looks like they are make bigger strides than I thought on M2 as well https://github.com/AsahiLinux/docs/wiki/Feature-Support#m2-p...


How many hours of battery life do you typically get when running Linux on the M1?


I'm still holding out for HDMI output support so I can use it with the dock at work. But otherwise the project looks really good.


Replace it yourself using iFixit instructions and parts. Just screws and plugs, easy peasy.


was going to say this as well. The battery replacement process is pretty easy.


> it's battery has been shot for a long time but it still works quite good when plugged in and I'm too much of a miser to have it replaced

Consider a battery pack? I have a Bluetti K2 and get far more hours running heavy workloads than people report on M1. It's never bothered me carrying it around, though some might care I guess.

My only regret is that it's slightly over the amount that is legal on planes in most jurisdictions.


Guessing using two smaller capacity battery packs instead isn't workable?


The KWh limit on planes is a total amount, not individual device, I guess you can split it up amongst people if you want to.

Gotten so much mileage out of this laptop battery, especially on long distance trains, but it's a no go on planes by being so slightly over the limit.


I'm curious. I run Ubuntu in a VM Fusion VM on my (Intel) Mac. Would that not work on a M-series Mac?


Absolutely! VMWare Fusion supports Apple Silicon (as do several other VM software, like Parallels.


I don't know if this is at all helpful, but I have Ubuntu running nicely inside Orbstack on a Mac Studio. Obviously it's all running inside MacOS though, but it works.


Intel straggler here ! When M1 hit I was certain I would get a M Apple in a year or two when they iron out the transition. It was just leaps and bounds ahead of others and the pricing was great.

Now I think AMD and Intel caught up enough, Apple didn't keep the momentum, and they do standard Apple pricing discrimination - I think I'll take a gamble on Framework AMD version. I really like the idea behind the self-repairable upgradeable device and want to support them. If I was able to work on my Apple Intel i9 overheating PoS for past few years I'll be better off with anything, might as well support the stuff I like.


I have a 12th gen Framework 13", 13" M1 Air, and a 15" M2 Air.

I use the Framework laptop for work because I need to use Linux.

The Framework laptop is mediocre just like pretty much all PC laptops. The hinges are awful, if you pick up the laptop upright, about 50% of the time the screen falls flat 180 degrees.

The trackpad is arse in Linux.

If you're lucky you can probably get 5 hours battery life, but on a realistic workload you're looking at 2-3 hours.

The keyboard is pretty nice, but I wish ctrl/fn is swapped like Apple and it has the inverted mini-T keyboard arrows (or at least I wish someone would make a swappable keyboard for the Framework).

The speakers are bloody awful.

Display/Webcam/Mic are fine.

I would like more ports over modular ports, but I appreciate the design that went into the modular ports.

Speaking of modular ports, sometimes they abruptly stop working and require removing and reseating.

All these small nits really add up and it just feels like a mediocre experience. It is my work laptop, but I try my best to avoid using it over my PC with WSL2 or either Air laptop, but I try my best not to mix work and personal.

Both the 13" M1 Air and 15" M2 Air are just amazing compared to the Framework, and I suspect PC laptops in general. They have their drawbacks, price (gouging in some ways), less ports, can't drive dual displays, but their trackpad, finish, speakers, etc. are just amazing. I personally prefer MacOS to Linux for a desktop experience as well.


Thing is I use integrated keyboard/trackpad maybe a few times a week in conference rooms, same for battery - 5 hours is plenty for presentations and meetings.

I want a portable workstation that I can occasionally use as a laptop, so build quality and laptop stuff isn't that big of a deal to me. I'm always using a screen via USBC + dedicated keyboard and mouse. Performance and noise are a factor - I'm hoping that AMD versions deliver on that.

I'm leaning towards framework because if my current MBP dies I can't do anything about it since it's been out of warenty for years. And upgrading it eventually with next gen CPU without having to change storage/RAM, etc. sounds nice.


This is where I am. I have several PCs, a 15" Intel MBP, a 16" M2 Max MBP, an M2 MBA, and a 14" M1 Pro MBP. If I didn't have to use the PCs, I wouldn't because of how good the Macs have been for me. My team and I do a combination of web/app development, media production, and 2D/3D graphics. While we won't be replacing our PC towers anytime soon, our laptops are going to be Macs going forward with the exception of testing on the PCs we have. I can run Autodesk Fusion 360 and Blender on my MBA well enough that it makes no sense for us to force ourselves onto the more mediocre experience.


> Now I think AMD and Intel caught up enough

Interesting. I don't really like Apple, mainly because of how they handle the app store vendor lock in stuff, and I hate macOS. But I use an M2 MBP purely because I can't find any other laptop that has the same fast performance, long battery life, quiet fan noise, no heat.

Can you recommend an AMD/Intel or anything that comes close? I'd switch in a heartbeat. The closest that comes to mind is the ThinkPad X13s.


I just bought a Thinkpad T14S Gen3 after evaluating a bunch of notebooks.

Compared to my Macbook Air the Mac excels in a couple of areas which Thinkpad is lacking in. Like ambient light sensor, port quality (ie. how recessed the USB-C port is and how much strain it can take), audio output quality (Macs have powered headphone jacks for high impedance headphones) and of course speaker quality which on Macs is second to none.

The Thinkpad by comparison has poor quality speakers, no ambient light sensor so it doesn't auto adjust the screen or keyboard backlight. Its USB-C ports are also not very strong so the any strain on them or wiggle will cause them to disconnect - they certainly feel very fragile.

I hear the Thinkpad Z13/Z16 is more comparable to a Macbook but again it doesn't have little details like an ambient light sensor which seems an odd omission in a luxury laptop and price wise it's practically the same.

That said, the new AMD Ryzen 7840HS and 7940HS chips are pretty competitive with an equivalent Apple M2:

https://browser.geekbench.com/processors/amd-ryzen-9-7940hs

https://browser.geekbench.com/macs/mac14-15


And GeekBench is a pretty bad benchmark.

There's also the 7945HX which is a 16 core CPU, but only comes with big dedicated GPUs, sadly.


> I don't really like Apple, mainly because of how they handle the app store vendor lock in stuff, and I hate macOS. But I use an M2 MBP purely because I can't find any other laptop that has the same fast performance, long battery life, quiet fan noise, no heat.

Buying their products for the same reason everyone else does is "liking Apple".


You can like or use a product without liking the company making it or its practices


True. eg "For my last project, I used JavaScript, in anger."


Your only relationship to the company is via the goods and services it sells to you. If you buy their products, you like the company and their practices. The product is the culmination of all of their practices.


"I don't like being a galley slave but if I stop rowing, they whip me"

"Rowing for the same reason everyone else rows is 'liking slavery'".

Counter-proof by reductio ad absurdum.


So you admit of being a prisoner in an ecosystem built to ruthlessly exploit you...?


I own zero Apple laptops, desktops, watches or subscriptions, and one low spec. 2016 iPhone bought second hand. That's hardly being a prisoner or being ruthlessly exploited, is it?

If you hate Disney's Frozen because you've seen it so many times, but you watch it with your daughter because she likes Frozen, does that mean you like Frozen because you're doing the same thing that people who like it are doing? No; people's motivation for the same thing can be different. People can do thing they disapprove of, for reasons other than approving of thing.


But as far as the market is concerned, if one has a choice and exercises it, they liked the product better than the alternatives. Nobody on the Apple galley was ever really forced to be there.


Even ten seconds thought should show you the problems with what you are saying. Even if you take it on aggregate, a million people buying Apple over Dell means Apple is doing something to attract money - it still doesn't necessarily mean people like Apple more, it could be that Apple are lobbying congress to require Apple for schools, or Apple offering employers a massive discount, or Dell having supply chain issues so the vote is really about availability not technical preference, or etc.

As far as the market is concerned, if prisoners like rotting food more than starving then they "like (and approve of) rotting food" in general. People understand more than just pricing signals and can see the problems with that reasoning. At least, I can. In a similar way that the playground challenge "would you rather (horrible-thing A) or (horrible-thing B)?" is hilarious because whichever you pick, it means you actually like horrible-thing, haha ha! This reasoning is "the market is a good judge of character and if it has judged you, then you must have that character". Another option is "the market works on very few signals, most of them monetary, and that makes it a poor judge of character and motivation and other higher human concepts".

Another comparison is the web adverts which reason "if you bought a lawnmower then you will want to see adverts for lawnmowers because we have strong evidence that you are interested in lawnmowers" - as we warn about markets, past performance is no guarantee of future interest in lawnmowers. It could mean you are a lawnmower enthusiast, but it could mean you are disinterested in mowers and but have a lawn and an obligation to mow it, it could mean you choose whichever is in stock at the closest store which is no vote about the product and only a vote about your local store's stock levels. It doesn't mean that if you bought a lawnmower instead of a private jet that's because you like FloorGoo lawnmower F6240h PowerMow+ with leaf-eviscerator extension more, only that you can't afford a private jet. It could mean you ran over your neighbour's lawnmower with your truck and are buying a replacement of whatever model they had. It could mean your kid wants to earn money by mowing lawns over summer and you wanted to support them by funding the mower, or that you got a job at a lawn care firm and need to buy the tools they use for yourself, it could mean you took advice on which mower to buy from a friend and don't have any personal knowledge of mowers, or that a shop had a sale, or that a shop had an attractive smooth talking lawnmower salesperson, or that you had a breakdown and need to finish mowing before the rain due at the end of the day and had no time to do any research or like/dislike anything, or that you are supporting a mow-for-charity drive for a group going to mow churchyards and you are just buying within the budget that the volunteers donated, or that you aren't technical and misunderstood that you need an Apple lawnmower for your bank app because they advertised and told you that you did because companies employ highly skilled professional manipulators working full time to frame and distort the truth and your access to information in their favour...

You might also argue that if I want a silent laptop with long battery life that is a "proper *nix" and not a limited WinRT/ChromeOS/Android then I don't have a choice and am therefore not exercising a proper informed choice so the signal from it is misleading.


I use my laptop as a portable workstation so I rarely use built in keyboard, touchpad, battery, etc. From what I see AMD 7840 CPUs deliver similar performance to M2.

I think I'll get an Framework 16 or 13 with AMD in a few months, will make sure Linux drivers are in order before ordering.


Wasn't AMD pretty much right behind in single thread perf and a bit behind in peak perf TPU, but always ahead in multi-core perf options?

I'm very happy with my Ryzen 7 6800U.

Matte screen? Check.

Lots of full perf cores? Check.

USB A and C? Check.

Really good battery even though I'm plugged in all the time? Check.

X86-64? Check.


The big thing I like about the M's is how quiet they are. I hate fan whirr in my old age.


I'm one of them. For work, come and grab my 2019 16in from my cold dead hands - it may not have the battery lifetime of my private 2022 M2 MBA, but at least I don't have to fight weird issues with x86 Docker images (especially anything involving a JDK runtime, i.e. Tomcat, tends to act up). And no, converting these images to ARM isn't an option, the source image doesn't come from us, and we need to reproduce the exact software environment to reproduce bugs.

And for me as a Samsung phone user, I'm pretty annoyed that I have to drag out a win10 machine every time I want to update the firmware of my phone because Odin is only available on Windows and UTM can't use Rosetta to emulate a Windows VM at any acceptable speed or stability.


2019 16"cher here too. These are great machines just eclipsed by all the Apple silicon hype. Yes they suck comparing to performance, power use and heat to Apple silicon, they're still amazing x86 laptops compared to whatever other x86 machines are out there.


I have 1 of these for work and 1 for home. I'm waiting 5 years from release, so about 2024-2025. Computers got really good around 2017, and the only reason to upgrade is the heat and fan. I use remote VMs anyways, so chrome is really my limiter.


So an occasional firmware upgrade is holding you back? If you install the guest drivers, windows is superfast under UTM.

The docker thing is nonsense if you're using orbstack for example.


> So an occasional firmware upgrade is holding you back? If you install the guest drivers, windows is superfast under UTM.

The guest drivers don't work under Win7.


I think the point is more that "is it worth buying this new m3 or should I buy a secondhand m2 for half the price?"


more like "a hundred bucks less". The M1's go for maybe half... if you're lucky. Most are more.


Because there's a newer model, which is now true for the M2 ones too?


M1’s took a hit because multiple processors came out each gen. Remember: the M1 was superseded by both the Pro & the Max (and the unites but that’s not in a MacBook).

By the time the M2 came around ppl knew what was up. So you might be able to find a bog standard M1 that low (they do show up) due to this… but it’s rare.


I don't think you're going to find any M2s for half the price. They hold their value pretty solidly.


They won't if the M3 turns out to be a significant upgrade.


That’s honestly just not how Apple hardware resale value has ever worked. The new M3 hardware could double the performance of the previous generation and they would only drop in price by maybe a few hundred.


One big outlier; the recent Intel Macs have dropped far more in value than expected.


And that outlier is the last /significant/ difference in performance.

M1->M2 wasn't that big, generally seen as a small incremental improvement at best, and the last few Intel updates barely moved the needle between them.


I see ones like mine on eBay for more than 50% of what I paid for it, and it's four years old.


Or should I buy a brand new M2 Mac Mini for half the price of iMac M3 if I already own a monitor. Or a M2 Pro for the same price where iMac M3 doesn’t even have a counterpart.


I’m an Intel strangler because it’s the only way I can have Mac and Windows on the same machine. Am I the only person that really needs this? Parallels has trash performance.


Hilarious typo — I picture someone throttling an Intel machine (which sadly throttles you right back).


Ha I just noticed it!


I bought a cheap Windows crapbook just to run the two Windows-only apps that I need (terrible apps for alarm systems and photovoltaic inverters).

Apple Silicon laptops are so much better than Intel-based machines, there is really no comparison.


There are PV inverters with Windows desktop apps instead of a mobile or web app? Can I ask which one?

(Just curious; I worked in the industry on the web app side)


I got a NUC for whenever I need windows and just RDP in it. It mostly collects dust.


I miss Boot Camp too. (Only really used it for gaming). But the M2 is so nice that I jumped ship despite knowing I'd no longer have Windows.

For the rare times I need it, I sometimes use a cloud VM (like Shadow.tech if you need a good GPU, otherwise any Windows VM I guess).

Also, for any GPU-intensive apps (not sure if CNC counts), it might be worth giving Game Porting Toolkit (the other GPT) and Whisky (a GUI on top of that) a shot? Might run better than Parallels.


You can run ARM Windows under Parallels, which is good enough for me.


Just throwing this out there: ARM Windows uses Microsoft's own ARM to x86 translation layer (like their Rosetta), which works surprisingly well for run-of-the-mill business apps and the like. 3D apps do take a big hit though.

https://learn.microsoft.com/en-us/windows/arm/apps-on-arm-x8...


Ah just updated my comment. I really dislike how slow Parallels is.


For gaming sure... but anything else... hypervisors are very good these days.


I need it for windows only CNC software. It’s pretty terrible for that purpose.


Would that software perhaps work under WINE/Crossover?


Funny, I never notice that.


I'm still running Windows on my Mac mini. I used to use Bootcamp but don't do so much Windows stuff now so can get away with VMWare. After June next year I can drop Windows completely so will likely upgrade then.

I'm not sure how VMWare compares to Parallels but performance isn't as good as Bootcamp. I wouldn't want to use it all day long.


Parallels is an overpriced rubbish with stupid limitations. I’m switching to WMWare or VMWare as of right now


Parallels used to be front-runner with better support for new macOS features. vmWare used to lag behind with weird issues. This is why I switched from vmWare Fusion to Parallels Desktop a few years back. What are these limitations you speak of? And what makes it rubbish?


8 GB max of RAM is a software limitation not present in the more expensive tier. I suspect it is not a technical limitation, just greed.


Probably the biggest complaint is fusion is now free (for personal use) and Parallels is a subscription.

Also you can get other free vm managers on Macs now since it ships with a hypervisor and vm framework.


have you tried using VMware Fusion at all?


my intel died last week. had to get an m2 pro and THIS drops today. kinda salty that my intel couldn' have held out till the M3 was available


Apple has a 14 day return policy. Don't need a reason.

https://www.apple.com/shop/help/returns_refund


This is what returns are for! It is definitely worth the hassle for a better chip


I bought a top spec Intel mac a month before the M1 came out. Just sold that computer for 1/3 it's original price :\


I don't recall if the M1 was a secret/surprise (I doubt it).

Before buying any Apple hardware, it's worth looking up a couple of sites that suggest when the next release cycle will be for a given item and what specs are expected in the next release. The one I use is https://buyersguide.macrumors.com/


M1 was not a surprise.

After years of waiting for a non-butterfly keyboard, they released a working Intel based MBP, but almost immediately made it worthless by announcing the migration to Arm. They even "rented" prototypes using A12Z chips for developers to use to prepare their apps nearly a year in advance. Hardly a stealth endeavour.


1/3 is pretty good for a used computer, TBH.

I usually get like 20% or less... sometimes they're hard to even give away.


Fair, this one was in literally perfect condition though. No scratches / damage and a brand new battery.


It's like buying a car. The moment your Mac walks out of the store, the ghost of Steve Jobs comes and personally curses it to 2/3 of its value that very instant.


Except with cars, that loss is against the sticker price. So if you get a good enough deal on your new car, it can in fact lose nothing when you drive it away (I got close to 1/3 off a new car in 2018, using a web-based buying agent, in the UK).

It's very hard to get anything like that much off a Mac, since Apple appears to have pretty tight control over prices, even as charged by third parties.


“Last week” is well within the 2-week return window.


unless you went second hand, you should still be covered with the 30 day return policy no?


wow it's 14! definitely need to hurry up :)


Intel straggler here. My laptop just completed 5 years. It says battery needs replacing which I’ve been putting off for a year. The thing is I have 32G which is quite good IMO so I’ll probably use it for two years at least.


You're just hurting yourself. Even the M1 Pro is like 3x faster than that Intel laptop. Plus so many tiny things like sleep/wake, power management, heat.. I really can't stress enough what a tectonic change the M1 was over Intel.


Depends on what they're doing with the machine. For casual use, Intel is perfectly fine and you do get some x86 benefits too. I have M1 Pro for work, personal M2 mac mini and typing this on 16" Intel MBP, watched the presentation live and still not convinced I need to upgrade my casual browsing and light programing machine this year. Battery is at 500 cycles and 84% health, still lasts 4 hours.


Exactly! I have a 2012 MBP and a 2015 MBP that are still kicking and work perfectly for my kids playing Roblox or Minecraft and any web browsing we want to do with a proper keyboard.


It really, really was. I’d been used to barely-noticeable performance gains for a decade or more. The responsiveness under load of the M1s made them feel like a decade of performance improvements overnight.


This was especially true with Apple/Intel thermal envelope, I don't know of any other manufacturer that purposefully ran their CPUs to the point of throttling just to keep fan noise down, and in such thin machines with wafer thin heatsinks for design aesthetics. I think they made the last few gens of Intel they used worse than they were (to be fair Intel did have poor TDP till recent 12th Gen)


I've got an M1 mini already, so it's good to have the Intel MBP around for occasional x86-only stuff. Built-in USB-A is also convenient.


But how will I ever survive without my touchbar?


Maybe an aftermarket keyboard? I know you're not serious but aftermarket keyboards some of them have little OLED inside the keys. Also somehow the Apple vision Pro headset might provide you with that one extra awkward step needed to accomplish a task interface that it seems like you might be humorously craving.


For years now, Apple's biggest competitor is Apple from five years ago. One of their biggest threats is that sales flatline because people are still using their perfectly good laptops and phones from several years ago. And that's a hard problem to deal with when you advertise your products as high-end goods that will last a long time.


They have solved this though by periodically introducing incompatible macOS updates, so that eventually you can no longer install the current macOS and then after a while you won't be able to install latest version of applications and you start to feel stronger need to upgrade...


Six year OS support, plus security updates past that.


Not as long as Windows support but still good enough.


This shouldn't be a problem, and the answer is to refresh devices less frequently.


The whole company is sized to a specific revenue stream, and this revenue stream requires a specific sales volume. Apple could get off Mr. Bones' wild ride, but this would require remaking how the whole company works:

- less frequent laptop releases mean lower sales (do you want this 2020 Mac or this 2023 Dell?)

- lower sales mean lower revenue, lower revenue means lower costs

- so now Apple has to spend less on R&D and at the same time convince its board of directors that lower sales don't mean Apple is failing


It objectively is a problem for Apple as a corporation though, as they are expected by their shareholders to to continue to grow and increase profits year over year.


Only in a modern capitalist society etc etc. One thing Apple diversified in in the past years is its non-hardware offering; iCloud earns them billions and Apple TV is churning out massive productions, a 3.5 hour Scorsese flick in cinemas for example.


Why would they do that? It wouldn't solve the problem @rueeeeru stated. They need to remain competitive and you can't do that by sitting on your backside for four or five years between products.


I have family still using the 2017 Macbook Airs, happily. A year ago I replaced the battery on two of them (from OWC which still offers high quality ones). Every year I ask them if they want to upgrade to a new model Macbook and they refuse. Meanwhile I have other family on Chromebooks that they've had to replace twice already because they are too slow, or the hinges on the screen have worn out, or the charging port got too annoyingly loose.


The comparison with M2 is not very useful at this point. Throughout the presentation, the gains were something like 15-20% in some areas.

Even if they do their own chips, Apple cannot achieve a revolutionary performance gain year over year, that's why between two generations we'll see something like this for a long time. And since their idea is for people to be using Macs for many years, it makes sense to compare it to older generations to try and persuade people to do the upgrade.


Not just that, but if you're not running ARM images then it's running a VM for your docker images to boot. It's just mind-blowing how many containers I have running and my laptop is cool to the touch, and at full battery.


You’re running a VM either way on a Mac. Docker without a VM is only available on Linux.

EDIT I get your intent though because it runs better when the images match the host.


Yep! But at least it can leverage hyperkit and vastly reduce overhead when running images of matching architecture.


hyperkit does not run on ARM Macs

  $ brew install hyperkit
  hyperkit: The x86_64 architecture is required for this software.
  Error: hyperkit: An unsatisfied requirement failed this build.


Oh I didn't realize, thanks for pointing that out. I guess they'll have to rely on the new virtualization framework going forward.


MacOS can run MacOS containers natively, but I understand that's not much help for most people.

https://macoscontainers.org/


Yeah, but that also requires disabling SIP, so you still might want to run it in a VM.


Aw. I wish there was a way to isolate corpoware crap like Citrix into its own little jail. Kinda like {tool,distro}box on Linux.


That’s interesting. It’s in 0.0.1 version though but it seems like a possible drop in replacement


Without network, IPC, PID and cgroup namespaces replacements it's not even close. How Windows Server does it? By using parts of Hyper-V.


And on Windows while running Windows containers.


Before anyone else gets confused...Windows containers are only for Windows applications. If you have a Linux environment in your Docker image, then it's running on a VM. https://learn.microsoft.com/en-us/virtualization/windowscont...


Of course, what is there to be confused about?

It say it clearly on the name, Windows containers.


This isn't as much the case anymore. There are typically arm64 images available for the more popular images, and anything you build locally is native, unless you're trying to build x86 software.

Redis, Memcached, PostgreSQL, MySQL- it's all native arm64 images now.


Even using ARM images you still need a VM because the MacOS kernel is not a Linux kernel, and containers rely on features of that kernel.


That only really helps if you also want to deploy on arm64. Developing on one architecture and deploying on another would kind of destroy on of the advantages of containers.


> if you're not running ARM images


The point was that there is rarely a reason to not run ARM images, since they’re widely available.

If I’m running M3 images on my Ryzen, performance is gonna be horrible, but why would I?


Today? Sure. Thus my statement, which came with a caveat about not running ARM images, the implication being that it's a rare thing today.


Do note that cool to the touch is a bad thing if the internals are otherwise hot


If the all-aluminum chassis remains cool to the touch despite boiling internals, that's impressive in its own right :-)


I was more surprised not to see any AI-specific benchmarks—sure, the most popular open-source models are from avowed competitors, but there should have been a way to define a re-training task that would be relevant for the wave of ML programmers.


They didn't improve the neural engine. M3's NPU is half the speed of the A17 Pro in iPhone 15 Pro.


I had the same thought. It seems like the improvements there weren't worth talking about.

AFAICT, the biggest benefit is just the unified memory model at this point.


It won’t excel at this. It is a mobile GPU and won’t be able to put up remotely similar numbers to a desktop GPU with massive amounts of power and cooling.


There were comparisons to the M1 and M2 in the presentation, and indeed the linked website.


The M3 chips are 30% faster than the M2 chips for efficiency cores and 15% faster for performance cores. The overall performance is still impressive and there is no alternative to M*s when compared with performance per watt. But others have caught up, for example https://www.amd.com/en/products/apu/amd-ryzen-9-7940hs is very good.


I note that if those performance numbers are correct it'll still be a lot slower than Intel's Core i9-13980HX laptop CPU. The M2 Max was between 50% and 80% of the speed of the 13980HX on most benchmarks. A 15%-30% uplift will get it closer to the Intel part but still not reach it.


Yeah, but the problem is that 13980HX has 24 cores, 32 threads and incredible 157W Turbo Power. That means it will be really slow when on battery. And when connected to the mains, it'll be as loud as an airplane taking off.

The perf per watt claims still apply.


Good point.

However, when they’re that close in performance, I think power usage should be taken into account. It’s a laptop CPU after all. Maybe I’d rather have 80% of the performance at 4x the battery life


Did you account for the M3 Max having four extra performance cores (12 versus 8 for the M2 Max)?


> But others have caught up

You mean others now got to use TSMC's 4nm process? These new M3 chips are probably on the 3nm process, Apple is still a generation ahead here.

It looks like Apples aim is to always stay a generation ahead of its competition, I wonder how long they can keep that up since they aren't running their own fabs.


Oryon seems to have caught up in performance despite being on N4 (which is just 5nm++), but it's an ARM design from the guys who made M1 in the first place.

M3 seems like it generally underwhelms. A17 had like a 3% increase in IPC. They didn't discuss battery much and I suspect that's because ramping the clockspeeds to over 4GHz isn't so good for the battery benchmarks.

The most worrying part is the transistor count. M3 Max gets quite a bit higher transistor count, but M3 is only 37B transistors while M1 was 33.7B. Apple and/or N3 absolutely suck here.

Looks like I might be keeping my M1 system for yet another upgrade cycle.


the backstory here is that TSMC N3 is a trainwreck, it's now separated into two different nodes, N3B (for bad) and N3E (enhanced). N3E gets the promised step that was originally for N3, but it only enters volume production next year. Supposedly it will actually bring costs down (and yields up) because this is where some additional steps go EUV. Both TSMC and Samsung have been fucking around with their marketing around nodes to try and say they're first in volume 4nm production, but both are having problems with the final bosses of FINFET at 3nm and after this both TSMC and Samsung do GAAFET and solve a different set of problems. Past that lies... nothing. Hyper-NA seems dead.

In the meantime, M3 is on N3B despite very low yields, which surely applied design pressure to keep size down, and the power gains are not as good, and also the density is worse than promised. Apple also surely feels pressure to keep prices high (bait-tier M3 base option with 8GB lol) and honestly they probably are going to be tough to justify on a performance/efficiency basis compared to very fierce ARMv8 competition (we are now testing the thesis there's no difference lol). Apple still has advantages but man do they take you to the cleaners for the result, a loaded apple laptop is obscene. I chose to go for an older loaded M1 Max instead of waiting for M3, because I could actually get a nice laptop that wouldn't impose limits on a prosumer etc. 1TB is all anyone can afford still and that's really silly.

(SSD prices in particular are absolutely inexcusable lol. Mandate a M.2 NVMe 3.0/4.0/+ port please, EU, it's time. Don't care how it works, slot it into the side or whatever if you want, it can be single-sided 2230 or 2242 if you want (or caddy-loading, the icy dock standard lol), but it's time.)

https://global.icydock.com/products-c5-s48-i0.html

https://global.icydock.com/vancheerfile/images/mb873mp-b_v2/...

I also wonder if losing a bunch of the PA Semi team to whatever startup (it may have been nuvia or tenstorrent lol) may also have hurt apple's velocity on A16/A17. There were a lot of apple silicons before M1, after all. But certainly TSMC is a bunch of the problem here.

I think they'll hustle to refresh it and do M3+ as a fast follow in 6-12 months with N3E, the cost economics are very favorable to jump as soon as there's the volume. That doesn't mean MBP gets refreshed immediately though, they'll ramp on the phones (iphone 16/16 pro, etc) and base-tier M3+ or whatever first.


>the backstory here is that TSMC N3 is a trainwreck, it's now separated into two different nodes, N3B (for bad) and N3E (enhanced).

Without N3B, there will never be N3E.

It was the same with N7. And none of these are new to a new node generation.


Apple signed a deal with TSMC to purchase nearly all of their available 3nm chips for the next year. Ethical or not, they positioned themselves to ensure almost no competitors could develop on 3nm until they did first since no fab on earth has the scale of TSMC. They could ride this plan for years if it pays them.


not only is outbidding the competition not unethical (as a sibling notes), apple actually is very involved in the early node work etc. a lot of this work is literally done for apple, it is "exclusive games" in the "this game would not have been made without the sponsorship" sense. this literally would not have been brought to market on the same timelines if Tim Apple wasn't signing a couple billion dollars a year to TSMC right upfront.

Apple pays lavishly to support TSMC's early node research, and they get their say in what happens in the R&D process, and very early insight into the node and their say on how it would work for them as they do their rollout. TSMC gets carried through the research phases much faster than their competitors can do, and it's led them to be on an absolute tear starting with 7nm. And they absolutely cannot fill the same level of demand with the same level of R&D funding from any of their competitors.

It's been a healthy, productive long-term partnership, TSMC is maybe the only supplier Apple can't boss around and Apple is certainly a client that is always too big to fire. Doesn't mean every apple product is good (and TSMC can still flub, and their competitors are catching up a little bit) but Apple can move whatever they need to lol, they are masters of supply chain managment. They can cover TSMC's mistakes if needed, and they have insight into exactly what is happening as the node is developed and how they need to maneuver their product stack around to exploit it.

Engineers study designs, CEOs study logistics. Also true of NVIDIA btw lol, they are very logistics-oriented because they make up such a large marketshare. How many companies on the planet are ordering big bulk runs of GDDR? Well, if we are ordering 20% of the planet's GDDR on a fixed timetable then maybe we can get a custom version, micron, right? (9 months later, GDDR5X/6X is born lol)

It is an interesting contrast to Intel - this is almost the same kind of synergistic relationship as intel's own fab and IP side have historically had together. Did intel fail because they had a tight fab-design coupling, or did they fail because they had a rotted internal culture and then the fab slipped a bunch?


How is Apple buying all the production slots for a process in any way unethical? It's not like they're buying them and then burying the chips in a landfill. They paid TSMC's asking price for production slots. AMD or Intel could have bought those same slots but didn't. TSMC has limited capacity at 3nm, it was up for sale, Apple bought it. Where's the ethics question?


I’ve seen argument for ‘business ethics’ before from the losing side. It’s sometimes part of a media campaign. It’s likely cheaper than legal action.


>How is Apple buying all the production slots for a process in any way unethical?

This has been the case on HN for more than 5 years. Intel Fabs used to sell their industry best node to only Intel themselves, and charges a premium for those newer CPU. I guess that is unethical too.


How is Google buying search defaults on all iPhones in any way unethical? It's not like they're buying them and then burying the searches in a landfill. They paid Apple's asking price for search defaults. Microsoft or Brave could have bought those same search defaults but didn't. Apple has a limited amount of search defaults on iPhones, it was up for sale, Google bought it. Where's the ethics question?


Not running their own fabs makes it easier, right? They presumably pay a premium for being the first to use a generation, but not anywhere near as much as the full cost of developing it, since TSMC can sell that capacity to everybody else when they are done with it.


They are literally on the 3nm process. They say so in the video.


Ok, so no probably, they are on 3nm. Anyway my point that they are a generation ahead still stands.


> But others have caught up

Then you proceed to link to one that... hasn't? (it's good, yes, but it's not caught up at all)


I still know a number of my colleagues who haven't jumped onto the Apple Silicon machines despite our 2019 Intel machines being out of warranty (and thus eligible for replacement) for two years now


I'm still using a 2015 macbook for my personal daily driver. I have to plug it in regularly, but otherwise it works just fine for everything except video editing.


I thought the same, but ended up upgrading to an M1 Air due to a hardware failure.

Sure it feels 100x faster. But more than that it’s DEAD SILENT (no fan in the machine!) and completely cool.

The temperature/noise, if I had experienced it myself first, would have probably gotten me to upgrade.

I use a 2019 Intel MBP at work. It’s much faster than my old 2015 too, but with the additional heat and noise I didn’t really want one.

I would have taken the noise/heat of the M1 + 2018 or 2019 performance. Instead I got heat/noise of the M1 and far better performance, for a fraction of what my 2015 cost (unadjusted) new.

Amazing upgrade.


I got an M2 Pro Mini a couple of weeks ago to replace my 2018 i7 MBP and while it is obviously snappier and you feel it's more powerful without running benchmarks the main difference is that: it is silent. I only got it warm to the touch rendering video with Da Vinci, when CPU temp usage quickly went to 75ºC.


Are you able to replace the battery on those? I have done battery replacements on 2017 model Macbook Airs to great success. I get the batteries from OWC.


Any reason for not upgrading?


> Any reason for not upgrading?

Why is this sort of question always framed as if people have to provide a justification for not buying a new model, as if spending money on the new shiny without any reason is normal or desirable?

We live in a day and age where hardware bought a decade ago still packs enough punch to run most of today's software without any hiccup. Why would anyone waste their cash to replace something that works without having any compelling reason?


Some people have experienced downsides you might want to know about.

For example as you passed 2016-2019, not only did you have the butterfly keyboard mess but each generation reportedly got hotter and thus louder.

My 2015 was quieter/cooler than my 2019.

So if you’re happy it may turn out that even though the newer machine has better performance it feels like a downgrade for other reasons.

(I don’t think that’s the case here)


2019 16" no longer used the butterfly keyboard, they have a physical Esc key and something Apple called "magic keyboard", really a scissor based mechanism that's really pleasant to type on and doesn't stop working with a tiny bit of dust under the cap.

https://www.macrumors.com/guide/butterfly-keyboard-vs-scisso...


2019 MBP mostly fixed the issues introduced in 2016. But lack of USB and HDMI was enough for me to pass on it. Also, the 2015 one is prettier.


Ah I miss the lit up Apple logo.


Yeah I would definitely miss that, as well as the shape.


The new magic keyboard or new scissor keyboard has a key travel of 1mm, for me I wouldn't describe it as really pleasant.


I'm rocking a 2017 MacBook Air and it's great--really the only down side is that Apple stopped supporting it in macOS past Monterey. Almost all of my past and present Apple hardware has long outlasted software support, which is my biggest complaint with the company.


It would depend on workload. On my old intel MacBook, I was looking at an hour or so to build, and it could only complete one build on the battery if that. Testing took a similarly absurd amount of time.

The M1 dropped both times by in the region of 30 minutes, could do multiple rounds on a single charge and didn't make a tonne of noise while doing so.

The amount of time savings you get from the improved CPU perf is quantifiable, and you can assign a monetary amount to that time.

Now if your use case is not performance (cpu, battery, etc) limited then of course there's no reason to upgrade, ever really, but that would apply to any laptop or pc not just Macs.


Because the productivity gains from upgrading can be quite large relative to the cost of upgrading, esp when you factor in the average salary in this community.

I spend 8-10 hours on my Macbook every day. The amount of time I've saved / productivity I've gained by things just running faster and by being more mobile (much longer battery life) is huge compared to the $2000 price tag.

Frugality is good but there are some things in life (depending on your personal circumstances) where it does in fact make sense to upgrade for clear benefits.


Really depends on what you're doing.

I spend 8-10 hours a day coding on my 2018 MBP (web apps - Postgres and Rails or Python) but almost none of that is really CPU-bound in my case. The meat of my work, the actual coding and iteration, is not limited by the aging CPU.

The one thing that's painfully slow is rebuilding Docker images, but we don't do that too often. Less than once per week.

I actually am upgrading soon, but it is not going to make an amazing difference for me in terms of productivity in my current work.


> Because the productivity gains from upgrading can be quite large relative to the cost of upgrading, esp when you factor in the average salary in this community.

You see, this is simply not true. At all. By far.

I have a cheap Intel laptop released 8-10 years ago. It shipped with 8GB of RAM and 4 cores. I bought it on a clearance sale for around $500. I use it still to this day to work on webapps, including launching half a dozen services with Docker Compose. The only time I experience any type of slowdown is when I launch IntelliJ.

I also have new kit, including a M2 MacBook.

There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing. The only issue I have with my old laptop is battery life, and that's just because I don't bother replacing it.

Please do point out a single concrete example of "productivity gains" that I would get by spending $2k on a new laptop.


> There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing.

[…]

> The only time I experience any type of slowdown is when I launch IntelliJ.

I can't tell if you're a serially dishonest interlocutor, or whether your fetish for making very emphatic generalisations with lots of intensifiers makes you seem like one, but once again this is very weak reasoning. You have yourself pointed out something you cannot do with your Intel laptop which you could with an upgrade.

> Please do point out a single concrete example of "productivity gains" that I would get by spending $2k on a new laptop.

You can run IntelliJ smoothly and have no battery life issues. (Literally from your own post… it's just so sad to see this utter lack of self awareness.)


The only downside would be struggling with 8GB, which should be upgradable just as well. 10y old would have a cd/dvd tray - that can be replaced by an SSD for 4TB of goodliness (SATA but still good enough).

My spouse has a 12y old laptop that has had pretty much everything (but the soldered GPU) upgraded - CPU, memory, HDD->SSD, CD-SSD, WiFi (to support 5GHz), keyboard (replaced), fan & heatsink, battery (replaced, might rebuild one w/ LG's 18650 MJ1). Unfortunately pre-Sandy Bridge memory is capped at 8GB, so it shows its age - still an amazing thing.


I love responses like this, we should think hard first why NOT to upgrade, instead of doing reverse.


Why do you have an M2 Macbook?


It is an honest question.

For some background in my thinking, it is because today's announcement really focused on Intel users. My semi-educated guess is that Apple did a whole bunch of user studies and realized there are a lot of people out there, like the OP, who haven't upgraded yet, hence the focus. As a result, I'm genuinely curious why this person hasn't upgraded.

And for my own personal experience, the upgrade/switch from intel to m*, is night and day better ergonomics as a developer. It isn't just some shiny new toy or a waste in cash. For the same reason professional mechanics in F1 don't use shitty tools to work on their cars. Or tour de france racers aren't using 30lbs Huffy bikes.

TLDR: I don't give a f'ck if you don't happen to upgrade, that's your choice. I'm just curious about why.


    For the same reason professional mechanics in F1 
    don't use shitty tools to work on their cars.
They also probably don't buy new wrenches every time new wrenches are released, if their current wrenches are completely sufficient and not holding them back in any way.


> if their current wrenches are completely sufficient

That's a fantastically entirely subjective opinion.


> That's a fantastically entirely subjective opinion.

That's the point: objectively, there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

In fact, it boggles the mind how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools", as if MacBook Pros packing an Intel core 7/M1/M2 suddenly became shitty laptops just because Apple released a new one.


> That's the point: objectively, there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

Again, what you mean to say is that _you_ cannot think of a reason that would make _you_ upgrade from a 4 year old MacBook to a new M3 one.

> objectively

Do you understand that what you say is literally, definitionally, subjective? It's one thing to make primitive and clumsy generalisations, but quite another to be confusing subjectivity and objectivity.

> it boggles the mind

Starting to believe there isn't a lot of mind to boggle here…

> how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools"

I haven't noticed anyone making this argument, but I know many people who upgrade their tools -- whether computers or otherwise -- to the latest and greatest whenever they can, because working faster and more efficiently is a concrete benefit, and it really would take an inestimable moron to, say, argue that late Intel-era MacBooks can do the same things that M-series MacBooks can.


    I haven't noticed anyone making this argument
Yeah, you haven't read this thread.

Not that you missed anything of value. A previous poster, latchkey, quite literally made that argument:

    "the upgrade/switch from intel to m*, is night and day 
    better ergonomics as a developer. It isn't just some 
    shiny new toy or a waste in cash. For the same reason 
    professional mechanics in F1 don't use shitty tools 
    to work on their cars. Or tour de france racers aren't 
    using 30lbs Huffy bikes"
As to this assertion:

    it really would take an inestimable moron to, say, 
    argue that late Intel-era MacBooks can do the same
    things that M-series MacBooks can. 
In terms of raw performance and power efficiency, obviously the Apple Silicon laptops trounce the Intel-based Mac laptops.

But if you spend some time learning about our industry you'll realize that not all development workflows are identical, and not all have the same bottlenecks, and for many tasks an Intel-powered Mac is not a bottleneck. Surely you can understand that, or aspire to understand that.

I would certainly agree with a more generalized and reality-based version of what you and the other poster seem to be attempting to say: If your current hardware is bottlenecking you in any way, you should most definitely address that if at all possible. A hardware upgrade that unbottlenecks you and improves your developer ergonomics will almost certainly pay for itself in the long run. That is sane and profitable advice and something I've always done.


Thanks, I had missed that. It contains the phrase "don't use shitty tools", but I'll leave it to you to decide whether OP honestly recapitulated the same argument in their passing reference. The two seem somewhat different to me.

> As to this laughable claim […]

This is a response to a specific point which rewmie has made several times. They seem to genuinely believe there is literally no difference between M-series and Intel chips:

> There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing.

> there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

> it boggles the mind how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools", as if MacBook Pros packing an Intel core 7/M1/M2 suddenly became shitty laptops just because Apple released a new one

I likely disagree with your position, and believe you have made some bad faith arguments, but you're at least compos mentis.

> But if you spend some time learning about our industry

Whoops.

> you'll realize that not all development workflows are identical, and not all have the same bottlenecks, and for many tasks an Intel-powered Mac is not a bottleneck. Surely you can understand that, or aspire to understand that.

Would you mind restating what you believe my argument to be? Because this reads as a patronising non-sequitur to me, and I'm sure you're not intending for it to land that way.

(If you are pushed for time, I'll do it: nearly everyone spending thousands of dollars to upgrade their computer has what they consider to be a good reason for doing so, whether that reason be boosting their self-esteem by having the latest toy, or a mild performance boost in their day-to-day work. You may not find their interpretation of "a good reason" to be persuasive, but there are likely to be many areas of your personal spending which they would see as imprudent or rooted in tenuous reasons. This thread is full of people incapable of understanding the reasons others have for upgrading and making emphatic sweeping statements. Everyone is different. News at 11.)


Replying to this one since I think we reached max nesting. Regarding as to why somebody might not be in a hurry to upgrade a 2015 Mac to an M2:

https://www.cpu-monkey.com/en/compare_cpu-apple_m2_8_gpu-vs-...

To put it in fully objective terms, a lot of development tasks (for many people) are still dominated by single-core performance.

The M2 has roughly 2x single-core performance, which is going to be absolutely awesome if you're spending a lot of time waiting for the CPU. But if that's not really a bottleneck, and the things you do are already completing at a speed that doesn't disrupt your flow state or otherwise consume significant amounts of your day.

I'm working (on my 2018 MBP) on some Python software that does science stuff. The single core perf delta between my CPU and the M2 is even smaller for a lot of tasks, more like 50% instead of 100%. And I'm not doing anything that would really benefit from more than 6 cores.

I'm currently planning an upgrade, but it's just not a pressing need as $2K-$3K is a significant investment for me at the moment.

    I can't imagine an F1 mechanic not taking an interest 
    in the latest marginally improved wrench
F1 teams have mandated cost caps. I'm not entirely sure if that includes tooling, but even if not, budgets are not infinite and there is a time cost required to research and acquire new tools. Time and money spend getting wrenches are time and money not spent elsewhere. So I would think there is a constant pressure (like in any business) to identify real bottlenecks, not just spend unlimited amounts of money on increased capabilities that may or may not have any bearing on actual performance. Presumably this is why a developer might choose a regular M2 or M3, but not necessarily the maxxed-out M3 MAX with 192GB of RAM and 8TB SSD for $10,000 or whatever (I know I'm exaggerating). Yes it's more performance, no it won't matter for many workloads.


There is no daylight between us on any of these points.

My position is not that there aren't good reasons to have not upgraded from a 2015 Mac, or that I'm having trouble imagining what they are, but rather that it's a reasonable question to ask of someone in this specific forum.

> F1 teams have mandated cost caps…

We're not really arguing the point here. OP was not trying to pass an exam about the specific details of how F1 teams operate.


> OP was not trying to pass an exam about the specific details of how F1 teams operate.

They were extending the analogy to make a counter point. That's a good faith thing to do.


Very true and sorry if it seemed that I was criticizing them for doing it.

I believe that OP's intention was to suggest that professionals who work in fields with high demands for performance, like professionals who are passionate about their work, are likely to invest in new and improved ways of accomplishing their work, even if it's only a marginal gain. (e.g. the marginal gain of moving from Spotlight to Raycast was worth it for me).

Discussing cost controls in Formula 1 moves us further away (IMO) from that universal truth they were trying to cast a light on.


     "There is absolutely nothing I can do with my M2 
     laptop that I cannot do well with my cheap old Intel laptop"
Well, I took that one in good faith and interpreted it to mean that the old Intel laptop was perfectly adequate for their personal needs.

The alternative interpretation, that they believed there was no objective difference in capability between Intel and Apple Silicon laptops, was so absurd I couldn't imagine anybody expressing it or believing it. I think I made the correct interpretation but it was definitely an extrapolation on my part and definitely fits the HN guideline of "assume best intentions."

To be clear, the Apple Silicon laptops certainly trounce the Intel MBPs and I think most developers will find them well worth the upgrade for most things -- I just didn't like the assertion that anybody still using an Intel Mac was equivalent to somebody riding the Tour de France in a Huffy.


> I took that one in good faith

I tried to, but found it hard given that OP also challenged people to provide "concrete reasons" to upgrade, and said things like "there is absolutely no reason". Everything OP says indicates to me that they actually meant this as evidence for their generalisation.

> The alternative interpretation, that they believed there was no objective difference in capability between Intel and Apple Silicon laptops, was so absurd I couldn't imagine anybody expressing it or believing it.

I agree it's a head scratcher… and yet here it is, before our very eyes, time and time again. I even recapitulated the argument in more reasonable terms ("I think what you meant to say is…"), but they seem resolute in their belief that there are no reasons to upgrade from a "late 2010s" MacBook to a new one.

> I just didn't like the assertion that anybody still using an Intel Mac was equivalent to somebody riding the Tour de France in a Huffy.

Heh, yeah that gave me pause too. I actually think that the example of the F1 mechanic slices the other way entirely: I can't imagine an F1 mechanic not taking an interest in the latest marginally improved wrench, given the narrow margins by which they succeed or fail in competition against other teams, and other mechanics.

You are right that many Intel machines are still highly capable. One could buy an Intel Mac Pro until earlier this year, for example.

But the trigger for Mr/Mrs/Mx "No difference between Intel Macs and the M-series" was another commenter benignly asking someone why they hadn't upgraded ("Any reason for not upgrading?") from a 2015 Intel MacBook Pro.

I said this elsewhere, but it seems like a fair question to ask someone on a computer/programming forum, particularly when the machine in question is close to EOL and has been blown away by a new technology. Don't get me wrong, if this was someone using a 2006 Core Duo in 2012, I'd think it was much of muchness, but the M-series does change things somewhat.


Sorry you're being downvoted for pointing out specious arguments.


You... don't think that mechanics on a racing team are qualified to know if their current wrenches are sufficient?


I suspect that OP thinks, as I did, that you've constructed an inane straw man.


User latchkey, the one you're agreeing with, is the one who very literally claimed that a developer using an Intel laptop is quite equivalent to an F1 mechanic using "shitty tools" or racing the Tour de France in a 30lb Huffy.


I'm not agreeing with latchkey's statement about developers and "shitty tools". I'm agreeing with them that when you say this…

> if their current wrenches are completely sufficient

… you are not making an honest argument, because it is entirely subjective as to whether their current wrenches are "completely sufficient".

The dog I have in this fight is not upgrade cycles or Intel vs. M1, it's "argue the fucking point without descending into high school rhetoric and logical fallacies".


A wrench has a finite set of objective qualities. Grip, length, strength, weight and maybe some special-case properties like being non-magnetic or spark resistant.

It's surprising to me that you think that cutting-edge racing mechanics don't have objective criteria for these things and that it's all some sort of subjective dark art. But it's a bad analogy to begin with and it's not my analogy.


I totally agree it's a bad analogy, and whether I succeed in defending it or you succeed in knocking it down, it doesn't really help us understand each other in greater fidelity.

The one thing I do admire about the person who offered it is that they are at least trying to persuade by offering different lenses through which to interpret their perspective, instead of repeatedly shouting THERE IS NO GOOD REASON TO UPGRADE LMAO.


> The one thing I do admire about the person who offered it is that they are at least trying to persuade by offering different lenses through which to interpret their perspective, instead of repeatedly shouting THERE IS NO GOOD REASON TO UPGRADE LMAO.

That nails it on the head.


This isn't Huffy bikes vs F1 racers. Unless your workload is heavily CPU bound. And even then it's probably more like a 20yo F1 car vs a new one.

We also live on a finite planet. And then energy savings for many desk jockeys is unlikely to be worth it for a few decades more, if one considers the literal tons of material and energy in manufacturing.


> And even then it's probably more like a 20yo F1 car vs a new one.

This thread is literally about the decision to buy an M3-based MacBook Pro to replace M2/M1/Intel MacBook Pros. We're talking about hardware launched in the past 4/3/2/1 years.

That's hardly "20yo" anything.

Also, you failed to provide any concrete, objective reason to buy a M3. None at all. Is it that hard to put together any argument to justify the move?


> That's hardly "20yo" anything.

My car comparison was trying to propose an alternative metaphor since comparing a top-of-the-line racing car to a child's bicycle struck me as absurdly out of proportion. Cars are generally maintained and kept in service longer than computers, so I picked 20y out of thin air.

> Also, you failed to provide any concrete, objective reason to buy a M3. None at all. Is it that hard to put together any argument to justify the move?

My point is for most people there is no justification to move. Unless one has a device beyond repair, so old its software cannot be kept up-to-date, or the very rare need for the latest performance then stick with what you have.


That's totally not my experience at all.


I've been holding off on upgrading some older Intel Mac minis I have while waiting for the memory situation to improve, but so far it hasn't.

Ideally, I'd consolidate these older systems into one new Mac mini, or even a Mac Studio.

I'd like at least 64 GB of memory, at a reasonable price.

The latest Mac mini maxes out at only 32 GB of RAM, if I'm remembering it right.

I think the latest low-end Mac Studio could be upgraded to 64 GB of memory, but the last time I priced it, this upgrade cost more than I'd been expecting. It also put the overall cost above what I'd prefer to pay.

While I'd like to keep using a Mac, it's looking more and more like I'd be better off just building a PC, where I could likely get comparable enough processing performance, but far more memory (and storage) at a lower cost.


> My semi-educated guess is that Apple did a whole bunch of user studies and realized there are a lot of people out there

Apple has telemetry from macOS. So they knew exactly what percentage of users are still on Intel Macs.

And it's low-hanging fruit to go after them then try and convince existing Windows users.


Telemetry doesn't answer the important "why" question.


> better ergonomics

The newer 16 inch MacBook Pros are half a pound heavier than the Intel one.


I'm weird, I sit on the floor on a cushion with my back against a wall. I have a folding table over my lap that the laptop sits on. The keyboard actually works unlike my old Intel ones with the crappy butterfly. I hardly travel these days, but throwing it in a backpack isn't the end of the world.

That said, I was actually thinking ergonomics in terms of performance of development. The thing is so fast that commands complete faster than I can deal with them. My IDE can keep up with me. I can run a ton of apps and it doesn't slow down or glitch. It doesn't get nearly as hot and there is rarely fan noise. The screen is higher quality. The speakers sound better. Magsafe is back! The button for my fingerprint works very well. No more stupid touch bar. Function keys!

I could keep going...


The honest reason is that there is no practical reason to upgrade. The computer works, and despite the various FUD you might read, the attack surface for external attacks is quite small for personal computers.

That said, if anyone would like to send me $4000, I will absolutely upgrade to a new 14" Macbook in a heartbeat.


Because hardware vendors don't provide security updates forever, and they refuse to open-source enough of their code that other people can do it for them.


> Why is this sort of question always framed as […]

OP didn't frame it any way at all as far as I can tell, but either way it seems like an entirely reasonable question to ask of someone on a forum which is largely comprised of computer and programming enthusiasts who has not upgraded their daily driver for nearly a decade.

> as if spending money on the new shiny without any reason is normal or desirable?

Every single person who spends their money on "the new shiny" has a reason. You may not find the reason edifying, but that's irrelevant to your stated argument.

> Why would anyone waste their cash to replace something that works without having any compelling reason?

As you were doubtless aware when you specifically constructed a straw man argument predicated on an entirely false premise and laden with your own subjective judgements about "waste" and things that "work" and "compelling" reasons, nobody does this.

I suspect what you really mean is that you believe people upgrade their machines without what _you_ consider to be a good reason. You think people are too quick to upgrade when their machine isn't the very latest, or when it's got a dent, or when it's slowing down a little.

If you'd written what you really believe -- that people should not upgrade as rapidly as they do -- you'd probably have pulled on the thread for a further 0.02s and realised that everyone has different values and priorities, and you likely "waste money" in others' eyes across multiple line items of your annual budget. So it's terrific luck, really, that the internet's various competing interpretations of a "compelling reason" can't stop you from spending your money however you'd like.


> (..) it seems like an entirely reasonable question to ask of someone on a forum which is largely comprised of computer and programming enthusiasts who has not upgraded their daily driver for nearly a decade.

Are professionals expected to mindlessly throw money around at the new shiny without having absolutely no compelling reason to do so?

I think my post was rather straight-forward: people buy things only when they feel there is a clear upside to it. If you made that purchase 2 or 3 or 4 years ago, you need a very good reason to just throw it away and buy a new replacement. You need to at least make a valid case for it, otherwise you are just wasting your hard-earned money for nothing at all.

> Every single person who spends their money on "the new shiny" has a reason.

Why was OP framing that question on whether no reason was needed then, and instead people had to justify why weren't they wasting their money on the new shiny? Why is being new and shiny such a strong rationale that the onus of not buying is placed on not buying?

These are simple questions. In fact, all it would take is provide a single compelling reason why it would be a good idea to waste money on a M3 Macbook Pro when you already own a M2/M1 Macbook Pro, or even a late 2010s Macbook Pro. Hell, why on earth would you even waste money on a M3 Macbook Pro if you already have a M2 Macbook Air?

If you cannot answer this question, why would it be anything than absolutely foolish to pretend that people should justify not buying a M3?


> Are professionals expected to mindlessly throw money around at the new shiny without having absolutely no compelling reason to do so?

Once again you're loading an incredibly tawdry straw man argument here with your own inane value judgements. The only difference is that this time you've undermined your argument with a typo: it's otherwise as self-evidently vacuous as your original comment.

Just look at this epistemological nightmare you enumerated with apparent sincerity:

> why on earth would you even waste money on a M3 Macbook Pro if you already have a M2 Macbook Air? If you cannot answer this question, why would it be anything than absolutely foolish to pretend that people should justify not buying a M3?

Putting aside haplography (I guess if your argument is just begging the question a dozen times it gets hard to write coherently), it seems that you're literally incapable of considering that other people have fundamentally different values and priorities to you.

Read this sentence you wrote:

> In fact, all it would take is provide a single compelling reason why it would be a good idea to waste money on a M3 Macbook Pro when you already own a M2/M1 Macbook Pro, or even a late 2010s Macbook Pro

It is axiomatic that there can be no "compelling reason why it would be a good idea [sic]" to "waste" money on an M3 MacBook Pro. It's a waste of money, so there cannot be a good reason. What you presumably intend to write is: "I cannot think of a single compelling reason for a person to upgrade to an M3 MacBook Pro if they already own an M2, M1, or late-2010s MacBook Pro."

And that's it. You can't think of a reason. People in this thread have given you both examples of reasons to upgrade, and clear-eyed explanations of why your inability to suspend your disbelief in this area is not the incisive general argument you think it is.

Much of the work I personally do will be made significantly faster by upgrading from the M1 to the M3 Max, which I will upgrade to. I upgraded to the M1 from an Intel Core i9.

You might think that this is a compelling reason -- wanting one's work to be faster and more efficient. You might not. It doesn't matter. It's a good enough reason for me to upgrade, and that's the rub. Everyone has a reason to upgrade, you just disagree with how compelling those reasons are. And again, the great news for everyone else is that your handwringing serves only to make you seem enormously judgemental and narrow-minded. You remain free to spend your money as you wish.


They mentioned their coworker and most companies have upgrade policies. Just fill in a form every x years and you get a shiny new laptop (and depending on where you work, you’ll get to keep your old one as a gift)


> They mentioned their coworker and most companies have upgrade policies.

Upgrade policies aren't driven by new requirements, or performance improvements. Some companies have mandatory hardware replacement policies which mostly serve to allow their tech support staff to standardized on a small number of devices. Getting a M3 MacBook Pro replacement just because your employer doesn't want to maintain an Intel MacBook Pro is hardly indicative that a M3 is worth spending money on, let alone replace a M2 or even M1 MacBook Pro.


Because capitalism, that’s why. Capitalists have convinced people it is a moral imperative to continue to spend constantly.


The mode of production doesn't affect the fact that you have to do production. If nobody's continually demanding laptops from the laptop maker, they will stop making laptops.

https://en.wikipedia.org/wiki/Paradox_of_thrift


If people don't need new laptops because their current ones already do everything they need, then reducing laptop production is good. Fewer resources and less pollution spent on things people don't need.


Not if it meant there are none when they do need a replacement.

Similarly, buying cheap used cars only works because someone else bought them new.


> sir this is a wendy's


I am sure that 16yo girl buying an iPhone is thoughtfully postulating about the juxtaposition of morality and capitalism.

And not because it's shiny, fun and lets her socialise with her friends.


> thoughtfully postulating about the juxtaposition of morality and capitalism

I don't know how you managed to read the comment you responded to as suggesting that.


Not OP, but

The new Magic Keyboard ( or Scissor 2.0 ) just doesn't suites me. 1.0mm Key travel is so so much worse than the 1.3/1.5mm old scissors on my 2015 MBP.

I actually dont need the seamless, ultra large trackpad. Which gets false positive from time to time. This has never been the case on a sane trackpad size.

My Workload is memory limited, and rarely CPU limited. Upgrading wouldn't bring a lot of benefits unless I have more memory, and Memory upgrade is expensive.

Did I mention keyboard or trackpad?

I just had a battery swap on this MBP earlier this year, hopefully it will last another 4 - 5 years or whenever I cant update Safari. Although I guess I could still use Firefox.


>it works just fine for everything except video editing

would be my guess


Fiscal prudence?


To me, "personal daily driver" sounds like where you'd do online banking. A MacBook from 2015 can't run any OS newer than Big Sur, which is EOL right about now. And it sounds really imprudent to do online banking from an insecure device.


It should still be able to run an up to date web browser though, right?

If one is that concerned about someone exploiting an OS level security flaw to exfiltrate their online banking credentials (wildly unlikely), they should just be doing that stuff in a VM or similarly isolated environment anyways.


> It should still be able to run an up to date web browser though, right?

For a while, yes, but the browser being up-to-date doesn't make an EOL OS safe to expose to the Internet.

> If one is that concerned about someone exploiting an OS level security flaw to exfiltrate their online banking credentials (wildly unlikely), they should just be doing that stuff in a VM or similarly isolated environment anyways.

Just doing sensitive stuff in a VM isn't good protection at all, since a malicious host can trivially compromise the guest.


> A MacBook from 2015 can't run any OS newer than Big Sur

It can, Ubuntu runs just fine on it


You're right, I should have been more precise. But you still won't get security updates to firmware anymore that way.


If this is your personal threat model, I commend you on an exciting life well-lived that appears to entail sophisticated personal protection of the GPG keys and Bitcoin you need to run your business empire securely.


It can run the latest OS with the open core project


My 2015 MBP is supported in macOS Monterey.


The "Pro" makes a difference there. The Air and Pro from 2015 both got Monterey, but the regular MacBook from the same year didn't.


Many US bank websites have so few features I'm not even sure what hacking mine could get someone. They can transfer from my checking to my savings account?


I assure you, I take security quite seriously. The version of MacOS I'm using is nowhere near the top security risk.


The problem from another angle: I wouldn't trust anything made in the last decade for my airgap box.


x86-64 docker containers?

https://github.com/docker/roadmap/issues/384 is still open. :(


I don't know exactly why this bug would still be open, but you can use x86-64 images on an ARM64 Mac :

https://docs.docker.com/desktop/release-notes/#4250

I have been using it for a few months (in beta) and it works great !


Ah, just out of beta a few days ago. I'll try it out!


I think this is squarely aimed at people who are holding onto their Intel-based Macs with an iron grip. There are always people out there that don't ever want to move to another architecture. I saw it going from Motorola 68030s to PowerPC. I saw people not wanting to upgrade from PowerPC to Intel. Now we're still seeing the people who don't want to migrate to Apple Silicon. They may have legit reasons and what-not. But time is ticking.

So I think it's mostly aimed at the Intel hold-overs.


Intel ride or die here. My 2019 i9 mbp is trucking along still - and this time of year the heat helps keep the room hospitable.

Was looking towards M3 for a big leap, but apart from heat and power (I use my MBP plugged in 95% of the time) there still isn't that compelling a reason to deal with some of the issues (thunderbolt / multiple displays) for my use case.

At 4 grand (sterling!) for comparable spec to my intel mbp, I just can't bring myself to take a plunge.


Nobody has to upgrade. m68030 Macs still worked after the PowerPC transition. I used my Mac mini G4 for years after the Intel transition, and still use both an m68030 Mac and that G4 mini with NetBSD now.

Currently I'm running Sonoma on a 2010 MacBook Pro. I'd love an ARM-based Mac, but can't afford one yet. I'd have to disagree about the idea that "time is ticking"...


True. But I was just offering a possible reason why Apple was hitting that comparison so hard in the presentation.


In case you are not already using orbstack, that might give you even more battery life (not affiliated, just a fan).


A container that is idle and do not serve any request only really as memory in its footprint. There is no reason having tens or hundreds of processes not occupying cpu and i/o time would affect battery life in a significant manner.


Bro and where do you think the containers and scheduler run? There is a whole linux underneath running all the time


Only one kernel and scheduler for all the containers, that doesn't have a lot to do if most of the processes are idle.

And I am not your bro.


Your whole argument is containers will not occupy CPU or I/O. Which is not true, you’re running a full fledged VM, not just a kernel.

And presumably the person running k8s is probably not running them to have them idle. They are technically knowledgeable enough to be aware how much resources do they consume, and be excited for such comparison.


A linux VM with only a kubernetes/docker with most workloads idling doesn't use a lot of resources, except memory, probably less than the typical open browser tab full of unoptimized js.

When you have kubernetes on your laptop, that is to test your code alongside a set of other microservices functionnally representative of a prod deployment. That doesn't mean your containers will have much load appart from your own punctual testing.


It shows the comparison right on the chart, what else you need? But you should get m3 most times if you want to use it for the longest period.


Out of curiosity, how much RAM do you have on your MacBook to run containers?


How much RAM do you want to give the containers on your MacBook?

I'm being facetious, but it's an unanswerable question.


However much you set on the slider, so not quite so unanswerable.


I mean the person I replied to asked the unanswerable question. How can we say how much memory they need in their computer 'for containers' without knowing where they want to set those sliders (and how many of them there are), and then it's not really worth asking, or it would be a question about runtime overhead or something.


I feel very disappointed that the laptops lack decent i/o. They spend all this time talking about how advanced the GPU is, but it's a $1,600 laptop that can only power one external display or a 2,000 laptop that can only power two displays?

Plus they are nerfing the desktops by only offering the all-in-ones with m2, so you can walk into a store and want to buy any other desktop now you are paying a huge premium for outdated tech :(

I was really hoping for FaceID and better display support, but I don't feel there was any compelling reason to upgrade anything. If you preferred Windows before, you probably still prefer Windows. If you have an M processor there is no compelling reason to upgrade. If you were thinking about getting a Studio or a mini, you should probably just wait until they update the processor.


> I feel very disappointed that the laptops lack decent i/o. They spend all this time talking about how advanced the GPU is, but it's a $1,600 laptop that can only power one external display or a 2,000 laptop that can only power two displays?

yep. powering 2+ screens on a mac is not a straightforward process. for the prices Apple is charging on their "pro" machines, it should be simple to plug in 2 or even more screens to get work done.


Genuinely curious: what is your workflow that it requires more than 2 external monitors? Pro video?


Amateur video and full-stack software dev were the times I really appreciated having 3 monitors, running tons of things at once that all interact with each other.

Nowadays with my big corp SWE job, things outside my control make my workflow a lot slower, and I don't benefit from using more than just 1 screen. I work on multiple things at once, but they're totally separate, so I tab between them on my laptop.


One screen for intellij, one screen for tests, one screen for browser or email, for instance


A lot of people use three screens. I don’t, I use one, but people do.


> Plus they are nerfing the desktops by only offering the all-in-ones with m2

I'm confused by this statement. The iMacs leapfrogged M2. The new models shipping next month will have regular M3 chips, replacing M1 models.


sorry, it's a typo and it won't let me edit it now. I meant that they only desktop they upgrade to m3 is the iMac and all the other desktops are thousands of dollars with the old processor... so like Mac Studio for $2,000 that I need to buy to power all my monitors now is using an old outdated processor.


You either have a very specific workload that justify replacing $2000+ devices each year with zero waiting margin...

...or you are vastly exaggerating by stating that a 8 month old device is an old outdated processor and that you can't wait 4 more months for the update.


Can I wait? Sure. But this is the whole problem with single source computing that suddenly we have to wait for the processor to be delivered in the form factor we would like until a single manufacturer chooses to give it to us and they could choose not to as the intel mac mini was not regularly updated or choose to spend thousands of dollars to purchase something that might be outdated in a four months.

Plus the entire reason why I have to spend so much money is because they didn't bother to put a few dollars in i/o parts to allow it to support more video outputs that their low end devices could easily drive, and they don't support third party drivers well enough that we can use displaylink.


displaylink works just fine with an M2 macbook, what was your experience with it?


I wouldn’t consider the M2 outdated. My studio is running the M2 Max and it takes a lot to make it sweat. Would the M3 be better? Sure. In some ways. Is the M2 bad? Not at all. This is the best computer I’ve ever used by a wide margin, and I expect to continue using it for a long time.

I would have hesitated to get an M2 with the M3 announced, but having used one now… I don’t care very much. It’s a bit like updating your iPhone every year. It’s like 5% better and in the scheme of things, it’s not worth worrying about.


The single external display driver can drive more than one display at resolutions at 4k or lower. You can daisychain two 4k monitors on a Mac that has only one external display driver when you use the Displaylink adapter (it’s a driver). It somehow internally reuses the same display output for both displays.

The USB-C can power one 6k or 2 4k monitors.

The HDMI port on M2 Pro and newer can power one 8k.


any experience with the displaylink driver? i always read that its a performance hog and comes with all kinds of issues. I'm running an m1 air and having daisychaining work properly would safe me from switching to an m1 pro + thunderbolt dock. I don't want to give up my single usb-c cable solution.


Displaylink really doesn't work well on a mac. Even with intel there was a time period where it flat out stopped working and currently when you connect it, it enables screen recording which doesn't allow you to use certain hdcp functions.

It's like Apple re-enabled displaylink, but only begrudingly after they didn't allow it for a few generations of the OS.


Thunderbolt docks too — with a single cable they can drive two 4k displays plus provide power to the laptop.


They really can't on plain (not Pro/Max) M1/M2/M3 chips. No matter what, these machines will only support one external display unless you go the (IMO kludgy) DisplayLink route. It's a hardware limitation and no matter what Thunderbolt dock you have it will not do it.


[flagged]


Would you walk into a church and start telling people God doesn't exist? It's an Apple thread!


I really appreciate this comment. LOL


An interesting thing about the M2 Pro and M3 Pro is that they shifted away from being mostly performance cores. The M1 Pro was either 6+2 P+E or 8+2 P+E. The M2 Pro was either 6+4 or 8+4. The new M3 Pro is either 5+6 or 6+6.

Apple has been shifting away from performance cores with each generation. M1 Pro: 75-80% P-cores; M2 Pro: 60-67% P-cores; M3 Pro: 45-50% P-cores.

This shows up when you look at Geekbench results. A 10-core M2 Pro (6+4) gets 12,100 while a 10-core M1 Pro (8+2) gets 12,202. The 12-core M2 Pro (8+4) gets 14,221. That's a 16.5% increase from having 20% more cores. In some ways, this feels like an odd result. Adding two additional M2 P-cores gets Apple a comparatively small result over the 10-core M2 Pro (less than the average core performance). However, adding two efficiency cores over the M1 Pro gives them the same 16.5% boost over the 8+2 M1 Pro.

If I had to guess, maybe it's thermal throttling when running the benchmark. If the additional P-cores can't truly be P-cores under 100% load, then their impact shows up less. Likewise, if the E-cores can match P-core performance under heavy thermal load, then it could show up as being roughly equivalent in the benchmark.

I wonder if real-world scenarios end up differing from benchmarks in a meaningful way around this. For example, core-pinning can be useful for warm caches and in a real-world scenario you might have a process the OS tries to pin to P-core-1 that has spikes in utilization while another process is pinned to P-core-2 with similar spikes. So both get the performance of a P-core and warm caches while the thermal load isn't that high so the P-cores are still at their peak performance (unlike a benchmark that's trying to use all cores as much as possible at the same time).

Maybe this is a business decision more than something based around how the chip performs. The big selling point of the M1/M2 Max was graphics (maybe the extra RAM as well). You could get the same CPU in the Pro or Max. Now the M3 Pro is a 5+6 or 6+6 CPU while the M3 Max is 10+4. 67-100% more performance cores becomes a selling point for the M3 Max even for those who might not care about graphics as much.


I think your last paragraph sounds about right. The Max is a compromise on battery life for those who want the absolute most possible performance, but I think Apple is seeing a much wider audience get the Pro chip. Anecdotally, whenever I see a tech person mention their Apple silicon device, it's almost always a Pro chip. I'm sure only a small fraction of that user base is hitting all cores with a full load on a regular basis, but a majority are certainly enjoying a killer battery life.


I'll be buying the Max this year, upgrading from an M1 Pro. The Max actually started making sense, because there are more performance cores and memory bandwidth is bigger. I don't much care about GPU: I wish Fusion 360 ran faster, but Autodesk doesn't seem to care much about Mac performance. What I do care about is CPU for Clojure development work, and for that the new Max actually makes sense.

In the M1 generation there was no benefit from going from Pro to the Max.


I chose the M1 Max over the M1 Pro only to get 32GB RAM. I couldn’t care less about the GPU (which makes it annoying to have to pay for it).


I have an M1 Pro with 32GB


It seems like Apple is learning from Intel here.

Apple made their E-cores a lot faster the past couple generations. Despite being a tiny fraction of the size, they get nearly 50% of the performance (yay diminishing returns) while using around 10x less power.

If your task only scales to 1-4 cores, you need those cores to be fast, but generally speaking, if you can scale more than that, you can probably scale to a LOT of cores. From this perspective, 6-8 cores allow a couple of those lightly-threaded applications at the same time which is usually all the more you'll be running.

After that, putting 4-6 E-cores+cache in the space of just one P-core+cache means you're still using half the power, but getting a lot more performance in those very scalable applications.

AMD is headed this same direction with Zen 4c and 5c chiplets where you'll have one performance chiplet and multiple efficiency chiplets.

The area and power wins are just too big to ignore.


> maybe it's thermal throttling

The GPU is a very large fraction of the thermal output these SoCs are engineered to handle, so there really shouldn't be any throttling if you're only loading the CPU.


Its the same package tho, so if the GPU gets hot then the CPU thermal throttles as well.


Yeah but it's not very often that you will push both CPU and GPU to the max.


Geekbench does though.


Significantly easier to get benefits from scaling GPU cores than CPU cores.


What's with the 36 GB option? The other memory configs (16 GB, 64 GB) are still clean powers of two. Size suggests that they're using ECC-capable memory but using the extra width intended to support ECC for data... but why would this only apply to a single size? Part availability?

EDIT: Digging this a bit, it's not (one or more) 72-bit wide busses with 2^32 words as I'd expect as a gray-beard, it's (probably) six 32-bit wide busses with 6 GiB per bus; and this use of 1.5 * 2^N deep memories has become relatively common with the use of IC stacking, with 12 power-of-two sized ICs stacked in a single package (instead of the more "comfortable" 8 high or 16 high stacks of the same ICs giving powers of two).


They've been doing more multiples of 6/12 lately, I supposed it's related to the available chip sizes for the LPDDR5X.

M2 and M3 go up to 24GB

M3 Pro is available in 18GB and 36GB

M3 Max is available in 36GB, 48GB, 64GB, 96GB, and 128GB


Why do I need to select the most expensive M3 Max version (the one with 16 cores) to get 48 or 64 gigs of RAM. The M3 Max with 14 cores only allow 36 or 96 gigs. Just let me choose 14 cores and 64 gigs.


In previous gens they had different die sizes with different physical numbers of memory controllers and connections. Guessing it's similar here.

Rumor is Apple bought all of TSMC's initial 3nm production. They are probably yield limited on full featured chips.


Most likely. Let's not kid ourselves, there would be more players sharing those if they felt it was worth it.


This kind of blows my mind.

I'm running the lowest spec m1 with 8 gigs of ram for my daily driver.

I frequently have tens of chrome browser tabs open across multiple profiles, multiple chrome debuggers, at least two or three VS code instances, multiple high memory nodejs processes, video conferencing, screen sharing and building code all at the same time - and the thing runs smoothly without a hiccup. And does it on battery! For a full work day, at least.

Weighs less than a pixie's fart.

I can't even imagine the capabilities the new m3 max has with that kind of memory and power available.

It's going to pinch, but I guess I'll be finding out in a few weeks.


What's your swap used and memory pressure in Activity Monitor? Regarding Chromium browsers, I have noticed that they tend to unload tabs more aggressively these days.


I have the same one and it’s neigh unusable. The memory is constantly maxed out and it freezes frequently while swapping. I can’t use it. It’s in a drawer somewhere.


That's so wild - I wonder what the difference is?

My m1 is so good it seriously makes me wonder if I really need to upgrade.

What kind of workload are you running on yours?


You are in their target market of who they can really squeeze since you want something a little bit special. Their memory pricing has always been absurd. Even on an older MacBook it was often +$200 for something silly like a 16GB SODIMM that costs $50 at the store. I still think their build quality , screen quality and OS are pretty much unmatched.


It’s worse. The configurator is labeled with a relative price upgrade but an absolute RAM size. The actual deal is +$200 for +8 GB RAM which is outrageous.


Sure, but they aren’t charging you anywhere close to cost. It’s painful for a company that size to add an option like this. Plus they know they can add a chunk on because if you’re in the market for that option, they know how much you want it.


The same discussion about cost/value comes up every single time.

Often, people will search for a random part that fulfills the same function, sort by cheapest, and then let their indignation run wild, completely ignoring the difference in form factor, other properties, or even quality.

That said, it’s no secret that Apple adds a healthy margin and an “inconvenience” tax.

In Apple’s ideal world, all people would purchase a handful of mass-produced configurations. This saves them in manufacturing costs, assembly costs, and logistical costs.

Apple also spent an ungodly amount on engineering to “make more with less”.

In the long run, this saves them money on lower-capacity components, especially at the quality and with the ancillary properties they're purchased. This is why spec for spec their iPhones and Macs look underpowered compared to competitors while performing the same if not better.

So, from their perspective, it’s “fair” to upcharge the “spec peepers” and professionals who really need it. The latter is generally less price-sensitive.


As you've stated people will make comparisons between the cheapest bottom rung slot-in ram against what's on offer from Apple's UMA. It's naive.

But even when one could just pop in your own RAM, many would still buy Apple's upgrade, which sounds insane on the surface. But there are actually pretty sound reasons for it:

1. They're buying the Mac hardware because it works. That's one of the core motivations for spending the extra to begin with.*

2. Adding ram is the gateway to unexpected crashes, such as errors that only pop up during heavy use, higher temperatures and the like. It doesn't have to be cheap either.

3. But if one is going for those cheaper components, one gets what they pay for: "mislabeled"(online fraud schemes), counterfeit, bottom-rung binned components sold as legit are par for the course.

--

* A lot of brands say their products work, when they simply don't. Here's some of my own examples:

I'm now onto my 3rd brand of mesh wifi routers - why? Because what's out there is garbage, even when you pay a lot for it. It's clear that there is insufficient QA on many top brands, and they simply won't acknowledge that the product they've sold you doesn't perform.

I purchased one of the pricy 5k LG screens that were a total lemon, at this level of expense you don't expect school-boy errors in hardware. LG handled it poorly, both in their return and exchange options, as well as customer support and patches.


I 100% agree with your comment down to your anecdotes on mesh routers and TVs.

Specifically, when it comes to routers, it seems that manufacturers only ensure that the basic routing works adequately and that all the bells and whistles exist solely for marketing reasons.

This is so bad that their CS is trained to have you disable said marketed bells and whistles during troubleshooting, only to conclude that everything is fine as long as the basic routing functionality works and that anything that doest work beyond that is a “you” problem.


Wild guess but if the base chip has 3 channels and the higher binned chip has 4 channels:

3x6GB = 18GB

3x12GB = 36GB

3x32GB = 96GB

4x12GB = 48GB

4x16GB = 64GB

4x32GB = 128GB


But the bandwidth goes like, 100 GB/s, 150 GB/s, 300 GB/s, 400 GB/s. Given either CPU or GPU alone can't saturate bandwidth (assuming M3 the same as M1 and M2), perhaps the memory controller or the chip layout is to blame.


Right, the two M3 Max SKUs being discussed above are the ones with 300GB/s and 400GB/s memory bandwidth.

That sounds like 3 memory controllers vs 4 memory controllers, likely due to chip binning. TSMC's 3nm processes supposedly still has relatively low yields.


Just get the 96G option? It is only around 100$ more for 1.5x the memory. And 1.5x the memory is worth it when you are interested in high memory applications anyway.

What is your usecase, LLMs? Other AI models? Multiple large VMs? It’s always worth it to max the memory. In this case going from 64 to 96.


They don't just differ by cores, they have different memory bandwidth - presumably because they have different width memory busses into the SoC.

Their designs, being primarily mobile-focused, are rather hard-coded to specific memory configurations.


Because you'll pay if you're in the market for that much memory.


Maybe it's either triple channel or bank switching? Mac Pro 5,1 were triple channel too.


It also appears they've dropped the memory bandwidth from 200GB/s on the M1/M2 Pro to 150MB/s on the M3 Pro, and you have to upgrade to the tippy top M3 Max chip to get the full 400MB/s bandwidth experienced on the M1/M2 Max chips.


(Just pointing out that all these numbers are in GB/s)


I remember getting one of those shiny aluminum MacBooks in 2009. Then MacBooks turned plastic and the aluminum ones became MacBook Pro's. Is this a thing Apple does? I'm not an Apple customer for many years now.


Apple hasn’t had any plastic devices for some time now.


I'm referring to the 2010 polycarbonate one here: https://everymac.com/systems/apple/macbook/specs/macbook-cor...

Not sure why my initial comment is -4 votes right now. What am I remembering wrong?


What you described as a transition to plastic for the MacBook was actually the final iteration of a fairly long line of plastic MacBooks, most of which were a pretty similar design to the pre-Intel iBook.

"MacBook" (not Pro) was plastic from the start, and introduced the same year as the MacBook Pro (always aluminum): 2006. The plastic MacBook was discontinued after the 2010 model. There was a one-off aluminum 13" MacBook in 2008, but it coexisted with the plastic MacBook and was not replaced by a plastic model—instead, the 13" model was promoted up to the MacBook Pro product line, which previously had been just 15" and 17".

After several years of not having a plain MacBook, Apple did the 12" aluminum MacBook from 2015 to 2017. It was basically what they wished the MacBook Air could be, but was too expensive and too thermally constrained to truly replace the MacBook Air of the time. It was killed when they finally upgraded the MacBook Air to have a Retina display.


That solves the mystery, the one off aluminum "regular" MacBook that I bought. Thanks for this, excellent .


Nothing it’s a perfectly fine question, just maybe off topic.


The plastic MacBooks were only made 2006–2010.


I bought an aluminum one in January 2009 if my memory serves me right. It was a nice machine, wasn't it first aluminum MacBook?

I don't think it was the Pro version (if it even existed). I certainly didn't have money to buy anything but the cheapest model.


It has nothing to do with ECC. They are using LPDDR5X memory, and if you follow Android phones these number shouldn't be surprising.


I prefer that tbh. If I'm frequently saturating 16GB of memory in the previous gen and it is time for an upgrade, I probably don't need literally double the memory, but 24GB would be nice.


There’s also a 18GB M3 Pro config of the MacBook Pro. Very unusual.


One and a half 12-chip?

Or a 12 plus a binned 8?


64 GB chips that failed testing? That's one way to increase yield.

Clive Sinclair did that trick back in the 80s.


Doesn't look like it. Memory ICs are tested before packaging, and then are packaged in stacks. While I'm sure there's some loss in packaging, probably not enough to justify this level of binning; and most of the failures that would happen at this stage would potentially interfere with signal integrity for the whole stack.

Instead, it looks like after the 8-high stacks that I knew about, manufacturers went to 12-high stacks. There are 16-high stacks available too, but looks like there's a lot more backside thinning needed to keep the same Z height so cost goes up disproportionately and 12-highs are a reasonably sweet spot for capacity per dollar.


There are non power of two sized DRAM dies in production these days; it's not purely a matter of stacking (and stacking doesn't make it any easier to have something in between one die per channel and two dies per channel). You can get 24GB single-rank and 48GB dual-rank DDR5 UDIMMs compared to a year ago when the options were 16GB and 32GB; no stacking at all, just going from 16Gbit to 24Gbit per die. But LPDDR has been doing this for longer due to demand in the smartphone market.


Intel and AMD do that now with processors.


Apple did binning back with the original M1. (either 7-core or 8-core models)


Uncle Clive.


It looks like these models won't be as useful for LLM inference which are heavily memory bandwidth constrained. The Macbook Pro page shows M3 at 100GB/s,150GB/s, and 300GB/s vs M2 at 200GB/s and 400GB/s. 400GB/s is available for M3 if you opt for the high gpu config, but interesting to see it go down across all of these models.


The presentation mentioned dynamic GPU caching: that seems like something transformer models would like.


Could be, but I'd like to hear more information about what it actually entails.

My gut feeling is that it's kind of like Z compression, but using the high amount of privileged software (basically a whole RTOS) they run on the GPU to dynamically allocate pages so that scare quotes "vram" allocations don't require giant arenas.

If that's the case, I'm not sure that ML will benefit. Most ML models are pretty good about actually touching everything they allocate, in which case, lazy allocations won't help you much and may actually get in the way startup latency.


In addition to what mono said, llama.cpp allocates everything up front with "--mlock"

Llama.cpp (and MLC) have to read the all the model weights from RAM for every token. Batching aside, there's no way around that.


Mlock is an optional parameter: github.com/ggerganov/llama.cpp/tree/master/examples/main#mlock


I don't understand why they don't update the entire lineup whenever a new chip comes out, especially if really only swapping the chip in the macbook pro and imac here.

On a separate note does anyone have any insight on how unified memory compares to vram in the context of machine learning performance? Considering an H100 with 80GB costs like $30k a maxed out macbook pro with 128gb unified memory for $5k is interesting. Is it remotely comparable or compelling for large models considering realistically most prosumers are capped at the 24gb nvidia cards for anything resembling a reasonable budget.


I expect it's a combination:

1. Engineering bandwidth (it's work to upgrade a lineup! They probably need at least a year)

2. Manufacturing bandwidth - they probably spent the first half of the year manufacturing the 3nm iphone processor while researching the mbp processor. I expect it's difficult to ramp manufacturing on many chips simultaneously

3. Sure, consider the demand side. If my parents are going to buy a new laptop and a new phone this year, I think they would be more likely to do that 6 months apart. Similarly, I expect keeping a cadence of announce products a,b,c in this quarter and d,e,f in that quarter helps keep apple in the news (for good/exciting things)

Wrt machine learning: I can't wait for the results to come out for this once we can get our hands on it


Also cost.

Apple introduced the 15 inch MacBook Air on June 6th.

If they upgraded it to the M3 right now that machine would have only lasted 4 months and 4 days.

Sure they could do it, but how many sales are they really losing because they haven’t? People buying the Air are not exactly looking for the absolute best in technology. They’re probably much more likely to be price sensitive, and putting a brand new M3 in would likely make that worse.

So if you consider that you have to make new boxes and manuals and new hardware and test it and change the lines and everything else… what are the chances you would come out positive after such a short amount of time?

It would make some more sense for the 13 inch air which is about 18 months old, but again if you don’t have to maybe you keep selling the M2 for another six months.


> 2. Manufacturing bandwidth - they probably spent the first half of the year manufacturing the 3nm iphone processor while researching the mbp processor. I expect it's difficult to ramp manufacturing on many chips simultaneously

Sort of. It's that the whole concept of a "ramp" implies that manufacturing things (not just chips!) starts expensive and risky and get cheaper and reliable over time as people work out the kinks in the various processes.

The M2 is mature. TSMC can make them reliably and in high quantity at low marginal price. Why would you *not* sell a product like that if there's a market for it?


Speculating, but I'd wager it's to clear out existing inventory by phasing out the old chips on other SKUs over time. E.g., you start with the expensive Macbook Pro models when you have fewer M3 chips in production, since those will sell slowly, and then you clear out your remaining M2 inventory in the Air lines while waiting for a "refresh" when M3 production is more ramped up.

I have no idea what Apple's sales are for the macbooks or if Airs sell more than Pros, etc., but they're definitely making profit this way to be this consistent in their approach.


Besides wanting to clear out existing M1/M2 capacity, I don't there's enough M3 stock right now.

I forget how many fabs TSMC has that have 3-nanometer capability, but it's gotta be low. Apple's supposedly got _all_ of TSMC's capacity[0], but I doubt it's enough.

[0]: Apple is saving “billions” on chips thanks to unique deal with TSMC | https://news.ycombinator.com/item?id=37040722


They just updated their macbook air and mac studios and mac pros like 6 months ago? If they updated right now, the previous owners would feel utterly screwed over.


Not even. If you ordered a non-baseline M2 Ultra Studio at launch, you likely didn't receive it til the first week of August. So we're a week shy of three months. So I'm glad for not being screwed, I think. Although who is to say they're not just being sat on (M3 Max chips)...


This is a perspective that comes from stagnation. Tech is progressing again, finally. This is normal for tech, and a good thing.


Right but why did they just update them then? Why not do it earlier or wait for the M3?


Yup I bought an m2 mbp about 3 weeks ago.


Feels like it's moved super fast. I just bought an M1 Studio earlier this year.


Almost four years now, for M1. I get the impression that people are used to Intel’s stagnation that we were stuck with for some time.


Only 3, unless you're counting the developer boxes? M1 machines were released in November 2020. However, M1 Pro and M1 Max were released only 2 years ago. Things are definitely picking up now...


And more importantly, these improvements are mostly thanks to TSMC's progress. You could predict the M3 already late 2022 when TSMC announced that they were starting mass production of their 3nm process. We knew that it would be a huge improvement based on the specs we had back then and roughly how long it takes to ramp up.

https://pr.tsmc.com/english/news/2986


The M2 Ultra tops out at 800GB/s unified memory bandwidth. A 4090, on the other hand, has 1,008 GB/s. On the PC side, dual-channel DDR4-6400 offers a bandwidth of 102 GB/s.


They're not really directly comparable due to the architectural differences, right? For pure rendering workflows I'm sure the 4090 is faster, but for anything that hits the CPU ...


Yes, the tiled-based architecture of their "GPU" is undeniably less performant. Even comparing to lesser GPUs.


Nvidia is tile based since Maxwell


Yes, I should've been more specific. Mobile GPUs often employed in ARM architectures often use a technique called "Tile-Based Deferred Rendering" (TBDR). This involves not just splitting the frame into tiles but also delaying the actual shading operations. This two-step approach minimizes overdraw but can create bottlenecks in scenes with intricate geometry or advanced lighting, shadows, and reflections.

Nvidia's approach doesn't have this.


How is the deferred rendering in TBDR different from the deferred shading and deferred lighting that have almost completely supplanted forward rendering in commercial games and engines?


Why exactly is tile-based a bad idea? Most rendering engines are tile-aware already.


See my response to the sibling post


Bandwidth means much less if those numbers aren't crunched faster.


I guess the question is: in terms of performance per watt, how valuable is the "unified" aspect?


Depends on if you're talking about training performance or inference performance. It's probably a decent deal for inference on such large models, but I doubt it's anywhere near competitive for training. In the 'consumer' space there's also the A6000 with 48GB for ~$4k.


It's probably due to a combination of the processor manufacture rate and the quarterly sales boost they get throughout the year


It makes a world of sense if you think of it from a logistics point of view, especially considering Apple sells insane volumes of everything they sell.

It costs less to manufacture the current generation ship and the last generation chip concurrently, it allows them to push the newest generation chip out the door faster without running out of supply or having to open up excessive production lines.

>H100 vs apple silicon

while it's only one spec, H100 has over 2TB/s of memory bandwidth and Apple silicon caps out at 400GB/s. H100 in general kicks the pants out of Apple silicon performance and has far more support. These parts aren't really in the same league but Apple Silicon has become a popular hobbyist choice for LLM inferencing.

Also see: The Jetson AGX Orin 64gb, Nvidia RTX A6000 48gb, AMD Radeon Pro W7900 48gb, two nvlinked 24gb 3090's.


Regarding updating the lineup, I would guess that it is completely intentional to keep the carrot just out of reach across the lineup. Each new chip release rotates the product categories in just the right way to keep the consumers consuming.


Right? I’ve just setup my office and for now I’m using my MBP hooked up to a couple of Studio Displays - which I adore for software development. I was planning on picking up a Mac Mini or Mac Studio so that I could still have the MBP spare.

Maybe not. It’s a bit of a pain having to unplug and replug everything in, not to mention having windows and such shuffle around each time.


My gut tells me the Macbook Air sales have been cannibalizing Macbook pro sales.

For the M1 and M2 generations, the airs released in the fall and the pros a few months later in the spring. The airs have gotten so good that many people that would have gotten the more expensive (and more lucrative) pros got airs instead.


> Additionally, support for up to 128GB of memory unlocks workflows previously not possible on a laptop, such as AI developers working with even larger transformer models with billions of parameters.

The presentation explicitly highlighted this very randomly, without showing how AI development actually works.

I know Apple has put more effort into Apple Silicon support for PyTorch but is it there yet?


I don't think it is? But I would love to be wrong. Cheap mobile Nvidia cards outperform regular M2 chips on small learning tasks.

M3 Max might be a contender, but Metal is far from CUDA yet? A laptop M3 Max looks like it can compete with a Nvidia 3070.

It's an unfair comparison as Nvidia is a huge desktop heat/energy hog and Apple M chips are really efficient. Let's see how desktop M3 chips fare.


Consumer nVidia cards are constrained by their RAM which is inconvenient considering the sizes of popular LLMs. An M2 Ultra basically gives you 100+ GB VRAM which is nice.


Although pytorch is in python, some ML features in SoTA libraries are coded against CUDA and complains if you don't have a CUDA device.

A lot of things work out of the box with pytorch on mps, but not everything, and it can be frustrating to figure what works and what doesn't while you have a bunch of other moving parts shifting under your feet due to how fast the whole thing is moving.

That said, llama.cpp (aka "ggml") on M1/M2 Mac is very robust. If you just want to run inference (as opposed to training) of models and you're comfortable with a Mac, there's really no reason to go out and by an nVidia card.

FWIW, some numbers I found on reddit: https://www.reddit.com/r/LocalLLaMA/comments/16o4ka8/running...


I'm pretty sure pytorch supports it OOTB. In my experience it works even better than ROCM


Are the cool kids using GGML?


no, GGUF


Also that.


FWIW, the main developer of GGML apparently runs a M2 Ultra himself. Apple Silicon is a first class citizen on GGML and runs fast.

See eg. https://www.reddit.com/r/LocalLLaMA/comments/16o4ka8/running...


It's quite impressive that the event was filmed on iPhone 15 Pro[1], although it involved professional lighting and various equipment, which is not typical for the average user.

With the ability to capture in Log, it's possible that the next iPhone release might be filmed using the very phone that's being unveiled.

[1] Source: https://www.youtube.com/live/ctkW3V0Mh-k?t=30m02s


Really love the pace of Apple innovations. Nowadays, I am biased towards battery and weight (MacBook Air) if it is for portability and the other line of notebooks are converted into desktops via TB exactly because of weight and battery life. I feel the difference in battery use from one to the other. The Air seems like it has "infinite battery" evn when the others use Mx processors.

Also waiting for a great Linux port battery/weight wise.


i’ve had the 16gb ram macbook air for 3 years now and it’s still an incredible machine considering lightness, battery, and passive cooling.


Ah! Thanks! I forgot to mention passive cooling. I am old enough to receive the heat of a new notebook. This is when "mechanic physics" tend to zero.

I would add that my MacBook Air received an smoothie of strawberry, mango, and orange over its keyboard and also fell off a few times from more than a meter.

BTW, I don't consider myself an Apple fan but really appreciate different form factors and real innovations. Hope others to follow. I think using several operating systems opens your mind.


I was seriously considering an m3 pro for the space black color until I confirmed it wasn't passively cooled. An extra monitor would be nice at work but the air is just too great as a laptop.


Interesting that the Myst remake is being used specifically as an example, with how the original was created on Macs using HyperCard.


Yeah, I thought that was a little poetic :)

Especially since (and this just occurred to me), in a roundabout way the original may have been the first Mac game to ever use raytracing (in its pre-rendering)


> Myst

I was just coming here to ask about the Myst reference, since I played that 25 (?) years ago and wondered why they mentioned it. So there's a remake?


That one, and Sketchup really confused me.


Where’s the SketchUp reference?


14 minute mark


I think the M series laptops were the first full manifestation of the laptop idea.

Instant sleep/resume, great input devices and monitor, cpu power at will without overheating or loud fan, super fast shock resistant storage, 20+ hours of battery life, no battery drain when off.

It took us 30 years but we finally have it.

Now let’s hope WIntel can also catch up soon.


It's kind of surprising that Intel (or I guess x86 in general) still has basic issues with sleep, laptops waking up in your bag and that kind of thing. The forced modern standby thing they're doing has only made it worse.


My Intel NUC was unresponsive and overheating when I came home from work today. It's the second one I have. The first was exchanged because it would overheat and become unresponsive. It doesn't have anything running while asleep except background services and Firefox.

There are several support notes about sleep problems and overheating, so I know I'm not alone.

No combination of BIOS or OS settings should make it possible to overheat while asleep---especially not the defaults! (Notably, overheating occurs equally in Windows or any of the 5 Linux distros I've installed in the past two years.) It's particularly disappointing when the computer is made by the processor company. There's no excuse---no missing documentation or inability to test side effects. Expecting peerless hardware support was THE reason I went with Intel instead of any other small-form-factor PC.

Honestly, at this point, if I want Windows, I'll probably run Parallels on a Mac. I hate Mac. That's where I stand with Intel and Microsoft.


Yeah overheating ever is just poor hardware design, not enough cooling.


It’s kind of strange their benchmark for “Machine Learning Programmers” is “Simulation of Dynamical Systems in MATLAB.” Seems like they could have capitalized better by using a generative ai inference benchmark.


iPhone 15 Pro NPU can do 35 TOPS. M3 can only do 18 TOPS. That's why they didn't focus on ML in the video.


Its big question mark with 35 TOPS, because its extremly limited and hard to achive.


Maybe they know M3 is not up for that kind of task.


Possibly this, but also consider Apple is playing major catch up with generative AI. I could see avoiding mention just to keep it a bit further from the mind of reviewers.


Could be, but people are already using M2 Ultra for LLM stuff and like it. Seems like Apple would double down on that.


I agree the machines have been good to run various ML models that aren’t running on other PCs in their category. (Though, shrinking the resource requirements seems like a never ending exercise.)

Apple likes to wait until it can announce something with a most around it. In this case, more compute isn’t enough, I don’t think.


These are still going to be used in laptops with soldered flash chips for the SSD. Apple's laptops aren't meant to be used by people who expect to repair their laptop when the storage fails. Many people don't have access to replacement chips and services to have the flash chips found on the board replaced. It's as if their hardware is meant to be e-waste.

The RAM is still not ECC. They focus so much on the shallow marketing without shipping anything which makes a difference.


My Windows desktop has generated significantly more e-waste per year than the Macs I have owned. Full stop. I've upgraded SSDs from 500gb to 1TB to now 2TB, upgraded graphics cards, and replaced a massive broken aluminum and copper heatsink. All in the last two years.

I've had the same MacBook, in a similar timeframe. It gets just as much use. Its still rock-solid.

Your concern is a hypothetical one. The inability to upgrade has, in a VERY REAL sense, meant for me: I overbuy specs upfront, so I don't have to upgrade (and thus, generate e-waste). It also experientially means that the machines are more reliable. The least wasteful machine is one that doesn't have to be upgraded; not one that can be and needs to be. The 500gb of storage I have in my MacBook sometimes feels limiting; but the cost of upgrading (a whole new machine) has stopped me from actually going out and doing it; and thus less e-waste is generated.

Eventually I will brainstorm what to do with this machine once its outlived its useful life as a laptop. I'm thinking, server rack. We've got Asahi, its got thunderbolt so wiring up super fast storage drives is a cinch.


Sounds like your own problem. Apparently you like to replace and tweak things. Most people buy a good computer and are done with it, and if they ever upgrades something, the frequency is much lower than you. Don't blame your problem on others.


But, that's my point? If most people just buy a good computer and are done with it, which I agree to be a correct statement, then why does upgradeability matter from the perspective of global e-waste generation? Outsizing the impact of upgradeability, by literal orders of magnitude, is Reliability and Longevity; buying less in the first place.

The problem that I have is: memory modules and SSDs aren't exactly reliable, and need to be replaced. One of those SSD upgrades was a Want; the other was a Need. Hypothetically, Apple should be subject to the same problem; and they are, to some degree, but what we're talking about are degrees. If their machines need to be repaired X% less often, but those repairs do require the replacement of an entire logic board; there's a value of X where they are generating less e-waste per user than modular solutions.

Sure, that's a lot of `if`s; but everything I've seen suggests they're on the right side of that equation. Especially considering their repair loop is closed, and they're obviously very proud of their considerable recycling capabilities.


That's not hypothetical at all. You might want to look into issues people have encountered with the apple SSDs and how much swapping is involved on the lower specced units.

Regardless, I choose my non-Apple hardware how you choose your Apple hardware. I buy something which won't have to be upgraded or replaced for a long time.

Being able to repair hardware matters. It's fine if it doesn't matter to you and to others.


Why are your old parts getting wasted? You can sell or repurpose them.


Sure; and you can also sell and repurpose your old Macs. That's independent of the problem everyone seems to yell about concerning Apple and e-waste; the ground reality that I've observed, and I think most people would agree with, is that I see a helluva lot more old iPhones and old Macs in peoples' day-to-day computing rotation than I do old Androids and old Windows PCs. Apple spent half the M3 announcement talking about performance in comparison to Intel Macs, because they know the statistics; there's tons of those things still kicking.


> It's as if their hardware is meant to be e-waste.

But the chassis is made from 100% recycled aluminum!~

On a serious note, Apple has a simple and working strategy, to which they're 100% committed: repair only in authorized service points, otherwise recycle. Making devices user-repairable adds costs and compromises on specs; you can't put a number on the disks being replaceable as easily as you can put a number on size, IO bandwidth, or number of write cycles, and numbers is what looks good on benchmarks.

You also have to consider Apple's scale: they have literally billions of deployed devices (everything from AirPods to Mac Pros) that they need to support, so what could work for a different vendor won't necessarily work for them.

(I don't necessarily agree with that strategy from end-user POV, merely pointing out the context.)


I've looked at their hardware. I would've bought if it weren't for the soldered down flash chips. I've seen plenty bad SSDs.


The point is, they don't want people to fix or upgrade their hardware, and they will fight to defend this strategy. Give me one reason why they should to it, from their perspective. If they have them glued, the equipment has shorter life and customers need to purchase a new one quicker. And since they can't replace components themselves, they have to pay exorbitant prices for 16 GB RAM or 1 TB storage that is cheaper than ever.

These are nice machines, though, and I buy one for building iOS apps every couple of years - but usually 1-2 generations later so that the pricing is more reasonable.


This puts the dot on the i quite nicely. It's all about the sales.


Average error rate in non-ECC consumer DRAM is in the order of one in 100 million, which is not insignificant.


Strange that they mostly compare performance to M1, not M2. Probably means it's not as much faster than M2 as they would like.


Its on the release linked. 15% faster than m2 for P-cores and 30% for e-cores, which is misleading, because it has 2 more of them so they're not necessarily that much faster.

For me, still on m1 is good news, one more generation I can skip without feeling too much FOMO. Graphics improvement are irrelevant since im not gaming on them and cpu difference is fine for being just <40% overall.


Exactly. I imagine most of the benefit of the new 3nm process is related to power draw. I'll bet the memory is the same stuff as the prior generation though, given those specs haven't seemed to change at all.


Don't forget that there are no M2 iMacs


They compared it to both, and most people upgrading will have an M1 so it's the more apt comparison (and of course, a bigger number).


What's the deal with "dynamic caching"? I wish they would have talked about it more. It sounds like they're reducing memory allocations by only allocating what is actually needed.


You got it. Instead of letting software developers allocate a static cache with overhead that eats a little into the unified memory, the Mac gets to decide how to dynamically allocate and release cache for graphics. They said it would be "transparent to developers," but not really sure what that means exactly.


I take "transparent to developers" as something developers don't have to worry about. It's done automatically.


It seemed like Metal was probably measuring it under the covers and adjusting things as opposed to asking the programmer to figure it out.


They said it was done in hardware, so not Metal alone.


From what I understand this is about assigning GPU resources (such as register files and other on-core memory) to shaders. Imagine you have a complex shader that has two paths, a frequently taken fast one that needs X bytes of on-chip memory to function and a rarely taken slow path that needs Y bytes (Y > X). Usually this stuff is statically partitioned, so you have to provision Y bytes for the slow path to run this shader. With dynamic partitioning, the shader will only allocate Y bytes if the slow path is hit, which frees the resources to load up more concurrent shader programs and improve the shader occupancy.

This stuff is only really relevant if you are dealing with complex uber-shaders and recursive function invocations, both of which are fairly common in raytracing pipelines.


in tinygrad, the llama2 model with Metal runtime produces 1k kernels. this means we have to compile them all, leading to both startup and runtime costs from repeated compilations and buffer bindings. someone suggested using one megakernel to call the rest. could dynamic partitioning help here?


Short answer: yes.

Long answer:

If the hypothesis of the parent comment is correct, then the following should be true: joining shader logic paths or parts with very different resource usage, notably register usage, shall no more cause occupancy issues which on traditional architectures could be sometimes resolved by artificially splitting the logic into multiple different kernels with the sole purpose of making it possible for low-resource usage portions of the logic to run with higher occupancy.

If the case, it is of course beneficial not only because such splitting burns development hours (and can be potentially error prone), but also because such splitting introduces overhead of its own, for reasons anywhere from having to repeat some of the calculations in more than one separate kernel (instead of reusing the results) through having to store and subsequently reload intermediate values to communicate between parts, to having to pay the overhead of launching and synchronizing additional kernels.

It stands to reason, however, this is not going to help in those cases when it's not the occupancy that's the problem, but rather sparse SIMD/wavefront utilization of ALU resources is: in cases when the control flow is sub-wavefront divergent, but splitting the code into multiple kernel launches allows for compactification of SIMDs/wavefronts.

Furthermore, the joined shader might still fall a little bit short of expectations anyway because joining code together not always results in the compiler successfully identifying and eliminating redundant or repeated calculations, and/or allocating resources better due to having access to the entire program at once. It's usually the case, but occasionally the opposite happens - sometimes joined code results in compiler's optimizations taking a different path and producing code that is actually worse than it would have been if different parts of the code were compiled separately. The risk of that increases if the frequency with which the shader execution takes specific paths in the program is unusual (statistically not typical where statistics are taken over the whole world of programs across space and time :-) ) and the compiler ends up mispredicting which paths are low- and high-probability.


They kept mentioning performance relative to intel Macs which makes me think there is a large cohort of people sticking to their x86 rigs due to compatibility. Being able to run an x86 linux or windows VM is still a requirement for me.


It's only been 3 years since the M1, most people aren't buying new computers that frequently.

Windows for ARM runs well on Apple Silicon, and has its own translation layer for x86 software. It should be fine unless you need specific x86-only drivers.


I got a new Mac at work maybe 6 to 9 months after the M1 generation came out.

I would’ve loved one, but some of the tools I needed did not work without real Intel hardware yet and there were no workarounds.

Today there are. But because of that I had to get a new Intel machine. And that’s going to be my machine until it reaches the standard replacement cycle. So I’ll still have it for a few more years probably.

By the same token a family member bought a new (to them, refurb) iMac maybe six months before the M1 iMac came out. Again I think that would’ve been a better computer, but it didn’t exist. And the old computer was on its last legs and needed replacing.

That computer does not get heavy use and will last a long time. It will probably get used until Apple stops updating Intel OSes and it starts becoming a real problem for the user.


I'm still using mine. Parallels can only MDM enroll the mac VM on the intel device. Apple silicon doesn't support the feature of changing the serial number. So for testing mdm profiles I have to use it or carry two devices around.


I'm also still running an i9. I can't justify buying a new one since my 2019 model is still running strong!


I can't relate.

I would probably have upgraded my i9 as soon as the M1 Pros were announced if it hadn't been my employer's property.

I had constant thermal throttling from the instant I booted the damn thing. Worst laptop I ever had.


Mine runs warm a lot. I was looking through M1s on eBay today, I might pull the trigger. I just have a hard time spending 4 grand on a new laptop when the one I have does everything I need it to.

I had an M1 at work for a while, it was amazing and I loved it.


2019 i7 here and I’m still very happy with it. I can’t justify the purchase.


Who knows but my guess is that the x86 requirement is probably for a minority of users.

Lots of people are probably sticking to Intel Macs simply because the averege user doesn't care about performance and will keep using a computer until it dies.


https://mac.getutm.app runs just fine with x86 linux vms. I'm running a few ubuntu ones as we speak. It's painless and quite reasonably speedy.


And still starting at 8GB of RAM… Disappointed.


and yet at this point pretty much expected. So obnoxious and odious - the reality is for any C/C++/rust or similar compilation you need a minimum of 1gb per compilation core for compilation alone, before you consider the rest of the OS :-/


You're saying people shouldn't be able to buy 8GB RAM Macs, which is crazy. I had to check how much memory my laptop has, and it's 8GB and I've never noticed.

I guess you're just complaining about what you get for the lowest price.


The lowest price is still a very high price that justifies the low end being 16. This is not a baseless complaint.

And no, it's almost 2024, people should not be able to buy 8gb RAM computers when they're soldered in and 1k+ in price.


For the price and a supposedly "Pro" model, having 8GB of non-replaceable RAM is ludicrous.


Seriously.... My 16gb MacBook pro M1 work machine runs out of memory constantly. I had a MacBook air 8gb and it was useless, it would freeze constantly and have crackly audio. I found that 24GB ram is really the minimum that is usable on M-series when doing any type of basic work.

Likely an output of so many electron based apps which are memory hungry, but if the memory is supposedly "really fast" then I would expect better performance. Never had the issue with Intel.

But with a nice bit or memory, the performance is really nice.


I'm on 16GB which seems plenty - running many JetBrain's IDE's at the same time, browsers, VM's etc. I know people with the 8GB version and they are fine too. Could be that your machine is faulty somehow and that it's not the RAM but something else.


You are only person in thread that says this. I still have one laptop is a M1 air 16gb. Still fastest computer I have ever owned outside of my MBP, I really only mbp for AI training everything else runs great on air still


> 24GB ram is really the minimum that is usable on M-series when doing any type of basic work

M1 Air with 16 GB RAM. I have RubyMine (JetBrains IDE), an application server running Rails, MariaDB, MongoDB, Redis, Slack, Safari and Apple Music running, and DxO PhotoLab 7 loaded on the non-work desktop space and clicking through some photos I took yesterday. Memory pressure green, laptop chilling at 30 °C.


I'm using Unity (uber wasteful) and Rider (Java) and an M1 air with 8GB is fine, no slowdown whatsoever* (compared to my desktop with 32GB on Windows). RAM is not the bottleneck in *NIX OSes, unless you do something that specifically requires a lot of RAM (eg machine learning)

*compiling is slower, but the desktop has a much faster CPU


You might want to get that checked out

I’ve got a 8GB Mac mini, 16GB iMac and a 64GB MBP, all on Apple Silicon. In 99% of the cases I can’t tell a difference in responsiveness or use and my main use is app development in Xcode and high resolution photo and video editing.

Even when actively monitoring things I mainly see a difference in memory compression, not so much in swap.

As for Intel, before Apple Silicon I had the most tricked out Intel MBP and I’d rather chop my left arm off and be stuck in a bunker for a year with just the 8GB Mac mini than have to use that the Intel MBP is had. Unless perhaps the bunker has no heating, in which case I might have to reconsider.


A contrasting data point: I have an 8gb M2 and haven’t noticed any problem yet that made me wish I had more memory. I’m not using it for work, though.


Not sure what workflow you are on but I have a few IDEs, simulators, electron apps, etc. on my 16gb m1 pro. I haven't ever had memory issues.


> With the M3 family of chips, hardware-accelerated ray tracing comes to the Mac for the first time. Ray tracing models the properties of light as it interacts with a scene, allowing apps to create extremely realistic and physically accurate images. This, along with the new graphics architecture, allows pro apps to deliver up to 2.5x the speed of the M1 family of chips.

Does this mean we can finally play Cyberpunk 2077 on a laptop? Is it going to be anywhere near as powerful as desktop Nvidia cards?


The performance is not that great, but you can already play Cyberpunk 2077 on a M2 Pro laptop using Game Porting Toolkit https://www.youtube.com/watch?v=sPJpkRmsceU Keep in mind, it has to do x64 -> ARM64 transition and translate Direct X into Metal. I would expect a native version to work better.


Pretty cool but that isn't enjoyable. Those are minimum graphics settings with heavy upscaling and it still barely manages 40FPS with extreme lag spikes in combat (and slowdowns?).

Also having played the game, I can tell you the scenes chosen there (pretty much all of Act 1) are some of the best optimized/least demanding areas. Especially the DLC is going to be unplayable with that performance.

Certainly looks like with just a bit more hardware power and development it could be getting somewhere though.


It's impressive, but still ways to go. Impressive nonetheless.


> Does this mean we can finally play Cyberpunk 2077 on a laptop?

My 2020 14" RTX 2060 laptop played it just fine, with decent settings (and copious DLSS). At 4K!

In the state I played it (earlier this year), it was great looking even without raytracing.


No. It's nowhere near as powerful as desktop Nvidia cards.

It's about as powerful as a PS5.


It could be theoretically as powerful as a ps5 but not nearly as performant. Game studios are not spending any time optimising for apple silicon and dumping tons of dev hours to make sure it runs smooth on the console.

Sony uses its custom graphics API (GNM and GNMX) and custom shader language and x86 platform. Xbox uses DirectX. Apple went in on its own route with Metal. That widens the gap.


Game studios release games for the iPhone. Which means they do spend time optimizing for Apple Silicon and dump tons of dev hours to make sure things run smoothly.


Good point. But I still think that would be completely different games and teams doing both for most games.

I can think of very few examples of games being cross ported on all platforms like Fortnite or Genshin Impact.

To the point of the ps5 comparison we're talking very different games people are thinking of. Maybe Apple needs to go the usual route of buying a studio and releasing games but I don't think apple likes or respect gaming enough to do that with good results.


I'd be impressed if it was as powerful as a ps5.


The PS5 gets around 10.28 teraflops, while the M1 Max, from two years ago, was getting about 10.4.


Interesting how the ps5 can run much more intensive graphic applications than the mac


Do you mean at max raytracing settings? Alienware laptops should run Cyberpunk at 60, just not maxed out.


Interesting how the ps5 can run much more intensive graphic applications than the mac


Replied on the wrong thread, sorry.


I'd prefer if there'd be higher RAM limits for each variant:

* ~~Ultra~~ Max probably has an adequate limit with "up to" 128GB

* Pro would be nicer to have "up to" (say) 64GB, rather than 'only' 36GB

* Base perhaps "up to" 32GB, rather than 24GB

There are some situations (running a VM or three with different OS(es)) where there really isn't a substitute for more bits.


Max is the one with the 128GB limit. We'll see what the limit is for the M3 Ultra, possibly 256GB.


IMO, if I'm doing that heavy of a workload, I'd rather SSH to a server instead of trying to do it on hardware optimized to work on a battery.


It’s also nice to just run it locally. No need to move your code and data. Debugging is a breeze. And you don’t need to pay for a server and a laptop.

But naturally, if you want to run on 128 cores and 4TB of ram, then you can’t do that in a laptop anyway.


I would rather avoid swapping between machines because it's unnecessary complexity when laptops are already so fast.


It doesn't have to be "heavy": just running a browser in a different OS may need a minimal amount of RAM (even if the 1-2 assigned vCPU/cores are mostly idle).


24GB or 36GB can run a browser in a VM.


I just hope they stop selling 8GB variants entirely. All those do is make job security for tech support people.


Completely disagree. Happily run Intellij with various containers doing dev work on my 8gb m1.


Yes, you're a tech person. I mean when non-tech people run them and try to use them for Adobe Suite and a browser at the same time, etc.


They didn’t.

Not surprised. But I agree with you. Especially for anything with the word “Pro” on it.


Whats the point of re-parroting Apple PR and include: "the most advanced x" for each new generation something Apple?

I get it that they must say it but for the rest of us it's pretty obvious that they would say and it would make sense to develop something that will be less advanced than the stuff you already have.

You might make different trade offs but you surely would not do whatever the equivalent of de-growth be in computing.


Apple launched M2 MBPs only in January. I wonder if Qualcomm's Snapdragon X Elite (benchmarking now, but not in products until mid-2024) drove some of the timing on this launch: https://www.windowscentral.com/hardware/laptops/qualcomm-bri...


From what I’ve heard from folks connected with Apple, M3 was designed some time ago and TSMC delays on 3NM are why M2 happened at all. So, Apple is releasing M3 later than they’d have liked, and I expect we will see the M4 refresh faster than we would have otherwise.


ARM on Windows is going to be irrelevant for at least a decade likely forever.

There is a long-standing culture with the Mac that (a) backwards compatibility is minimal, (b) developers need to always keep their code current and (c) users should expect constant turnover in the platform. None of that exists on Windows. And when we last saw Windows on ARM the outcomes were that this wasn't going to change.

Which means it will never be a transition rather a fragmentation of Windows ecosystem into two equal parts. Making it far less compelling for everyone.


The reports of emulation and native apps on Windows seem pretty compelling to me, and the battery life improvements are substantial. For people who live in browsers (most people), it's not that hard.

The more compelling reason for fragmentation is that AMD and Intel will stay with x86 as long as they can, and high prices from Qualcomm will keep people from shifting over en mass.


other way around i suspect. qualcomm knew this event was coming and wanted to front-run's apple's announcement.

especially as the macs ship much sooner...


M2 was late rather than M3 being early.


Oryon seems like it's going to be very competitive with the very underwhelming M3 design.


It's shiny but I think I'll stick to my upgradable RAM and two m2 slots + a RTX 3070ti and i7-12700H for a fraction of the price.

Something at Apple broke when they switched to their own silicon outside iPad and iPhone. They turned around the slow march toward parity on price:performance/RAM/storage and went screaming in the other direction. I really wanted to switch to mac when I was shopping for a new system earlier this year, but the prices were and remain ridiculous.


People have never bought Apple because of the hardware. They buy it because of the software, the aesthetics, the label, the support (It's easier to find an Apple repair shop than repair shops for any other brand anywhere in the world), and the integration with Apple's services. Comparing spec to spec is missing the point.

Also while this is niche, at the high end (M3 max + 128gb) you end up with a laptop that can literally just do things no Nvidia laptop can do right now due to the gobs of memory directly addressable by the GPU. I'm not saying 99.9% of people are going to take advantage of it, but it is a thing.


> People have never bought Apple because of the hardware.

Actually, a lot of people do. Here's one famous example [1], and I also know people in my network who love their MacBooks for the hardware.

[1]: https://arstechnica.com/gadgets/2022/08/linus-torvalds-uses-...


> People have never bought Apple because of the hardware

Hardware was the primary reason for me buying an M2 MBP. I've been primarily a Linux user for years, but I bought a MBP after my ThinkPad T14s just died and became completely unresponsive in the middle of a work day. The screen is gorgeous, the trackpad is great, the keyboard is decent, but the battery life is incredible. My ThinkPad was advertised to have a 12 hour battery life, but I'd be lucky to get 8. I can go nearly two days without charging my MBP.

I don't love macOS though. I had to install several third party apps to get basics like window snapping and a middle click, but I'm mostly okay since I just swap between Emacs and Chrome. Once Asahi gets DP over USB-C support though, I'll probably switch.


I tried to find a laptop with the specification you just mentioned that could last 20 hours on battery. Didn’t.


I'm personally always near a wall socket. My laptop is mostly a desktop. When I'm out and need to use the battery, it's only for a few hours. Do people really need >8h battery life?


It’s a huge nice-to-have. Moving from a decently beefy laptop to a MacBook has been night and day for me in terms of (a) not worrying about windows smart sleep (or whatever it’s called) waking my laptop so it’s warm and dead when I go to use it a few hours later and (b) not worrying about my laptop being dead when I open it after the weekend.

(a) should have been fixed years ago but still randomly happens on my partner’s laptop. (b) is a pain to deal with when you just want to get in to work. No longer having wall wort proximity anxiety every time I go to use my MacBook is really nice.

I’ll move back when laptops catch up though since I miss dual booting.


> (a) not worrying about windows smart sleep (or whatever it’s called) waking my laptop so it’s warm and dead when I go to use it a few hours later

MacBooks can run into the same problem if you have AirPods. I don't know WHAT the exact logic is, but the number of times I've used my AirPods at night, with the last pair being to my phone and the AirPods now connecting to my work laptop instead, causing a series of blares from MS Teams, Outlook and a ton of other notifications, people on Teams seeing me being "active" despite it being middle of the night... annoying as fuck.

I don't get why it seems to be impossible to tell the MacBook to not wake up from any kind of Bluetooth event. I get that this is the default because Apple wants trackpads, keyboards and mice to wake up a laptop... but I'd REALLY love to be able to turn that off because I don't use either of these three.


For years I am a happy user of Kill Bluetooth On Sleep (KBOS) https://github.com/alb12-la/KBOS


Long battery life isn’t strictly necessary no, but it’s nice. A lot nicer than one might expect. It can make the difference between needing to bring a charger or not, plus it’s one less mental background daemon to keep running (thinking about how much life is left, where the nearest outlet is, etc).

Arguably the bigger impact of long battery life is how it’s no longer necessary to throttle the system when it’s unplugged to not burn through battery in an instant. The performance of plugged and unplugged is identical.


At work, I'm in meetings much of the time, and I was lucky to get 4 hours of my my old MBP, if I didn't charge it over lunch, it wouldn't last the day. (there are chargers in meeting rooms, but only a few, so can't always count on getting one).

At home, I work in a few different seats (one of them is a desk with a big monitor, but for emails/reading, I prefer to sit elsewhere, including on the back deck in nice weather), all but one does have an outlet, but it sure is nice to be able to get up and move around without constantly plugging/unplugging the laptop. Magsafe does make that easier though, only takes a second to plug in and no big deal if you stand up without unplugging, it doesn't pull the laptop out of your hands or the charger out of the wall.


I find the extra battery life his nice but it is really just a side effect of the machines not getting hot. That is an amazing feature if you ever have to actually use it on your lap. That also means that it stays silent the vast majority of the time.


I think this is a great advantage, regardless of battery life, like you said.


I’ve never taken my charger to work with me. It was an odd experience the first week. I’ll never go back to plugs.


I never either but that's cause I have USB-C docks at work.


A Thunderbolt dock for a one-cable setup for everything is amazing. Power, video, peripherals, network, etc all with one little cable.

I never wanna go back. I know some PCs can do it too. It’s just so much nicer than having to plug in two or five or eight cables. It doesn’t seem like a big thing if you only plug-in and unplugg once a day but it is.


There are a lot of things people don’t need on average. But when you do need it, or you have a particular use case, well, then you need it.

These comparisons are always based on raw performance numbers or price. Which isn’t useful.


whats the point of having a laptop, even at home, if you can't be away from a wall socket without worrying about battery life


This is a desktop replacement due to no room for a desktop. It's always plugged in. I have a nice iPhone for writing and making music on the go.


With a 3070? Unless you have a backpack with a battery that’s completely unreasonable to expect. External battery packs that plug into USB-C are actually pretty small and affordable and easily 3x the battery capacity


Yep. But, me and everyone I know keeps their laptops plugged in most of the time at work or at home, so not everyone needs all day battery life.


I didn't care much about the battery life but not having to be tethered to the nearest wall wart is unexpectedly convenient even when working from home. I also appreciate the laptop not being a hot air blower.


I use laptop at my desk at home or my desk at the office where the power cable is within reach so it's always plugged in.

What use would be 20h of battery life for me?

I don't do the whole coding in the coffee shop or on the beach lifestyle, and even if I did, 8h of battery life is plenty for that since I can't spend 20h in the coffee shop.

There's probably a bunch of workers who are always on the road like contractors for whom this is a benefit.


I'm not on the road, I just tend to move around even when at home/in the office. Lower power consumption also means the laptop is silent and barely warm, and a tiny 30W charger is perfectly sufficient. Unrelated, but another thing about the Macbook Pro is how unexpectedly good its mics, speakers, and camera are. I used to have quite an AV setup but I can't complain about the built-ins nowadays.


>I'm not on the road, I just tend to move around even when at home/in the office.

I don't. I only work at my desk as that's where I have the best ergonomy from height adjustment desk, chair, keyboard, mouse and large monitor on adjustable arm that give me the healthy posture I want for long session of deep work and focus.

I can't understand how people can get work done hunched over their tiny laptop on the couch or kitchen table while getting a crooked neck/back, when that's what desks are meant to prevent. That's just not me, nor anyone I personally know.


What job do you have that requires you to work non stop for 20 hours away from a wall socket? I understand the need for autonomy but there are diminishing returns. It's ok to start looking at other decision factors past a certain point.


People don't only use their laptops for work. People also don't all sit at a single desk throughout the day.

Not to mention that you can cut most laptops' rated battery lives in half or even worse if you're doing anything moderately demanding, and that those demanding workloads often cause them to throttle pretty badly when on battery power, so it's not a simple question of "X hour battery life".

A high-spec laptop that's actually autonomous for daily use (from weight to screen quality to battery life) and does not turn into a radiator or a turbine when it's being put through its paces should be a reasonable enough value proposition for anyone to understand even if they personally don't want it; I don't understand why people are always so eager to snidely bring up devices that don't meet those criteria whenever Macbooks are mentioned.


20 hours isn’t full CPu usage, of course. The last intel laptop I had lasted a couple hours. My M2 gets me through the workday.


That's true, however I realized I personally don't actually need > 6hr battery life in a laptop, like, ever. There is always an outlet near me, and the charger & cable are small enough to easily carry around. If I am on a trip, well the same charger can be used to charge my phone and other devices, so I am not packing more than I would have otherwise. Why do I need to worry about battery life?

P.S. both you and I know that the 20 hour number is not real.


Would be interested in a link to a model if you have one handy!


That could be any gaming laptop from any manufacturer., those are the only ones who still ship with upgradable RAM and dual M.2 slots.


Finding one with a decent screen for anything other than gaming is the trick. I linked it up there, but this is IPS with 100% sRGB, more brightness than most people need, 16:10 (vertical space is nice!), 165Hz, and matte! Most gaming laptops are...not that. Maybe 50% sRGB on a dim glossy TN screen.


>Most gaming laptops are...not that.

Then you haven't been looking close enough. Cheapo gaming laptops yes, but Lenovo Legion, Asus G14 plus most other more premium models from other manufacturers have very good screens.


Lenovo Thinkpads? You can absolutely have two m2 slots with a good screen and a solid graphics card. https://psref.lenovo.com/Product/ThinkPad/ThinkPad_P1_Gen_6


I think any midrange gaming laptop in 2023 has that plus much better response times compared to apple laptops


Workstation laptops from Lenovo and Dell have upgradeable RAM and have been offering 128 GB RAM for something like 6 years.


https://www.bhphotovideo.com/c/product/1746503-REG/lenovo_82...

B&H doesn't stock it anymore, but it looks like other places still sell it with updated specs.


You know they announced a price cut right?


The low end "MacBook Pro" without a "pro" chip and only 8 gigs of RAM cannot seriously be considered a "pro" machine. That is a marketing gimmick.


You only need to bump the RAM to 16 GB to get a nice, quiet, efficient dev machine, similar to an Air, but with the significantly better screen, speakers and longer battery life.


I've reconsidered and there is a some truth to this. I have a 14" M1 Pro and it's the best screen I've seen on a laptop! I honestly don't need the "Pro" chip for what I use it for.


The price cut is the new starting model with just the M3 as a new SKU. The old comparible SKUs are about the same.


It's arguably also a price increase over the outgoing 13" M2 MBP model it replaces which was $1300.


It doesn't really replace the 13" MBP. That was a weird in-between relic with the old screen, a TouchBar and a fan that the CPU didn't really need. Essentially an Air with a bunch of semi-useless upgrades. Both the actual 13" Air and the real 14" Pro were significantly better, more focused machines.


It does because it's the low-end MBP with the base M chip. The 14" M3 has a fan too.


No they didn't, look closely. I was also fooled at first.


Apple Magic Keyboard, Mouse and Trackpad are stil Lightning, I cannot believe it


The regular M3 chip is still software limited to a single external display...


I can't believe all their external monitors are still 60hz...


Love the improvement on the CPUs, but I was hoping for bigger iMacs and smaller Macbooks. I work at home on a 27" iMac, and on the road on a 12", 2 pound Macbook. We are years into the M generation of chips, and iMacs are still stuck at 24", and laptops are minimum 3 pounds and 13".


I feel similar to you, but I don't think we are going to see another high end iMac. Apple wants us to get Studios.


I wonder if it didn’t sell well enough. I know it was extremely popular with a lot of us tech people, but the numbers may not have been there.


I purchased a fully maxed out M2 Max MacBook Pro when it was announced like 6 months ago. I knew it was going to be outdated eventually, but within 6 months, wow. Did not expect that.


i mean, if the machine does the job you purchased it for, is it really outdated? Unless the job it is doing is to be the best and latest, as a status symbol...


Yeah, I still have an M1 Mini and it does everything I ask of it pretty handily. The most intense work being rendering edited photos out of Lightroom while watching videos and web browsing around the same time.


Oh I'm super happy with it - it has handled everything I've thrown at it and then some and I'm not planning on upgrading anytime soon. But I feel like with the M2 Pro's having such a short shelf life, what that means for long term support.


Given Apple’s history and California law, I wouldn’t worry that your M2 will have a short lifespan.


I doubt it means anything. The 3rd gen iPad was superseded within 6 months, but it still had 7.5 years of support.


[flagged]


Please don't link to Asahi Linux on HN - Asahi doesn't like it.



Lessens the resale value at the very least.


I use a maxed-out M1 Max and have 0 complaints. There was never a task where I felt the memory or CPU performance was not enough


Same here, battery lasts for a long while and it never gets hot, which is a big deal for me as I use it on my lap for long periods of time.

When I bought it, I knew it was going to be a laptop that I'll keep for at least 5 years. So far, it still feels like new.


I have the same M1 Max as a personal machine and just got a maxed out M2 Max for work. The M2 Max does feel faster through daily use but that could also be related to the new install?


I periodically reinstall mine and keep the battery healthy


You can feel slightly less bad. M2 Max was announced in January, 10 months ago.


I have an m1 max and I'm more than happy. Until local LLMs work really well on these chips I don't see a reason to upgrade


depends on how you define "outdated" i guess


I love Apple but they wasted a 30 minute midnight event in the UK to announce new Mac chips and a new colour of Mac. I think this could’ve been a press release and tech reviews hands on. I don’t think they’ll do this again but it wasn’t that good. Yeah filming was obv. But the event purpose I don’t think made sense for 30 minutes when it could’ve been a press release and tech reviewed hands on.


It was also showcasing the iPhone 15 Pro camera, as the entire event was filmed with that device


I have an 16GB M1 Pro and it's a great machine. I want to jump the shark on a M3 max. Feels like some computer that should last 4-5 years much like my m1. However I keep an ugly ass full AMD Asus ROG laptop running Linux for gaming.

I want to wait for the dust to settle on the Proton-like compatbility layer feature. I know crossover exists but past experiences lead me to believe is not as compatible as Proton. Back in 2021 I've tested various programs that worked well on Proton but not so well in Crossover. Hopefully the Metal enhancements fix it.


Hopefully, but at least from my (non-professional) perspective, Apple makes it difficult to make software for macOS

- no support for cross platform gfx apis (opengl, vulkan

- requires physical Mac to publish, no support for emulation


Ya I'm on an M1 Max with 64G of ram, and it's still great. I know the M2 and M3 now (for Max) have more memory bandwidth, which is attractive, but otherwise...meh.


On a related note, and on a personal level, my heart goes out to Johny Srouji and the Apple team in Herzliya. War is horrible, and I hope every reservist called up comes back to do magic at Apple.


It’s strange that Vision Pro suddenly no longer has the current chip. (But it’s possible that they announced it with the M2 and started working on putting in the M3 as soon as it became available internally.)


I think AVP will have M3. All of the new features screams AVP.


Interesting that the whole presentation was shot on an iPhone. I've been wondering when they'd make that leap.


What I really miss in my work issued M1 Pro isn't more power, it's the keyboard in those 2010s models.


I agree. And the longevity of them specifically for me. My M1 (MBP and MBA) keyboards shine after a few weeks' of activity whereas no shining on older MBPs with a decade of use – it reeks of a quality degradation.


100% this. My work-M2 MBP is barely a few month old, but the keyboard already looks very worn out :(

OT: I love my M1 MBA and really hate the new style which seems to cut into my wrists and arms :(


110% I still use my Macbook Pro from 2011 and when using brand new Macs, I can't help my miss my home Macbook pro


Can we use more than one external display yet?


Seemingly no. So far it looks like (someone please correct me if I’m wrong): - M3, 1 external display - M3 Pro, 2 external displays - M3 Max, 4 external displays

This has to be a product differentiation decision at this point, right? Is there any serious technical limitation against more displays in the base chip, that Apple has not been able to solve in three generations?


It’s frustrating. I wish they would just charge an extra $X for the privilege of using multiple external monitors. I like the form factor of the MBA as is… if I need to compromise on that, I might as well choose a different machine.


Same, the M2 MBA is all the power I need and I vastly prefer the smaller size, but the monitor issue is killer.

I finally bit the bullet and bought a DisplayLink dock, which gets the job done, but just seems like a silly compromise when the Intel MBA could support 2 external monitors natively.


I hear the DisplayLink solution is very half-baked and has many issues. I'm going to skip mac laptops for as long as they don't ship a Macbook Air with dual external display support -- I am doing fine with Windows laptops, and they really have improved in performance and efficiency over the last few years.


> I wish they would just charge an extra $X for the privilege of using multiple external monitors.

They do, you have to pay more for an M3 Pro or M3 Max.


The person you replied to wants the nicer form factor of the MBA with the ability to use multiple external displays.

That combo isn't currently offered, and judging from today's M3 specs the next MBA won't likely offer it either.

I assume the iMac has the same limitation, which is a bit of a downer for some uses. It would be nice to drive a second monitor and a wall-mounted TV or projector from an iMac in sleek a home setup.


For context, you were able to do 2 external display with Macbook Air 2018-2020.

...and almost any Windows laptop at the same price point.


My bad, I missed that specification. I would agree with them, it would be nice if you could pay more for a MacBook Air with multiple external display support.


Mac Mini with the M2 Pro can drive 3 monitors, Macbook Pro with the M2 Pro only 2.


M2 Pro can always drive 3. The MBP just has one display built-in, so can drive 2 more.


Unfortunately tho, iirc, clamshell mode does not get you the use of an additional external display.


Of course. Have been since M1 Max. My MacBook Pro M1 Max powers two external displays.


With a MBP, sure. I don’t want a machine that large however. I have both a 14” Pro and a 15” Air that feels significantly smaller than its “smaller” counterpart…


I'm sitting here on an M1 Max with two external screens ..


Careful when sitting on computers, they might brake!


I so wish these beautiful beasts ran all the games for windows like Proton / Steam Deck :(


"No matter what your passions are in work or in life, there's a Macbook Pro that's perfect for you."

For the vast majority of Apple customers, Macbook Airs are plenty powerful. They're also lighter weight, and significantly cheaper.


M1 Max 16 GB struggles for my use cases, ram is way too low for how much I paid for it.

Then I look at the new line up and they are starting at 8GB still? And please don't BS me with "it's enough for most users", because this is a stupid argument, gonna tell you what, so is my macbook 2013!!!

An 8gb upgrade then costs me 300 euros? This is crazy, 8gb ddr5 memories at Apple volumes is 15$ expense at best.

The fact that Apple keeps gouging software engineers and professionals that will always take the upgraded model (as my employer and virtually all I know does) makes me say yet again: those models can sit on the shelf for me.

I'm not opening my wallet for such a greed.


Yeah that's crazy for their Pro line to even offer that option. Only the lightest users of MBAs should be buying a new machine with 8GB RAM.


Looks like the latest MBA is only 0.1 lbs less than the equivalent MBP at 3.4 lbs. Feels like they used to be lighter.


How are you determining "equivalent"? I don't agree with your conclusion at all.

My M2 MBA is 2.7lbs. The 14" MBP starts at 3.4lbs. That is a very noticeable 26% increase.

0.44 inches thick versus 0.61 inches thick is also noticeable.

I have a 14" MBP for work, so I have plenty of first hand experience with both devices.

The screen sizes of the relevant laptops are as follows:

    MBA (13in): 13.6 inches
    MBP (14in): 14.2 inches (+0.6in vs 13in MBA)
    MBA (15in): 15.3 inches (+1.1in vs 14in MBP)
The 13-inch and 14-inch are significantly closer in size than the 14-inch and 15-inch.

To add an additional spec for clarity, by my math the relative screen areas are as follows:

    MBA (13in):  84.36 square inches
    MBP (14in):  91.63 square inches (+8.6% vs 13in MBA)
    MBA (15in): 106.24 square inches (+16%  vs 14in MBP)
Why would the "equivalent" be the 15-inch MBA? Even though I obviously believe the 13-inch MBA is a closer match (as the specs above clearly show), the 15-inch MBA does still manage to be thinner and (barely) lighter than the 14-inch MBP, even while offering a significantly larger screen, which is impressive to be sure.


The resolutions and brightness are also relevant:

    MBA (13in): 2560 by 1664 at 500 nits
    MBP (14in): 3024 by 1964 at 1000 nits
    MBA (15in): 2880 by 1864 at 500 nits
I also would compare the 14" MBP with the 15" MBA" closer weight, closer display. If price wasn't a factor, the 14" MBP wins for me.


They also used to feel a lot thinner, with the taper at the front edge.

I don't like my M2 MacBook Air nearly as much as the M1, in terms of the physical feel and dimensions.


I had four or five gens of MBA over the years and generally agree. The new form factor is just not the same. It is much more like a how I'd envision design on some modern "Macbook" proper.

But the Air brand has been so successful, I think they just called it that knowing it would help move units.

That said, I use MBA M2 for my main machine and its so powerful and has such a fantastic battery life it is still the best Mac I've had since 2015 MBA.


I use an MBA M2 15in 24GB as my main and I agree.

I really enjoy the wedge shape of the original MBA that was maintained until the 2020 MBA M1, they feel better to my wrists than the 13 in MBA M2.

The 15in MBA M2 however feels to me like a 2012 15in MBP (I had one of those as well). Except for the relative lack of ports which didn’t bother me too much so far.


I miss the 27" iMac. It was (is) such a neat mac and it was even a reasonable deal considering it came with a nice screen.


I was hoping—if not expecting—to see a refresh of the 27” iMac; I was ready to buy one! It disappointed me they simply upgraded the current 24”.

The new 24” are lovely, but I will miss the real estate (still using a "Retina 5K, 27-inch, Late 2015” one).


The best feature to me is really the battery life and the quietness of the laptop. That's a game changer for me. I can work from anywhere comfortable, without burning my palms or legs and in silence.

I remember working on a PC laptop years ago and as soon as you pressed F5 the fans came on cranking for a couple of seconds, it was farcical.


There are Windows laptops with fantastic cooling setups. My old G14 can run fanless, and they've improved the cooling even more in subsequent generations.

The thing Macs really excel at atm is long, sustained workloads on battery... Other than gaming.


OK they got me, time to upgrade off my i9 MacBook Pro.


>Rendering speeds are now up to 2.5x faster than on the M1 family of chips.1 The CPU performance cores and efficiency cores are 30 percent and 50 percent faster than those in M1, respectively, and the Neural Engine is 60 percent faster than the Neural Engine in the M1 family of chips.

WTH, was Apple sandbagging on the M1?


5nm vs 3nm and three years of progress. Intel's e-cores have been seeing more generation over generation uplift compared to their p-cores so nothing about this strikes me as anything unexpected.

When the m1 came out details about the m3 were well known and that this was going to be a pretty big generation for performance improvements. I think the surprising thing was that m2 got pretty close to m3 in situations which were not as thermally constrained.


thx, good info.


These are the new 3nm chips from TSMC, right?


They didn't say, but since their A17 is 3nm this is almost guaranteed to be 3nm. The 3nm process is the big improvement, the M1 and M2 were TSMC 5nm and 4nm which was basically the same thing.

The performance improvements here is roughly what you would expect if you knew the specs of TSMC 3nm chips.


They DID say it, right at the beginning. All the M3 chips are 3nm.


The A17 Pro benchmarks are extremely disappointing, so I don't think 3nm is actually doing much.


Significantly faster absolute single core CPU performance than a 12900K is pretty good and not disappointing at all. I find the improvements in A17 Pro substantial.


Is this still ARMv8?

I think ARMv9 SKUs from others are just around the corner, and that's going to be a pain point for Apple.


https://developer.arm.com/documentation/102378/0201/Armv8-x-...

Apple implements all armv8.5 features and some 8.6 features.

The 9.0 competitors are at the same level as 8.5.

Cortex-X4 is at 9.2 which is not yet in any SoC.

The only difference between armv8.5 and armv9.0 is one single extension that is required on 9.0, namely SVE2 (scalable vector extensions 2).


SVE2 is the big one.


As someone who prefers the MacBook Pro feature set (Pro Motion, namely), but doesn't need the best processor, I'd be fine with the 14 inch Pro, M3.

But that model doesn't come in Space Black. Well played, you son of a bitch. Well played.

They've mastered the art of the pricing ladder and it's fairly aggravating.


Well, you mention you mainly care for feature set. What kind of feature is body color that would affect functionality. In fact I feel any kind of scratch would be worse on space black as compared to gray or silver.


I’m sure I remember Apple using Myst to sell Performa Macs in 1995. Some things never change.


So they remove MacBook 13 inch and replace with MacBook 14 inch as base. Many people will see the $500 off. Good selling strategy.

Btw, I didn't see the comparison with M2 chip. Could anyone give me the reference for that? Thank you.


They had multiple slides for M1/M2/M3 in the video. M3 is around 15% faster than M2 in most benchmark (according to Apple).


Thank you.


I just felt the continued existence of the 13" MacBook Pro was just confusing because it was very arguably worse than the 13" MacBook Air even without considering the price.


> And, a new media engine now includes support for AV1 decode, providing more efficient and high-quality video experiences from streaming services.

It's taken years but AV1 has ended up everywhere.


Sadly, only a handful of GPUs out there have hardware AV1 encoding chips. If I read the product page correctly, that new M3 architecture still cannot encode AV1 from hardware.


Decode everywhere is still great.

Encode for h264 and HEVC was slow too. It basically never even happened for VP9 outside of phones, AFAIK.


Which is likely fine. As a codec it's weighted towards distribution, not live encoding.


Apple's biggest problem may be how good the M1 laptops were. I have a 14' M1 Max and there's no task in my workflow that's slow enough that I feel any real need to upgrade.


It's frustrating that they don't refresh the whole line at once. I'm in the market for a new mac mini but now I feel forced to hold off until it gets its refresh.


I got a new M2 Pro 16" just a few weeks ago. I was pissed when I discovered that a new M3 Pro was coming out. Reading all the comments here, I feel a bit better.


You missed out on black though.


Yes, the black is the "killer feature" of those Macs, I'm thinking about upgrading from my M1 Air solely for this reason :)


"Graphics-intensive games like Myst have incredibly realistic lighting, shadows, and reflections, thanks to the next-generation GPU of M3"

This is the funniest thing I have ever seen in an Apple press release. Myst came out in 1993, and the "graphics-intensive" version of Myst that they are demonstrating is clearly realMyst which came out in February 2014, just a few months shy of a decade ago.

Meanwhile, my wife is happily playing Baldur's Gate 3 on her M1 Mac. Could they really not get the sign-off from any other studio but Cyan on this? No shade against Myst, it is one of the best games ever produced, but this is not how you do a graphics demo...

----

EDIT: Just now learning that there is yet another Myst remake from 2021 made in Unreal. It turns out I have played this version, but I did it on a Meta Quest 2, so I thought I was just playing a VR port of realMyst because it looked equally bad.


The realMyst from 2014 was actually realMyst: Masterpiece Edition, the Unity engine remake of the original realMyst from 2000 that used Cyan's own 3D engine (later used in Uru and Myst 5).

So Myst has to be one of the most-remade and ported games in the industry: five major iterations just on PC.


"the most advanced chips for a personal computer", says Apple. This is not an objective statement.


Name the more advanced chip then... Intel and AMD fall short.


Wouldn't make more sense for Apple to define and backup what they mean as the most advanced? Also, Apple's products never hold up to the specs they quote in their marketing material any way. I'm not sure why you are taking their own post as fact, when Apple is one of the most blatant over marketing companies around.

And even if it was true, it doesn't matter because Apple is obsessed with placing their chips in the most thermally constrained form factors as possible.


Apple routinely sandbags their performance in marketing slides. The actual performance is usually 1-2% better actually. Probably because they're the most visible company in the world and don't want to get sued.

AMD/Intel meanwhile will cherry pick benchmarks and way overpromise.


> Apple's products never hold up to the specs they quote in their marketing material any way

For example?


Like another commenter said, what is the definition of "Advanced" in this context


What's the budget?


Well, if you consider the entire package rather than CPU alone, I can see it. Still a misleading claim, but if stretch the definitions of “advanced” and “chip” you can do it.


What is the objective definition of "advanced"?


There isn't one, that's the point.


> This is not an objective statement.

Rick and Morty did it before.


I have found some deals on M1 Max Macbook Pros with 10-core CPU, 32-core GPU, 64GB RAM and 2TB storage. Does it make sense to go with those or with base M3 Pro with 11-core CPU, 14-core GPU, 18GB RAM 512GB storage which is the same price?


FWIW I dev on an M1 Max MacBook Pro and it’s the best computer I’ve ever owned.

Maybe the best thing about these M3 chips is that they’re making those older machines more affordable.


Fwiw, I have a 16" M2 Pro and had a close call with losing it to water damage, so I picked up a M2 Air and it is FASTER than the Pro for most of my jobs (electron/Xcode dev). The Air is so much lighter that I much prefer it to the Pro. The Pro now lives at my desk via TB, and the Air in my backpack for pretty much everything else. Weirdly, when I look at GeekBench browser I see that the M2 Pro should be faster, but at the time I've got screenshots of it being a Pro @ 2.7 GHz and Air @ 3.5 GHz... and I just checked again right now (at my desk) and the Pro is running at 3.29 GHz, so the dynamic CPU scaling must have had an effect. Yet, the actual work doesn't lie, and everything feels super snappy on the Air. Truth to YMMV.


No, keep the M1 Max. M1 Max has bigger RAM & better GPU compared to M3 Pro.


"The industry’s first 3-nanometer chips for a personal computer debut a next-generation GPU architecture and deliver dramatic performance improvements, a faster CPU and Neural Engine, and support for more unified memory"

This reads horribly in my opinion


That's Apple's hyping as usual. Apparently the improvements are not that "dramatic" when you look at numbers, at least not any more than M1->M2.


I'm not interested in the content, it's the readability of the sentence


Finally time to upgrade my 2017 i7. For over 1 year the battery meter has said "Service Recommended" and in the last 2-3 months multiple key caps are literally falling off and regularly get jammed while I try to type.

I figure the 16" with M3 Max and 64 GB is a good machine that will last me 5-6 years but I'm not sure if I should pony up for the 2TB storage upgrade.

I just pray this revision won't have some unexpected problems. I'm actually glad it seems to only be a spec bump so I won't be taking on any new tech. Hopefully any issues have been shaken out from the M1 and M2 releases.


It's harder than ever to be excited to spend money on an upgrade, despite the overall machine being better than ever. RAM and SSD upgrades are comically expensive in Canada, maybe the former of which being slightly more justifiable. $500 for the ram upgrade $750 to ugrade from 512gb to 2tb, so really I'd be paying something like $1000 on just basic storage. It's kind of insulting.


Yes, totally agree, which is why I'm waffling over the upgrade. From 1TB to 2TB is $500 CAD extra. It is just so insanely out of line with the actual cost of the memory. I'm looking at Amazon NVMe SSD and a decent looking 2TB drive is in the range of $150 CAD! I don't care how high quality whatever it is they are using is ... that is ridiculous markup. And since everything is soldered nowadays it isn't practical to buy and install oneself.

My current HD is 500 GB and I have ~140 GB free. But I've also turned up the iCloud offloading feature, stored larger media files like videos on an external SSD and a few times gone through folders removing files to clean up space.

I suppose if I want to play around with ML models then I should just bite the bullet and get more space. Fair play to Apple knowing how to squeeze every last cent out of their wannabe pro users.


I priced out a 14" and chose the minimum options that enable 64gb, and it comes to $4974 before tax with 512gb of storage. 512 to 1TB is that aforementioned $250 extra, or the $750 extra for 2TB, bringing it to nearly six grand before tax. Going from 64gb to 128gb adds an additional $1000.

A lot of that price comes from the required upgrade of the base model cpu to the best cpu.

I'll give them some credit and say that a very mid-tier upgrade from my 16gb ram intel mbp 13" to the now baseline 18gb 14" wouldn't be excruciatingly expensive and probably a great machine, but ram is being eaten up harder and faster than ever. At minimum, as a non-ML (unemployed) software dev, the 36gb is minimum I'd expect to be practical over the next 5 years


Well, thats annoying. I just bought a M2 Max 16" MBP in mid September.

I wouldn't have expected a new revision in the same year. I also swear apple used to have a policy for this situation, but I can't find it now.


Always check this page before purchasing Apple hw https://buyersguide.macrumors.com/


I mean, I did.

https://web.archive.org/web/20230912190332/https://buyersgui...

Neutral looked pretty good to me.


Maximizing margins for the M3 Pro which makes up a large majority of MBP sales. The transistor count reduction adds up over thousands of 3nm wafers. They are probably getting twice as many M3 Pro chips per wafer over M2 Pro. Nudging power users to the M3 Max doesn't hurt either.

Personally, I don't mind trading 2 P-cores for 2 E-cores but the memory nerf annoys me, and I suspect the cache has been nerfed as well.


I mind it to the point where I cancelled my order (upgrading from an M1 Pro). I'd do the mid-range Max but I imagine the battery life would suffer too much on the 14".

It's been pointed that that marketing for the M3 Pro is the same as it was for the M2 Pro earlier this year - up to 20% faster compute than the M1 Pro. As is the wireless web battery life.

Sure this will be variable based on the workload, but on average, this is a barely there upgrade.

The M4 will likely flip things back due to the yield issue, and there I suspect we'll see compute will improve by 40%+. I also suspect we'll see that upgrade sooner than later (next summer?) as Apple likely wants to get off these chips ASAP.


Don't blame you on the battery life, we'll have to see if the tiny M3 Pro has brings efficiency benefits without a reduction in real-world performance.

I'm fond of the E-cores and my main gripe with my M1 Max is the 2 E-cores being constantly pegged and 2 P-cores always active to compensate, which likely means the first P-core cluster can't sleep. It would be fun to see how 6 E-cores would handle daily tasks but I'm not spending $3k to find out. Only paid 2k for a used full-size M1 Max a over a year ago so I'll likely skip this gen. I get ~6-8 hours of constant use btw.


After seeing the benchmarks I ended up getting a 16" unbinned Max... previously had a 16" M1 Pro as a work machine and think I can be fine dealing with the bulk as my one and only even though I'm regularly on the move.

I don't think the e-cores were an actual engineering decision, but a way to reduce transistor count to improve yields without reducing performance.

Agreed about 2 being too few. I see the 2 on my M1 Pro regularly pegged as well, but I imagine a 1/4 ratio to be ideal.


I have a 16" M1 Pro for work and it's heavy. Wish I had requested the 14" in hindsight.


I’ve been wanting to upgrade my 2018 Air but I’m not sure if I’m ready to take the plunge on this new architecture and what I may not be able to do with it… the intel Mac’s keep getting cheaper, and more tempting… has anyone found they couldn’t do something important or even fun on the new M chip? Clearly you can’t run Windows as a virtual machine, which I may seriously want to do for Visual Studio if I get a much more capable Mac.


The M1 bar here shouldn't be at the middle? https://www.apple.com/newsroom/images/2023/10/Apple-unveils-...


No, that would be 100% if it was


That looks impressive, but why they show animations at so low fps as a demonstration of performance?

Surely they should play it at 60 or more fps...


Not sure exactly, but the whole thing was "shot on iPhone" so maybe related


M3 should not be in the macbook pro line. it only support one external display, should this laptop be called 'pro'


What does the number of external displays have to do with the laptop being "pro", whatever that means? My Air drives a single 43" screen at home, I literally can't fit more screens on my desk. At the office anyone can grab a screen (they're nice, 4K and around 27-30" I think), some people use them, some don't (e.g. people with 16" MacBooks).


Apple should develop framework motherboards. Also their ridiculous upselling practices are in the full swing again.


I really hope Apple doesn’t do a yearly cadence with new chips. From all the comments I’m reading here there’s a lot of confusion and misunderstandings about why new chips that seem to underperform or lackluster performance. Maybe Apple should just do every other year for chip upgrades.


Wonder if the unified memory (up to 128GB) makes it much better for machine learning / AI things? On a Windows laptop you get maximum a Nvidia 4090 and with 18GB of RAM for the GPU.

Do I miss something? I wished I had more insight on Apple architecture support from the AI frameworks out there.


> Graphics-intensive games like Myst have incredibly realistic lighting, shadows, and reflections, thanks to the next-generation GPU of M3.

I thought this was funny considering the original Myst can run on a modern toaster. I'm guessing they're referring to a new Myst re-release?


Yeah, it's pretty pathetic they keep bringing up Myst. I definitely highlighted it as a laughing point when summarizing it to others.


My favorite part was when Tim Apple teased the new M3 parmesan cheese grater.

https://i.postimg.cc/TYpx7SVf/Screen-Shot-2023-10-30-at-8-48...


On the power consumption vs performance ratios, every chart showed the M3 cranking at higher overall power usage than the M1 when under load.

One of my favorite things about my M1 is the battery life. I wonder how the actual battery life will compare for the different market segments.


You can always turn on low power mode. I do this for extra long days.


Seems like apple's software is lagging far behind their hardware. To this day I can't reliably use Facetime for a call. Often times its ringing on my mac but the other use gets nothing. Maybe take some time to fix your software as well, apple?


I'm waiting for the reviews until making a decision -- unfortunately I have a corporate laptop M1 MBP so can't use a personal laptop for work

As usual Apple will charge arm+leg for 32gb/1tb config -- and I don't need the graphics really


I want to see low level instructions baked in to allow reliable, performant and efficient real-time emulation of win/x86 software, especially games. As in, I don't care about what chip is inside, I want my game library to just work.


Just wait for M3 Ultra which in past generations is literally just 2 Maxs. Imagine 256 of GB unified RAM (oh and I guess the 32-core CPU and 80-core GPU, though those aren't as impressive). Really would love to know how much that is going to cost


The ultras don't make a lot of sense for most professionals when, at that price point a threadripper and a 4090 would still eat it for lunch.


While that may be true, it's not really an apples to apple comparison.

The m2 Mac Studio right now at $2400 for the 64gb model is cheaper than most anything new if you want 64gb unified pool of gpu memory other than the $2000 Nvidia 64gb Jetson AGX Orin which is much slower.

A threadripper based system is physically freaking large, and loud, and incredibly vram constrained. It of course is going to absolutely destroy the Mac Studio in most situations because it's better than the Mac Studio in most metrics, but honestly I'm very excited by the prospect of future Macs with gobs of memory, although I'm concerned at how Apple isn't generationally improving memory bandwidth.


I have a ~7 year old 27" iMac. It wasn't that expensive at the time and the screen is still way better than a lot of what is on the market today. Now, it is only available in a 24" version.

Is time moving backward?


I guess the high-end iMacs and iMac Pros (remember the iMac Pro? 10 core Xeons upgradable to 512 GB of RAM!) just never sold very well, so they canned the idea of selling iMacs to pro users and instead chose to focus on the lower-end market that was previously served by the 21.5".

So it seems to me like the iMac has very much been repositioned as a stylish, basic desktop computer for the home and general office tasks now. No M3 Pro/Max, no bigger monitors, limited I/O, etc. I personally don't really like it because I have labs full of decked-out 27" iMacs that are now kind of left without a clear upgrade path, but I understand what they're doing.


Have you seen in person the new iMac? The performance leap from that is insane. The 4,5K retina screen also fits as much even if not more than your 27-inch


No, a 4.5K display doesn’t fit more than the 5K display on the 27” iMac. The performance boost is nice but the display is still better.

It looks like the market shifts mean they’re assuming people who want bigger displays will buy a separate 5-8K display and connect it to a Mac Mini, which is probably correct. That used to cost more when the third-party market was so limited but that’s improved considerably.


I don't understand why they didn't just keep the 27" screen. It isn't like 27" is considered large by today's standards.


How did Apple become so successful making high-end CPU chips on their own so quickly?


Bring in people with expertise, buy outside expertise where you can, focus on the product you actually want instead of the product that will appeal to every possible use case for every possible customer that the chip marketing team can imagine, don't sweat where you can put the margin line between chip and system integrator, because they are both you.

I suspect any $2,600,000,000,000 company could pull it off, if they moved first.


Apple acquired PA Semi in 2008 then Intrinsity in 2010. They have been using their own ARM cores since 2012 with the A6 SoC

https://en.wikipedia.org/wiki/P.A._Semi

https://en.wikipedia.org/wiki/Intrinsity

https://en.wikipedia.org/wiki/Apple_A6


They've been designing the A-series chips in iPhone and iPad since the A4 in 2010.


A4 was still using ARM cores from Samsung. A6 was the first SoC with Apple cores.


20 years ago, Palo Alto Semiconductor was founded, and 15 years ago, Apple bought them in order to start building their own CPUs.


The iPhone chip was seriously competitive with laptops 2-3 years before the M1, it's just that due to the software environment few people noticed what was going on.

The real question is why Apple left it so long. They clearly wanted the first generation to be a clear success, but they could have probably pulled this off faster had they wanted to.


It wasn’t that quickly if you consider the years of iPhone chips leading up to it.


They've been doing this for years. It was just mobile only at first.


So quickly 10 years in the making.


Not quickly. People have been predicting this since the PA Semi purchase.


They first spent 10 years making their own mobile chips.


Probably billions of dollars in R&D and extremely clear executive guidance?


What is "quickly" for you?

They don't "make" them, they design it.


I don't think we're comparing apples to apples here (excuse the pun). These chips have something like 10% of the instructions that a typical x86 chip does. Once the big CPU players start producing the same kind of chips, I greatly expect Apple's power to performance advantage to drop significantly, if not be overtaken by the likes of AMD, etc.


> I greatly expect Apple's power to performance advantage to drop significantly, if not be overtaken by the likes of AMD, etc

Nope. Apple and AMD both use TSMC for manufacturing. It's all made by the same factory. AMD does not have the advantage there. Apple buys the most capacity on the most advanced process nodes since they place much bigger orders (Apple also has 10x more cash than AMD).


Is it me, or does the "space black" not look that dark? I have a Macbook Air in "midnight black" or whatever it's called, and it seems darker than this hue, at least from watching the video.


Black MacBook

Some folks are going to lose their marbles that Apple released a black MacBook Pro.


> The CPU performance cores and efficiency cores are 30 percent and 50 percent faster than those in M1

...or 15% and 30% faster than M2 according to their graph further down, so it's basically another M2-level upgrade.


Why do some of the chips (m3 pro only?) have 18g and 36g models, vs more traditional 16g and 32g? Can't find any explanation for it, just a lot of reposting of the specs in various sites.


See this thread https://news.ycombinator.com/item?id=38078282 seems to be based on available LPDDR5X chip sizes?


Also curious about this.


> Graphics-intensive games like Myst have incredibly realistic lighting, shadows, and reflections, thanks to the next-generation GPU of M3.

I'm just out of words for this.


I think it's quite clever actually, because although it's sort of a laughable technical feat compared to other options for gaming, many of the people who both have the cash to spend on the higher-end mac AND who don't pursue any more modern games on other platforms probably would have been original Myst players


Selling a tick for a tock.


> "Games like Myst have incredibly realistic lighting, shadows, and reflections."

So we've finally come full circle. Next step, bring the old CEO back in an advising capacity.


Just as Qualcomm shows it's chips are getting better, Apple pounces! The layout looks very different from m2, it must have been delayed. Does anyone know if it's using HBM3?


Nah, it’s LPDDR5X


All the videos on that page are so slow and choppy on my 2019 MacBook Pro 16" in Safari.

If not intentional, their marketing should do it every time they want customers to upgrade.


Stockholm Syndrome?


The new M3 base model, along with only supporting one external display also loses a Thunderbolt port. A new shell / chassis just to remove a port. Courage!


The M3 base model replaces a computer with two thunderbolt ports. It now has MagSafe, 2 thunderbolt ports, HDMI, and SDXC.

How is that a downgrade?


Other than the fact that I can’t upgrade the OS and increasingly vendors only support M-series chips there is nothing wrong with my Intel MacBook Pro.


If you are not running latest, your system might not be fully patched: https://arstechnica.com/gadgets/2022/10/apple-clarifies-secu...

So there are downsides to not upgrading.


Oh I know - my complaint is that the hardware is good to go, but I'm forced to buy new hardware. The obsolescence is built-in.


Finally, meaningful amount of RAM in lowest option (which is the only one most shops usually have in stock) for the Pro - 36Gb.


What an underwhelming event especially since the M3 is worse than the M2 in some aspects.

I was hoping for some Mac Pro news or an iMac Pro or other such things.


Not buying an 8GB laptop in 2024, and I have zero intentions of paying the 300 euros markup to double it.

Those macbooks can stay on the shelf for me.


Tldr; Much better than M1, we don't want to compare it to M2 because that highlights it's only an incremental upgrade.


Letting Apple lead things in the market has gotten us to a very ugly space in personal computing. These are prohibitively expensive devices that for 95% of the use cases, a chromebook would be better-suited.

All those intel machines that were so impressive in 2019 are running like crap now. All going according to plan, I'm sure.

I manage a fleet of about 200 of these things. They are easy to manage, and oh-so-nice to look at, but boy, they are a huge waste of money.

These things should only be used by engineers and creatives. Yawn.


Right, that is what the "Pro" moniker in Macbook Pro means


Somehow Apple never has supply issues, meanwhile the good half of Lenovo’s ryzen 7040 series is not available in North America.


Which M3 configuration is the best price/performance option for an average developer? Is there an obvious best choice?


Literally every single Apple Silicon MacBook with 16 GB of RAM can easily run your IDE, one or two VMs and/or some containers, your messaging app of choice and your browser. That's more than enough for most dev jobs out there. If your job actually requires more RAM or CPU grunt, you'll probably know. SoC, RAM and storage upgrades are kind of expensive from Apple, so the base models are always the most cost effective.


Not that I can see. I spec’d this:

SoC: 11-core CPU, 14-core GPU, 16-core NE Memory: 18 GB Disk Space: 1 TB SSD Power: 70 Watt Price: $2199

I personally would go for the 36GB Ram on this, bump the processor to the 12/18 core set and go for 2TB drive but the price skyrockets with these additions.

FWIW, MBA M2 fully loaded is a very sound laptop.


For an average dev, I'd honestly skip the M3 and go buy a 15" MBA with 16GB RAM.


"Games like Myst have incredibly realistic lighting, shadows, and reflections"

Am I missing something here? Is it 1994?


There's an updated remake of Myst with a proper 3d engine and free movement etc. Even so that remake is a few years old now. It's a really odd choice when talking about cutting edge graphics.

Maybe Myst was the first game to get an update to adopt the new APIs?


Would be fun to see pgbench for M3 Max mbp


In the same way that Wikipedians rush to update the pages of recently dead celebrities, I'm surprised there isn't a similar rush to update pages like https://en.wikipedia.org/wiki/Apple_silicon around events like this.


We do. I was just at my day job. If someone else doesn't get to it this evening, I'll add something.


Will Maccy be as fast as dmenu on my old Thinkpad is what I need to know as a M1 Pro user


Can i just have fucking 3-4 monitors support on the bottom line pro model. ffs apple


So what about that Cinema4D rendering performance that showed 2x vs the M2 Max?

Was that GPU or CPU?


I suspect most of the side by side comparisons, including this one, were GPU.


I hear the complaints about incremental updates and not enough reason to upgrade. My opinion is that it doesn't need any updates as it's nearly perfect the way it is. What it needs is a 3 year release cycle instead of yearly. Less e-waste and more visible spec change.


No, people just need to learn self control. You don’t need a new laptop every year or every 2 years.

My current MBP is nearly 8 years old, cost $4,500, and is finally in need of replacement. I just purchased a $4,300 M3. Why should I have to purchase a 3 year old M1


You wouldn't. You would purchase a 0 year old M3.


M3 Max looks like the top tier GPU is a 40 series rival am I missing something?


It's not a 40 series rival without CUDA. Probably great at inference though.


driver and developer support


Why do you say that? I'm sure its good but why do you specifically call out the 40s?


Also, how's pytorch?


Wonder by M10 if we have ChatGPT like AI built-into the chip.


And still no 27 inch iMacs. Strange decision, that’s a great size.


It seems that the 13” macbook pro (m1, non air) is gone for good?


And with it the touchbar I guess?


Good riddance. No matter how much they pushed it was an utter failure that should have never left their research lab.


Am i the only one who is frustrated with the mac monitor options these days? I don't know anyone who uses a 27" desktop monitor, much less a 24". What year is it? And $5k for a 32" is insulting.


The “correct” retina resolutions are 4k at 24", 5k at 27" and 6k at 32".

Apple won't make a display that doesn't align to that, since macOS doesn't handle anything else perfectly.


the new iMac is 4.5k at 24"


What size monitor do you believe to be most common? Anecdotally it seems the 27" monitor is finally overtaking the 24" monitors, but those seem to still be standard at more corporate companies.

I wish Apple had succeeded in getting 27" monitors to be 5k resolution across the board -- but that ship seems to have long since sailed.


In my opinion, the biggest problem with Apple’s external displays is their 60 Hz refresh rate. That’s half of what their own iPhone (!) and MacBook pro models support, and is a far cry from the 240 Hz (albeit at lower resolutions) displays that are starting to pop up from other manufacturers.


You can use non-Apple monitors with a Mac.

Apple's monitors are only worth getting if you're incredibly picky about very specific things.


So among all the Mac now, what's the best buy?


But it's not a PC, it's a mac.


What should I buy? Honestly it’s all confusing as hell. 8gb ram seems like too little but increasing it is $$$$.


You mean a Mac :P


A Mac can also be called a personal computer if i'm not mistaken


In the 2000s, Apple ran a big campaign with the phrase "I'm a Mac - I'm a PC" to highlight the difference.


Yeah but then it's not the "most advanced chip" anymore.


character limit :P


Something tells me it will get shortened to "Apple unveils M3" soon


Can they drive two monitors this time?


The 14" and 16" Macbook Pros have always been able to drive 2+ displays. The new ones updated today continue this feature. Hope this helps.


Not at all. Version of MacBook Pro with the M1 and M2 base chip can only drive one external monitor.

And 600$ upgrade for the privilege of having a second external monitor is one hell of a step.


You're talking about the 13" model (now retired). As the previous poster said, the models that have been upgraded here have always been able to drive two displays.


The new base Macbook Pro M3 cannot drive two external displays. If you get all nit-picky you can say it drives two displays, as long as one of them is the internal display.


Ah, I see. I think this corresponds to the price drop, right? You can now get a 14" M3 MacBook Pro without a 'Pro' chip, whereas previously that price point was occupied by the 13" model. So essentially, for a given price point, your external display options have not changed much.


There is no "now", this was the case for the M1 and M2 MacBook Pro with the base tier of CPU.


But that was the 13 inch model, no? I think all the 14 inch models had 'Pro' or 'Max' M1s and M2s. Now that the 13 inch model has gone, there are some cheaper 14" models that also have the base chip, and hence can only drive one external display.


Base 14" M3 MBP comes with 8 GB RAM at $1599 USD. That's pretty stingy.


That's what happens when there's no competition. Come up with an equivalent product with more RAM at the same price and I'll buy it.

Sadly (or perhaps gladly, depending on your perspective), there just is no equivalent to a macbook for many people.


Yup, it seems that the "Apple tax" has gone up quite a bit since Apple shifted toward soldered RAM and storage for their products. You get excellent Apple Silicon chips with their performance and energy efficiency, as well as the most polished desktop OS (macOS), but it comes at a steep price, especially at non-base configurations. There seems to be plenty of deals for base-config Macs at places like Amazon and Costco, but if you want 24GB or 32GB of RAM, it seems that no deals can be found; you must pay full price for a custom configuration from Apple.

I was a long-time Mac user. I remember when my 2006 MacBook cost less than comparable PC laptops. My last Mac purchase was a refurbished 2013 Mac Pro I purchased from Apple in 2017; since 2021 I've switched to PCs. While I love the configuration choices and expandability of PCs, the operating system situation leaves much to be desired, in my opinion. The shenanigans Microsoft continues to pull with Windows as well as the Sisyphean development cycle of the Linux desktop ecosystem (systemd, Wayland, etc.) makes me peek at Apple whenever there's an announcement, but the steep prices for a Mac configuration that fits my needs are too much for me, and so I remain a PC user.


Definitely the main thing preventing me from moving on from Mac is how bad Windows and Linux is.

Around 6 months ago I actually bought a beefy desktop PC to play around with, dual booting both Windows and Linux and they were just terrible. The default keyboard shortcuts on Windows are a joke (not to mention, the lack of an additional modifier key which is present with mac, which makes shortcuts more difficult than it needs to be).

Linux on the other hand, was so limited. People then argue that Linux is customizable, but I spent forever trying to customise it and half the time, the modifications literally don't work at all. I assume because there are so many distros and the eco-system is so fragmented that something is almost guaranteed to break.


What is superior about Mac shortcuts? They seem to be just as inconvenient and requiring a rebind on both systems (though given how relatively easy it is for the basics strange it's a barrier to switching)


Having an additional dedicated modifier key is night and day in terms of creating shortcuts.

I don't see how they're equivalent. F2 is vastly inferior to pressing the enter key to renaming a file. Alt + F4 to close a window as opposed to cmd + w.

Can you name shortcuts that are more convenient on Windows than on Mac?


> Can you name shortcuts that are more convenient on Windows than on Mac?

That wasn't my argument

> They seem to be just as inconvenient and requiring a rebind on both systems

But they exist: Screenshot with PrtScn (or even Windows key + Shift + S) is better than the 3/4 Mac ones

> F2 is vastly inferior to pressing the enter key to renaming a file.

It's actually close to objectively better (though not a good one):

- Enter is a side-ways-stretch pinky key, so one of the worst keys out there (Backspace is worse since it's diagonal stretch)

- F2 is a middle-finger vertical key, so more convenient

Besides, this blocks opening files with Enter in the dumb Mac file manager, so add that to the downsides of superior Mac shortcuts


Fair enough. I guess ultimately it comes down to what you're used to and what you prefer.

I think more than anything, it's difficult to discredit something that you don't actively use and aren't completely familiar with. Like I'm just saying Windows is shit because it's not Mac, but that's disingenuous because I don't actively use Windows as my main OS. But hey, if you like Windows, use Windows. If you like Mac, use Mac.


I am sure if look closely at Windows laptops you can find plenty of competition. Of course the screen and the speakers are not as good and battery life is shorter, but if you can look over that (and I find most people outside HN don't really care) you can go pretty far with a premium Windows laptop (XPS/ThinkPad/HP EliteBook).


And yet _still_ 8GB of RAM as the baseline. It's probably as profitable as it is evil / misleading.


8gb is pretty bad. I have an M2 Pro with 16gb, & that limit is regularly pushed, & that without regularly pushing things hard, resource-wise (then again, I've seen plenty of people on both PCs & Macs shut down whatever applications they weren't actively using when they switched between them. If testing shows that to be really common, I could see 8gb being plenty for the majority of users. Crazy user patterns, to be sure, but to each their own). If I get an M3, it will have 36 (or whatever Minis will have available) just for the headroom.


Calling it evil is a bit of a stretch.


I'll admit "evil" is a stretch but I was thinking in terms of the amount of e-waste this causes due to insufficient RAM hurting the longevity of a $~2k+ computer

e.g. https://www.globaldata.com/data-insights/technology-media-an...


Is it? Its pretty evil to overcharge and prey on a customer base who doesn't know any better, but would benefit from a more usable starting point, oh say 16...

Tims a finance guy, they are kinda evil by default.


America is truly a blessed nation. Their richest evil people simply make the most powerful consumer appliances available for a reasonable price. Their good people must be truly saints.


Who? As far as I can tell every American is a psychopathic capitalist murder ready to drive their SUV over the next sand dune/forest/oil field/grandmas house you name it. Bought and sold in 1775 boy.


Yeah, Apple customers tend to also just "trust Apple" and get whatever the recommended product is for their price point. Half the reason I find this troubling is ewaste, the other half is I'm sick of having to try to explain RAM to my MIL and ask how many tabs they keep open when asked which computer she should buy. If they had a better baseline memory (16gb) or a simple inexpensive upgrade that's the second option in the lineup instead of the higher storage cost, it'd be a simple, one-sentence "get that one".

Even having to upgrade the RAM is a hassle when price comparing etc. They know this and it's - by design - misleading to the type of affluent less-tech-friendly users who prefer Apple products.


No one should have to care how many tabs they have open, and since browsers kill/suspend background tabs these days, they probably don't have to care about it either.


Except all of the browsers are shit at it. I know someone with so many tabs open it just shows a :)


It's a matter of perspective of course. I think if you claim that this makes Apple evil then you would also need to argue that most of capitalism is evil. Aren't the vast majority corporations just trying to make more money? I would disagree with that.


> I think if you claim that this makes Apple evil then you would also need to argue that most of capitalism is evil.

I mean, surely a mixed bag, and I still participate in it regardless, but pretty much, yeah.

https://www.reuters.com/article/us-usa-mining-children-trfn/...

https://www.culawreview.org/journal/child-labor-and-the-huma...


Exactly :)

Im just some punk trying to have a good time with some spicy comments on HN. It's ok to both hate and exist within capitalism. It requires a certain level of pain (masochism) to survive that existence, or dare to enjoy it, while also raging against it.


Hyperbole like this is necessary for people to justify the extreme negative emotions they feel about a computer they’ll never buy having specifications that they disapprove of.


apple silicone doesn't need ram /half the people on HN


Hey don't be stingy yourself, Tim's people said "it's the perfect MacBook for students and aspiring musicians.."!

Seriously, you can't make up this stuff..


What do you mean, have you set foot on a college campus this decade? It's a sea of macbooks. A $1500 sticker price isn't stopping people.


I proudly use an M1 Mac Mini but the fact that they're so full of themselves constantly makes me laugh(?)


What's even worse are the upgrade prices: $200 to go to 16GB and $400 to go to 24GB. Anything more than that requires upgrading to the M3 Pro model, which has a base price of $1999 and a base 18GB of RAM. Upgrading to 36GB costs an additional $400.

I waited for today's announcement because I'm in the market for a laptop with 32GB of RAM. Today's announcement looked really tempting! However, $2399 + AppleCare + sales tax for the 36GB M3 Pro MacBook Pro is too much for me, and so it looks like it's either a refurbished M2, a Framework laptop, or a ThinkPad for me.


Here in EU, the price I get in the local Apple Store is 2029€ = $2153.


Is that with or without VAT? The US always gives prices without VAT, since different states have different rates.


With VAT, I think most EU countries mandate advertised prices to include VAT.

Still, US sales tax is low. EU is >= ~20% almost everywhere.


Sales tax is usually ~7% or more, so $2,029 * 1.07 = $2,171.


You will need to convince your politicians to lower import duties and taxes.

And that $1599 USD is without taxes. So not the whole US has this price. It was always worth for me to travel to Oregon to have zero tax.


Here's some example US->EU Markups for other laptops in the same (EU) price range, sold direct by manufacturers to Ireland (list price to list price, so tax excl in US, tax incl in EU):

- Macbook Pro (M3, 8GB RAM, 512 GB Storage): $1549 -> €2049 ($2179) = 40% Markup

- Dell XPS 15 (i3-13700H, RTX 4050, 16GB RAM, 512GB Storage): $1865 -> €1989 ($2110) = 13% Markup

- Framework 16 (Prebuilt Performance Pro - Ryzen 7840HS, 16GB RAM, 512GB Storage): $1739 -> €1907 ($2023) = 16% markup

Apple's EU markup is an outlier here.


Even without the taxes it costs ~$200 more in the EU. Admittedly mandatory 2 year warranty might play a role


This, plus a risk of EUR/USD exchange rate going wrong way in the next year.


It usually comes with a 2 year warranty, which adds to the costs.


I think you mean, convince politicians to allow for the ridiculous tax-excluded display pricing that’s prolific in the US.


Why is it ridiculous to allow for tax-excluded display of pricing? You couldn't run a TV or newspaper ad with tax-included pricing in the US since there are so many different taxing jurisdictions here (more than 13,000, as of early 2023).

Perhaps what's ridiculous is that we have so many different taxing jurisdictions.


Well the average VAT in Europe is about twice as high if not more than the average sales tax in the US.

So I’d rather have to do some basic math but save 10% (of course different tax rates only partially explain the price difference, it still is significantly more expensive in Europe without VAT)


> ridiculous tax-excluded display pricing

Why is that ridiculous? I don't pay sales tax, what else would you have them advertise?


I think this is brilliant: let the people know how much of the price actually goes to the company, and how much to the government. Helps people make an informed decision on who's to blame for high prices of everything.


USA/CANADA prices are without VAT/sales taxes.


Especially when a 13" M2 MBA with 16GB RAM and 512GB SSD comes in at $1499.


It also supports exactly 1 external display.


Literally the only reason I can’t use my work MacBook for any work that isn’t just testing Safari support. I was even ready to tolerate the terrible ram quantity given it’s so friggin’ insanely fast at javascript.


:(

I think the base MBP from 2015 had 8GB.


Given how much money Apple has, this is simple profiteering. Something governments should look at and come at Apple hard, given this behaviour is inflationary.


I agree and I think it's going to happen eventually. At least something is going to happen. It already looks like customers have been a bit wiser and stopped buying the most egregiously priced models. They are running their computer division to the ground with their outrageous financial mastermind strategy. They put up a good marketing front for their proprietary technology but there is no serious advantage to their stuff anymore. Most prosumer performance focused customers are switching or are thinking about it.

You just need patience and enjoy the fall. Just a few years and soon enough nobody is going to care developing stuff for their platform, it will be mostly empty shells, but at least they are pretty, I guess. Patience my friend, karma will do its thing eventually...


It’s actually enough for light work, but you can get more ram just pony


Am I the only one shaking my head that they kept comparing the M3 chips against M1 chips skipping M2?

Just like the last product release spent more time on huffing their own farts (the Apple Watch is the first of its kind carbon negative product), there is no real innovation or something to garner interest for me here.


I think it makes sense in that the people that bought M2 are not likely to be in the market for an M3. The people looking to buy an M3 are likely upgrading from either an M1 or an older Intel based MBP.


A few years ago, I tried to figure out the battery life of the latest iPhone. The comparison page would only say "2 more hours than iPhone X", "4 more hours than iPhone 8", etc. Eventually I worked my way backwards to the iPhone 6 where the battery life was just listed as "—".


https://www.apple.com/iphone-15/specs/

Scroll down to "Power and Battery", about halfway down the page.


There's the obvious, that they want to say higher numbers.

But also, I bought a 16" MacBook Pro M1 Max when it was announced. The M2 Max replacement was announced only 15 months later and didn't seem like a big enough improvement to justify buying. Maybe they know there are a lot of M1 owners with my mindset, and we're who they're pitching.


Not real innovation (when comparing with Windows world) but having support for hardware raytracing now is definitely nice for games.

However, then on the information page they show Myst. Doesn't look spectacular to be honest, probably doesn't make use of hardware raytracing anyways. Having a short video of a fast-paced shooter with hardware ray-tracing support would be more impressive.

I guess it will take some time until games will adopt new Metal graphic APIs for hardware raytracing.


Myst supports ray-traced reflections. I agree that it's not a particularly good show piece though.


Way more people bought in to M1 than M2, so they're a bigger customer base to sell to. But didn't I see something about M3 being 2x faster than M2? I'm sure that's some BS about it being 2x faster in a specific benchmark. But I'm sure it's still a good generational improvement.

Edit: I love seeing what HN will downvote. Absolutely hilarious.



The focus seemed particularly on people still on intel macbooks and selling them on upgrading. I don't think they would usually reference a product that outdated at this point but they brought it up several times highlighting 11x faster.


The mainstream marketing for iPhone 15 is only highlighting that's its made from titanium... I'm sure there is some innovation in manufacturing, and I do love my 15, but it's devoid of any innovation.


The M2 was a tiny upgrade over the M1 since the 5nm to 4nm step was tiny, 4nm was basically 5nm. The M3 is probably running on TSMC's 3nm which is a huge improvement over both their 5nm and 4nm processes.


I guess they are seeing too many M1s in the wild - you're supposed to upgrade to a new Apple device a d create e-waste.


E-waste? Apple devices retain their value really well. If you don’t hand them down, you sell them.


Are they not supposed to make comparisons? I don't understand what alternative you would like.


I assume the question is why there are comparisons to M1 instead of only having M2 comparisons.


I understood they wanted a comparison to the M2, the most recent version. Most people care about “what changed since last version” it’s disingenuous to say “X faster!! (Than an model out date by a year)”


They compare to M1 because that is their target audience upgrades not M2.


no one does any heavy graphics work a mac anymore. especially for their showcases of rendering... its all nvidia GPU based for creation and rendered on farms anyway. Same with resolve.

Apple is so dumb.


Definitely not no one. Nvidia and the likes of Redshift democratised heavy 3D rendering. Here we are now with Apple using Redshift in their promotional pitch. There are plenty of cloud options for ‘heavy graphics’ but for the act of creating with feedback loops it seems to suffice.


If it doesn't run Windows, is it really a PC?


PCs and Windows are not synonymous.

PCs didn't start running Windows until they were nearly 10 years old, and then Windows was just an option. Shortly after Windows became popular, Linux emerged too and some people were running Linux on their PCs instead. There have been many other OSes on PCs over the years.

I've owned and used many PCs, yet hardly ever needed to use Windows. Even when I developed software that ran in Windows, including games, I used Linux to do the development :)


So MacBook Air chads win this round.


I'm looking for advice as I was planning to purchase the 14" M2 Pro MacBook, but I've noticed that the M3 Pro MacBook Pro has been recently released. I'm uncertain about which one to choose, and I'm seeking assistance in making a decision.


This is cool, where can I buy the chips and read their datasheets?


"unveil" !== "give you datasheets and provide components"

e.g. https://www.space.com/spacex-drone-ship-a-shortfall-of-gravi...


This question is in bad faith, and you know it. You know Apple doesn't sell chips.


I don't think the datasheets are available but you can buy them at https://www.apple.com/shop/buy-mac


I just want the chip, not the computer.


I think the chip is the computer


That's the problem.


What problem are you trying to solve?


The problem of using an Nvidia GPU, maybe.


it's really not. just a made up one on your side.


It's even better, you can buy the whole machine, with a great monitor, connectors, a keyboard, and so on!


I want to build a cluster and run it in a data center.


Then wait for the M3 Mac Mini and get a fleet of those instead.


And let my sysadmin type their personal Apple ID password on all of them?


It would be ridiculous to not use MDM in this use case.


And I want a free pony, but nobody obliges!


Pretty underwhelming a bit like the A16 release. They basically upped the frequency in A16 and that's how they got their single core performance improvement, not much of an IPC increase to speak of. Looks like the it will be the same story here given the performance numbers Apple gave. A16 also came with a power consumption increase so the 3nm process from TSMC is disappointing as well.


Isn't this typical tick-tock move? Don't change too many things at once - they are moving to the new process so probably not too many changes in the architecture, only increase in frequency for same power consumption allowed by the better process.


Now it's more like tick-tock-tock with m1, m2 and m3.


> Apple unveils M3, M3 Pro, and M3 Max, the most advanced chips for a PC

Nice! When can I order these most advanced PC chips for my next build? Do they have any kind of upgrade path for existing AMD/Intel users? I don't mind starting over too! Exciting times.


>When can I order these most advanced PC chips for my next build?

If you have several billion dollars to convince them to sell them for you as standalone chip, very soon!


I'd be very interested in a standalone ATX board with M3, that's the only way I'm going to buy an Apple desktop.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: