Hacker News new | past | comments | ask | show | jobs | submit login
Apple Unveils M2 (apple.com)
602 points by yottabyte47 on June 6, 2022 | hide | past | favorite | 749 comments



It's very interesting to think the processor releases were holding back the Mac for so long. They are on the cusp on turning Macs into annual upgrades - with meaningful performance upgrades tied to the OS, just like they've done for a decade with the A series / iOS releases.

This is firing on all cylinders. The organizational structure and performance is a marvel for this.


> It's very interesting to think the processor releases were holding back the Mac for so long.

Apple also made a huge leap forward with their cooling designs. Making the laptop thicker and investing in proper cooling design made a huge difference.

My M1 laptop is significantly quieter than my old Intel laptop at the same power consumption level. It’s not even close.

Apple’s last generation chassis and cooling solution were relatively terrible, which makes the new M1 feel even more impressive by comparison.


My M1 isn't even full of cat hair like my x86 MBP, because it never needs to run its fan.

The x86 MBP was stuck in a vicious cycle of getting overheated, running the fan all the time at high speed, sucking up enormous amounts of cat hair, and jamming up the motherboard and cooling vents: the hotter it got, the faster the fans ran, the more cat hair it sucked up, the worse the cooling and more snuggly warm the insides got, the more attractive it was for the cat to sleep next to.


That should be the Wikipedia definition of a positive feedback loop


Sounds pretty negative to me!


Not for the cat!


100% agree. The last generation of MacBook pros with intel processors were virtually unusable for zoom calls or any intensive processing power tasks.


This is downvoted, but my intel MacBook Pro (maxed out specs) from 2015 really was dying on zoom / MS teams calls. Probably says a lot about those two apps, but this was one of the main reasons I upgraded to an M1 Mac during corona.


Same, and same spec machine (2015 15" maxxed out everything).

I did manage to make it handle Zoom just fine by opening it up and blowing it out with an air duster, which I've been doing about every 6 months since 2018 or so. The interior is a magnet for dust in a way that my Dell work laptop just isn't (even though that thing is a PoS in many other ways, most of which I suspect are down to Windows rather than the hardware).

After a periodic dusting it runs a whole lot better - no lag or dropped frames - even when the fans come on. It's still a nightmare for fan noise compared with my 2011 17" MBP (again, maxxed retail specs, but then replaced internal drive with SSD - which made it a new machine - and upgraded to 16GB RAM because you can replace at least some stuff in machines of this vintage), which then unfortunately died due to the GPU desoldering issue (I've already "fixed" it once by reflowing the solder, but this only held for a couple of weeks before it died again).

I think I'm going to wait for M2 MBPs and then splurge - hopefully get 7 years out of the new machine as well.


I'd forgotten, because I've had an M1 Pro for a while now, but Slack video calls on my last gen, maxed out Intel MBP could heat the thing enough to burn through to the earth's core.


Yeah, same, that's why I resorted to water-cooling :) https://forums.macrumors.com/threads/so-i-water-cooled-my-ma...

It's pretty much silent now, even with the CPU governor pegged to max.


Thank you! I've been looking for a water-cooled laptop solution. I've tried the fan based laptop coolers and they're basically a waste of time (plus they add to fan noise). This is what I wanted. Nobody sells it, but I don't mind a bit of fabbing.


I had to rub a plastic bag of frozen peas on the bottom of my overheating x86 MBP to make an important zoom meeting one time. It made the peas inedible.


The only thing my 2015 pro had trouble with was Teams. No problem with Zoom for me.

But Teams was a nightmare of dropped sound and video in both directions.


I have had luck running Teams in the safari web app instead of the native (ahem Electron) app. Cuts the CPU usage down to 25% instead of 75%.


Interesting, as an engineer, I wonder what explains that.


My guess is that the Electron app is using a generic video codec that they got working reliably across Windows/Mac/Linux, but Safari is forcing a more efficient one that's well-optimized for Mac only.


Bizarre how my 2012 mpb15r (16GB RAM, SSD) is still usable and maybe preferable to models 3 or 4 years newer.


It's almost like different people have different expectations from their computer. (Different workloads, different environmental conditions, etc.)


Which os & version are you running?


Catalina (10.15.7)


Yes, mine too. I actually set up a large desk fan pointed at the keyboard to help dissipate the heat. Got so hot I would burn the tips of my fingers if I let them rest on the keyboard.

It was not a well designed laptop.


I got a late model MBP 16" i9. Zoom, Teams video confs, backups, and anything else even mildly intensive and it would ramp up to impressively loud levels, enough for people in the next room to ask what the noise was.


I swear my 16" i9 could eat the battery in an hour of a Zoom call. Even doing just CPU-heavy golang code, I could out-run the stock power brick and drain the battery.


Totally agree. I have an M1 MBP for personal use, and a 16 inch i9 for my work laptop. The M1 is silent and not too bothered by zoom. Last week, in a pinch, I had to use my 16" MBP for a zoom call, and it didn't last much past an hour before it was complaining about needing to be plugged in to power. And it felt like it might melt the desktop. Really drove home how much better the M1 is.


I had a laptop that did that (got people noticing), I went to IT a few times to tell them that something was wrong with my laptop, as colleagues' were a bit quieter.

I wish that I'd have been more firm with getting a replacement. I now have tinnitus after a year of a really loud laptop. I'm in my early 30s. I can't even tolerate my desktop with relatively quiet fans.

One day in the office, I shutdown my laptop for the day after the air conditioners turned off at 5pm. There were about 6 people left in the room, everyone noticed that "something had just turned off".

Be careful that your laptop doesn't damage your hearing.


> I now have tinnitus after a year of a really loud laptop.

Hmmmmm....

Does your laptop sound like:

       - A Dyson vacuum ?
       - A jet engine ?
       - A Harley Davidson exhaust ?
       - The sort of sounds you hear in the datacentre hot aisle ?
No ?

Well, I didn't think so either.

Not a cats chance in hell your laptop caused hearing damage.

More likely you had the laptop speakers too loud, or you had headphones plugged in which were set too loud.


Yes but what if he did that to drown out his laptop fan?


> Yes but what if he did that to drown out his laptop fan?

Nonsense.

The "rule of thumb" is that you are at risk of hearing damage in an environment where you cannot talk comfortably at normal voice level with somebody at arms length away.

I have owned many laptops in my time (including Macbooks). I have frequently pushed them hard (e.g. compiling code) to the point where the fans run for extended periods of time.

Not once have I had difficulty conversing in a relaxed normal fashion with somebody with arms length or even further away in the room. I've never had to raise my voice.

Its simple, if you don't believe me, get a calibrated decibel meter and take a reading in a controlled manner in a controlled environment. Frankly I'm willing to bet you hard money that the measured sound level with the laptop fans running at full whack will not be anywhere near hearing damage level.


I don't think you understood my question.

You ended your comment with:

> More likely you had the laptop speakers too loud, or you had headphones plugged in which were set too loud.

To which I suggested that maybe he had very loud headphones because he was annoyed by his laptop fan and wanted to drown out the noise. Which, I concede, is probably not the best life decision, but I can imagine someone doing it.


hmm while I do believe that your laptop is pretty loud, I seriusly doubt that it could damage your hearing. Ear damage occurs over 90-100 decibels, this is akin to leaning close to a loud vacuum and magnitudes greater than any laptop.


You more probably became your tinitus because of being stressed that your laptop was so loud. No way it could physically harm your hearing ..


afaik tinitus is not necessarily damage to the (inner) ear, but could as well be psychological or physical damage to the brain (blood flow). Have a doc check that out.


come on now... damage your ear from a laptop fan? can't even damage your ear from the crap speakers laptops come with let alone a fan...


Its interesting that Apples "premium line" at the time can't handle zoom, whereas my mid level lenovo ( Intel(R) Core(TM) i7-8665U CPU ) has no issues with zoom. Feels like poor design, not architecture issues.


This was true for me.

pmset -g thermlog was showing below 30% CPU limit with only Zoom.

I swore I'll never buy MacBook again. Here I am with new 14' MBP... I have zoom, 10 IDE windows, slack, Spotify, Firefox with tons of cards, docker with ARM images and that thing barely turned fans to inaudible speeds. Only way I can make them relatively loud is by running heavy x86 docker images.


This hasn't been my experience. How many CPU cores we talking here? Haven't really run into Zoom perf problems since running it on a dual-core 2015 i5 MBP.

Since 2019, I have been using Zoom multiple times per day on an early 2019 and a last 2019 Intel-based Macbook. Both are 6-core Intel i7 based. One with 16GB, one with 32GB.

Roughly speaking, Zoom eats a single CPU core or so. This is far from ideal but the machine remains plenty usable. No insane fan noise, etc. No problems doing pairing sessions, etc.

This is not a defense of Apple, Zoom, or Intel for that matter. Not really a fan of the latter two myself. It's just interesting that your experience is "virtually unusable" while mine is "fine" -- I'm curious what the difference is.


I honestly wonder if parent plugged their MacBook into power on their left? I’ve found my meet conferences are unbearable when my MBP is plugged into the wrong side and then overheats and throttles the CPU.


Zoom kills the battery on my (non Mac) laptop as well, why the heck does a simple video conference app have this kind of power draw?


Software video code/decode is extremely expensive.

It's actually a serious issue for sales these days, we have a compute expensive product that can't be demo'd effectively over zoom.


It does depend a lot on the software and I don't get why zoom is falling behind here. I'm dealing with this problem a lot on a recent Intel MBP and in my experience:

Hot jet engine: Meet on Firefox, slack video, zoom.

Works well: meet on Chrome, AWS chime, pop.

Is totally on MBP for being an odd one, but also on the software since we have counterexamples of good performance.


I do my Zoom calls happily with by 2013 MBP 15”. But I haven’t upgraded macOS for 3 years, partially because BigSur dropped support for these macbooks.

Might it happen that more recent versions of macOS are extremely slower for Zoom? I’ve always thought they tune macOS for bigger L2 cache and it keeps paginating, but you’re telling me recent hardware is slower too.


Last generation?

Without too much unkindness, MacBooks have been like this since the 1990s.

Many people I know wouldn't have considered them for one second before the M1.


Companies always solve their own problems first. Apple uses WebEx, which is probably the most piggy of the videoconferencing solutions that I use in terms of cpu and memory.


This is really funny considering how high people were about their MacBook pros during this era.


Funny, I remember we all complained about this issue ceaselessly.


I remember hearing people complain about their keyboards, not that they couldn't video conference.


With COVID lockdowns a huge population of people suddenly started video conferencing with Zoom/Teams/etc. If pre-COVID you only conferenced occasionally with those apps or used FaceTime or some other conferencing better using hardware encode/decode the keyboard was likely your biggest problem.


might be it, definitely wasn't hearing from people as much post-covid.


In just about any HN thread on Apple’s Intel-powered laptops, you’ll find plenty of actual users complaining about excessive fan noise and getting heat-soaked under load - compilation and video calls being a big one - with the problem being greater for the higher-powered i7 and particularly i9 machines. We were definitely complaining.


I can believe in the conspiracy theory that i9 models were released purely to show how bad Intel processors were and to make Apple Silicon look better.


Haha, I probably wouldn’t go that far, but yeah. I had a 15” i9 and they definitely weren’t worth the premium. Doesn’t help that it had what was probably the worst Pro chassis in the past decade or so.


Apple reused the exact same Intel chassis for their previous generation M1 MacBook Air and Pro 13”.


Sans the "contactless" cooling I'm assuming?


Sans the fan actually [0]. Yes, the cooling system got worse for the M1 Air.

The new slightly thicker Pros are great, but their chassis’ are not a significant factor in how they perform relative to the old Intel models.

The new M2 Air is the thinnest an Air has ever been. I bet it will perform excellently as well.

[0]: https://www.ifixit.com/News/46884/m1-macbook-teardowns-somet...


While last Intel based and M1 Air share the essentially same chassis, there is huge dufference in the thermal management. In both cases it is obviously explicitly designed to not be able to cool the entire TDP of the SoC in question and intentionally cause thermal throttling (the reasoning behind that has to do with limiting the outside temperature of the whole device), but the overall design of the thermal solution is better in M1 MBA as this us done by just bolting bunch of thermal mass to the SoC, while there was no space to do that in x86 MBA which necessiated the weird design with actively cooled intentionally ineffective heatsink.


> Making the laptop thicker and investing in proper cooling design made a huge difference.

Indeed. Turns out there is some utility in function over form. God I hated Ive's bizarre need to shave with the damn thing.


I suspect Apple was being aggressive on cooling designs based on inaccurate TDP information from Intel. Intel just made the power/thermal footprint configurable - so when you got the final silicon, there was a decision between lower performance than promised, or higher TDP than desired.

Multiply this by adding the chipsets, other IO controllers, and discrete onboard GPUs.

I suspect Apple are being overly conservative now, partially because these chip designs are new, partially because no internal team wants to be where fingers point if the design is inadequate, and partially because a SoP design gives them a bit more space/flexibility to do so.


> My M1 laptop is significantly quieter than my old Intel laptop at the same power consumption level. It’s not even close.

Work have me an M1 MBP and I have yet to hear it make a noise, it's amazing.


Yeah, they didn't even put the fan next to the heatsink on their later Intel laptops. I've seen some speculation that it might have been an attempt to make M1 look better by comparison, since the Intel chips weren't able to hit anywhere near the performance levels the same chip got on better-cooled laptops.

Then again, Apple has a long history of sacrificing cooling performance for form factor.


Yes and no? My m1 MacBook Air doesn't have a fan and performs better than my Intel MBP in many respects.


On the software side things have gone so much worse though. You have the Swift compiler that uses heuristics which makes it unreliable (a compiler that times out!), and even Apples most basic software seems to crash constantly. If you have a budget Intel MacBook the compiling time are horrendous. Just yesterday the Contacts app on macOS crashed for me.

I think Rosetta + Swift + SwiftUI might have been a bit much change for the developers at Apple and the users are now paying the price.

I hope they focus on some principles behind stable software next.


I made the mistake of opening Maps on Monterey a couple versions back. Instant kernel panic. (For fun, just tested in 12.4 and it works again!)

Most of my problems these days are with their bluetooth drivers, though. Anything with a bluetooth connection randomly reconnects throughout the day. Most annoying with the mouse and magic trackpad. (And while I appreciate suggestions, future posters, it still happens after a clean install / nvram clear, so it's unlikely to be configuration related).


> …even Apples most basic software seems to crash constantly. Just yesterday the Contacts app on macOS crashed for me.

This is also anecdotal with a sample size of one, but I can't recall experiencing Apple app stability issues in recent Monterey releases on an M1 Air. It may be worth digging into crash logs. https://macreports.com/crash-reports-how-to-use-them-to-trou...


"The organizational structure and performance is a marvel for this."

Totally agree, plus Sirouji, who seems like a modern day wizard


LMAO yeah 8 GB - 256 Laptop for $1200 is "killing it;" duping consumers definitely from day 1.


Go find an x86 laptop that gets the same performance and battery life in a similar form factor for less than $1200


Agreed. The M1 drove me to become a Mac user/customer. I have been buying $600 used Dell professional laptops for ages...laughing at the stupid Mac people. The performance/effeciancy/quality/value of the M1 laptops exceeds any Intel laptop I'm aware of. Match the specs the best you can--Intel fan is going to howl all day and you better have a charger ready--the Mac is cheaper. Anyway, Windows sucks your dignity out and sends it back to Microsoft in a telematics stream.


Funny you say that, we were just talking about it:

https://www.scss.tcd.ie/doug.leith/apple_google.pdf

"iOS sends the MAC addresses of nearby devices, e.g. other handsets and the home gateway, to Apple together with their GPS location. Users have no opt out from this and currently there are few, if any, realistic options for preventing this data sharing."

If you think Apple is somehow not abusing their position of power, that's just foolish. Apple is definitely one of the more evil (if not the most) companies out there.

Now they are getting into cars, tracking your Speed, your Location, How you Drive your car. And you are worried about your Windows sending telemetry about what you click on the start menu lol? Really that's your concern?


Amazon is telling their workers to stay in warehouses during tornadoes, Google is firing their AI ethics people for complaining about the ethics of Google's use of AI, Instagram is running a service that their own data shows is damaging to teenagers' mental health, and Facebook is actively participating in the disintegration of western democracy by recommending hate groups to people because it drives engagement, and you're saying Apple is the most evil?

[citation needed]


While these are all legitimate and valid concerns. NONE of these companies have a religious cult-like fans defending every decision Apple has made. Apple has been caught and discovered doing some very BAD anti consumer/anti privacy deals, slave labor, and there's a whole cadre of people (not unlike yourself), defending these decisions as somehow being flawless. Beyond that, these fans bury all these discoveries at every possibility, and continue hyping Apple and gushing over their products even undeservedly at times. This will prove to be very damaging in the future, and this is why in my mind it's the most evil company. FB/Instagram at least people realize, acknowledge and not try to defend/bury it... which leads me to believe at least something will be done about it... eventually.


The only one showing “religious cult-like fan” behaviour here is you. You’re beside yourself all over this thread nay-saying and arguing with people about a damn CPU that no one is going to make you use. You’re working yourself up into a lunatic frenzy.


If you’re looking to avoid telemetry, Mac OS is not the alternative.


I've got these abacuses here for sale... very robust and safe with complete privacy.


You really have to ask? Because why? You live in an Apple world before Google search has existed because Apple hasn't invented search yet right? Also it's $1500; because no one wants a laptop in a 8 / 256 configuration, no one; it's just there for money extraction purposes.

HP Spectre Laptop Computer 13.5" WUXGA+ Touch Screen Intel Core i7 16 GB memory;

Count the number of ways it's better and cheaper.

https://www.walmart.com/ip/HP-Spectre-Laptop-Computer-13-5-W...


That machine is barely cheaper, has a much worse display, gets half the battery life, comes pre-loaded with bloatware, and you have to contend with HP's legendarily poor build quality.

This is not a comparable machine. It may win in a drag race, but there is way more to a laptop than FLOPS.

And I don't even use a Mac. I primarily run Windows and Linux. I'm not some brainwashed cultist like you think.

EDIT: And the CPU is still considerably slower than the M1[1], so it doesn't even win the hypothetical drag race.

1: https://www.notebookcheck.net/i7-1195G7-vs-M1-vs-M1-Max_1318...


Anecdata here, but I owned an HP Spectre 360 laptop a few years ago and I would never buy another HP laptop again. It cost almost as much as a Macbook, and within ~2 years of gentle use, mostly around the house, it blew a speaker, the webcam developed a purple tint, the rubber strips on the base peeled off, and when I tried to open the case to check for a second M.2 slot, one of the Torx screws stripped and remains stuck.


Every HP I've used has needed repairs within two years of ownership. My first personal laptop, bought right after high school 15 years ago, had a flaky keyboard connector.

The next HP I ran into was provided by my employer. Within 9 months the battery started bulging and the trackpad stopped working. They replaced the battery, which turned into yet another spicy pillow after another 9 months.

Eventually, my employer provided me a newer HP, one of their $2,700 mobile workstations. This is my current work machine. It has had keyboard connector problems almost identical to those I had 15 years ago. But even without those problems, I'm shocked HP charges so much money for this machine. Despite the metal chassis, it feels flimsy. The 1080p screen has reasonable pixel density (not quite "retina", but serviceable). However, the quality of the panel itself is atrocious. It looks like the cheapest IPS panels you could buy in 2011; horrible color reproduction, awful contrast, atrocious panel uniformity, backlight bleed out the ass. Even the viewing angle is mediocre by IPS standards.

I will never willingly spend money on anything from HP.


M1 Air (especially) has been having screen issues too, including tinting; I've personally witnessed it. I think HP did become really bad around 2012, but they've really picked up their game lately. I've seen some great HP laptops, but you don't need to go with HP. There are even cheaper laptops that are slightly less powerful but still great for what they are capable of.


A few years ago I bought my wife a 15" Spectre x360 as a present (and then upgraded it to 32 GB of RAM once she fell in love with it after a day or two). I went with the Ryzen model, because of how awesome their mobile processors are.

The screen is "nice enough", the battery life is probably fine but she uses it 99% plugged in, the keyboard seems nice enough but it has a numpad so everything is off-centre, etc. It plays the games she wants and runs the app she wants, and it was like $1100, the build quality is good, and it's way faster than anything else we had in the house at the time (with the possible exception of our iPhones).

The only real complaint I have is that it seems as though, by and large, HP and the retailer (CostCo) seem to not really know that this model exists, so when I search for information I have to find the closest model number I can and hope that it's basically the same laptop. Also, spare parts are ridiculous; I paid $50 plus shipping on eBay to a third party from the UK for replacement rubber feet, but it was only one foot. No other official HP parts resellers actually had the part in stock, even though it was only two years old and there were multiple identical (ish?) models.

At least when I want parts for a two (or eight) year old Macbook I can actually find them from someone sufficiently reputable on the same continent as I am.


>i7-1195G7

That's an undesirable generation that lacks efficiency cores and gets barely two thirds the performance of the lowest specced M1, and has 13h of standby time compared to the Apples 18h of video playback. Not remotely comparable, and wildly overpriced comparitively.

https://www.cpubenchmark.net/compare/Intel-i7-1195G7-vs-Appl...


Im sorry what?

i7-1195G7 => 1605 Apple M1 => 1729

2/3s? Are you being a little liberal with your interpretation? I'd say its a wash and pretty much the same thing.


>i7-1195G7 => 1605 ; Apple M1 => 1729

Where are you getting those numbers?

The link I gave has Intel at 10843 multithreaded and M1 at 14653. That's actually closer to 3/4, now that I do the math, but it's also close to half the TDP (and everyone knows Intel lowballs their TDPs). And once again, this is a comparison between Apple's lowest-binned chips (which currently sell in a laptop $400 cheaper than the one you linked) and one of Intel's highest bins (which, even though it's old, is still 6 months newer than the M1)


This laptop is literal dogshit lol


It's not exactly an apples to apples comparison, but if you must:

- 8GB on the Air is not the same as it'd 8GB on the HP, macOS + Apple silicon's memory efficiency means it can do more with less.

- The new M2 Air has a better screen by resolution, size, and by brightness (400 vs 500 nits). I wasn't able to find colour gamut specs for the HP Spectre.

- By all metrics battery wise the M2 Air handily beats the Spectre.

- The Spectre comes with bloatware like McAfee LiveSafe and other HP software that aren't essential.

- In ever benchmark (real world and simulated) the Intel® Core™ i7-1195G7 loses to the M1 so it'd be reasonable to assume it'd lose to the M2 as well. I mean it's a 10nm process versus a 5nm so not at all surprising. In some benches the difference is nearly 2x in favour of the M1 which is astounding. Again, we're not even comparing the i7-1195G7 to the M2.

- The wireless chips between are both (802.11ax) so they're on equal footing there.

- The M2 Air has a higher resolution webcam.

- The Spectre does not have fast charging.

- The Spectre has a dual mic array compared to the triple of the M2 Air's. We haven't gotten real world tests of M2 Air yet for it's video conferencing capabilities but if we use the M1 Air's capabilities as a base point then it's safe to safe it'll be better than the Spectre's.

Last year I had a friend who used the M1 Air as a stopgap (from his old Intel MBP) until the 2021 Pros came out and it performed amazingly with a heavy load. I'm talking about running Photoshop+Illustrator simultaneously along with VS Code and node server while have dozens of tabs open all on the 8GB of RAM. I'm positive the Spectre could do the same but the thing is his never made a peep in noise because it literally couldn't since it didn't have any fans.

Hardware is more than it's basal specs, it's about how holistically said specs integrate with one another to create a unified device.

To be frank, comparing a now three year old machine that it's original MSRP is $1600 USD now on sale for $1400 to a 2022 $1500 machine is a bit silly. Even more so when you look closer to the details.

Sources:

https://www.apple.com/macbook-air-m2/specs/

https://www.walmart.com/ip/HP-Spectre-Laptop-Computer-13-5-W...


There are plenty of good choices

"2x in favour of the M1 which is astounding" False, Intel bests M1 in single core. and pretty close in multi core, i12 beats it and M2 easily. See: https://www.newegg.com/titanium-blue-msi-ge-series-raider-ge... Though pricey, but it beats M1 Max too, so. This laptop completely destroys every SINGLE SPEC of M1 Air, MBP and MBP Max; except perhaps battery life.

M2 air has a webcam? Considering M1 didnt even have one, LOL, that was ... fast.

Fast charging? OK? So what? It's not like I'm gonna sit there and watch it charge. Gimmick, also fast charging wastes/ruins your battery much sooner, I'd rather slow charge 95% of the time.

Bloatware that can easily be uninstalled or opted out from at purchase time? I would consider all the shit APPLE installs BLOAT WARE too. Case and point, why do I need health app (that I need a subscription for), Apple TV app (I dont have it), Apple Home Kit. Hell I cant even uninstall some of these apps.

Resolution, is a tiny bit better, if you want better res there are better than Air res available easily, it's not exactly revolutionary and some people dont like higher res on a small screen. Example: https://www.walmart.com/ip/seort/955928442/

You are just grasping for straws and it's boring to respond. You were running 2 apps simultaneously? Wow, I run like 50 now. There was nothing really interesting with M2.


6.39 lbs, huge screen but really low PPI. surely that thing's more of a portable desktop than a laptop?


> macOS + Apple silicon's memory efficiency means it can do more with less.

Source? I agree for other points.


You're aware that Apple silicon is currently much faster than Intel?


* per-watt. Intel is faster overall.


In similar size/weight laptop class?


What would dripping the ball look like at this point in the game? Mediocre annual upgrades?


I would expect that even mediocre annual upgrades wouldn't actually be a short-term business failure, as the vast bulk of sales are probably 3+ year upgrades.

It would be a marketing failure though, and sustained would translate into a longer-term business failure.


[flagged]


Ease up bronco. You can buy a https://frame.work/ if modular upgrades are important to you.


Apple's whole thing is simplicity/"just works". Modular designs are pretty much the opposite of that. They've been proven right too. The vast majority of people don't want to slot in their own parts.


> They've been proven right too. The vast majority of people don't want to slot in their own parts.

How have they been “proven right”? Correlation is not causation. Just because they’re highly successful and they ship integrated components doesn’t mean the two are related.


The reason why their ecosystem “just works” is because it’s a tightly controlled vertically inte grated prison. The second you allow choice and a multitude of hardware and software configurations is when you start seeing complexity for users as well as instability.


I'm not saying they should, just disagreeing with the idea that it's a proven thing. Apple likely has many reasons for not selling a user-configurable Mac, but I don't believe "users don't want it" is in that list of reasons.

For example, customers definitely want lower prices, and a modular/user-upgradable Mac would clearly offer that (e.g. upgrading the stock hard drive with something much bigger, faster, and cheaper). So to say that Apple doesn't do it because users don't want it is simply wrong.


Not sure if you’re aware that Apple doesn’t use stock hard drives, they use raw Flash and implement their own controller. Part of the aforementioned vertical optimization of performance (and security in this case).

Anyway, yes, this argument always comes down to whether enough people want a larger, slower, less reliable, more configurable, more upgradeable device, and it’s hard to know for sure without trying the expensive experiment. At least we got some ports back!


Apple users above all want something that “just works”. The more flexibility and options that Apple offers, the less likely that will remain true. If you want choice, Apple is not the way to go.


What has been proven is that consumers don't care about customization too much.

If they want it, they don't want it that much.


How has that been "proven"? If your argument is that people are still buying Macs despite the lack of customization, well that's ignoring the state of the market. Most consumers only have two choices (Mac or Windows), and oftentimes it comes down to familiarity with one OS over the other, rather than any specific features or merits of the actual product.

If Apple released a "customizable" Mac, or one that could be upgraded like a standard PC, do you believe that would sell poorly? I highly doubt it.


Modular laptops currently exist, and have existed in multiple incarnations over their entire existence. They always flop in the mass market.

People in the tech community want modularity and upgradability. But you must remember that tech forums are tiny bubbles. The mass market overwhelmingly DGAF.

If you want to be successful at selling laptops to a tiny market, it is possible. You will have a company that looks like Framework or System76.


This thread isn't very useful since my original comment wasn't clear enough, and it seems like people are misinterpreting my point, and then making assumptions about those interpretations.

* I'm not saying Apple should make a "modular laptop", just disagreeing with the idea that users wouldn't buy one.

* I'm also not talking about Framework-style modularity, but "simple" things like swapping out the hard drive and RAM, two things you can't do on Macs or Macbooks anymore.

Even if the average consumer wouldn't ever think to swap out those components themselves, the modularity would directly lead to longer lifespans for devices, lower repair costs, and (significantly) less e-waste. There are many regular non-tech people who would want those benefits.


I'm following. Modular designs are not without tradeoff. Using memory on a board requires different types of memory (i.e. not LPDDRX), which have different performance and power characteristics. It requires changes to packaging and the overall design to facilitate that modularity. It changes their ability to share subcomponent designs with other products in their offerings. It introduces opportunities for commercial third-party modifications, which can be of varying quality. It requires a larger BOM. And it changes the mechanics of product tiering.

All of these attributes influence factors for which the mass market does consider when buying laptops.


All true, but what I'm saying is that nothing has been "proven". The market has not decided that this is what it wants, Apple has. They have a pure monopoly on the market for Mac-compatible hardware, so whatever they do is immune to the regular market forces which would normally require them to make different product decisions.

So again, whatever Apple manages to sell does not really reflect the reality of the market, since they 100% control that particular market. To say that consumers don't want something because Apple says so, is just wrong. Maybe Apple is right (and this isn't just a decision that's beneficial to their already-obscene margins), but there's no way to know. Hence, it is not "proven" in any meaningful way.


Apple's laptops compete in the same market for laptops that others do. I definitely doubt that Apple's customers prioritize modularity more than Linux or Windows users. Very likely the opposite. Yet, the most popular devices of any OS are moving towards non-modular designs.


There has been tons of research in this. Of consumer machines capable of being upgraded, only a low single digit percentage of them ever are.

Factor in issues with driver support and updates for arbitrary hardware, the weight premium for upgradability, compromises in chassis rigidity and resilience to add removable panels and it’s just hugely wasteful and compromises quality for the vast majority of owners.


Haha, I dont know about these studies but I was just browsing and randomly saw this. Checking in that I definitely upgrade my machines all the time.


Can you cite any of this research? I'm not doubting your claims, I'm legitimately curious.


It's proven through consumer choice of laptops over desktops. Proof is in the sales figures. Consumers choose a less customizable product over one that is customizable indicating logically that some features are more important then customizability.

Which continues to be in line with what I said. Consumers don't care about it as much. It is proof. Please refute this claim or acknowledge you are wrong. Thanks.


Now I know you're just trolling me


Sort of. But I mean I said it because it's definitively true.

The troll part is because it's also definitively obvious, and I know everyone knows about this, including you. And I can easily form a statement that can't be challenged at all. You're still welcome to prove my statements wrong even though it's impossible.

Obviously you're just not thinking in that direction and you're sort of heavily biased towards modularity and other niche nerd features.


It's not univariate, but if the software outweighs hardware modularity then that's useful information.


Counterpoint: M2 is a tock release, not a tick. This is pretty much the same behavior that ended up getting Intel shot in the foot: leaning too heavily on marginal performance increases from process enhancement will bite you in the butt when your competitors get their tech on your node. Intel and AMD are both going to be on 5nm in 2023, Apple should be pretty worried if this is what they're fighting with. 3nm isn't even getting taped off until 2024 AFAIK, so this really does feel like Apple's "Skylake" moment, as far as desktop CPU architecture is concerned.


Spend a few minutes talking with someone who designs hardware. Often when they release something, there are TONs of incremental improvements that they just didn't have time to get to when a new product is released.

I remember having a conversation with someone who worked on hard drives in the 80s. He got so ^%$^$# excited telling me about all the improvements he worked on between generations; they were mostly things like tighter calibrations, and refinements.

Point being: Don't knock releases like this.


Indeed, that "18%" improvement is almost certainly made up of very many 1% improvements (or more likely, many improvements that each only makes difference in some scenarios).


that "18%" improvement is almost certainly made up of very many 1% improvements

My guess is 18 of them.


I'd guess 17, but I guess it depends on whether they compound. :)


Ah, the old synergy effect, a corporate executive's best friend.


This comment sounds like the exact same hopium I heard about Intel in 2016. I get what you're saying, and there are definitely smaller changes packed in here, but my point is that Apple's performance crown is looking mighty easy to usurp right about now. Hell, they even showed a graph with the i7 1280p beating the M2's single-core performance by ~20% at the WWDC today; they know they're on notice.


M2 isn’t intended to have high performance. It’s intended to have a high perf-power ratio, as they said.

Often it gets the performance too, but that’s kind of like a lucky break.


Still M1 Pro/Max/Ultra single core performance is basically same as M1. It's not good sign for M2 Pro/whatever's single core performance. It's fine for M2.


And why do I care about perf-power ratio if my laptop is plugged in? WHY? I can understand if it switched to high perf-power ration when Im not plugged, but why would I need that when Im plugged in? Seems like a made up, just because metric.


Otherwise it would throttle instantly because your MBAir has a passive cooling system.

Also, it would make you infertile and it’d drain the battery faster than the charger can recharge it.


Because lots of power makes a system hot. And a hot system is a slow system, even if the battery is full.

https://www.novatech.co.uk/blog/cpu-and-thermal-throttling/


Nothing you said here doesn't not apply to M1 which also overheat under high load and gets throttled, and some of them dont even have a fan.

https://medium.com/illumination/how-to-solve-overheating-iss....

A lot of these benchmarks run for several minutes and test the system as it OVERHEATs, simulating real conditions and Intel i12 beats M1/M2. So this doesnt change anything or bring any new information.


This is the first sentence of that article, “MacBook rarely shows overheating issues.”

I can say that’s definitely not the case of the Intel laptop I’m currently using.


There's more to comparing computers than just CPU benchmarks. Benchmarks can be manipulated. Raw CPU only matters when your workload involves lots of CPU work. (And, if it does, maybe a desktop is a better choice.)

But, more importantly, it's like comparing two cars based on horsepower when in practice both will get you to the same place, on the same road, at the same time.


That 20% better perf was at 4x the power though?


Apple doesn't really have to worry about competition too much for a long time, even if Intel and AMD get comparable watt/power levels for laptops, Apple isn't going to be using them for their products anytime soon. And people using Macbooks aren't likely to switch to Windows/Linux machines, even if there is a 20% better CPU.

But really we've only had 2.5 released cycles so far. Not much to go by.


I might be a very small minority but I'd personally switch back to Linux in an instant if there was something comparable to the M series hardware


Asahi linux has most things working, except the accelerated GPU. Just in the last week or so a triangle was rendered, so progress is has been made.

Sadly while apple has 128 bit wide memory on the low end (66-100GB/sec on the mac mini and mba), 256 bit wide (200GB/sec), 512 bits wide (400GB/sec), and 1024 bits wide (800GB/sec). Nothing wider than 128 bits looks to be coming to standard laptops or desktops in the non-apple world. I don't really count HEDT chips like the threadripper pro, since they are very expensive and very limited, and burn many 100s of watts.


Asahi Linux is, for me, the most interesting thing going on in Linux these days.


Agreed. I'd likely have a studio today if the GPU port worked, but unless it comes out RSN I'd likely wait for the M2 based studio to come out. The chip has other magic inside as well. I'm hoping Linux continues to implement support for the MatMul instruction (not just vector multiplies), 16 trillion op neural engine, various encode/decode video accelerators, etc. I've heard vague references to compressed swap to help make the most of limited ram (m1 limit was 16GB).


What package manager/distro will Asahi Linux be using/basing off of?


It's based on arch, but there are already people who ported the necessary changes to nix and something else I can't recall. The idea though is that once things get upstreamed, any distro will work.


Perfect, looking forward to when it's stable.


Massive memory bandwidth is for GPU, so not matter unless you utilize iGPU.


I disagree, Anandtech tested the M1 max (409GB/sec bandwidth), and the CPUs managed 224GB/sec with the faster cores, and a total of 243GB/sec with fast and slow cores.

Sure that's quite a bit less than 409GB, but there's other major parts of the chip like the GPU, video codec acceleration, and neural engine.

Seems pretty unique and useful to me, not everything fits in cache.


That's true that CPU core consumes massive available bandwidth on AT's test, but it doesn't mean that such bandwidth is used for real CPU usage. I don't oppose memory bandwidth is meaningful for total SoC including iGPU/ML/Video parts.


Asahi Linux is running well enough to be usable for lots of things.

However, no screen brightness control, no sound or YouTube currently. And there's some page size weirdness that means Chrome/Electron (?) is not usable, so no VS Code.

If you can live with that, it's so nice having Linux on the M1 hardware.


YouTube is the acid2 test of the modern web. I had a laptop that didn't hardware accelerate youtube, and it ended up as a headless server.


I'm considering getting an M2 Air even though I very much don't like OSX.


Despite the exciting news and I'll definitely upgrade my intel MBP to either M1 or M2 in the future, I doubt my personal workflow will change: Linux for development, Mac for productivity and maybe some (lightweight) development, Windows for gaming and gaming only. While Apple probably wants to steal existing user base from Windows, on productivity uses maybe, I don't think they're planning to drag people from Linux on development workloads.


M2 has a media engine, which was previously reserved for the higher end versions.

Considering that M1 was already an overkill for the most tasks, I think it is a meaningful "tok" upgrade.

Besides, I kind of expect the trend of ASIC to continue. Instead of having more extreme and extreme lithography, it kind of feels more appropriate to have computation specific developments.


AMD recently teased a ~15% single threaded performance increase for Zen4 when they will move from TSMC 7nm to TSMC 5nm.

Apple just teased ~18% CPU increase while staying on TSMC 5nm.

Sounds like they are doing just fine.


AMD stated >15%, not ~15%, and this was the worst-case, conservative number, as they have clarified.

The increased core frequency alone brings in around that number already.

https://overclock3d.net/news/cpu_mainboard/amd_calls_its_zen...


That 18% is multicore


My understanding is that AMD coasts behind Apple’s process progress at tsmc, seeing as Apple invests the bulk of resources into tsmc’s cutting edge nodes. If apple hits a wall, well so does everyone else depending on the fruits of their r and d.

Also, if Intel can keep up this time, and if tsmc/apple slouches, I wonder if there will be Apple silicon and Intel x86-64/risc macs/apppe devices on available simultaneously?


My understanding is that 3nm for Apple is coming in 2023.

https://www.slashgear.com/833760/apples-3nm-processor-is-abo...


Coming in 2022 for the iPhone 14 Pro line.


Only because it’s a COVID year.


Why does the 13" Pro have a 720p camera and the Air have a 1080p? Seriously, why is Apple still putting in 720p cameras? This has to be a mistake, right?

Edit: Also the pro is missing the magsafe charger. Are they phasing out the 13" pro?

Pro: https://www.apple.com/macbook-pro-13/specs/

Air: https://www.apple.com/macbook-air-m2/specs/


To clarify, you’re talking about the 13” Pro only. But yeah, total mystery to me why they’re still selling it. It made sense in 2020 on the cusp of the Intel/Silicon transition, but now that we already have the redesigned 14”, don’t really understand what they’re doing here.

Who would buy one of these? The Touch Bar is an evolutionary dead-end, and the design of the new 14” and 16” Pros seemed specifically targeted at addressing the well-known shortfalls of this previous generation.


I have the M1 13” Pro, honestly I love it.

13” is the perfect size of laptop IMO, the battery life is insane, the performance is great, the screen is good enough for me (I don’t care about higher refresh rates etc)

If the M2 is just more of the same but faster, yes please. Why fix something that isn’t broken?


I might go for it. If it weren't for the keyboard and the odd battery/performance issues with my late 2016MBP I'd be happy with it. When I got it I thought it was too thin, but I got used to it and now the new 14" feels bulky. I loved mag safe, but it's not a must have. I love the 4 USB-C ports, I don't need any other connectors, and I actually enjoy the touch bar.

I'd have to test the keyboard though, the new pro keyboard is awesome so maybe I don't want to miss out on that.


I have a 2020 post-butterfly 13" and one of the new ones from work and I don't think I could tell them apart based on keyboard feel alone. I'm very happy with the 2020 keyboard at any rate, beats the pre-butterfly ones people keep raving about. No idea if they made any major changes after that, but I don't feel much of a difference.


Simple answer: price.

13in pro starts at $1299. The newer 14in starts at $1999.


Why purchase a 13in pro over the m2 chip air?


Probably the biggest reason is the fan.

Allows for increased sustained performance as benchmarks have shown with M1.


That’s a fair point. And I guess Apple wouldn’t keep selling them if people weren’t buying.


Maybe they do because of it? There's a chance they have some stock of touchbars, old chassis and displays left and therefore are continuing the model series.


They always keep selling not-fully-refreshed versions alongside the new redone machines. It’s confusing to look at head to head.


The 13” MBP is $1,299 and has better performance and better battery life than the MBA. The 14” is too expensive at $1,999, has inferior battery life, and weighs more.

If I had to buy a laptop today I would get the 13”. The Touch Bar is no issue to me as I use it docked 75% of the time.


It probably has a little to do with marketing. Buyers will instinctively compare the M2 air with the M2 pro and the air will look much better.

“A 1080p web cam, thinner chassis, mag safe and 100$ cheaper. What a steal”


> To clarify, you’re talking about the 13” Pro only.

Yes. The 13" Pro is currently the only pro model with a M2 processor. I updated my comment for added clarity.


I bought M1 13” because:

- size

- Touch Bar (last chance to buy mbp that does not have useless-to-me Fn keys)


That Pro is the old design. It's just Apple's usual thing of keeping an old design of a product around to serve some kind of gap they see in their market - it creates a confusing product lineup because this means you have:

New Macbook Air: New design & M2 chip

New Macbook Pro 13": Old design & M2 chip

Macbook Pro 14" & 16": New design & M1 Pro/Max chips


The new M2 13" pro is in exactly the same unit as the old Intel 13" pro I guess, just with a different processor?

It's also the only "new" M1 or M2 machine with touchbar (seems to have been dropped elsewhere) and without the new magsafe.

It seems to be kind of a mistake, and a mistake to buy it.


MBP13 is an old form factor so it'll be definitely phased out. There are 14 and 16 that are getting real love these have 1080p webcam IIRC.

But the phase-out period is very long in recent Apple products (which is probably a good thing especially for enterprise context.)


I think the 13" Pro is just to is linger until the 14" can drop in price enough to fill it's spot.

The 13" Pro is now also the only Mac with a TouchBar, which is just strange.


Even the pricing makes me think that this is just a phase out. I'm willing to bet this is the last generation of 13" Pros. Just seems like the 14" is replacing it. As far as I can tell the Air is better than the 13" in almost every way, except maybe the bigger battery and active cooling. You can pay the $100 difference to upgrade the chip to be identical, which is odd for Apple. Usually there is still a minor premium.


Probably haven't phased out old manufacturing pipeline so they "have" to keep making these (and make money off it) before it is gone entirely in the future


Tbh it's weird they even bothered refreshing it, instead of going ahead and phasing it out. It's the only one left with the touch bar, with no MagSafe, etc. The Air's screen is now bigger than it. And there's also the 14-inch MBP. I just don't see the point of it now.


I recently bought an M1 Air and I miss the magsafe charger. I really don't understand why they try to move away from it. A reliable working solution to a problem.


I imagine the existence of the 13" Pro using the old hardware design means the 16" M2 Air is at least another year out, and at some point, it will simply cease to exist.

So, why update it? Redesign it instead.


remember 14" is more like a 13" with a narrower bezel.


The 14" is a tank compared to the 13"


I have a new M1 14" MBP, and an old 2015(!) Intel 13" MBP.

They are pretty much exactly the same height/width, if you put one on top of the other they line up close to perfectly. The 14" just has less bezel.

The M1 14" is thicker and heavier though.


Meh. Same thickness, 8mm wider, 9mm deeper. I'll grant that 200g is a noticeable increase.


All i want is a laptop with similar build quality, battery life, and Ubuntu.


A major factor in achieving excellent battery life coupled with relatively good performance & responsiveness, on Apple products, is the OS. That holds for both Macs and iDevices.


There are several android phones that compete with the iphone 13 pro performance wise, while running a much less optimized os.

Likewise in the desktop world, the single core performance of the M1 is on par with top of the line chips, and M2 will be the same, and intel and amd work fine on linux.

Apples decision to keep a locked box has nothing to do with performance, solely to do with keeping people in the ecosystem for revenue.


I'm aware benchmarks don't tell the whole story, but Geekbench shows the iPhone 13 Pro[0] significantly outperforming the Samsung Galaxy S22 Ultra[1] - about 60% faster multicore (presumably thermally limited?) and ~80% faster singlecore.

[0]: https://browser.geekbench.com/ios_devices/iphone-13-pro [1]: https://browser.geekbench.com/android_devices/samsung-sm-s90...


The "coupled with" was important. They hit a spot on the performance/responsiveness/power-use graph that others don't, and a lot of that's due to software, not just hardware. It's easy to get incredible battery life if you accept poor performance, or to have great performance by sacrificing battery life, but Apple does the extra work to achieve both. Kinda like how BeOS used to feel way smoother than Windows or (GUI) Linux even when running on far worse hardware.


Yes, although I think a lot of the issue is generic hardware drivers that don’t necessarily configure things correctly, and it looks like the Asiago linux people are paying a lot of attention to power management, so it may end up having decent battery life too.


So you're saying that BSD has better power management than Linux? Would love to see some real research and analysis on this.


It's not a BSD vs. Linux thing. It's the entire software stack including user space, e.g. how much is the CPU woken up by silly background tasks doing useless things. That thing can be fixed regardless of the underlying kernel if people care enough. The total investment in desktop Linux is tiny compared to what Apple puts into macOS.


The vast majority of power management is less background services while active.

It's mostly about reducing idle power consumption sleep mode, and closed lid hibernation.

Most battery loss in mobile devices can be attributed to two things.

The screen, and how much is lost when the device is idle in your pocket/backpack.


My claim is that macOS and iOS have far better power management than Linux or Windows, or Android, respectively. Possibly that's directly due to their BSD heritage, but I doubt it.


Apple's kernel isn't merely one of the BSD derivatives. And of course their userland is mostly proprietary.


No, macOS does its own power management - the base kernel may be BSD, but macOS has vastly more work in it than simply shipping a basic BSD system.


The base kernel is not really BSD.

https://wiki.freebsd.org/Myths#FreeBSD_is_Just_macOS_Without...

>Darwin - which consists of the XNU kernel, IOkit (a driver model), and POSIX compatibility via a BSD compatibility layer - makes up part of macOS (as well as iOS, tvOS, and others) includes a few subsystems (such as the VFS, process model, and network implementation) from (older versions of) FreeBSD, but is mostly an independent implementation. The similarities in the userland, however, make it much easier to port macOS code to FreeBSD than any other system - partially because a lot of command-line utilities were imported along with the BSD bits from FreeBSD. For example, both libdispatch (Grand Central Dispatch in Apple's marketing) and libc++ were written for macOS and worked on FreeBSD before any other OS.


It’s pretty BSD. Sure Mach is under there, but that doesn’t mean it “is” Mach. There’s a lot more original work that doesn’t belong to either ancestor.


I like the XPS 13 line myself. The 93xx line lasted all night when I forgot to shut the lid one evening (9+ hours) with Fedora (after running `powertop` and doing its tweaks).

But I'm also a "weirdo" who doesn't like Apple-made hardware (the trackpad, touchbar, keyboard, mouse, etc. are all inferior IMO), so maybe you're looking for something different there.


I'm currently on my first and last XPS, a 9570. I really wanted to love it, but there's just too much about it that is a complete disaster, particularly around power management, wake behaviour, and thermals.

I know I haven't exactly babied it, what with it having been plugged in in my home office for basically two straight years during the pandemic, but the battery is shot at this point— getting barely an hour of life. Often it'll be supposedly sleeping in lid-shut mode but be cooking itself for no reason. Then it'll wake up and immediately go into a power-panic shutdown, only to assert that the battery is full after all when it reboots connected to juice. And now the HDMI port is also toast (verified under multiple OSes to be a hardware issue).

Maybe I just got a bad year, but this is supposed to be Dell's premium machine and I don't think I can justify giving them another chance after this. It's just nowhere near reliable enough to be used on the road, and not performant enough to be a true desktop replacement. So I don't know who is using this machine and for what.


Same issue with power here on my 9570! In fact I just checked after one month of plugged-in non-use, turning it on resulted in dead battery in < 3 minutes. What's crazy is that if I let it charge now, I will get 3 solid hours of battery life. The battery health indicates 2/4, it's definitely not a fully spent battery.

The other thing that has driven me crazy is the wide-gamut 4K screen. The text is nice and crisp, but good luck with sRGB content. Basically all content for non Adobe professionals is fubar'd. I eventually made the situation tolerable by finding an old version of Dell PremierColor and using its sRGB profile (newer versions of the software on Microsoft Store quit working on the 9570 and the color profiles do nothing - imagine that). Still get terrible color banding and crushed blacks, and I'm not even sure investing in a colorimeter would fix it.

Overall I've come to the same conclusion as you. In 2018 this was a $2000 machine (Costco price below retail), and I also can't justify giving Dell another chance.


My XPS 15 was horrible under Windows, but it was fantastic running Linux. It would somehow get 2x the battery, and ran a lot faster... and once Proton hit, it wasn't half bad on gaming.


I had a 9520 for about 5 years. I had two issues, which was a broken left hinge (4k touch screen was two heavy for that part, I believe in the XPS 15 actually didn't suffer from the same issue), and swelling batteries, which caused various knock on issues. I also experienced the same issue with batteries from the dell latitude I got from work. Seemed to be a dell thing for laptops left plugged in constantly. I never had to deal with overheating issues despite having a Xeon (although it did get quite hot).

Its a shame because otherwise I really liked the laptop. Gorgeous screen, good trackpad and keyboard, and a perfect size imo.


That's so funny, because I'm in the same boat with my XPS 9570. I bought it for personal use because I had an amazing experience with a 9370 developer edition at my last job. I figured it would be the same, but that was extremely naive. I got the i5-8300H SKU with an empty 2.5" SSD bay, 53Wh battery, and a 1TB drive for $800 and change with the intention of plugging it into my GTX 1070 eGPU at home. I figured that this would be like, a power-sipping config.

NOPE! It wakes itself from lid-shut sleep and cooks itself whenever I put it in a bag, it will crash and reboot if the Thunderbolt cable gets jostled, and I get maybe 90 minutes on battery life after weeks of optimization script fiddling. I wanted to like it so badly because at the time, the processor was a steal. Matching the old 7820-HQ's performance with a bottom-tier i5 8300H felt like magic, but nothing else lived up to the hype.


> around power management, wake behaviour, and thermals.

Can't speak for wake behaviour cause I have never encountered issues on my 15" 9570 but you can actually make it much, much better with a bit of work.

You need to work around plundervolt flags and enable undervolting(there's a tutorial in some reddit threat on how to do this), then undervolt CPU a bit, this way it almost never throttles. As for thermals, just changing paste to Kryonaut and adding few stacked thermal pads so they connect to the chassis in few places according to some tutorials I found made insane improvements. I play Valorant on mine and I went from almost constant throttling, to no issues at all, even during longer sessions. It's also almost never running fans on higher speeds during normal work.

But I agree, my next machine will be a MBP with Mx chip, I don't want to put in so much work into something that costs a small fortune just so it works properly.


I've had the same experience with my XPS 9560. Past few years I've had "TPM device is not detected"[1] issues. Tried all the different solutions and nothings fixed it. Dell have never addressed it with a BIOS update.

I don't want to go back to OS X, but its hard to find good build quality and a high res screen.

Also, my battery was terrible after a couple of years. I had it plugged in as well. I bought a replacement this year and it was easy enough to switch out. Hopefully I can just keep this going and use it as an RDP machine.

[1] https://www.reddit.com/r/Dell/comments/onvh42/xps_9560_tpm_d...


I had the same issue, for me updating the TPM firmware solved the issue (but it is convoluted)

https://superuser.com/questions/1668861/alert-tpm-device-is-...

My guess is that Dell didn't update the TPM firware, my guess is this breaks some keys so can't be done automatically and a BIOS update at some point then screws up the handshake with the older TPM firmware unless the laptop is fully powered off.

But I'm not a huge fan of dell hardware.


I think the real garbage part of buying an XPS from Dell for professional developer work on Linux is that they will not provide any support for the device. Particularly when you report driver, sleep, or other software level issues that they ostensibly provide with OEM installed Linux.


Yeah my 15” xps was pretty bad too. Coworker and I both had this issue where one key press would result in the same letter being typed twice. Coworker sent his machine back several times but never got that fixed. I just used an external keyboard most of the time. Besides that, my machine’s battery ultimately puffed up and made the trackpad impossible to click.


> supposedly sleeping in lid-shut mode but be cooking itself for no reason

Same. It fires up full fans and all in the middle of the night.

It’s a nice form factor, but much like me growing tired of stuff like “android slowdown syndrome” and me getting an iPhone last fall … I feel like I’m being pushed to try a Mac laptop for a cycle….

I just don’t want to deal with this stuff anymore.


> with it having been plugged in in my home office for basically two straight years during the pandemic

Yeah, my second seems to have succumbed to that as well. There's a setting in the BIOS that says "I plan on keeping this plugged in all the time" and it'll do better battery management that way.


My mbp automatically enables that after being plugged in for awhile, by setting the max threshold to 80% or something. The mbp warned me in the battery monitor I think. Nice, because I wasn't even thinking of such things.


That's the kind of thing that Linux tends to miss out on. But I don't feel that selling my computing experience to Cupertino is worth such things either.


I’m thinking, that since your battery has passed its cycle limit, maybe the increased internal resistance of what odd causing all the heat issues. Maybe, change the battery?


> But I'm also a "weirdo" who doesn't like Apple-made hardware

Please, it’s not that weird of an opinion.

> (the trackpad […] are all inferior IMO)

Oh.

But I think I agree when it comes to anything like drag-and-drop.


My XPS 13 9360 randomly turns on when the lid is closed and burns through the battery. There was an official document at some point saying that if you put it in your backpack in sleep mode, it voids the warranty.

I really like the laptop otherwise, but battery/power management is utter crap on it, both on Windows and on Linux.


I've noticed this too, but it seems Bluetooth related. There's a report with the Linux kernel, but no progress as yet.

See this thread: https://news.ycombinator.com/item?id=31521995


What are you after in a trackpad? I think this is the best feature of the laptops.


I am rough with mine (my first uses were on some IBM tank from the late 80s/early 90s when I was young and a Lenovo T61 in college). Spurious clicks and taps happen all the time when I end up using the things. I have spent minutes trying to get an Apple trackpad to perform to drag and drop and instead having it do every other thing from extra clicks on the way to zooming to running out of space when I get to the edge trying to get where I'm going. The fact that you can SSH in and use a more exacting interface is the best feature of the things IMO, but sometimes the UI is just the only way to get something done.

I also liked the matte texture of the ThinkPad, but I think that era is over (the XPS isn't glassy like Apple's at least, but still lacks texture).


But all of that crap is optional. The first thing a macOS user should do on a new machine is go to system preferences > Trackpad and turn that stuff and "Force Touch" off.


That doesn't help when I'm using someone else's Macbook.


Have you tried 3 finger drag? So much nicer than needing to apply pressure.


Doesn't that swipe between desktops/workspaces/whatever? On that note, I have no idea how anyone is supposed to discover these gestures. I had the same problem when I had an iPhone for a few months (long story): I became afraid to swipe anything because I never knew what anything would do and the lag on the thing meant that some widget could show up under my finger without knowing (something I really dislike about reflowing and progressively loaded websites too). The floating dot thing was also way more invasive than a button too.

FWIW, I have animation time set to 0 on my Android phone to avoid these kinds of behaviors but given that the primary interaction was through them on Apple, it was unavoidable.


By default yeah, you need to disable that gesture (or set it to 4 fingers) and then enable 3 finger drag in accessibility settings under trackpad. Aside from 3 finger drag though these are all in the normal trackpad settings, it's pretty clear about what does what and all of it can be disabled.

This used to be a regular (non-accessibility) setting, not sure why they hid it away a few versions ago.


In system preferences there are videos demonstrating all trackpad gestures in the trackpad settings.

I think most issues people have with using the trackpad are due to the fact that the default is that you have to click the trackpad like a mouse button in order to perform a "click." Most people coming from Windows want to use tap to click and I think that is what leads to confusing results.


Fwiw, I have a nearly-maxed-out brand new mbpro m1 and can't stand OSx so it's collecting dust as my similar specs 1 year old XPS i9 with PopOS sees 12h of daily use.

Of course an m1's battery life is better. Everything else sucks in comparison imo but I admit this is very subjective.


Not weird, or at least if you are I'm with you. OS; eh, ok, I'm used to it. Hardware, nope. I run my work Mac in clamshell and use none of it. Screen is fine I guess, but I'm old and most of it's lost on my shitty eyesight anyway.


I was eyeing an XPS like of notebooks for many years, but comments like the ones in sibling threads are holding me back from acting upon it.


Just got an XPS 15 with i9-12900HK with Ubuntu 22.04 (5.18 kernel installed after the fact). Still working the kinks out, but the machine won’t reliably wake when the lid is opened. It’s about 10% faster than my M1 Pro, but with much worse battery life. If I didn’t still need an AMD64 machine for somethings, there’s no reason to put up with the quirks of a Linux laptop IMO.


I really think that's what the Framework will be in a couple of years. I use one now and it's pretty great, but does have a few (totally acceptable to me) rough edges that they're working on.


As a framework owner, there's maybe two orders of magnitude difference in build quality between a mac and a framework.

The track pad and display suck for a laptop in 2022. The display has issues with calibration and resolution (understandable from where they're at right now, but still it's trash compared to an XPS or MB). The trackpad has mechanical issues that cause it to wiggle, its been reported on the forum (and personally to staff a few times) but they don't seem to have a decent way to fix it.

Its been a nightmare calibrating the touchpad to my liking, on my XPS and Macbook it has always "just worked" (even on Linux!)


This depends on what one thinks of as build quality. If it is a certain way things look and feels that evoke emotions of quality and power, then yes, Apple's build quality is better.

But if you appreciate that things are engineered in a way to make repair-ability and upgrade-ability easy to make your life better and have less hassle, perhaps the Framework has better build quality.


Exactly how do you quantify "two orders of magnitude of build quality"? What units are you using for this analysis?


I think "maybe" is an important part of the text you quoted, and the rest of the comment goes into decent detail regarding the specific areas where the experience fell short of their expectations.

Did you stop reading after the first paragraph?


How does one distinguish a build quality that is 100x better from a build quality that is only 10x better?

I think the point is that it seems silly to say “two orders or magnitude” when there’s no quantitative metric. Is it more informative than “way better”


It's a figure of speech


I don’t mean to discredit Framework in the slightest but they are still leagues away from Apple when it comes to build quality.

The entire industry struggles to match Apple’s fit and finish, it will be an uphill battle for a hardware based startup. I do hope they succeed.


Apple runs an entire supply chain, for a small company it’s fantastic to be half as good

bonus: it’s a niche I can’t imagine Apple going after any time soon!


Yes true, I was probably too optimistic with "a couple of years."


Yep, I have a Framework laptop. I really hope Sand Hill Road doesn't kill it. They will almost definitely need another round or two at to hit profitability.


Framework is the closest I believe. Personally I buy used thinkpads. Huge bang for the buck and they are pretty much bullet proof. That being said my daily driver is a 2013 MacBook Air running arch now.


For hardware specs and upgradeability yes. I have one. I love how I paid market price for third-party 4TB SSD and 64GB of RAM and not Apple's 4X market price.

For build quality I think they're still behind Asus, Lenovo, and Samsung.


Yes, I think so too now. But they don't ship to where I live, and their battery life is reportedly 3-4 times shorter than Air's. I understand that processors are a biggest cause of it. (No AMD option for Framework is also a very big minus in my eyes)


I am on a 2015 Macbook Pro also running Arch/KDE and it is a pretty solid daily driver.


My X1 Yoga from 5 years ago still answers all the above, IMHO. I replaced the battery (myself) about a year ago. Build quality is great, and I think you can even get it preinstalled with Ubuntu. If you're not into the touchscreen-yoga thing, the X1 Carbon is pretty similar.


I have the new X1 Yoga, and I like it better than any other Windows laptop I've ever owned, but battery life is just trash. I'd be lucky to get 3 hours out of the thing.


What about the very same laptop running native Debian?

How to install: https://git.zerfleddert.de/cgi-bin/gitweb.cgi/m1-debian/


I think you should be hopeful for Aeshi Linux.


Asahi Linux. And it is awesome, the future is here. Just waiting for the GPU driver and ... suspend... and brightness and sound.


Yeah. As soon as it’s got feature parity I am getting an m1 Mac to run Linux with i3. And no don’t mention magnet it’s not the same.


You could try Yabai [1], but it's a bspwm clone not an i3 clone. Still not as zippy as the TWMs on linux, but a pretty decent experience overall. There's also Amethyst [2] which kind of replicates xmonad. In most ways they're worse than using the linux ones, though at least you retain the full features of Aqua and Quartz, while trying to integrate i3 or other TWMs with a proper DE in linux remains to be a hassle.

[1] https://github.com/koekeishiya/yabai

[2] https://ianyh.com/amethyst/


I am familiar with both projects. And both are doing amazing work working around the limitations of MacOS to being some sensibility to the OS’s window manager. But yabai from what I remember requires you to disable SIP or to get it to fully work and on top of that I have cooled on my zealotry for MacOS over the years. I used to be a huge fanboy. But now I am a bit creeped out by all the ways Apple is trying to make it hard to leave the ecosystem. By how interconnected everything is. Also I love and support the brew project but … it’s not as fast or as robust as a proper package manager. (Maybe nix could come into play here.) But ultimately I am most keen on using proper Linux for the container tech, the customizability, the freedom, that containers run without the need for a VM —- that being said I hear arm64 VMs run super well on the M1s and … I dunno maybe I will revisit the Apple side of things. I just wish I could get the amazing Apple hardware and M1 or M2 performance and just run my own OS on it. The asahi folks are helping with that.


Do you mean Asahi?


Yes, fixed it.


FYI- If you meant to edit your post it's not showing up as fixed for me at least.


The really interesting question: how much would you be willing to pay for it?


I'm pretty price insensitive, but nothing on the market comes close to what I'd like:

- One week suspend, resume under linux (reliable).

- Keyboard and trackpad centered under display, and as good as best of class from 10 years ago.

- 4K / hidpi display

- no/minimal fan, cool running

- 12+ hour "typical" battery life; at least 4 when running slack and zoom (and maybe compilation jobs)

- as fast as a 10 year old midrange desktop e.g. i7: 2700)

- don't care about video acceleration, but video out must reliably work.

- No dual GPU switchover garbage.

- not intel brand (the last N Intel machines I have used have had severe chipset/cpu issues)

- ability to not run systemd in a supported config.

All the laptops I have found fail on multiple of these points. My pine book pro meets as many of them as most high end laptops do (so, not all that many), but at least it was cheap and worked out of the box.

Still waiting for a "real" laptop to replace it, but everything I've seen has glaring fatal flaws.


How about replacing with a "real" computer? I stopped accepting these compromises around 2004, and since then have built my own computers from discrete components.


As semiconductor technology advances, increasing miniaturization and integration become normal and expected. I imagine that in the 70s and early 80s, some people thought that the only "real" computers were minicomputers such as those made by DEC, or even mainframes. Highly integrated computers such as the Apple Silicon Macs are the logical continuation of the trend that began with the first microcomputers.

Having said that, the desktop form factor has its place. My next PC will be a desktop. But it will also be a Windows PC. I wish there was a Windows PC that had the same level of integration, and performance per watt, as the M1 Mac mini.


"integration" and "performance per watt" are way overrated IMHO. My desktop is on 24/7 with multiple VMs running. I walk up to it and begin working - no startup or setup. Electricity cost is probably $20/month.


honestly happy to pay through the nose for thinkpad build quality. it's a tax deduction and going to last me 5+ years.

hopefully get some ryzen 6000 series laptop options this year. the thinkpad z13 looks promising, but will have to wait for reviews to start rolling in.


The same as what it currently costs, personally.


I find this most interesting. I feel Apple went to Intel chips originally to be able to offer users the best of both worlds: both Windows and OS X. Naturally Bootcamp came along to bring that functionality. Now that Apple is back to a RISC chip, what is stopping them from offering a RISC version of Bootcamp? Windows runs on ARM chips, as do many distributions of Linux, so why not officially offer the ability to run alternate OSes?


Apple originally went to Intel because IBM was unable to deliver a Power chip with any semblance of power efficiency. IIRC Windows compatibility took some time before it was available.

Apple has said that Windows availability on the M series processors is up to Microsoft. Bootcamp is no longer needed, you can now install alternative OSes on the same drive and boot into whichever you want. At least that's the way Asahi works.

Apple's hypervisor API is the way to go in any case. If companies use that Apple will take care of all necessary drivers. Doing things on bare metal will require a lot of work and Apple won't help with that.


I think Apple was motivated to switch to Intel because of the stagnated development of PowerPC. You may be thinking further back, like in the 90s, regarding Macs that could dual boot into PC


Would a Ubuntu arm64 VM running under macOS be worthwhile?


I work on an Ubuntu VM running in Parallels on a Mac M1 to target a Raspberry Pi.

It’s the very best such setup possible.


I am waiting for HP Dev One. https://hpdevone.com/


My honest advice: Don't get HP. Except for the highest end ZBooks, maybe.

They all have severe drawbacks in some way. They even used the Elitebook brand to make cheap shit and now no one likes it. Gamers wanted the Omen, turned out to be shit.

I bet the thermals are horrible, the keyboard sucks and/or will break in a year or two, the battery life will be bad, the BIOS updates will cause problems with no way to revert, the drives may suddenly fail and Linux still won't work properly on them :D


Maybe consider a ThinkPad P1 Gen 4.

$$$ but can be had for 40% off.

A good spec with the nice screen and discrete graphics is < $3k.

Upsides: Great for Ubuntu, everything just works. Screen is beautiful. Keyboard is brilliant.

Downsides: Can get hot. Battery life sucks. Still 11th gen - Tiger Lake.


I wish I could get something like this from System76.


meh monitor

previous gen cpu

only 16gig

pretty underwhelming to be honest


1000 nits not glossy laptop monitor. Which company makes better? 2 SODIMM, so 64 GB RAM.


1000nits is great, but 1920x1080 is underwhelming. most of the thinkpad range is advertised as available with higher res (but other then the x1 carbon seems unavailable).

> 2 SODIMM, so 64 GB RAM

sweet :D

this looks more interesting https://wccftech.com/hp-unleashes-zhan-x-a-14-inch-laptop-up...


Choose two features between: 'Similar build quality', 'battery life', 'Ubuntu'.


I’ve found that Ubuntu is the problem variable when it comes to battery life. I’ve never gotten it to behave comparably to MacOS or Windows on any hardware.


This. I hate MacOS with a passion.


Surely it would be better to save one's passionate hatred for things that actually rise to that level, such as things that hurt people who can't just choose an alternative. I prefer to think of mere differences between operating systems as neither good nor bad, just different, and use whatever is most practical in any given circumstance, while trying not to get emotionally worked up about its downsides.


Everything about it or the developer experience? I use a VM for my dev work and then run the rest (which pretty much means "a browser", these days) in MacOS. Which works perfectly.


Not the original poster, but for me, everything. I'm the sort of user who likes to customize everything, and Mac (and other apple products) makes it hard to customize anything.

And I'm the kind of developer who will make pull requests to fix bugs that bother me, so everything being proprietary rather than open source is also a big pain point.


Nothing wrong with that, and I have no point other than I’m facinated by your approach to computing. I customize my iTerm2 theme, tweak Vim a bit, increase the cursor size and I’m done. What I want is to be able to get a new laptop, login to my password manager and be working within 15 minutes.


I’m a long time linux user who recently got a Mac, and I found it much more customizable than I ever thought. At least compared to ios it is seriously just a very old “linux” kernel with a better integrated user space.


Almost everything about it. It's pretty unusable for me all-around and just gets in the way of almost all my engineering.

I use Ubuntu for everything, and customize almost everything.

Ubuntu VM in a Mac isn't what I want. I want full access to all capabilities of the system, and no hassles with using all my GPU, RAM, and direct access to all disks and hardware interfaces. If I'm just going to sit in Ubuntu all day with VM resources maxed out, I might as well it be the host OS not the guest OS.


[flagged]


I personally don’t understand the whole concept of ‘hating’ someone else’s personal choice of OS.

I mean if the government mandated MacOS maybe you’d have a mature point instead of an infantile rant.


Someone expresses general displeasure with MacOS -> lambda Mac user has to come and say "It works for me, your opinion is thus misplaced". Every. Time. I don't know what's more infantile.


That's a lot of anger you're packing around there for an operating system lol


Yeah, that was my 1984-like daily minute of hate, except that Goldstein is now an OS. Better than hating people I guess? Also, other than "smugfaced", I believe I stayed pretty factual in my description. Still, gotta fullfil my obligations to the Party and all that, lest they start suspecting me...


Microsoft Windows Millennium Edition™


I do remember seething at hourly BSoD with Win98, now that you mention it. Like that one video where the guy starts attacking the computer.


Hoping for Asahi Linux soon!


I got an M1 Macbook Pro, and yeah, the processor is fast and cool and all that, but the most absolutely wonderful thing about it by far is that there's no touchbar!


I really, really, really want to like the touchbar, but it just isn't useful to me. I wish they'd put some serious effort into improving it.


Agreed. There were minor use cases were it was delightful to use (able to swipe & adjust brightness or volume). But overall it was just useless.

You weren't even able to skip to the next song, like you can on the old function keys. Play & skip to next song are probably my most used buttons, one of which never stand-alone existed on the Touch Bar.


You can/could modify the touchbar to get skip easily, see Keyboard settings


I saw an interesting idea floating around a while back: Put in a screen instead of the trackpad. I can see it being useful for drawing as well as picking emojis and so on, but it couldn’t be always available like the touch bar can.


I don't mind the touchbar - I mind that it _replaced_ the F-keys! I want proper function keys! If I get some sort of status display _in addition_ to the keys, I'm fine with that.


Am I the only one who just ignored the touchbar and didn't take it as a personal affront?


Your finger never slipped and hit the "play" button by mistake, blasting the last song you played at max volume directly into your ear. Happened dozens of times in the first year of having a touchbar (work laptop, not my decision), super jarring every time. Decades of having a physical "play" function button hooked up to do the same thing and it didn't happen once.


I can't say this has ever happened, but I have fairly skinny fingers. I can see how it would though.


Same. I had to remove all of the volume and media buttons and Siri from my Touch Bar prior to upgrading to M1. Interesting idea, poor execution, and no followup.


My work machine has one, and I use it to display status information, give me a proper Rub Out key, and for soft keys to insert certain Unicode and macros that I need occasionally, but not often enough to remember.

I think it would have been better accepted if Apple make it more customizable right out of the box, instead of relying on half-baked solutions from tinkerers to program it.

In my opinion, it should also have been in addition to the function keys, not a replacement.


The problem I have with the touchbar isn't the existence of it, it's that apparently my keyboard posture is terrible and I float fingers up their periodically. Because there's no force requirement for activation, I would keep triggering buttons. I could turn it off, but there are a few of the command/f buttons I use regularly, and so am stuck with it.

Hence, it's a nuisance. Not something I go insane about like some do, but it is a very definite day-to-day annoyance.


> apparently my keyboard posture is terrible and I float fingers up their periodically

Same. The short time I was exposed to the touchbar it felt like I was constantly being berated for my keyboard posture. Apparently the "at rest" position of my left hand leaves my middle/ring fingers hovering over the escape key (I had the earlier model that didn't have a physical escape key).

Not to mention losing access to the physical f-keys decimated my custom hotkey usage for certain software.


For me it’s my little fingers :D


The internet disproportionally reflects the most extreme views.

I think it's mildly disappointing and I won't miss it.


Oh yeah I won't miss it either but some people seem to hold rather extreme emotion over a touchbar :) . That said it is easy to be hyperbolic on the internet.


False dichotomy; I alternate between ignoring it AND being annoyed by it


I've had a touchbar on my work macbook for ~3 years and can't say I care at all. I use the touchbar controls without issue, the non-physical escape key is irrelevant to me (actually I find press-and-hold on the virtual escape key to be in some ways more pleasant). I use the touchbar zoom controls pretty regularly to leave meetings, I use the touchbar screen lock button probably 8x per day, its completely fine.


Genuinely curious, how did you avoid using it ?

I'd imagine changing brightness or sound volume + mute/unmute only with on screen controls and/or keyboards shortcuts could work, but that's a lot of shortcuts to remember for stuff that are only done a few times a day.


If it was an addition it would have been fine. Instead it replaced the escape key and all the function keys.


Personally quite like the touchbar. It's not great, but I like it more than function keys. In particular, I like that when I'm screen recording, I can use the touchbar to stop the recording. It's a trivial little thing, but it's cool.


I'd be less annoyed by accidentally touching the touchbar all the time if it were positioned at the bottom of the screen instead of at the top of the keyboard.

Hey, Apple: why not just lose the touch bar and make the whole screen taller and touch sensitive instead, huh?


I'd be fine with the bottom of the screen being touch sensitive or something, specifically where it wouldn't align with the keyboard, but they can't even get their anti-glare coating to not erode permanently when you get some finder grease on it. Not optimistic that a touchscreen would be done very well on macbooks.


Same here and I can't remember accidentally touching it. Even if I did so did I with F-keys as well (F1 in AutoCad uuh - on regular keyboard as well).


Still only supports one display. Also they increased the prices across the board..

Quite happy with my M1 Pro, a beat and a hell of a purchase.


This is disappointing. It has 2 Thunderbolt 4 ports but can only drive one external display. So unnecessary, this would be the perfect machine for my home and work setup, but I have 2 external displays in both cases.


They are 100% doing this intentionally. They want to drive power users toward spending more and they know that many of us will…


Really disappointing considering they now support up to a 6k external display. Yet you can't do 2x 1080p, or 2x 4k.

I think Apple knows a lot of customers care about this and want it to be a barrier getting them into a pro machine. The cheapest laptop they sell with multi-external-monitor support is $1k more than their cheapest laptop overall ($2k vs $1k).


Wait, you can't extend the display to a second screen?


Yes but only one external display plus the build in display.


There was some dock from a 3rd party vendor that let you do more screens, but I can't remember which one...


There are a few. They are able to do this by using something called Display Stream Compression. While it may be find for some, a lot of us would prefer not to have a diminished experience with a compressed stream.


Display Stream Compression (DSC) is fine. It is not a "diminished experience". DSC is visually lossless.

Instead, those docks use a technology called DisplayLink which has nothing to do with DSC. DisplayLink means that external monitors are basically "software" displays that are tremendously slower and often very limited in resolutions and frame rates. Having any DisplayLink display connected also breaks HDCP and can cause other problems.


The relevant standard is proprietary, but Wikipedia quotes it, confirming that "visually lossless" is marketing lies:

https://en.wikipedia.org/wiki/DisplayPort#Display_Stream_Com...


"Marketing lies" is unnecessarily inflammatory. I googled before posting to see if I could find anyone legitimately complaining about DSC, and it really seemed like pretty much everyone was happy with it.

There are always people like "audiophiles" who claim to be able to distinguish impossibly small differences, and there is perhaps a very small number of people with exceptional hearing who actually do... but 320kbps compressed audio is "audibly lossless" for most of the population. The exact same thing applies here, by all appearances. I'm sure there are mp3 test cases where the compression does something terrible, just like with DSC... that just isn't what people actually encounter day to day.

I can't see the second study linked which is on IEEE, but if you look at the fist one, Figure 4 shows that DSC was "visually lossless" in almost all test cases. Let me quote one thing from that study:

> As described above, the HDR content was selected to challenge the codecs, in spite of this both DSC 1.2a and VDC-M performed very well. This finding is consistent with previous series of experiments using SDR images.

So, this testing was done with samples that would challenge the codecs... and they still did great. It doesn't appear to be "marketing lies" at all. It appears to be a genuine attempt to describe a technology that enables new capabilities while dealing with the imperfect limitation in bandwidth of the available hardware.

Do you have some terrible personal experience with DSC to share? Did you do a blind test so that you weren't aware of whether DSC was enabled or not when making your judgments? Are you aware that almost all non-OLED monitors (especially high refresh rate) always have artifacts around motion, even without DSC?

I haven't personally had a chance to test out DSC other than perhaps some short experiences, which is why I based my initial comment on googling what other people experienced and how Wikipedia describes it. You pointed me to a study which seems to confirm that DSC is perfectly fine.


>in almost all test cases

Common sense suggests that "visually lossless" means no detectable difference by the naked eye ever, not in "almost all test cases". MP3 is a very old codec, and it's possible that there are still some "killer samples" that can be ABXed by skilled listeners with good equipment even when encoded by a modern version of LAME. A better example of something that could reasonably called "audibly lossless" might be something like Opus at 160kbps, for which I've seen no evidence of any successful ABX. But even that is is usually called "transparent", not "audibly lossless", so not only is "visually lossless" a lie, the name itself is propaganda.


> Common sense suggests that "visually lossless" means no detectable difference by the naked eye ever, not in "almost all test cases".

Common sense suggests no such thing. When you buy a bottle of “water”, it actually has a bunch of stuff in it that isn’t water. How dare they?! When someone says “I’ll be there in 15 minutes”, it is highly unlikely that they will show up in exactly 900 seconds. Such liars! Why are you even meeting them? When people say airplanes are “safe”, you might angrily point out how many people have died, not realizing that “safe” is relative to other things and not an absolute in that context. This is common across basically everything in life. “There are no absolutes.” If you think common sense is to automatically assume every statement that even looks remotely absolute is intended to be taken absolutely… that is not common. Short statements will come off as absolute, when they are just intended to be taken as approximate, but even absolutes are usually meant to be taken as slightly less than absolute.

“Visually lossless” is a description of the by far most common experience with DSC. They’re not describing it as truly lossless, so you know there is some loss occurring. It is natural to assume that in extraordinary circumstances, that loss might be noticeable side by side… but you don’t have a side by side when using a monitor most of the time, so the very lossy human vision system will happily ignore small imperfections.

> so not only is "visually lossless" a lie, the name itself is propaganda.

Your whole comment shows that you don’t understand how communication works. It is “visually lossless” as far as people are concerned. The study shows that! This is not at all what propaganda looks like.

When Apple labeled their iPhone screen a “retina screen” because people would no longer notice the pixels, I suppose you called that a “lie” as well because you could lean in really close or use a microscope? The retina display density achieved its stated goal.

There is literally no point in continuing this discussion when you take such an absolutist position and refuse to consider what alternative communications would look like. How about “99.9% visually lossless”? That would be even more confusing to people.

Communicating complicated concepts succinctly is a lossy process. Language is lossy. As they say, “all models are wrong, but some are useful.”


I understand perfectly that "visually lossless" was chosen to emotionally manipulate people by triggering positive associations with the word "lossless", despite not actually being lossless, or even transparent. Language is lossy, but that does not excuse corporations twisting it further in their attempts to exploit you.


Who is being exploited in this situation? I've never heard of anyone successfully selling a product because DSC is "visually lossless". It's a short explanation for a complicated technology, on the rare occasion that anyone googles DSC to try to understand what it is.

Your continued use of inflammatory and frankly incorrect language isn't helping your case. If this is "exploitative" marketing language that is "lying", you should file a case with some consumer protection body. You have also failed to demonstrate how you would communicate the overwhelming effectiveness of DSC that the study showed.


"High fidelity lossy" would be an honest name.


Sure, that name would have been fine too.

I don’t think “honesty” is an issue at play here either way, as I have discussed in great detail (and with many examples that you surely have encountered), given how people (unfortunately?) communicate in the real world.

If the study had shown something substantially different (or if people online were frequently having bad experiences), I would totally have been onboard with calling it marketing overreach and lies.


DSC doesn't solve the hardware limitation of only being able to drive a single external display on the M1, that's a hardware thing that cannot be changed. You have confused it with DisplayLink, which is basically another graphics card, hence why it "solves" this problem, but the experience is worse because it's CPU-intensive/software rendered.


Good catch. Definitely meant DisplayLink.


I bought and followed the online tutorials about using the DisplayLink docks and whatever else I purchased from Amazon and I couldn't get it to work with 2 external monitors. It isn't straightforward.


Does the 13" MBP support multiple displays?

Sorry- I'm horrible at reading Apple Specs and inferring the capabilities


just the one external screen (two screens total including internal).

https://www.apple.com/macbook-pro-13/specs/

People have gotten round it by connecting additional screens using display link adapters.


Awesome thanks for the assist! That page makes it clear, I guess I'm actually just horrible at sifting through the marketing to find the spec page :)


Display Link is alrightish for light office work or coding. But not much else.


Can you use two external screens if you disable the internal screen? That's what I do now with a ThinkPad.


No sadly you can't


My 13" 2014 MBP supports 2 mDP + 1 HDMI = total 3 external displays.

Running external display at 4k@60Hz is possible but not straight forward, it requires patch core graphic framework, or using 3rd party boot loader. Newer models do not have this limitation afaik.


Intel != Apple Silicon


Wait really? I was using 2 external displays alongside the built in desplay on my m1 just a few days ago. Or is it a limitation only with m1 mb airs?


> on my m1 just a few days ago

M1 != M1 Pro/Max/Ultra.

If you have an M1 Pro or M1 Max or M1 Ultra, that is not "[your] m1".

Each chip has significantly different capabilities in a number of aspects. As far as display support goes,

M1 = 1 external display[0]

M1 Pro = 2 external displays

M1 Max = 4 external displays (3 USB-C + 1 HDMI)[1]

[0]: the exception is the M1 Mac Mini, which doesn't have an internal display, so it can use two external displays.

[1]: once again, the desktop version without a built-in monitor can support one additional monitor, so the Mac Studio with M1 Max can support 5 displays.


Is there a technical reason that the M1 only supports a single external monitor (optimized intended experience), or is just market segmentation?


Every GPU on the market supports a limited number of monitors. There are fixed-function (not programmable in a traditional sense) blocks of silicon that are used to support each monitor.

M1’s GPU came equipped to only support the internal monitor and one external monitor… a very slim configuration, but that’s likely influenced by its smartphone processor ancestry. Smartphones don’t need to power a bunch of displays.

The larger M1 chips have bigger GPUs with more of those fixed function blocks.

It isn’t artificial market segmentation at a software level, but it is certainly market segmentation at a hardware level, and something they knew would happen when they designed these chips.

In the end, they were pretty spot on about the market segments. Most people want/need external display support… but one external display is plenty for most people. People who need more are likely to also want more in general, and the higher end options satisfy that.

It still would have been nice for them to upgrade things for M2.


> People who need more are likely to also want more in general, and the higher end options satisfy that.

I disagree. The topic comes up repeatedly whenever Apple Silicon is discussed. It’s my impression that for quite a lot of us the base M1 or M2 would be everything we wish for from a pure performance perspective. Yet the limited display output options are the only thing that force us towards the Pro and higher tiers.

It seems like a deliberate limitation and I don’t like this form of product segmentation.


Thank you for the thoughtful response.

I agree that they were pretty spot on with the market segmentation. I’m one of the folks who doesn’t need more than a single external monitor, and I consider myself a power user when it comes to resource consumption. I just wish the cost of ram would come down, holy moly.


Got it, I thought they were saying it was a limitation of the chip not the specific laptop they had. Thanks for the clarification!


It is a limitation of the chip. The M1 chip and the M1 Pro chip are not the same chip.

The laptop itself has nothing to do with it. If they decided to put an M1 Pro chip into the MacBook Air, it would be able to have 2 external displays.


alright thanks


M1 Ultra = Every display known to man.


Apple probably could support 10 displays off of M1 Ultra, but I guess they decided to leave some displays for the rest of us.


14 and 16 inch Macbook Pro's support multiple external screens up to 6K. https://www.apple.com/macbook-pro-14-and-16/specs/


That's the case for the M1 Pro and M1 Ultra. The regular M1 only supports a single external display.


The M2 as well, unfortunately.


Only supports one external display, as opposed to the 14"/16" machines that can do maybe 3 or 4?


They did? Seems to be the same price as M1 MacBook Pro.


"M2 takes the industry-leading performance per watt of M1 even further with an 18 percent faster CPU, a 35 percent more powerful GPU, and a 40 percent faster Neural Engine"

18% faster at the same performance per watt is a nice increase. Interesting to see if this will ever make it to their desktop computers.


> 18% faster at the same performance per watt is a nice increase.

18% faster at the same wattage, which means 18% higher perf/watt.


Have they confirmed anything about power?


From the presentation, taken from Anandtech's liveblog:

https://images.anandtech.com/doci/17429/34312453.jpg

Same power budget.


Ah, completely missed that slide. Thanks!


Since they never made any specific power claims of the M1, why would you expect them to make any such statements about the M2? You'll have to wait for 3rd party reviews and analysis for that.


GPU wattage seems to be increasing, but that appears to be linearly correlated with the number of cores they're adding. Still a bit of an "Intel Moment" all things considered, but not as bad as it could have been.


The 18% is for multithreaded workloads. Have they said anything about single core perf yet?


No they didn't reveal single core performance.

It is commonly assumed that M1 shares high performance Firestorm cores with A14.

It looks like Geekbench scores for A15 over A14 is about 18% in multithreaded, and 10% in single thread.

It is also very likely that the M2 uses the same Avalanche cores as in A15. So I would suspect that this translates to a 10% increase in single threaded performance between M2 and M1 as well.

Incidentally, A15 runs at around 3.2 Ghz vs 3 Ghz for A14. So the majority of the speed-up between A14 and A15 comes directly from increasing clock frequency. M1 runs at 3.2 Ghz.


I didn't see anything, but inferring from what I can find it's mostly improvements the slow cores (icestorm -> blizzard) improvements. The fast cores (firestorm to avalanch) seems like a very small difference.


I'm not surprised they are keeping the M1 Air alive. That thing is a great price/performance combo in a very light and portable wrapper. It's been $850-$900 multiple times at Costco, Bestbuy or Microcenter.

Also, the first computer where getting the upgraded hard drive (512GB or 1TB) really helps with the low ram because of the integration bus they have with the drive for swap. It's fast.


The price increases are a kick in the shins though. 20% more for 18% more performance.

The M1 generation looks to be a bit of an "introductory offer" to get people looking at apple who otherwise wouldn't have... once they have established their mindshare as being a performance leader worth considering over x86, they can raise prices back up.


Heh, sure, if that was it. What about the larger and better display? Magsafe? GPU perf (35%)? Better battery life? 50% more memory bandwidth? More ports (2 tb +power) ?

M2 starting at $1200 looks pretty nice to me. Avoids many of the corners cut on the competition like: plastic chassis, tiny trackpad, poor fans that get noisier in the first year, poor Intel iGPU, poor battery life, etc.


I agree. The Air used to be the laptop for everyone. Now it has entered into MBP territory in terms of pricing and performance.

What will happen with the M3? Will the base Air start at $1500?

Will the M2 Air drop its price to $999 when the M3 is released?

I'm not saying the M2 Air is not worth its price compared to x86 laptops, but it's ridiculous that the cheapest Apple laptop is way overkill for its intended audience. Even the M1 is already overkill for users that typically spend the majority of their time in a browser or using Office.


The original Macbook Air when introduced had an MSRP of $1800.

They are still selling the M1 Air at $1000.


Yeah that was at launch, but a couple of years later it went down to less than $1000. I bought one new around 2017 for about $800.

For many years it was one of the most popular laptops ever. Popular as in admired and famous, but also for the people.


The original Air was placed in a completely different segment, against the tiny executive Vaios and the like. People on a budget (like university student me) got the white plastic Macbook back then.


The original Macbook Air was introduced to be extra light - it only later shifted to the entry level device as there used to be the base macbook for that


Base price is still fine for the product, but RAM/SSD upgrade fee is the pain.


I feel like most of the markup comes from the new industrial design. The previous M1 MBA was essentially a 2018 Retina MBA with an upgraded chip. When the 2018 Retina MBA was introduced, it was also $1,199.


The price/performance for the M1 air is impossible to beat. Anyone who just needs a good laptop at a reasonable(ish) price can grab that and be happy for years.


I have the M1 Air and it is incredible to have such a powerful machine that doesn't blast an annoying fan at all times, and is still thin, light weight, and with a very high resolution display and incredible battery life. I travel frequently so it's the perfect laptop for me. And the best part is there's NO TOUCHBAR. I will absolutely be trading this in for the M2.


They were so careful not to compare it to the M1 Pro!


Why would they? This is the base M2.


M2 will outperform the M1 Pro variants on single core perf, since it is the same across all M1 chips apart from the memory bandwidth.


It's hard to imagine needing this honestly. I'm typing this on an M1 air, and even on this chip it's so snappy and quick even on things like larger compilation jobs.

I'm not on the super-power-user end, but imo the price/performance for the air, as well as the form factor seems to be a sweet spot.


I always see people say things like this, but I have an almost fully specced out m1 max and the system grinds to a halt for a few seconds when I do basic tasks like join a zoom meeting from my web browser. Like the mouse will lag, if I’m listening to music it will start to skip or freeze, etc.


I have an M1 pro (16gb) and don't have this issues. Might be worth checking your system/taking it to Apple for a check.

Only time I got any similar issues were when I was hitting the ram pretty hard, but MUCH better than my i9 32 Gb with the same problem on memory (FUCK YOU DOCKER!)


One thing that came to mind, which may be relevant to you or not, do you have the Whatsapp desktop client installed/running? A few months ago, an update seems to have screwed up something that made my Mac much slower and unresponsive when it was open (my gf and my ex had the same issues with WA, the moment they closed WA it was snappy again).


I do have it installed, but it usually isn't running. I'll uninstall it and see if that helps though. Thanks!


Hmm sounds like maybe it's a bug or some specific software issue.

On the M1 air, I sometimes notice that the battery is draining a little bit quickly, and will check and see that I have like 10 Safari windows open and minimized, with over a hundred tabs in total, but I hadn't noticed any slowdown or performance issues.


Bluetooth devices having trouble?

I suggest you try wired before blaming the CPU.


This happens regardless of the peripherals. I usually have a wired keyboard and a wireless (with a usb dongle) mouse connected through a Caldigit TB4 hub. It is possible it might be some weird macOS/chrome/zoom bug and not necessarily an issue with the CPU itself though.


i'll echo the others with my experience that i have no problems too. i do see this in m1 with high memory pressure though. docker is always suspect. also long-lingering web tabs.


Single threaded performance is very useful for gaming, but I agree that the M1 is so fast that anything faster is just a bonus.


Which games can you even play on a mac?


Lots! On my base model Mac Mini ($700, M1, 8GB RAM, 256GB SSD) I've played League of Legends, Factorio, Stelaris, Civilization 6. All of them have native offerings for macOS, and the experience isn't too different from Windows.

It's not a great choice for gamers, but for people that only play games casually/occasionally it's a surprisingly good choice.


I myself played Ratchet & Clank 3 on PCSX2 on my Macbook Air M1.


How does it run? I was looking for this, but no benchmarks online.


I enjoyed playing Stellaris on my MacBook M1 pro on a transcontinental flight last night. 4x games work well for passing the time!


Minecraft with shaders and a high-res resource pack.


There's always MUDs!


I was pricing it out, to get 512gb and 16gb of memory its $1700. You might as well save $25 a week until september and buy the m2 pro that comes stock with more memory and storage when they inevitably update the 14 inch and sell it for $2000. Better yet, wait 6 months and buy the refurbished m2 pro for $1700...


You might as well save $25 a week until september

The best tool is the one you have, not the one that doesn't exist. In September you could then wait for the M2 MAX coming in June. The in June wait for the M3 in September.

I can't imagine telling a client, "I'll get that video to you around Christmas. I'm waiting for another version of a computer that just came out to come out."

When it's raining, you want an umbrella, not to wait in the rain until someone builds a cafe to hide in.


If you are in the business of returning a video to a client by a deadline, chances are you aren't working on it with this computer placed at the bottom of the lineup with two ports.


Buy $1700 worth of Apple stock and hold it for 12 months instead.


Wish I could tell my young adult self to do that in the past. If I'd bought Apple stock instead of a new mac in '93 and kept it, I'd be worth about $3M


can anyone knowledgeable please oblige us with a comparison?


M2 will be faster in single core, M1 Pro will be faster in multi core.

Taking Apple's 20% claim at face value:

Geekbench Single Core - M1, M1 Pro: 1700, M2: 2040

Geekbench Multi Core - M1: 7700, M1 Pro 8-core: 9000, M2: 9200, M1 Pro 10-core: 12400

Of course we'll see M2 Pro/Max sooner or later, which will presumably match M2 on single core just like the previous gen.


Apple's 20% was for multi-core.


The M1 and M2 have the same number of cores. I suppose the fabric could have been improved for the M2.


Multi core uplift will be a bit more than single core, because the “little” cores improved more than the big ones. 18% multi core uplift, but only 12-15% single core. Weak cores might have increased by 25% or so but they’re still much slower than the big cores so don’t affect “single core” performance.


>Taking Apple's 20% claim at face value:

Which you shouldn't. They are, once again, using performance per watt. Nothing guarantees that it even runs at the same wattage.


The CPU cores of the M1 and M1P are exactly the same (just with different mixes, and different boosting behaviour on the E cores).

If the M2 has 18% higher PPW, and keeps the same TDP / power draw, that's 18% higher performances (remains to be seen whether it's across the board for E-cores, or for P-cores) it's about 25% below the 6+2 M1 Pro, and about 70% below the 8+2 M1P. At least looking purely at the CPU.


14" MBP gets the M1 Pro/Max, not the base-tier M1.

The M1 Pro is a 6+2 configuration, so it will have a little bit of an edge in core configuration, but Apple claims 18% faster on this generation, which might cancel out that edge a little bit (I'm guessing slightly slower still, but close). The M1 Pro does have a 40% larger GPU, but the M2 is 35% faster (again, taking Apple at face value) so it should again be similar-ish in gpu performance, very slightly slower (135/140 = 96% as fast).

The big difference is still that M1 Pro gets support for much larger memory, and the 14" has a much better port configuration, the 13" is basically still the same old chassis with USB-C and the touch bar, just updated with a newer processor.


M1 Pro is either 6+2 or 8+2 depending on which model you select.


M1 Pro is 35B transistors, M2 is 20B, fyi.


Ouch, that MacBook Pro 13" chassis makes me hurt. The air actually has a better body, that's disappointing. The display size, magsafe and function keys are all better on Air but it doesn't have a fan for sustained performance :'(


As far as I can tell, there is almost no reason for the 13” MBP to exist.


Especially with the single external display limitation. It's basically just a MacBook Air with fans.


Wait till software catches up where the fan needs to turn on.


I can only imagine there is a yet-to-be-announced replacement in the works. Perhaps the 16" M2, or the return of a "MacBook" line. Seems like this will stop existing at some point.


Lots of people like computers that have "pro" in the name…


> Ouch, that MacBook Pro 13" chassis makes me hurt.

It is just the previous MacbookPro 13" with the M1 replaced with an M2. The Air is better in all regards if I read the spec correctly.


And the touchbar limps on


Confounding why this is continuing. Is there that much remaining inventory?


That would be the only reason I can imagine.


That’s what the MBP 14/16 are for. Unless you’re editing video though, it’s unlikely you’ll notice a difference.


I can't help but wonder what Microsoft's answer to Apple Silicon will be going forward. They don't really make hardware, but selling Windows laptops gets harder and harder the further Apple gets ahead. It seems inevitable that there needs to be some ARM-based Windows laptops to compete in perf/watt to the M1/M2 but I don't know what company can provide an ARM chip that competes with Apple at this point.


Their answer is this: https://arstechnica.com/gadgets/2022/05/microsoft-will-boost...

They will need to get developers up to speed porting their apps to ARM before they are even in a position to re-boot their Windows ARM strategy.

But this is a multi-year journey which is likely to give Intel/AMD time to produce something more competitive.


Even if that happens, I'm not sure how much appeal an ARM version of Windows actually has. Right now, the only things keeping me on Windows are Visual Studio and my huge library of legacy software (ahem video games). Microsoft's x86 emulation on ARM is downright atrocious. A native ARM version of Visual Studio could keep me productive, but I'm not about to spend money on a new computer than runs all my favorite old games noticeably worse than my current machine.

If I buy an ARM machine any time in the next 5 years, it will almost certainly run macOS or Linux, with Windows relegated to an x86 box that I use for gaming.


I've got my eyes on the ARM64 build of full Visual Studio coming in "the next few weeks".

https://visualstudiomagazine.com/articles/2022/05/27/news-ro...

I've been using VS 2022 on "Windows 11 for ARM" inside Parallels Desktop on my M1 Max MBP, and it's just barely usable as VS 2022 is 64-bit. JetBrains Rider is pretty good on macOS, and "VS 2022 for Mac" is coming along now, but full VS would be nice.


Microsoft has made three seperate attempts at windows on ARM (Windows RT, Windows 10 for ARM, Windows 10 S). Whether because each attempt produced a more locked down platform than standard windows, or because people prefer compatibility with their existing software over battery life, or because non-M1 ARM chips were not competitive with Intel/AMD even before emulation overhead, none of these attempts took off


The full Windows 10 runs on ARM64 since 2017. See here for all architectures of Windows. [1]

And the deal is that current ARM processors have higher IPC than even the latest Intel and AMD processors and are much more diverse. The biggest ARM CPUs have 128 cores that have higher multi-threaded performance than any CPU by Intel/AMD and a Cortex-X2 has higher IPC than any Intel/AMD.

1: https://en.wikipedia.org/wiki/List_of_Microsoft_Windows_vers...


True, at that point Linux becomes very competitive.


How is the Linux on Arm situation currently? I mean specifically for desktop Linux?


Pretty much every distribution and software package is available in ARM64. You will not miss anything.


I think the popularity of the Raspberry Pi has sorted out most desktop use-cases.


Asahi linux has a port for the M1. Accelerated 3d isn't supported yet, but recently his a milestone of a working rendered triangle.

So not yet, but seems pretty close. Marcan has a Patreon if you want to support it.


Is it actually supported to run Asahi linux on an M-series mac? Or is it kind of a sketchy jailbreak type situation?


Yes, fully supported. No jailbrake/security attack required. In fact Apple even fixed bugs in the bootloader that Asahi linux found. They seem to be intentionally enabling other OSs.


Awesome! But I understand the bootloader is still locked down on iPad, even though it's the same SOC, correct?


Sadly, yes I believe the iPads are locked down and would require a jailbreak.


As far as I can tell, the ARM hardware linked is vastly inferior to the current M1 hardware, let alone M2.

> But this is a multi-year journey which is likely to give Intel/AMD time to produce something more competitive.

And during this time Apple is going to release M2 Pro/Max, M3, etc. I just have a hard time seeing how Intel/AMD catchup in the laptop space.


Well fabs used to be hugely important, not only did each generation halve in linear size (4x in transistors per area), but each shrink was a big win on clock speed and power use. This revisions happened often, around 18 months.

These days the shrinks are smaller, i.e. 5nm -> 4nm -> 3nm, but each gen lasts longer, and provides very modest improvements in power and clock speed. They are also coming out in ever slower release cycles.

So now the competition has more time to catch up, and less of a disadvantage of they are a process behind. TSMC is currently leading, Apple, Nvidia, AMD, and others are bidding for the latest/greatest, while Samsung and Intel try to close the gap with their fabs.

Apple has an advantage of doing several generations in phones/tablets before bringing out the M1. Additionally they have an architecture license, so they do custom cores, not just what ARM is offering. This allowed them to tune their designs, use engineers from various companies they acquired to tune their chips, and get rid of the cruft, like 32 bit compatibility.

With all that said I expect Apples the perf/watt advantage to decrease over time. What does seem somewhat unique is they have build in a relatively small, power efficient, and inexpensive package (compared to similar functionality) 128, 256, 512, and 1024 bit wide memory interfaces. Sure you could build a dual socket Epyc with 16 dimms and likely burns north of 100s of watts and takes at least 1 rack unit, or you could buy a mbp m1 max. To match the M1 ultra you'd have to switch to some exotic CPUs that use HBM and sold by companies that typically send 3-6 sales people in suites before revealing their prices.


It was hard to imagine Intel catching up to AMD pre C2D.


And AMD to intel pre Ryzen.

It takes a good 5 years of doing everything right though.


too bad 2012 was a turmoiled period for microsoft, windows 8 for phone was a concrete seed for an unified development environment with an unified api.

a series of strategic and communication mistakes kinda wasted the shot, and when they finally fixed the desktop side of the experience was too little too late.


Ooh, I want one of these. I've long wished that Microsoft would release its own small-form-factor desktop to compete with the Mac mini.


I've been running an Odyssey from Seeed Studio for a while now, as an in-house dev server running SQL Server and IIS. It's the form factor you want, and it's been flawless (even though it's underpowered for what I'm doing with it.)

https://www.seeedstudio.com/catalogsearch/result/?q=win10


I've long wished that Microsoft would port Windows on ARM to the Raspberry Pi!



Which is different from Windows on ARM


Why? Windows seems like a terrible option for minimal hardware?


Why? Unofficial ports work OK. Besides, imagine how efficient Windows on ARM software would be if it was developed on a Rpi


Because using poor/slow tools means we end up with better end results? In my experience using poor tools ends up with higher costs. I don't see how higher costs are going to result in better software in the end.


Does windows not have a rosetta equivalent??


For anyone who saw this I saw a few other comments that say that they do, but obviously I don’t have details.

I was very confused that they wouldn’t have it, it just seemed bizarre to ever have win/arm without it?


> selling Windows laptops gets harder and harder the further Apple gets ahead

That's Apple's marketing but the 12th Gen P chips are perfectly capable of keeping up with Apple on the performance side and AMD's likely to be able to compete on the power consumption side as well. Yes, x86 is likely to never match ARM on battery life, but I believe they can be reasonably close for it not to be an issue.


It would be nice if they could get close in terms of battery. Every windows laptop I’ve ever had including my current one I use for work has had a battery life of around 3 or 4 hours. Compared to my MacBook Air which can easily seem to go over 24.


I have an older work laptop but a nearly brand new (comparatively) battery that's about a year, maybe 18 months old. If I'm actually using my laptop, it will last maybe an hour or two. I might get it to last 3-4 if I just have email up with the screen dimmed.

Not sure if it's Windows, too much work surveillance-ware, or just HP being garbage, but my M1 Mac will last twice as long as I've ever needed it to without getting plugged in. Even my older Intel MBP lasts most of the day.


I am honestly a bit shocked at the low numbers you and parent are reporting. I have not used Windows in a long time but Linux is generally assumed to be less power efficient on laptops and yet I have no problem getting 7-9 hours out of my year old ZenBook 13 OLED.

That's still nowhere near what you get on a MacBook Air but perfectly usable imo.


My experience with Linux, Ubuntu and Fedora on HP EliteBook and Dell XPS13 respectively is similar to the Windows numbers posted above.

My 2014 MBP13 still gets better life than my daily driver XPS13 9370.


That's exactly the problem though. One OEM can match Apple in performance, and the other in power, but none in both power and performance.



My point was AMD's Ryzen 7000 is likely to be able to do both really soon, while Intel's 12th gen can match/even exceed it performance wise now.


You forgot about price. There is a lot of low-end work that doesn't even need an Air, for $500.


Microsoft is a company full of bureaucrats who don't care about their product. Apple is going to take over the computing world here shortly.


Please talk to anyone outside of the start-up/tech world and ask them about the technology they use. A majority don't give a toss about M1 or M2 or ARM vs. x86 or anything else that seems to get so many in the tech world so excited. They care about Excel, they care about backwards compatibility, they care about centralized management.

Apple _may_ take over the consumer space but this will be more due to the shift from desktop/laptop computing to phones and tablets than anything with the M* series of processors.


Consumers do care about things like battery life. I imagine most consumers would prefer to stick with what they know (windows), but as the battery life/performance gap grows, people will be more likely to make the switch.


> Consumers do care about things like battery life.

For laptops, less than you'd think. A huge chunk of people buy laptops so they can work on the dinner table and then put their computer away easily when it's time for the family dinner.

Source: I spent about five years selling laptops to people. That was a while ago, but I don't think much changed here. If anything, things changed the other way (battery life is even less important for laptops than it was) since a lot of people also have a smartphone or tablet.

And as battery lives get longer, there are diminishing returns as well. The difference between 1 hour and 4 hours is huge. The difference between 4 and 8 hours pretty large. After that? Less so.

In my experience noise and heat (or rather, lack thereof) are more important, although also not hugely so for a lot of people, just more so than battery life.


Consumers buy €400 17” monstrosities with numpads and run them plugged in.


To be honest I'd kill for a revival of the MacBook Pro 17".

Heck I'd love portable 21" and even 24" models.


I agree about the CPU architecture, people don't give a shit.

However, I think MS/Intel will start losing also corporate space. With the staffing problems, companies are looking for ways to score cheap points, and I'm starting to see "free choice of a laptop, including MacBook" as one of the benefits even in some big corps.


Thankfully we don't need to trust in meaningless anecdotes about what those in the 'real world' do or don't know.

The facts on the ground are that Apple's Mac sales are rapidly growing and in the last quarter half of all Mac buyers were new. That clearly indicates that something new to the Mac platform is attracting users.

So whether they know specifically about M1 or not they do know that the Macs have better characteristics than in previous years which M1 is responsible for.

And given that in all Mac marketing the M1 has been heavily advertised logically at least some proportion of users do know about it and do see it as a key differentiator.


I’m working for a big bank. They now offer Mac workstations to be able to hire the best devs.

I would have never expected a Unix workstation in such a corporate setup when I started 15 years ago.


It's a gimmick. Best devs dont us MacOS, lol. There's probably some cohort of frontend that try to look "cool." But any dev doesn't fall for that fluff. FFS, Apple just announced memory swapping as a feature on their iPad, a feature that's literally been around since 1970s. That's laughable and sad, any dev worth their salt would know this.


I agree that the world basically runs on excel, but given that, the world cares about excel’s performance. Especially as spreadsheets are only getting bigger.

And then there’s this: https://support.microsoft.com/en-us/office/use-office-for-ma...


Office for Mac is a non-starter for power users due to the lack of Alt-key accelerators. There are countless other missing features, but that alone is enough to never make the switch


Interesting that you state there are missing features.

I remember a speech given by one of the leaders of Mac development at Microsoft saying that new features are tested out on the Mac first, and if they work out, they're brought into the Windows version.

Is that no longer the case?


The problem is that Alt-key accelerators are encouraged by Windows across the entire OS and it has been the case since the very first version of Excel (it really predates Excel)

I love the approach of testing new features on the Mac first, but it isn't sufficient since the Mac version was never updated to be 100% parity with the Windows version, which means some of the preexisting features would forever be missing from the Mac


No chance unfortunately. Windows 11 has pop up "notifications" that are basically ads all over. Unnecessarily hard to cleanup / customize in a biz setting.

If we could deploy Apple products in a business environment we would in a heartbeat. But Microsoft just is better here currently on a lot of fronts - the last time I chased my tail here it didn't pay off.

If Apple wants to compete for the business market I think they should! We need first class user account management that INTEGRATES with other stuff (ie, google email etc etc). Right now you can federate from active directory to almost anything (SonicWall/VPN for remote users, WiFi for onprem user devices, vSphere for VM management etc etc). If you sync to google you can then use google one click sign-ons everywhere on the web SAAS side.

We then need office running perfectly.

Then we'd probably do our legacy apps on some VMs and chrome for SAAS apps.

We also need to be able to run MacOS virtually. We have remote users who talk to an on-prem VMs, separates their personal and work stuff, we can lock down and monitor the on-prem VMs and they can watch netflix with no worries using home machine. How does this work with Apple? It's easy with Windows.

I think there would be some demand from smaller co's to make the switch if there was a solution which allowed what folks are looking for -> migration to cloud as offices go virtual with controlled "desktops" delivered to users while still allowing in office / warehouse / factory deployments.


My cut n' paste pet peeve example of why macOS seems like a "toy" and not for serious business use:

The file save dialog box has this unbelievable limit of 38 viewable characters! I regularly have to deal with 50+ character naming conventions where the first 38 characters are the same among many files. It is a huge hassle of cursor navigation that is so unnecessary as I am looking at all this unused real estate in the dialog box.


I agree that this particular aspect of the the dialog box is bad. But if something as minor as this keeps you off an entire platform, it sounds like making excuses.

I save ~50 - ~100 character filenames all the time. I even cut, copy, and paste bits of them in that little box. It doesn't feel like a big deal to me.

But yeah, it's the little things like this that belie Apple's reputation for attention to detail.


> If we could deploy Apple products in a business environment we would in a heartbeat.

I'm not gonna pretend to know about how IT works in business, but most employees at big tech companies do all their work on Macs, so it's certainly possible in some cases.


Is Active Directory still LDAP compliant? Embraced and extended or compliant?

Open-LDAP should be able to get you most of the way there. Stuff like CIFS allows for mountable shares, and roaming profiles is easily handled by LDAP login and a mounted /home

Oh wait, then you could use actual FOSS systems, Sorry I forgot that this was about Apple.. Ok so they can license AD, giving M$!a bone in the process


I actually used to do this. Samba on Mac used to be great, so you could do a good hybrid setup. And once you had Samba working your linux users could jump in more or less if they could self support.

I think Samba went to GPLv3 and updates for it on mac seemed to stop entirely cold which killed this as the easy integration glue. Does anyone remember details? This great integration point went away and basically you end up tilting at windmills.


You realise that something like 99% of all LDAP authentications in the world go through Active Directory, right?

This is like someone screaming that Linux is a toy because it’s not really UNIX unlike SCO.


I have worked on big companies that do pretty much all of this on Mac. I agree that it might be harder to do than on Windows, because there is so much industry know-how on the MS side. But there is no real technical barrier for this to happen.


Provide a service similar to Active Directory? Absolutely, that's what is needed from Apple, Red Hat, Canonical, etc.

Depend in any way on a Google Account for anything critical? That's something I oppose with all my will.


In a business context a google account requirement would be fine. Microsoft is basically going there to get folks to move AD into the Azure cloud. We're feeling a ton of a pressure towards that, and entitlements for Office etc are being delivered that way (so you end up with a mini AD instance in cloud already).


Virtualizing macOS for enterprise is a good point. I wonder IBM/Apple want to do this.


Microsoft cares about their B2B products. While end users complain about Windows being a bloated mess, corporations still see no better alternative platform for deploying and managing a fleet of thousands (or tens of thousands) of machines.


Just a few weeks ago I replaced my Surface Laptop 3 with a M1 MacBook and couldn't agree more regarding hardware. I can't speak for any xbox branded stuff, but any MS-branded computer I've ever owned has been trash. Microsoft might be terrible at this hardware business, but they do have a powerful presence in the developer & business community.

I still feel like Microsoft is the strongest software company on earth. Consider that not even the confines of this M1 MacBook prevent me from being able to compile & run my .NET apps without modification. Apple's hypothetical hegemony does not cross over in the same way.

Until Apple can get me to look at their Xcode offerings and think "wow fuck visual studio, GitHub, et. al.", I do not think their takeover of the computing world will begin.


I have to disagree on this. Microsoft has gone out of their way to support their legacy software on older systems, and it's a huge reason companies in the IT and IoT sector have stayed with them all these years.


It's worse than that. They're busy turning off their users with dark patterns, terrible UX, ads and spam in the OS, and endless amounts of unnecessary telemetry.


I just had to hack/patch windows 11 in order to bring back "never combine taskbar windows" functionality which existed in windows 10. I am strongly considering switching over at this point. Removal of "never combine" is such a productivity kill that it baffles me how this thing rolled out at all. Who took over the wheel over at Microsoft and who left, that made this major breaking change take place?


Oof - I understand your gripe completely, Win 11 is downright perplexing with some of this stuff, but if you're someone who wants the "Never combine..." option, you'll probably hate MacOS dock, the way window and app switching works, lack of any options there, and general "We know better than our users" mentality all over the place...


"Grass is always greener" effect. If you think there isn't weird UI shit on MacOS...


Thanks for the heads up. I'm never upgrading if I can help it...


And yet, I and many of my friends will keep using Windows because the third-party Windows screen readers are better than macOS's VoiceOver in many ways. I have no doubt that other users have their own favorite (edit: or essential) third-party tools that keep them on Windows.


It's a false dichotomy. You could say the same about either company. Those are the inevitable consequences of proprietary software and vendor lock-in

(my original comment was some rhetorical question, I edited it to be more direct and less passive-agressive)


Microsoft.

They had some promising years but I always sensed a struggle in the wheelhouse.

Now they are back to forcing Edge on people, ads on login screen and in the Start menu are their new inventions and their store is almost as broken as ever and most importantly hard earned trust flew out the window in the process.


Apple forces safari on users in iOS, has icloud ads and integrations built into the OS, and sells devices with locked down bootloaders/filesystems that don't let you sideload your own programs.

Who is the bigger threat here? The real threat to user freedom is the tribalism of picking the "lesser evil" when there are workable non-evil solutions like linux.


It doesn't force Safari. Chrome is absolutely allowed to create a browser and track users and monetize them on iOS. They just have to use the same rendering engine.

I'm not Apples greatest fan (see my latest comment), but there is a major difference between iCloud or OneDrive being pre-installed, both which is OK with me, and Candy Crush showing up in the start menu on my work laptop or some stupid game altering my login screen, again on my work laptop.

And yes, I too am a Linux user.

Why choose between various dumb and evil options if nice is available? (I know, some people get as mad at font problems and alignment on Linux as I get on microlagging on Windows and boneheaded CMD-TAB on Mac, but each to their own.)


No, you can't. There are no dark patterns, ads, or spam in macOS. The worst you could say is that it has "terrible UX." I would then respond: compared to what?

In my view, the only desktop-grade OS I prefer over the modern Mac is MacOS 9. It was much easier to use and understand from top to bottom. On the other hand, it lacked a lot of features I've come to take for granted (pre-emptive multitasking, multithreading, protected memory, support for modern hardware, gestures, etc).

I do really miss the spatial Finder though.


There are plenty of dark patterns in macOS. For example, macOS will trick users into thinking that the apps they want to use are either broken or malicious if developers didn't pay Apple $100 a year and Notarize apps. macOS has increasingly become a platform to sell iCloud subscriptions, as well.


No "dark" patterns he says. Even after all the revelations, iFads just keep mindlessly worship Crapple. When in reality:

https://www.scss.tcd.ie/doug.leith/apple_google.pdf

"iOS sends the MAC addresses of nearby devices, e.g. other handsets and the home gateway, to Apple together with their GPS location. Users have no opt out from this and currently there are few, if any, realistic options for preventing this data sharing."

Power corrupts and when one company wields too much of it, shit will hit the fan.


I think the argument here is that Apple is better in this area than Microsoft or Google, not that Apple is great.

Better than Microsoft or Google on privacy is not a high bar.


I just gave you as a matter of fact statement where Apple sends private information to their servers, and your response is going to be that "Apple is better." Wow, you are an Apple sycophant. As George Carlin said: "Imagine a person of average intelligence, and then know that there are even dumber people than that." Why would someone be so compelled to defend a corporation, even when it's clearly doing so many things wrong. Slave Labor.. Check, Monopolistic Practices .. Check, Shady Deals ... Check.


Please purchase iCloud Subscription to backup this comment.


Yes Apple does push iCloud a bit but it seems fairly simple to opt out and after you do it stops bugging you, or at least that has been my experience.


I'm not even sure which company they're accusing of having those faults.


Not to mention, up until a few years ago, most PCs did not come with TPMs, so they can't run Windows 11. And Windows 10 won't get security patches after 2025.

I built my computer in 2017, and it's still very capable of running modern games, and in three years it will still be perfectly fine. But I won't be able to run Windows 11 unless I do weird hacks and workarounds, or try to source a TPM that works with my motherboard.


Yes, Apple would have a difficult job displacing MS, but it seems that MS is set on helping them. I mean, who doesn't want ads on their work computer? /s


The theory some people have is that Qualcomm's acquisition of Nuvia last year was their attempt to get their hands on some desktop-class CPU cores (Nuvia was originally aiming for the server market), and Microsoft has largely partnered with Qualcomm on all their previous offerings. So that might be their saving grace if they can actually materialize something in the next year.

But I agree. Apple is pulling ahead a decent amount here and likely will stay in that leading position for a while, like they did in the phone space, and that makes all the competitors that much less appealing.


Nuvia by all accounts has an excellent team. IIRC, Qualcomm has redirected their efforts to laptop SoCs.


Yeah, I have no doubt they can make an excellent core based on what I saw; it's just that there's a limited timeline before your competitors make their move, and Apple is very much moving right now. Hopefully they'll have something released within the next ~6-10 months.


I see Microsoft as a SaaS company now, heavy on cloud and Azure. Apple is still a products company.


Supposedly Qualcomm will have an M1-class laptop chip ready at the end of 2023.

That timeframe does not inspire much confidence in me, seeing as it is three whole years after M1-based products first hit store shelves.


Quick Google search indicates Windows still holds a 74% market share. Apple has a long way to go before they are really crunching on Windows in the general market. Hardware superiority does not guarantee success, for many people what they are already comfortable with is fine.


Windows may still dominate, but 75% is far below the 90% it was 10 years ago, while MacOS has nearly doubled in the same timeframe.

As someone who can remember this never changing, that's a pretty steep slope...


Oh it is far from nothing to be certain. And as Netflix's recent loss of subscribers and subsequent drop in stock price showed nothing is forever. But I'd still need to see the drop continue for a bit longer before I full on expect Microsoft to be in trouble.

Mind you, I would like to see them follow in Apple's footsteps on the train the M1 is creating. It certainly makes it FEEL like there is more runway down this path then Intel's, with caveats for potential hardware vulnerabilities like specter that simply haven't been found yet on Apple silicon to inhibit optimizations.


I think at this point Windows market share could go to 0, and while it'd hurt, with Office 365, Azure, Xbox, etc., I think microsoft is sufficiently diversified to survive that.


I assume what you refer to is this

https://www.statista.com/statistics/218089/global-market-sha...

If you look at the trend, macOS is gaining worldwide market share while Windows is steadily dropping.

And macOS share in USA has already reached close to 25%.


Right, but for Apple to gain share in the rest of world like they have in the US, they need to drop prices, and that'd hurt their margins, including in the US if enough people buy cheaper foreign macs (which is definitely a thing, in my western european country there's a couple of somewhat popular retailers selling imported US laptops which are cheaper than local SKUs)


It's behind a paywall for me.


> Windows still holds a 74% market share

Note that is for desktop PCs: many people don’t own a PC/laptop so the market share is far far lower than that, especially outside of rich countries. Microsoft is now primarily just business software?


According to StatCounter, macOS only has 15% of the global desktop market share.

https://gs.statcounter.com/os-market-share/desktop/worldwide


I'm not convinced that Apple's advantage is due to ARM vs x86. I think it has more to do with Apple's exclusive rights to TSMC's most advanced proccess. After all Apple is also beating Qualcomm's ARM Android CPUs.


Apple's advantage is manifold. They have an in-house chip design firm that is very talented and an architecture license for ARM. So they can make CPUs that do whatever they need. They also build the OS so can take advantage of new CPU and GPU features or go the other way and request hardware features to support software designs.

They're also using essentially the same cores between their mobile and desktop products. Apple is their own ARM ecosystem. An improved core for the A-series chips is an automatic improvement for the next M-series chip. Most arm licensees have to wait for ARM to come out with new cores.

The fact Apple can book the initial runs of any of TSMC's process nodes is just one of many of their advantage over other ARM and x86 manufacturers.


Yeah, thats a good point. There's def something shady and untold about this whole thing; and that could explain it. Apple has deep pockets and considering they have done shady deals (like the Google default search engine) this could be another one of those.


The ISA does not matter in a CPU design.

But the process node is also not the main reason.

What matters is only microarchitecture. And Apple has by far the most performant microarchitecture design of all CPUs.


ISA does matter in a CPU design and CPU performance:

- Variable instruction sizes make the front-end more complex and limits the decoding width;

- Delayslot makes superscalar front-end more complex;

- Page size limits VIPT L1 cache size;

- Dedicated SIMD/FP architectural registers allow dedicated SIMD/FP physical register files;

...

The choice and design of the ISA is extremely important, it's hard to argue that the ARM ISA has no impact on M1&M2 performance.

But the ISA choice is obviously not enough to explain the whole performance of the M1&M2. Likewise, the manufacturing process cannot fully explain the performance of the M1&M2.

The Apple microarchitecture is by far the most performant and efficient of all high-end superscalar CPU.

But be careful with simplistic explanations, the microarchitecture is always constrained by the ISA/architecture, and the x86 ISA has some flaws that can affect the microarchitecture (least on the power consumption).


Mike Clark, the architect of Zen says in [1] literally that ISA doesn’t matter and that it’s all about the microarchitecture.

Quote: “Although I've worked on x86 obviously for 28 years, it's just an ISA, and you can build a low-power design or a high-performance out any ISA. I mean, ISA does matter, but it's not the main component - you can change the ISA if you need some special instructions to do stuff, but really the microarchitecture is in a lot of ways independent of the ISA. There are some interesting quirks in the different ISAs, but at the end of the day, it's really about microarchitecture.”

In the end, ISA is such a small part that impacts the design of a modern high performance CPU that it is almost negligible. It is physically impacting only the decode unit, but the decode unit is only a few percent of die area. Feel free to listen to the whole interview to get a feeling.

1: https://youtu.be/3vyNzgOP5yw


Qualcomm hasn't made a single Qualcomm core since Apple released their first aarch64 SoC. Qualcomm had zero competition and decided to not really work on anything. Apple blindsided it with its very competitive aarch64 cores, Qualcomm had nothing to show, so they switched to ARM's core design.

Customers kept paying Qualcomm for their SoC with ARM designed cores, so once again, Qualcomm had no reason to actually do anything but sit on their patents.

Intel had a similar story, since Sandy Bridge "x86_64" part of CPU barely changed, most of the performance gain was somewhere from better process, more custom instructions (avx2, etc.), higher TDP (since ryzen).

It's not ARM vs x86, it's Apple ARM cores vs everyone elses cores.


Continue hoping that people who don't want to run MacOS still won't want to?

I don't think they have any chance to match Apple in terms of efficiency while buying third party chips. The advantage comes from controlling the whole stack I think. Apple knows exactly what accelerators will be available for each generation, and their communication between hardware and software folks is presumably much tighter.

Is the Wintel laptop/Macbook gap even that much larger than the Android/iPhone gap?

The market for non-Apple devices is, I think, pretty large.


Microsoft has ported Windows and Office to ARM for a decade, and they have both x86-to-arm (Windows 10) and x64-to-arm (Windows 11) translation technologies. They also have ARM-based Surface devices in the Microsoft Store (looks like the Surface Pro X is the current model).

That said, the big business is still big business: Azure (Cloud), Office, and device management (MDM) / active directory are big focuses even in a heterogeneous computing environment that includes Chromebooks and Macs.


It's not clear if the ISA difference is so meaningful, perhaps it's only a small part of the performance boost. Don't forget that Apple moved from PPC to x86 to get better perf/watt and the PPC ISA is closer to ARM than x86.

Intel or AMD back on their feet can probably match Apple in perf/watt. And I guess they are the closest competitors in the PC market.


i thought apple switched to x86 because two console companies were buying most of the PPC fab output? I distinctly remember that being the reasoning. Until Ryzen hit with the 3000 series x86 didn't have a better perf/watt than PPC, at a glance.

In the 10 years between my 40 core HP server's release and the Ryzen 5950x's 16 cores, performance increased ~10%, but the TDP of the 5950x is 10% of the quad xeons in the HP. This is ignoring the fact that a single 5950x cost less in any market than a single xeon in that HP upon release.

Does anyone else remember the Cavium ThunderX processors? Whatever happened to those? the perf/watt on those was supposed to be outstanding...


Simply getting functional Windows on ARM would be helpful. The last surface ARM attempt went poorly.


Windows on ARM is basically completely different at this point, it's perfectly functional; they have x64 emulation as of recently and it just uses normal UEFI for booting, just like most other normal laptops. Honestly the emulation is pretty good even if it's not as fast as Rosetta2, and they even now have a new 64-bit Windows ABI and calling convention that allows developers to incrementally port their applications to AArch64 on an individual DLL-by-DLL basis (so the emulator can handle transitioning between an x64 app and an AArch64 DLL, or the other way around.) They aren't just doing nothing.

The biggest problem is the hardware: you're basically just buying a poorly performing laptop with probably lagging Linux support if you get tired of Windows, and you could just buy an M1 Macbook and get superior performance and battery life for the same cost, and you can even just run Windows on that using Parallels and still get good performance. The AArch64 laptop market is mostly just Qualcomm processors and Apple, and if you actually care about the performance profile, there's basically no comparison between the two right now with current offerings; the Mac is the winner, and you can even run Linux on it.


None of the major manufacturers are going to ship Linux to consumers. Having a version of Windows that actually has apps that they can ship means now building the hardware is worth it.


It's not as fast as Rosetta2 because M1 is about 2-3 times faster than the Qualcomm Microsoft is using


I have faith in MS software development - I believe they can get windows on ARM working - but as far as I can tell, all non-Apple ARM hardware implementations are significantly lacking in the laptop space. There are vendors making decent phone chips, but I've seen no indication of the ability to scale up to a laptop in the way Apple Silicon has.


Apple’s key advantage here, though, is one Microsoft would never voluntarily embrace: abandoning x86-64 in favor of ARM.

Apple is willing to force its partners and customers to make the switch or get left behind. Microsoft would never do that, so Windows on ARM will presumably languish in application support indefinitely.


Yes Apple is walking on familiar ground here. They abandoned M68K, then PPC, now Intel. They know they can do it, as they have done it before, and their customers will follow along, as they have done before.


x86 is dead, full of security issues and is energy-intensive

smartphones, even embedded devices, including cars nowadays are all on ARM, servers are building momentum too, not because it is shiny, but because of tangible gains on many aspects

> Apple is willing to force its partners and customers to make the switch or get left behind.

That's not true at all, the chip doesn't matter when you sell software, hardware and services

It's like changing the internals of your Camera to provide a better experience and quality, why do you care about it? in fact you don't! You want a better Camera, company will pick what's best for the better Camera

Apple provide a transparent translation layer to accompany the transition with Rosetta, it's effortless for the users

That's the problem of Microsoft, they are incapable of designing proper UX solution to accompany their customers to better solutions, instead they force their customers to be stuck with inefficient solution, Microsoft don't even care nor dare cleaning their OS to provide up-to-date solutions

It's a bloaty mess of 5 generations of different UI/UX

Choosing Windows prevents you from having a seamless experience from your Watch -> Phone -> Desktop -> Car

That's what Microsoft fanboy don't understand, they protect their poor decision making, their inefficient products and ultimately, it leads to the death of their products

Microsoft Windows consumers are stuck

That's why Windows Mobile, Metro, UWP, WinUI all flopped, the platform is no longer up to date

And it's not just a chip issue, it's the whole ecosystem and culture, always too late to make changes, and here, incapable of providing a transition path, hence they are failing behind apple


> x86 is dead, full of security issues

to be clear though: spectre/meltdown are not an x86 issue. POWER, SPARC, and indeed even ARM (although only some of their products have OoO/speculation) were affected as well. There is no magic to ARM that magically makes it secure if you don't protect against side-effecting.

I generally agree with the rest of your points, Microsoft is stuck in legacy hell with x86 and they are stuck with a customer base that specifically values that (everyone else has departed for linux or osx, they have "dead sea effect"ed themselves into a high-maintenance customer base), and they've done a super shitty job in general with 5 different generations of UX lava-layered over the top, and x86 is clearly falling behind in energy efficiency. But security isn't something intrinsic to ARM or x86, you can design a secure x86 processor and you can design an insecure ARM processor.


> to be clear though: spectre/meltdown are not an x86 issue. POWER, SPARC, and indeed even ARM (although only some of their products have OoO/speculation) were affected as well. There is no magic to ARM that magically makes it secure if you don't protect against side-effecting.

IIRC, M1 was even vulnerable to some of the otherwise Intel-only Meltdown (cross-privilege boundaries) exploits, let alone the more-or-less ubiquitous Spectre (only within same-privilege boundaries) exploits.


Meltdown wasn't Intel-only - POWER and ARM A75 were affected as well. Meltdown affected everyone except AMD (who have had a similar issue surface themselves recently with their implementation of the PREFETCH instruction) and SPARC.

https://en.wikipedia.org/wiki/Meltdown_(security_vulnerabili...

You're not in the minority for thinking this, there was some serious journalistic miscarriage there. To a lot of people, Intel and AMD are the whole world and if it doesn't affect AMD then it's Intel-only. Even people in tech journalism.

(thought I remember Oracle eventually admitting SPARC was vulnerable as well but I can't find it so maybe not)


> You're not in the minority for thinking this, there was some serious journalistic miscarriage there. To a lot of people, Intel and AMD are the whole world and if it doesn't affect AMD then it's Intel-only. Even people in journalism.

As I recall, the initial investigation focused on Intel, AMD, and some ARM implementations, so that was what was reported; I personally didn't attempt to follow up on any subsequent investigations on other architectures, so I was unaware of any specific results on SPARC et al, good or bad.


> That's not true at all, the chip doesn't matter when you sell software, hardware and services

Yes they do, I am typing this in a fully functional 10+ year old Thinkpad running Linux and getting updates to software.

I know people with with Apple Laptops that they can no longer get security updates due to the chip change. There only option is to install another OS to keep on that hardware.

But they chose to pay for a brand new model instead of leaving their OS of choice.

So, Apple is able to pull these people along raking in the doe because they are willing to send 1500+ USD to Apple every few years.

Good for Apple, PT Barnum comes to mind with Apple.


> I know people with with Apple Laptops that they can no longer get security updates due to the chip change. There only option is to install another OS to keep on that hardware.

Why do you lie? The newly announced macOS supports Intel based macs

https://9to5mac.com/2022/06/06/macos-13-ventura-supported-ma...


Someone I know is unable to upgrade their MAC, I do not know what kind or how old it is. But is is not that old, maybe 10 years or so.


>That's what Microsoft fanboy don't understand

Maybe if you stop accusing people of being fanboys, discussions could be more productive.

In fact, your entire post is all trolling, FUD, and no substance or arguments.


We can never complain about Microsoft and Windows, it is always seen as "trolling", FUD, or arguments with no substance, despite everything being already documented and archived for over a decade, I'm not going to turn a comment into a 10 pages essay when you can just with the help of a google search find the documented claims i am making

When you hide the problems, they resurface years later, and here we are!

Apple moving to ARM allows their entire lineup to have a seamless ecosystem, apps runs natively everywhere, iOS/iPadOS/macOS

Microsoft indeed is stuck and had to come up with a full VM to support an OS with a different architecture, and android apps for a different kind of chip! is that FUD? am i a troll? come on, let's be serious!


Except you haven't made any factual arguments here. Your comments can be summarized just as "Microsoft sucks, Apple is the best !!!111one".


> Your comments can be summarized just as "Microsoft sucks, Apple is the best !!!111one".

Exactly, i used to use microsoft products, until recently i switched all my machines to linux, i don't even use apple products, i have an M1 only just to test my software, for everything else it's a VM

My claims are objectives, from experience, and everything is documented and archived

To your blind fanboy eyes only the opposite can be true?, "apple sucks, microsoft is the best?"

You might be living in an alternate reality where "Windows Phone XAML Edition Pro 2004" is #1 in the charts, and the cloud native OS is "Windows Bloated Server Millenium 1914 Product of Seattle Hell Yeah Brother"


Yeah, my impression is that the arm windows laptops were basically using cell phone chips.

And if you consider that arm cellphone chips are already behind Apple cellphone chips (which the M1/2 improved on)… not great performance for windows.


even if they can get windows on ARM working, they still need to get vendors to distribute ARM versions of their software. This is another leg up for apple and the humble penguin.


How is this an advantage for desktop Linux? Yes, open-source software can simply be recompiled, but desktop Linux-based platforms (particularly GNU/Linux) have never been great for proprietary app developers, and adding ARM to the mix doesn't make this any better.


I use a windows 11 arm vm on my m1 MacBook with parallel and the experience is pretty good.


Did you have to do anything legally dubious to get that Windows ARM VM running? I thought that Windows for ARM wasn't officially available except on Qualcomm devices.


No, it was extremely easy and I was surprised about it. Just install Parallel, select windows 11, wait a bit and it’s done.


Do you mean the Surface X?


i use surface pro x on arm as a daily driver and it's excellent (on the beta release train anyway), and i use linux and osx too daily.


The question is who is Apple competing with with these new chips? Is it other PC/laptop makers (Microsoft, Lenovo, HP, Dell etc) or is it solely Apple on Intel? Whether Microsoft (and everyone else) needs to react or not depends on whether Apple's overall market share in the segment is going up.


I just tried out the Surface X. (Arm-based Microsoft tablet running Windows.) There was a lot to like about it, but I returned it because it wouldn't connect to my printer and scanner.

In general, compared to a Macbook:

- It has a touchscreen

- It has a detachable keyboard

- It has a pen input

Microsoft's execution on the device is flawed. (IE, in order to use it as a laptop it needs a much more sturdy hinged keyboard,) but there's clear differentiators in their lineup.

IMO: Apple's lack of a touchscreen and detachable keyboard (or 270 degree fold) really hurts the Macbook lineup. If I could get a Macbook that I could also use as a table, or an iPad that truly ran OSX, I'd be happy.


I also invested in a Surface Pro X (I have macbooks but don't like to use them vs Windows), bought at a deep discount on Amazon. I persevered with the ARM-related issues and now love it. You really need to run the insiders' builds at present in order to get decent compatibility (e.g. Android apps and 64-bit Intel emulation). I have Docker and the complete stack of dev tools for my projects (Scala, Kotlin, Haskell, Rust, TS) running in WSL2 with VSCode IDE.


What compiler do you use?


For which language? but generally I use the common/default/stock compiler for any language.


There are plenty of people who don't like Mac.

Me I am a Linux user, had already had a job that "forced" me to use Linux back in 2009 (yes, my boss demanded everyone used Linux, in 2009 and I absolutely did not complain as it had been my choice since 2005).

I came to Mac that year and was very enthusiastic about what I had heard was like a polished, commercially supported Linux distro.

I left three years later after having spent significant time trying to adapt to it.

I was relieved to get back, even to Windows.

Last fall I got a Mac Mini.

Some of the warts are now fixable, but I only use it for things I won't have to do in anger or fear or anything like that.


The only reason to buy a Windows laptop is so you can install Linux on it. If Asahi gets there, the last reason will vanish.


I am pretty happy in Intel+Linux. My device basically lasts all day. It does get kind of hot if I'm running some crunchy numerical code, but that's usually pretty short (assuming the thing I'm running is a laptop appropriate toy problem).


What is even a Windows laptop? A laptop for which Windows has drivers? You buy a laptop for which Windows has drivers in order to install Linux? That doesn’t make sense.

What you mean to say is: you buy a laptop for which Linux has drivers for. Windows is not in this equation, my friend.


I like how Apple is pushing forward the computing industry to have SOC's. However, in as much how fast an m1, m2 or the Alder lake processors are. the problem, lies in 45% in the HN demographic that's shipping dog slow software. whether creating OS code at Microsoft, Linux or Chrome. then the rest of the web dev's. Hopefully the industry can transition to SOC's with documented API's so we can skip the multi-layers of software i.e Firmware -> OS -> Driver's -> User Land Software to Just Hardware -> Thin OS (unikernel like) -> User land software


Looks like a pretty minimal upgrade over M1 pro. apple.com and keynote were also super vague with multi display support, parallelization etc.


Right, but you can't get an M1 Pro in a Macbook Air or 13" MBP. I think we'll probably also see an M2-based iPad Pro.

If I had to guess, we'll next see a Macbook Pro 14"/16" with M2 Pro/Max, and the Mac Pro will be M2 Ultra.


"Compared with the latest 10-core PC laptop chip, the CPU in M2 provides nearly twice the performance at the same power level"

"compared to the latest 12-core PC laptop chip [...] M2 provides nearly 90 percent of the peak performance"

That doesn't make sense. Twice the performance of a 10-core Intel CPU, but only 90% of the performance of a 12-core? This implies Intel more than doubles performance when going from 10 to 12 cores. Reading the footnotes, the 10- and 12-core Intel CPUs that Apple used for benchmarking are the i7-1255U and i7-1260P which have respectively 2P+8E and 4P+8E cores (performant and efficient cores). So the second Intel CPU actually has twice the number of performant cores than the first.

This means the benchmark mostly depends on performant cores and nearly doesn't use or doesn't depend on efficient cores. If so, that's a rather useless benchmark for evaluating the CPU as a whole (P and E cores.)

But what this also means is that Apple is being sneaky. The M2 has 4P + 4E cores (not revealed in the press release, but we can tell from the die shot). Thus comparing it to an Intel CPU with only 2P cores (i7-1255U) is guaranteed to make the M2 look better as the benchmark doesn't use E cores (see above.)

What I'd love to see is the M2 put up against a Zen 3 or 3+ mobile CPU like the Ryzen 7 6800U (15–28 W) which straight up has 8 regular ("performant") cores.


I would recommend looking at the graphs below the section you've quoted because they do explain away the incongruity that you're thinking is there.

It's 2x the perf if you compare the max power draw performance of the M2 vs the performance of the Intel CPU at the same power draw (so not maxed out)

It's 90% performance of the Intel CPU when they're both maxed out, while taking quarter of the power draw


Err, you are right. I misread.


One note, I used the images of the chips themselves and the M2 chip is physically bigger than the M1 chip by 21.8%, but the article claims a 25% increase in transistors. So there's only an actual 3% increase (1.25 / 1.218) in the transistor density with this new process. Most of the advancement is presumably in the form of better heat transfer and cheaper silicon per unit area.


> It also delivers 50 percent more memory bandwidth

Anyone know if this means much in practice for a typical dev user?


No VP9 or AV1 acceleration.


The M1 already has VP9 acceleration in the “media engine” chunk of the SoC.

https://singhkays.com/blog/apple-silicon-m1-video-power-cons...

Though Apple doesn’t super explicitly say that.

As for AV1…well, we don’t really know yet. That’s deep in the weeds, and it’s entire possible the M2 does have accelerated decoding, but they just didn’t spell that out yet.


The M1 accelerates VP9


At least not mentioned. YouTube 4k will eat a lot of battery…


Why watch 4k on a 1664p screen?


Bitrate so much better, even on 1080p screen it looks nice


You can connect an external 4K screen. Also higher bitrate makes it better looking even on lower res screens.


They would probably rather you watch videos on appleTV anyway.


Can anyone fathom how a M2 MBP 13inch would stack up against the current M1Pro MBP 14-16inch? This whole CPU/GPU thing is hard to make out. An 8core M2 which has 12% more oomph then an M1 makes it equivalent to a 8.96core M1? What's wrong with clock speed...


Now that we have validation that Apple is sticking with a numeric naming convention, I wonder how they will handle the upcoming naming clash with the M7 through M12 motion coprocessors used in the A-series chips.


I think tis is a non-issue. In 6 years nobody will remember those coprocessors any more.


I see a lot of hype around the AI cores. What do regular users use it for? Only thing I can think of is accelerating writing recognition / face-auth stuff.


Filters and Zoom backgrounds.


Funny how I skim the page for eight seconds, see a ton of marketing fluff, and immediately back out and come to the comments section here for real info.


That's a year and seven months since M1 was released, so M3 is expected ? Jan. 2024? It'd be pretty rare for a January release, expect instead probably November 2023 or March 2024.

There's a joke here somewhere... all those poor M2 suckers when M3 is about to be released. Something like that.


I'll switch once I can play my full Steam library on a machine like this. The power is there for sure.


I'm only a very casual gamer these days (although anyone I go on a date with would only see a gaming accessory or console and immediately think hardcore), I recently went down the gaming rabbithole to see what I can do with my M1.

It seems like MacOS on M1 architecture can play almost everything! Like, there are so many titles that don't have OSX listed as ever being released for, but it can either be played out the box or with a slight tweak. But I guess that does preclude Steam releases if you don't have direct access to the per game, installers.

And nowadays these indie games release on all major consoles, mobile, and windows, macos.

What are you encountering? A few examples on whats missing for you?


Three semi-broad examples, Elden Ring (Japanese dev), Horizon: Zero Dawn (Dutch dev, PS4 port), and Inscryption (indie dev) do not show MacOS support. I have not tried installing Windows-only games on a Mac. Can they be installed regardless if they don't show support? Does Valve have Mac-specific APIs to get them runnable like they do for Linux?


Did you get any VR games working?

https://www.applegamingwiki.com/wiki/Home paints a sorry picture of the state of M1 gaming (partially due to dropping 32 bit support).


No VR! I’m sad I can’t get to experience Half Life Alyx, but I mostly forget


Can you expand on “almost everything” and what you do to achieve that?


I look at tables online that show game compatibility and experiences

Its surprisingly good and I think this is partially because of Apple’s network effects

But aside from that the rosetta translations work really well, allowing a lot of x86/x64 to work

And then there is Crossover which is a GUI for WINE

And then VMs

I got a Windows 95 game playing on my M1 yesterday with Crossover, more resource intensive stuff seems to be doing well too, with maybe flagship games failing some enthusiast benchmark


Yeah, I wish Valve and Apple would make nice (like they did back in the day for Steam on Mac) enough to:

1) Update all the first-party Valve games to 64-bit since the 32-bit binaries are no longer supported

2) Bring Proton support to MacOS, which is what Steam uses to run Windows games on Linux and the Steam Deck console.


Apple would probably need to do the heavy lifting on (2) by either adding Vulkan support to their GPU drivers or by adding Metal support to Proton.


Yep. It would be awesome, but I feel like Apple wouldn't be very enthusiastic about putting in effort for an emulated solution over native games.


https://www.codeweavers.com/crossover

Works surprisingly well. You can find videos on youtube.


The comparison Apple makes with NVIDIA GPUs is very exaggerated. But yes, it should be able to game.


I was hoping to the long-awaited Reality Glasses this keynote. Something like a M2 on each lens could drive a 90 fps 2K display. The libraries are hinting at howAR/VR will be done-Metal-3, lidar, etc.


Does this mean that a M2 Max could do 128gb RAM? That opens up some use cases beyond the 64gb. My world has pretty much no improvement between 32gb and 128gb, but I'm currently on a 64gb machine.


It would mean 96GB. They increased from 2 memory channels per M1 to 3 memory channels per M2.


20% better, 20% more cost? Fair enough. I'll get the M1 for my mum


20% faster. The chassis and display have been changed as well. (Not to say you’ve made the wrong decision)


Not at all, your opinion is much appreciated. Nothing massively different in the chassis and display, right?


Nothing game changing IMO.

They added MagSafe charging.

It’s thinner and lighter (the previous model was already thin and light).

They added some new colors (I’m partial to the blue).

The screen is a few pixels taller (however a new notch takes some of those pixels away).

The camera is higher quality, the speakers are a bit better, and the screen gets a bit brighter.


Ah, the camera. That's big for her: 1080p vs 720p. But I think she's going to end up using her phone. Still, it does make me pause.

The weight and colours are a good point. Thanks for the summary! <3


24GB of RAM (formerly 16GB) is a big upgrade for me.


I'd still buy the 14" Pro for the more ports, slower chip for more $ or not.

And real function keys instead of a touchbar?

They seem to be making weirdly inconsistent choices in the product line.

I thought they were going to be getting rid of the touchbar (and maybe adding more ports?) and they were only still in the legacy 13" because it was legacy. But apparently they mean to indefinitely have a 13" Pro with a touchbar and a 14" Pro (actually the same size device, just less bezel, I think?) with function keys?

And the new M2 Air has a magsafe power connector (like the M1 14" and 16" Pro)... but the new M2 13" Pro does not? Why?


Actually, I just realized the new Air has a magsafe power connector and two additional USB-Cs, and does not have a touchbar.

I was about to buy a 14" M1 Pro, not because I needed the speed at all, but I wanted magsafe, I didn't want a touchbar, and two USB-C ports (inclusive of power supply) is not enough. Also the built in HDMI out was nice.

The new Air has everything I want except the HDMI. Separate magsafe power PLUS two more usb ports (that's enough for me), no touchbar... yeah, I'll be waiting for this and saving significant money over the 14" M1 Pro I was about to get.


> The new Air has everything I want except the HDMI

I use a USB-C -> HDMI cable. Works great on my Dell.


What's the best solution for when you take your computer to a conference room that has a male HDMI plug that you're supposed to use? I have a USB-C mini dock, but I'd rather not have to bring that everywhere.


I have a mini dock, less than 100g, that does usb-c to hdmi, vga, dvi. I just have that in my travel backpack. Works with my chrome book as well, and I think iPad too.


What product do you have, just so I have an example?


QGeeM USB C to HDMI Cable Adapter,QGeeM 6ft Braided 4K@60Hz Cable Adapter(Thunderbolt 3 Compatible) Compatible with iPad Pro,MacBook Pro 2018 iMac, Pixel,Galaxy S9 Note9 S8 Surface Book hdmi USB-c https://a.co/d/6xniK1E

I bought it to see if I could play movies from my phone. It didn't work, so I chucked it in a drawer.

More recently, I bought a 50" TV to use as a giant developer monitor, and this cable works better than my Dell's HDMI port. What happens is that the HDMI port can only do 4k at 30hz, but this does 4k at 60hz.

You'd think something like that wouldn't matter for an IDE, but shockingly, moving a mouse at 30hz is... Just... Weird.


Keep in mind that the air with the ram and storage of the 14 gets pretty close in price.


I guess, depending on what you mean by "pretty close".

M2 MacBook Air 13" with 8-core GPU (10-core would be $80 more)

* 16GB RAM, 512GB storage: $1599

* 16GB RAM, 1TB storage: $1799

M1 14" MacBook Pro with 14-core GPU

* 16GB RAM, 512GB storage: $1999

* 16GB RAM, 1TB storage: $2199

So $400 cheaper, ~20%


It gets closer depending the country. In germany it's 1960€ for the M2 Air with 16GB/512GB, and 2250€ for the base 14" model. That's 13% difference compared to the 20% in the US and makes the M2 a lot less attractive.


Ah i guess I was thinking of the education pricing on the 14 or the discounts I see all the time, but those would also affect the air


I don't think M2 is meant as an upgrade over last year's M1 X or Pro.


What is the 13" M2 Pro meant for, do you think? Why would you buy it instead of either the new M2 13" Air or the 14" M1 Pro? I can't find a reason.

I guess if you really love the touchbar, this is the only thing that has it? (That HN hates and we all thought they were getting rid of based on the M1 Pro's). Or you really hate magsafe power connectors, this is the only new laptop that lacks it?


It is meant to fill the gap in the product line between the Air and the 14" Pro. The fact that it did not get redesigned alongside the Air makes it a weird and confusing product. Performance-wise, it should beat out the Air thanks to the inclusion of a better cooling system. Just like with the M1 versions of the 13" and MBA.


I can't figure this one out either. The Air now has the same screen as the MBP 13", same processor, slimmer form factor. The air is also $100 cheaper than the MBP so that's not a justification. The only thing that is different on the specs page is the GPU count 10-cores versus 8. Why keep the 13?


I'm guessing they can get slightly better sustained performance out of the 13" Pro than the Air because it has a fan? But still doesn't really seem to explain it. Unless they want to bring the Touchbar back to the other pros.. which hopefully isn't the case.


I feel like it's a business-driven decision to sell the 13" Pro. They can get rid of their Touchbar inventory, and I assume the margins on that one are ridiculously high, since it's a design that's been unchanged since 2016. Kind of like the iPhone SE that is repurposing the 2017 iPhone 8.


How much Touchbar inventory can they possibly have? And unlike the SE, which some people purchase because they like the form factor and low price, I feel like the only people who get a 13" MBP are uninformed about how it stacks up against the MBA.

I wanted to get a revamped MBP, but after seeing the horrendous stats (2 ports, no MagSafe, no SD/HDMI!) it was a no-brainer to get in line for the MBA, which is cheaper, lighter, and has more ports and a better camera.


Yeah the touchbar needs to stay dead.. hopefully this isn't it rising from the dead.


The Air can also be upgraded to the same proc spec as MBP.

Its interesting that they don't have the same screen. The Air is .3in larger and supports more colors?

Air 13.6-inch (diagonal) LED-backlit display with IPS technology; 2560-by-1664 native resolution at 224 pixels per inch with support for 1 billion colors

MBP 13.3-inch (diagonal) LED-backlit display with IPS technology; 2560-by-1600 native resolution at 227 pixels per inch with support for millions of colors

https://www.apple.com/macbook-air-m2/specs/

https://www.apple.com/macbook-pro-13/specs/


An... extra 4 pixels in height resolution?

Like... they just found a manufacturer that already made these screens and didn't want to retool for Apple's order, or what? Why extra 4 pixels?


My guess is just different displays selected for products at different stages in their hardware lifecycle. The MBP shares a case dating to before M-series processors. The M2 Air is new hardware.


Don't forget that AppleCare and repairs (e.g., battery replacements) are cheaper for MBAs than MBPs.


Is the M2 faster than the M1 Pro/Max? They didn't do a direct comparison, but it's only 20% better than the original M1. I bet the pro/max still perform better.


The 13” feels like a stopgap product. Maybe they had supply chain problems and they couldn’t redesign it so they just put an M2 in it.


It could be the case they want to keep at least one machine with touchbar on the market for some poor souls who have invested in it.

Same logic for how they keep the iPhone SE with a physical home button.


Someone please explain to me why I need so much power. Can't even run ML on these. More browser tabs?

I miss my 100Mhz IBM PC with 16 MBs of RAM & Visual Studio.


Video/Photo editing


I don't recall owning a computer that was too slow to run Photoshop.

4k video editing - I admit is a valid use case, not something I've ever had to do. But I doubt most people buying these are video editors.


I have a 2014 MBP with an i7 and 16GB of RAM.

RAW photos straight off my Sony A7iii bog down Lightroom, but not horribly so. Once I try to scroll through a few hundred of them and apply bulk edits things slow more. Photoshop is fine unless I open more than about 10 large files simultaneously.

Editing 1080p video in Final Cut Pro I can bog it down from time to time. 4K is utterly unworkable unless I let it convert it all to ProRes - even then it's quite slow.

I should also mention during all of this the fan is absolutely screaming, and battery life is about 30-40 minutes - it's essentially un-useable for demanding tasks on even a warm day and when not plugged in

All-in-all it's acceptable, but I will get a used M1 MBP as soon as I can afford it.


$1500 to edit photos 20% faster? Was it slow before?


If you're doing it professionally, then that adds up.

This is such a ludicrous argument. If something is faster, it's faster. What's the problem?


if you're dealing with 45-61MP RAW files yes. More RAM and more GPU and CPU speed help. Of course.


I'm a photographer, and almost no one deals with 45-61MP RAW files. You're talking about a subsegment of photographers, subsegment of those who deal with making very large images, almost no one. It's your money lol, I just don't understand why you justifying so hard spending money on this incremental update.


I use 45MP RAW files from my Nikon Z7. Guess I'm almost no one. Thanks for the helpful comment. "lol"


This makes me think Microsoft should just pay up and buy Intel because it will be hard to compete with Macs in a year or so.


How much of microsoft is windows business?


Microsoft makes less than 20% of their revenue from Windows.


Compete with what? Virtual Memory swapping that Apple just introduced?


Can someone tell me how big of a difference in performance the missing fan on the air will make against the pro 13”?


None, unless you’re in a warm room and have a long compile job. For that case, which only happened once to me, I just put my Macbook on the cold balcony.


Generally the difference is minor and only on for very long builds


only two usb-c ports :(


Yes, but 100% more usable ones than before, if you factor in the new MagSafe. I'm still pissed I would have to spend $2,000 to get a Mac with more than 2 USB ports (or HDMI, SD, and other standard fare from MBAs of yore). But I'll put up with that limitation to get one of these new MBAs.

If someone came out with a nub-sized flash drive that sat nearly flush with the edge of the Mac, I'd definitely get one. I was really hoping the base MBP would be revamped, and I'd be able to get an SD card slot and avoid Apple's insane storage tax...


Despite these improvements, it's not enough to replace M1 Pro when considering the price/performance


Apple A4 (2010) vs A5 (2011)

- CPU: 100% faster

- GPU: 600% faster

Apple M1 (2020) vs M2 (2022)

- CPU: 18% faster

- GPU: 35% faster

Diminishing returns.


That's what they call cherry-picking stats.


That's the same for all chip makers. It's the end of Moore's law that's bringing the diminishing returns, as you probably know.


It depends also on the % of what.

600% of 1$

18% of 1 000 000$

Which % do you prefer ?


Not really a new chip version. Still 5nm.


how does m2 neural engine performance compare with popular contemporary nvidia laptop and desktop gpus?


yessss been waiting for a m2 mini for a new workstation


The only downside is now magsafe over usb-c :(

unless you can charge with both on new macs?


You can charge with both, in any port (I have a Macbook pro with Magsafe, but primarily use all the USB-C chargers I have lying around)


You can charge with both on the 14" and 16", at the apple store a few weeks ago they told me the next one's will likely have the same capabilities.


That's awesome, thought they were doing a 180 and forcing magsafe charging only.


Why is that a downside? Magsafe was one of the best things about the older, early-2010's Macbooks.


I for one love charging with USB-C. By now I have USB-C cables laying around a few different places in the house, and can charge almost anything at any one of them. No going to the other room to get the cable.


Magsafe is fine; having to travel with a separate charging block is terrible. The USB-C consolidation has been awesome for people like me who travel frequently- my phone, external battery, M1 Air, headphones, Nintendo Switch... all charge over USB-C, so I only have to travel with a single charger. It's wonderful


The new MagSafe doesn't have a captive cable on the charger. It is a MagSafe cable plugged into a USB C charger. You can charge via a USB C cable but you give up a port and you can't get quick charging.


The USB-C port is also awesome for docking station setups. Just one cable to plug in for power and all peripherals. But for mobile use I definitely like to have the choice of magsafe.


How do these on the new base model air compare to an m1 13”?

Any benchmarks yet?


0n the return on MagSafe: they should have let Jony "Form over Function" Ive go much sooner. Maybe we'd still have audio jacks on iPhones.


I can't believe I'm saying this but now that all my devices charge via usb-c I don't actually want magsafe that much now I can bring one cable and change all my portables and mac + razer laptops. Not worth it for me to lug around another cable just for magsafe.


My MagSafe cord for the 2021 MBP stays at home and it's USB-C while on the go. Still, the return of MagSafe is welcome just so folks have the option.


The best of both worlds would probably be a magnetically detachable USB-C cable. There are such cables out there but they have very very bad quality, at least those few I've tried.


The best of both worlds is what we have today - you can use either magsafe or the usb-c ports, so you have the extra safety if you bring the extra cable, or the extra convenience if you can't be bothered. There's some third-party offerings out there for magsafe-style USB cables.


That's what the new magsafe cable basically is. A magnetic USB-C cable that (probably) isn't garbage quality.


The improved battery life with M1 is so good that I pretty much only plug in when I'm at my desk. Most of the situations in which MagSafe saved me before (and there were plenty) I'm just not using a charger at all, now. And at my desk, I'm plugging into my monitor, not directly into a wall outlet. In fact, I almost never plug my Air into its charging brick at all.


I got a 3rd party GAN charger with 2 USB-C ports and a USB-A port. I can use it for charging everything, and super easy to pack when traveling.


I am 100% with you. To add to this, in my two work environments (office and home), my laptop is charged via my monitor with a single cable. WTF would I want an additional cable? My magsafe adaptor went into the drawer as soon as I got it. Magsafe made the charger included in my m1 macbook pro _less useful_. My previous charger was great for vacations as it could charge my ipad, nintendo switch, headphones, or laptop.

That said I'm sure it's nice for _someone_.

Edit: I just realized that the magsafe connector can be disconnected from the brick, and a standard usb type-c cable can be used thusly making the charger as useful as the old one, provided you buy an admittedly cheap in the big picture cable.


A permanent setup where you charge via your monitor (or some other desk-based equivalent) is incredibly convenient, but that's not the only context in which these things get used. If you're on the couch, in a café, school, or whatever else, the magsafe bit is amazing. It's also the sort of feature that you don't give a damn about until it saves you - my parents' dogs tossed my iBook 12" on the floor many times by tripping on the power cord, and that stopped being a problem with the first magsafe mac I got.


Would you be OK with a USB-C to MagSafe adapter?

I totally get your point and can't disagree with it. But I really love the magsafe connector for power. It's just so nice. usb-c feels slow clunky in comparison. Just for a straight power supply use case, I think magsafe is superior to usb-c.

So I wonder if anyone is contemplating an adapter? There's probably going to be too much of a mismatch of power requirements or something to make it viable.


You actually have the option to just use any USB-C charger in the 3 USB-C ports though, at least on my work 14-inch M1 Pro MBP. I just keep the MagSafe charger in my backpack, because those are the situations where I'd want it (rather than at home... though with my dog, maybe I should get more MagSafe cables).


I'd have been just as happy if they'd given us another USB-C port. It was unacceptable that to get more than 2 free USB ports, you had to spend $2k on the 2021 MBP.


Airpods are pretty awesome. Have some trouble with switching between devices but still pretty great.

I actually run with Garmin (Spotify & downloaded music) + Airpods.


Back in the day when laptops still came with a CD/DVD drive by default I had a customer with a broken MacBook that didn't power on at all, and some expensive disc was stuck in the drive.

Pretty much all disc drives that have been produced in the history of disc drives come with a little pinhole where you can stick in a paperclip to manually push the opening mechanism exactly for this kind of scenario, so you can recover your disc if your computer or drive fails.

Except Apple computers, of course, because such a useful piece of functionality would be ugly and an abomination unto Saint Jobs, or something. So I had to spend a few hours opening up that MacBook that very clearly wasn't designed with "opening up" in mind. I was lucky this machine wasn't in warranty and dead, so putting it back together wasn't really a concern.


It's not like those Macbooks were all that difficult to take apart. I know I took mine apart to remove the DVD drive to replace it with an SSD.

It was something like this: https://everymac.com/systems/apple/macbook_pro/macbook-pro-u...


It's stupid if you need to do it in the first place, and/or that you need to hire an IT person to do it for you. What if it was still in warranty? What if you wanted it repaired? If you needed that disc you were screwed until the Apple Certified™ repair centre could do their thing.


Pretty sure those had a release, it just required sticking the pin in the right place in the disc drive opening.


I have used to take CDs or 3 of drives with that hole and paperclip a great many times. Once I faced this issue with 2009 MacBook (white plastic one), I managed to pull the disk out of its drive with a very thin pliers.


How does the M2 compare to Intel's offerings?


Watt-for-watt, dollar-for-dollar, M1 steamrolls Intel. Intel does claim to currently have the most powerful consumer chip on the planet, if you watts and performance (effectively cooling the thing is a bit of a meme).

Note that Apple do not mention AMD. M1 and M2 probably still kick AMD to the dirt on the power efficiency front, but the cost for performance end would be difficult to quantify (and the AMD performance ceiling is also significantly higher).


Intel's chips are a bit faster at the top end but use far more energy. Certainly a problem for laptops, but I don't think most desktop users care. I guess Intels prices are a bit better if you want more than 8gb mem/256 ssg, but they're not that far apart.


You get a comparable performance at 15W with M2 to Intel at 55W.

Intel is not significantly faster, just more power hungry. That’s the mean difference in day to day usage.


One is a chip just announced/released by Apple. The other is a company floundering and stagnating in their product offerings.

Did you want to compare the M2 to a specific Intel CPU? The M2 is better.


[flagged]


5 years from now people will grok more fundamentally that launching 2 process nodes ahead of Intel via TSMC was the big win.

For now, we'll deal with marketing-ese about how "other CPU vendors have to choose between power and performance" from Apple's head of semiconductor engineering, and posts on this page like the one saying Microsoft doesn't care and Apple's about to take over computing.


They're only 1 process node ahead of Intel. Intel 10nm is roughly equivalent transistor density as TSMC 7nm. The M1 was TSMC 5nm.

And given that there's no way a single process node jump is going to give Intel a 75% uplift in instruction per clock (2 nodes wouldn't give them that either, for that matter), Intel is going to have to clock higher for comparable performance and that's still going to put them behind on power consumption (which is exponential in clock speed).

Which is to say, it's completely untrue that the only reason these chips look good for speed/power-consumption is the process node they're on. Apple came up with a super-wide architecture (way wider than anything we've ever seen in x86-64), and they made it pay off by getting good performance out of a much lower maximum clock, which bought them a ton of power efficiency.


It seems like people just repeating this over and over but is there anything to back it up? Do we have a 12th gen Intel chip with a known die size and transistor count? Seems like Intel is deliberately coy about this. I gave up looking.


Yeah. The "nm" in a node designation is nothing but marketing now, since 3D FET designs have completely changed what "nm" means.


They were two ahead at launch (to wit, comparisons were against 10th gen Intel)


Can we switch to MT/mm2? (millions of transistors per square millimeter)


Can you maybe explain what you mean here? I can hardly understand your point about Intel, and don't seem to get what you are implying about "marketing-ese".

The M1 is a spectacular chip. The M2 seems like a fine iteration.


They mean Apple's advantage is in good part due to their anti-competitive practice of buying TSMC's entire 5nm production. Leaving none for others like AMD which had to compete on 7nm for the longest time. Same will happen for 3nm if nothing changes.

Whatever your opinion may be about Apple/AMD/Intel. Unfair competition is not good for consumers in the long run.

edit: instantly downvoted. As per tradition in Apple related posts on HN when faced with facts.


What's anti-competitive about that? Apple doesn't even make their own chips, they need outside vendor for that. AMD and Intel are free to offer TSMC even more money, if they want.


You think AMD's 200 billion market cap stands a chance against Apple's 2.3 trillion market cap?

Consumers only stand to lose in the long run. Including Apple consumers.


As far as I know TSMC is building a new factory in Arizona right now, and one more in Taiwan. That was largely financed by Apple's money, and is a huge win for consumers who will benefit from the most advanced chip manufacturing capabilities for the years to come.


> financed by Apple's money

> huge win for consumers

doubt


That buying however much of whatever from a single company can be read as "anti-competitive" reflects more on the market, which relies on a single company for its fundamentals, than the buyer.

To me it seems that the problem is not that Apple bought however much of whatever from TSMC, but rather that TSMC doesn't have competition at the moment. Hopefully that changes.

(I do think HN is a bit trigger-happy with downvotes lately. I don't often get downvoted, but sometimes people who reply to me with a different view do, and so they get grayed out and drop in the comments. I used to try to counter that, but my one vote doesn't work for very long, so I mostly gave up. But it's really annoying.)


Agree. Chip production is such an important aspect of economy these days. It baffles me that US still couldn't come up with a TSMC-like factory yet.

I get that it's really hard. Like rocket-science hard. But still.

Maybe Musk can start something in the sector.


The fact is, Intel is not using TSMC (yet) for their high-end CPUs that compete with Apple. Their stagnation has nothing to do with Apple monopolizing the fab's capacity. Since they [Intel] were on Macs before Apple Silicon, they are being used for direct comparisons.


> 5 years from now people will grok more fundamentally that launching 2 process nodes ahead of Intel via TSMC was the big win.

AMD will have process node parity with Apple this year - Zen4 will be on N5P, as will M2. I doubt that alone will be sufficient for x86 to catch up, they have a LOT of ground to make up.

(this is, of course, the "small" laptop chip for Apple, the M2 Pro/Max will add more CPU cores and a much larger GPU, but you can still extrapolate the performance trends once we see Zen4 and I doubt it's going to be all that flattering. AMD has said "minimum of 15% faster", but even if that works out to 40% on average, Apple just made their own 18% leap, and the current architectural gap is much larger than 22% according to SPEC2017 benchmarks.)

It's not all just "apple wants to go bigger" either - Apple's cores are quite svelte in terms of transistor count as well, they're in between Zen3 and Alder Lake transistor count (this is supposition, but I think they would still be slightly smaller than Alder Lake even if you removed the AVX-512 support from Alder Lake). Most of their transistors go towards a truly titanic GPU, the cores themselves are fairly small (the efficiency cores in particular are impressively small for the performance they give).

And yes, of course "Apple has chosen to target slightly lower clock rates with really high IPC", but that is enabled by design decisions that ARMv8 allows (really deep reorder buffer, really wide decode) that x86 cannot replicate as easily, you can't just triple x86's IPC by targeting a slightly lower clockrate and going wider.

And yes, ARM code is slightly less dense - about 12% larger than x86 when compiling the SPEC2017 test suite, according to the numbers from the RISC-V people. That's not where the difference comes from either, it's not just "high IPC on low density code".

I know what Jim Keller said, and he's right, x86 isn't dead, but it's not ahead right now either, even considering Apple is on 5nm. When AMD is on 5nm this year, we can re-assess and see whether that was the driving factor, or whether there are design reasons as well.

People seem to have interpreted Keller's comments as being "it is physically impossible for ISA to make any perceptible difference in perf-per-transistor or perf-per-watt" and I'm not sure that's a statement he would agree with. A 10-20% advantage to restructuring your architecture in a way that enables deeper reorder and better decoding vs x86, seems like a reasonable premise to me. Especially considering the baseline is x86, the quintessential legacy behemoth ISA. There has been a lot of work to keep it in play, but that means a lot of the "easy tricks" like instruction cache have already been exploited just to get it this far. Surely there are things that could have been done better from a clean start.


> When AMD is on 5nm this year, we can re-assess and see whether that was the driving factor, or whether there are design reasons as well.

Quick note: AMD's 5nm offerings will not be using a big.LITTLE configuration, so direct comparison with the M1 would probably not be very accurate for the purposes of comparing power consumption.


> Quick note: AMD's 5nm offerings will not be using a big.LITTLE configuration, so direct comparison with the M1 would probably not be very accurate for the purposes of comparing power consumption.

Ok, but the second we start saying Intel or AMD needs to make architectural changes to match M1 performance/power consumption, that in and of itself is a refutation of the GP's argument that the only reason M1 looks good is the manufacturing node. We're admitting that there's an architectural component to its results.


And Intel has big.LITTLE but they're not using TSMC nodes. Apple comparisons run on OSX or Asahi Linux, which doesn't run x86, so the software/ecosystem is different. Etc etc.

No comparison is ever perfect, you'll just have to take the best data we have and run with it. Complaining that a study isn't exactly perfect is trite, once you get beyond the "junior scientist makes obvious methodological error" tier it's honestly one of the least useful forms of criticism, someone is always going to think it should have been done better/differently (and wants you to take the time and spend the money to do it for them). But science is about doing the best you have and trying to make reasonable extrapolations about the things you can't.

Single-thread benchmarks on big vs little cores will get you IPC figures, and then you can scale those according to clocks you see on full-load conditions, for example.

Or you can simply compare it to a future 13th/14th gen Intel i3 with big.LITTLE, or an AMD quad-core APU. AMD has SMT, that's an advantage, Apple has single-threaded cores but a couple extra little cores, it's similar-ish.

Nothing is ever perfect. You just make do. It doesn't mean we throw up our hands and scream that if we can't be accurate to 1000 decimal points then we can never truly know anything.

(You may not have intended this, so just FYI: it kinda comes off like you're pre-stating that you won't accept the results if they don't come out the way you like, that you'll find some other difference between the two to latch onto. And there will always be some minor thing you can latch onto, no two designs are exactly identical. But that's not really an honest way to approach science, merely being able to theorize some differences isn't useful and if you feel strongly about it then you should do a similar test yourself to demonstrate.)


I definitely agree that there are reliable tests here (your suggested IPC count is "good enough" for these purposes), but the majority of benchmarking between these two machines wouldn't yield much of a comparison at all. I don't think it's unreasonable to annotate that comparing these chips directly is fools errand, especially since Apple has demonstrated that themselves with their own M1 benchmarking/graph fiascos.


How do you provide fanless laptop, and extended battery life if you don't care about maximizing performance while maintaining ultra low power usage?

That is why nobody buys windows based laptops anymore and the macbook air is the best selling laptop

you don't need apple marketing team to see how people are fed-up with hot windows laptops and their noisy fans (and blind btw, pun intended or maybe not lol)


> That is why nobody buys windows based laptops anymore and the macbook air is the best selling laptop

A huge majority buy windows based laptops.

Macbook Air's are the best selling laptop because of the fragmentation of the Windows laptop market amongst manufacturers. MacOS does not in any way make up a majority of laptops.

I get that it is 2001 again and we should all be shitting on MS, but let's try and keep some facts in the conversation.


Incapable of attacking my argument about fanless/power efficiency, instead you nitpick on the sales

> I get that it is 2001 again and we should all be shitting on MS, but let's try and keep some facts in the conversation.

That's a windows 11 with snapdragon for you! fanless! ha


It's not a "nitpick" you made a specific claim.

Not the person you're replying to but I'll reply: I think the M1 is really nice, although I don't care much for some other MacBook hardware design decisions and I don't care much for macOS either, so I didn't buy one, but I did spend some time considering it and looking at options.

In the meanwhile, my current ThinkPad is "effectively fanless, most of the time". What this means is that as I'm typing this it doesn't need any fans. It doesn't most of the time, even when I'm programming and compile my project (incremental compiles) it doesn't need the fans, and it remains fairly cool as well. With full compiles or some (not even all) games it does need the fans, and that's okay with me.

And this laptop is actually 4/5 years old; I got it "second-hand new". Newer ones are even better. Oh, and the battery also lasts about 15 hours on a full charge, which is not as good as a M1 machine, but "more than good enough" for me.

So "full fanless" would certainly be nice, but in the meanwhile "mostly fanless" is actually just fine.

Also remember that Apple only makes top-of-the-line laptops; if you buy a cheap Windows laptop: yeah, you're not getting an especially good laptop. But if you buy something in the same price range you often (not always) get something much more comparable.


Living in a bubble is always nice though. How many laptops are even used as laptops?

Most of the world is not using Apple laptops and is not going to. The Facebook community around you might, but that's not the entire world. The "nobody" in your list still accounts for what, 90% of the laptop buyers? Even more?

Most of the powerful laptops are neither Macs, as those go to the gaming laptops where - this is a big surprise - games and their performance counts. Apple has no answer there.


> Most of the powerful laptops are neither Macs, as those go to the gaming laptops where - this is a big surprise - games and their performance counts. Apple has no answer there.

Games? the biggest market is the Mobile market, Desktop market is shrinking year after year, it'll become a niche very soon

Ever heard of Genshin Impact? https://gamerant.com/genshin-impact-made-more-money-in-its-f...

And please learn to project a little when you do some analysis, yesterday is long gone, tomorrow is what's going on

And when say nobody, i talk about the people making a deliberate choice, in that group, rarer are the people choosing a windows laptop, other than the people replacing their IT fleets, or your granny picking a cheap laptop because she has no clue what an OS is anyways and everything is pre-installed with Windows, for some reasons ;)


Games, yes, games. We're talking about laptops here, not mobile phones. PC gaming industry is huge and continues to be (even in the future iterations you talk about).

When it comes to projections, the ~1300€ laptops are not going to be the ones taking over the world. Not in any realistic projection, unless the inflation goes to Turkey level.

Every people makes deliberate choice, don't discount someone's choice not being deliberate just because it does not align to yours. I don't pick a laptop for my own self, but a desktop - since I don't need the mobility when I need grunt. There M2 brings me nothing so far.

Your granny might make a more thoughtful choice than you do. She might buy a tool for herself while you're fancying over something shiny. That's not deliberate.


M1 and now M2 allows macbooks to run iOS apps/games natively so that makes macbooks able to consume natively the biggest market for Games, moving forward that's huge, as more games are being ported to phones Apex, Call of Duty, GTA, FIFA, Heartstone, League of Legends, Fortnite, Valorant (including indie titles like Dead Cells, Minecraft, Terraria)

and with Cloud gaming coming, as the infrastructure improves, and fiber becoming more and more popular, that'll allow every kind of devices able to consume intensive games

Microsoft added Android VM support in windows 11 because they know what's going on, they have access to the numbers


> Games?

Yes, games. Even as a floundering mess, Blizzard rakes in 300-400mm/quarter on a single game that is a PC exclusive, without insane microtransactions/loot boxes.


That's exactly why Blizzard is working on a mobile version of WoW, that's exactly why they made a mobiel version of Hearthstone, and that's why Riot is porting their games on mobile too 300-400M/quarter is nothing compared to what others are doing on mobile, and it is a strong IP with a dedicated playerbase that grew with the game

"In the first three months of 2022, Genshin Impact raked in £450million from player sales"


> fanless laptop

See also: "how do you provide ever thinner laptops?"

The reason you're so sensitive to fans is because Apple's attention to cooling goes far, far, beyond a joke. Modern processors will throttle at their maximum temperature (typically 100C for CPUs). Apple's approach has always been to depend on that as a cooling solution. People have improved the cooling on the M1 to something in the ballpark of the M1 Pro, and performance lifts to match.

M1 has certainly thrown the whole compute-per-TDP equation into chaos, and you'll definitely see more performance prior to hitting tj-max, but when that comes, performance will come crashing down. They don't have fanless cooling, they essentially have no cooling.

If your workloads don't require extended periods of compute, then the fanless Apples can't be beat. Claiming that the M1 beats x86 laptops across the board is, well, "very uninformed " to say it kindly. To say it frankly, fans of high-end devices (whether Windows or Linux) do get very noisy in the face of uninformed bullshit.


> If your workloads don't require extended periods of compute, then the fanless Apples can't be beat. Claiming that the M1 beats x86 laptops across the board is, well, "very uninformed " to say it kindly.

OK, but Apple also makes the MBP line with active cooling too?


Sure, but the parent comment was going on about fans.

Edit: you also get insanely specced x86 laptops that have no business being laptops, so for the few who actually buy those monsters the parent comment is also laughable.


The highest estimates I've seen for Mac laptop market share are 15%, most places put it at 6-8%.


I agree, a fanless cool laptop is preferable over a loud fan and hot laptop


Let’s hope they fixed the 32MB TLB bottleneck


It was super interesting for me to see Apple not directly compare M2 to M1 in any of the graphs, why not directly tell us how much better it is than its predecessor as opposed to PC Laptop peers ?


Huh?

https://www.apple.com/newsroom/images/live-action/wwdc-2022/...

This was the first comparison on the page. M2 has 18% more relative performance than an M1. You can argue about the relative part, but the certainly included the comparison (and do for GPU, etc...).


First graph of the first set of graphs on the page shows M2 as 18% better performance for the same power consumption as M1. First graph of the second set of graphs shows a GPU performance comparison between M1 and M2.


ah I missed those while I was watching the livestream, probably distracted with something at work, but what I saw was some 5 different graphs with 10core laptop pc, so was like what happened to comparison with M!.


They had a graph with M1 and M2.


Where is the new Mac Pro? Where is the upgrade to the XDR Display? Where is the M2 Mac Mini? What a shit show.


Good faith answer?

Supply chains. Good ol’ supply chains.

The “Mac” is really the “MacBook”—very solid majority of devices sold are laptops, followed by iMacs, then minis, then a teeeeeny sliver of Mac Pros.

Well, probably. People infer it from quarterly earnings. Apple no longer breaks it down explicitly by category. But it’s a very safe assumption the biggest selling Macs, by far, are laptops, and they are prioritizing silicon for those.


For a very long time, the M2 has been expected to launch before an updated Mac Pro - which might still be using an M1 variant, and should be out by the end of the year.


Shitshow? Hardly. This is business, you only release the products that sell. Presumably the Mac Pro and Mac Minis aren't selling. Might have something to do with PCs doing all of the same stuff for less money. Packing power into a thin & cool laptop is where Apple shines now.


The XDR Display did get upgraded — you can now attach your iPhone to it and use it as an external camera for Zoom. Amazing! (The XDR-specific mount will be $249, and you will say thank you.) /s


maybe they're still in development, or test production runs?


Wake me up when other hardware makers catch up so I can finally care, as I won’t be purchasing another Apple brand product, thanks.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: