I got an M1 for work and my very quick good/bad review:
It kind of sucks how much stuff wouldn't work on it without special install/compile from source instructions. But that's mainly just growing pains that go away.
There's also the issue that WASM multithreading basically does not work on M1 yet[1].
But, I am in love with a laptop that runs for days (of my 3-4 hours per day usage) without needing to be charged. I look at that laptop like my Kindle more than my phone: charging it is this occasional chore, not a daily "must get plugged in." I cannot quite articulate why, but this is such an awesome thing that's changed my behaviour patterns. It just sits there and I pull it up, use it, shut it, and put it down. I'll see "20% battery life" and think, "I'd better charge it tomorrow" rather than "I'd better interrupt everything I'm doing and move to an un-ergonomic location so that I can sit a hotplate on my lap while it charges."
I also love that it doesn't make noise or get hot.
Oh and another curious weirdness: when I open the lid in the morning, the cursor only moves at 30fps until I close the lid and re-open it.
I got the 16" M1 Max from work and adore it. I do Rails mostly and it's fantastic. We have a Rails based ETL app that pulls data from Oracle into a local Postgres with tons and tons of transformations. The Mac significantly outperforms our production server with Xeon Gold CPUs despite running the app in Rosetta due to Oracle's lack of arm64 support.
I've also never heard the fan, even running 100% load for hours. I'm sure it's spinning a bit but you can't hear it and the machine isn't hot. My Coworkers' Dell laptops sound like a vacuum just running a Teams meeting.
My xps-15 sounds like a jet engine when I'm watching a YouTube video. Mainly because Ubuntu + Chrome + Nvidia chipset do not play well together so it's all software decoded. The solutions are either: "run all these brittle steps to built your own Chromium[1]"
or "go buy a different $3000 laptop because Linux (Ubuntu?) is very brittle when it comes to hardware support."
My third solution, which I promised myself I'd do over the holidays but never got to, is migrating fully over to MacOS so I can stop thinking so much about my tools and get back to thinking about my projects.
Nope, I wrote a dedicated app to do the ETL (which is Rails but doesn't actually run a webserver outside of development).
I'm transforming Peoplesoft data into something actually usable, so it's more complex than any off the shelf tools I looked at could tackle. It's my third time doing one of these so the code this time around is super clean and maintainable.
The university I work for is switching to Workday next year so when we're actually provided details on that I'll be sure to revisit Kiba. If Workday's DB is cleaner it could be a good option.
I have PTSD from melting laptops so I often find myself checking the surface temp of my M2 air to reassure myself that this is real. On the occasions when it heats up a little - when I'm running 3 instances of intellij, a couple instances of vscode, some 20 tabs in chrome, 50 tabs in Safari in addition to Slack, and maybe a youtube video to just get away from it all - it is a bit disconcerting to me.
A performant way to do a whole bunch of GIS analysis on spatial data authored by humans and used by robots. Ie. "when a user uses these web-based CAD-like tools to adjust the city plan of the robot space (eg. rules of the road, laneways, destinations, etc., does it produce a usable result?"
I am wondering why anyone at Intel/AMD/Asus/Dell and Microsoft etc are not realizing that there is low hanging fruit to be had in battery life department on current x86 hardware.
The reviewer makes a decent effort with his more limited capability than PC industry giants to discover that Windows is boosting the single core to its max TDP for single core browsing workloads resulting in very high power consumption. Basically scheduling strategy from desktops and servers used as-is on laptops! And easily fixable in software!
SoC power consumption is the remaining bottleneck - that's not as easy but still if CPU/SoC/BIOS vendors and OEMs came together it would not be all that harder to fix that.
These two fixes would put x86 into a very competitive place in laptop battery life department - it's not ARM vs x86, it's the integration pieces. Mind boggling that nothing is being done by PC makers.
It's not that surprising. Have you ever been in a meeting with members from multiple companies? You can hear a pin drop, they are all afraid of spilling secrets or just deferring to management (which is usually also in the room since those tend to be high profile meetings). The only way to get this done is to do it in-house like Apple does. I think MS has sincerely pushed on this with the Surfaces but I don't think they control enough of the vertical below chipset level to match Apple.
Basically Intel and MS would have to merge and on a deep cultural level. That's what Apple has.
The "we own the hardware and software" is a HUGE HUGE HUGE advantage for Apple. I don't know how you get to the battery life and basic stability levels of a Mac without that integration.
It's a similar situation to Android vs. iPhones in terms of device integration.
Although iPhones don't really get much better battery life than the best Android devices around, if you compare battery life/perf to the battery size, the amount of battery life they extract out of (comparably) small batteries is extraordinary. The biggest/best Samsung phone is 5000mAh, whereas the biggest/best iPhone is 4300mAh.
You just throw away efficiency at a system level when you design a phone based on another company's design (i.e., Qualcomm, usually, for Samsung). Qualcomm doesn't have appropriate incentives to make a device as integrated/efficient as Samsung/Apple do.
> it's not ARM vs x86, it's the integration pieces. Mind boggling that nothing is being done by PC makers.
Intel spent over a decade and billions of dollars to make a mobile x86 chip that could compete with ARM on battery life and they failed. And now they're getting taken to the cleaners by an ARM chip for laptops. I don't think it's the integration that's the problem, x86 is just too inefficient relative to ARM.
When it's more efficient to emulate x86 on an ARM chip than to use a native x86 processor you have to ask some questions about the ISA and design.
Like, in what way and how much inefficient in 2022? Would it still be that when Intel finally gets to 5nm/3nm? Is the video reviewer lying or mistaken when he says 2 biggest differences that make up for most of the power draw between x86 and ARM are not in the ISA?
> Is the video reviewer lying or mistaken when he says 2 biggest differences that make up for most of the power draw between x86 and ARM are not in the ISA?
Naive is a better word. The question is really, why don't designers (who have a vested interest in making a feature like battery life better to improve sales) take these "obvious" actions to improve their products? The answer is usually because they're not sound optimizations, or at least come with some heavy tradeoffs.
That's not at all a logical/technical argument - it's just further lazy speculation to claim that only reason designers don't take these obvious actions to improve their product is because "x86 ISA sucks". As pointed out it is hard enough to get obvious things done when there is one company and 3 org units involved - let alone a CPU vendor, an OEM, a BIOS vendor and SoC with 10 other chips by different OEMs. Oh and after that the OS vendor.
None of those are ISA (https://en.wikipedia.org/wiki/Instruction_set_architecture) issues - they are firmware/os/SoC issues. The other issue is manufacturing process - AMD is getting to 5nm only recently and the results in lower power consumption are already telling.
IOW ARM doesn't have some magical ISA level stuff to do more stuff with less power or avoid doing stuff entirely that x86 couldn't replicate.
Straight-line performance per watt is not the problem IMO. The problem is something along the lines of sleep/wake states and how MS + Intel can’t seem to get it together to handle the times when the CPU doesn’t need to be working.
I think the root issue on sleep is Microsoft/vendor induced - this issue only really began when Microsoft introduced connected/modern standby, and started to incite OEMs to remove BIOS/UEFI support for S3 sleep.
It seems unfathomable - maybe there is a valid reason for modern standby, but I've yet to find it. S3 resume is quick on a good laptop with good firmware. Modern sleep seems to drain the laptop battery quickly for some reason, and yet this has been going on for years now.
Maybe standby drain isn't part of the Windows logo requirements and ought to be?
From what I've heard this is mostly on board designers and the complexity of how x86 handles the multiple levels of sleep/power state. Taking a conservative approach is done to prevent bricking devices on wake.
I wouldn't blame MS for this. Linux has the same, if not worse, state.
> it's not ARM vs x86, it's the integration pieces
Eh, I'm not sure how true that is. There are no doubt improvements that could be made, but, well, compare battery life on an ARM Mac to a related x86 Mac; it is night and day.
I am not sure if you watched the video - but the improvements in the 2 areas that he points out (stop single core boosting to tdp and turn off more unused SoC I/O components on idle) already gets x86 very close to M1 power consumption.
> but, well, compare battery life on an ARM Mac to a related x86 Mac; it is night and day.
If I turn the screen all the way down so nothing can be seen, I can peg the processor constantly at 100% of my Intel Mac and still get 10 hours of battery life. The only caveat is that it's a 2010 Core 2 Duo, and what it does in 10 hours an M1 can probably do in 10 minutes.
Mac was already beating Windows machines on x86, when it came to battery life (especially when you consider battery size, too—the actually efficiency was much higher). I dunno how much they beat them on claimed battery life (because IME Apple is way, way more honest about how long their laptops run on battery, than PC vendors are) but real-world it was like 1.5-2x, for a same-sized battery on similarly-powerful hardware.
Their iOS devices were also wildly more efficient than Android devices for years (may still be—I haven't had much exposure to Android devices in three or four years) despite both being on the same processor architecture, so it wasn't the architecture per se making them more efficient.
Apple just seems to be the only major vendor that cares much about power efficiency at a software level. There's absolutely a ton of room for improvement on x86—Apple proved it when they were still on that architecture.
What the OS needs is a way to tell CPU-hungry applications no. One misbehaving app (looking at you, Teams) is enough to pull the CPU to max power consumption and wreck the battery life.
Just keeping the clocks low and letting banner ads rotate a little more slowly would make for much happier users. Even if I set the priority of an app to minimum, the OS still does it’s best to respond to CPU demands
On macOS apps can specify a desired Quality of Service per thread, and so background threads get scheduled to th power-efficient E cores. I don't know if Windows has any similar mechanism, and Posix threads certainly don't.
> Mind boggling that nothing is being done by PC makers.
Correction, nothing being done by Microsoft, they own Windows, they own the software, yet they decided to go full bloatware, for Microsoft, the people who complain are "trolls"
And this video is the result, mediocrity
For some reason, Microsoft always dodge criticism and responsibility, that's what's mind boggling
They only dodge it to those who still loyally use Windows. Those of use who 'escaped' that hellscape are fully aware what an utter pile of bloated crap it's become. No idea why anyone tries to defend it at this point, it's a shockingly poor operating system and stands out as easily the worst product (be it software or hardware) that Microsoft currently offers.
This is a weird take IMO. I use PC and Mac daily (primarily Mac). Windows works fine in my view. There is really nothing that impacts my day to day work other than some great third-party software that is missing on the PC side.
I can confirm that various Dell XPS 15 models and Microsoft Surface Book 2 and 3 models are just not usable without the power plug for my software development (and very rare gaming) needs. I also need the power plug for a simple thing like a Microsoft Teams meeting that last longer than one hour.
Compare that to my MacBook Air M1 which runs for a full day easily. Also, standby works.
With Dell I learned it's better to always shut down.
It staggers me that this is still a problem, because *more than 20 years ago* I switched from Win98 on a ThinkPad to a G3 Powerbook partly because of sleep.
I was working almost exclusively in Office docs back then, and the Mac and Win suites were (then as now) file-format compatible. What coding I did was on *nix servers I could SSH to. And I got real, real tired of often-crashing, slow-booting, sleep-sucks Win98 on a laptop.
Then I noticed a colleague who'd come into the consulting group from the design side, and kept his Mac. He could just open it, do something, and close it. And then open it again, and have it wake up normally. It crashed marginally less often than Win98 (this is pre-OS X), but the boot time was MUCH faster, so the crashes were less annoying. I bought a Mac and have been here ever since.
And you're telling me that even today, in the Year of Our Lord Two Thousand Twenty Three, that sleep still doesn't work for shit on big-name Windows laptops? That's bananas.
it's easier to make work S3 on 20 configuration than on 20000.
Yes, MS, Dell, HP sucks dicjs not making it work even on a flagship notebook, but that doesn't means what there are no Windows laptops with a working sleep, it just means what you don't hear from the millions the users of thousands of working configurations... and what HP/Dell/MS just suck in the hardware department.
NB there were hundreds of laptops from HP/Dell/Asus/Acer what worked just fine. You just never heard complaints about them, because there were none.
All I know is that sleep eventually fails to work on every Wintel laptop I've ever touched, from the mid-1990s until today. Dell, IBM, Lenovo, HP, you name it.
I think it goes back to the unity of control Apple enjoys. Doesn't mean it CAN'T work for Windows, but the vendors would have to work harder to cooperate, and they clearly can't be arsed to do so.
Sleep mode is really bad. I had a Surface Pro 3 which was the first iteration of "connected standby" and it would run the battery down while sleeping in less than a day.
I was borrowing a newer Surface Pro from work more recently and hoped with the additional years of development they would have managed to fix it. Sadly not.
Should add it was newer but not current, maybe they've got sleep figured out by now.
But it was really surprising to me with the Surface Pro 3 because for years you'd been hearing "Microsoft can't make sleep work as well as Apple because they're stuck dealing with hardware and drivers from 3rd parties" and it turns out 3 generations into their own hardware venture it still sucked just as much. Connected Standby was probably a large step back even.
> Compare that to my MacBook Air M1 which runs for a full day easily. Also, standby works.
Huge credit to whoever wrote the standby/sleep code for Darwin for the M1s, I've never had to close my M1 Airs lid and worry about standby not working or it not resuming.
Windows is a mess and unreliable, and Linux is less said the better, it doesn't even work with secure boot on.
Suspend/sleep mode on (recent) Windows laptops seems very hit-or-miss, since everyone started to move away from "proper" S3 sleep, and towards what Linux calls "s2idle" (modern/connected standby on Windows?)
It seems to be a fairly significant regression in terms of functionality, which nobody really asked for, but which results in significant idle drain, coupled with the risk of a laptop deciding to resume and heat up in a bag, etc.
Apple seems to have a (long standing) positive argument here - to point at how when you put a Mac to sleep, you can resume it days later with negligible battery drain. Windows seems to have lost this in the post-S3 sleep era.
I have no reason to believe my current M1 MBP won’t last as long as my 2012 MBP that’s still running. Why, do you suspect a MacBook isn’t going to last as long as a Dell laptop? I’d love to hear the reasoning for that one.
SSD failure. If the SSD dies in a M-series Mac the machine is a brick, can't even boot off an external
drive in such a case either because of how they handle the boot process. Meanwhile most every PC laptop uses M.2 socketed SSD and even if it's some weird one that doesn't will still boot off an external drive.
When was the last time you had a problem with SSD failure? While I’m sure it happens, I don’t see a lot of complaints from people about their SSD failing.
In 20 years, my Macs have proven to be as well-built, hardwarewise, as the best Windows machines I had in the 90s (so, IBM-era Thinkpads). I have one in the house in use as a server that's 12 years old. Runs fine. It's slower than the M1 I'm typing on now (& obviously battery life would be awful) but if I had to I could do work on it.
I carry an M1 Macbook Air (for xcode) and an LG Gram 17 running Kubuntu for everything else (which is most of what I do). The Gram is probably the best laptop I've ever owned (it replaced an XPS-15) - it's bigger (13" vs 17"), lighter (2oz less than the air) and gets crazy life out of the battery.
When new, the Gram got 11-16 hours of battery (it now gets 8-10 hours) under development load (i.e running a node app, a go app and a Django backend, plus support servers). The Air hasn't been used as hard as the Gram, but it's battery gets similar performance. On the power management front, everything works on the air. On the Gram, hibernate is disabled, and sleep works (I've left the Gram in my bag over a three day weekend, and it was at about 20% on Tuesday when I opened it up). I ran Windows for about a week on the Gram and it did got less than eight hours of battery. No idea why.
I've had quite good success with hibernate on Linux. It was a long time ago but I accidentally tried it on a laptop ~13 years ago & it just worked. The appeal was immediate- 3-5s boot/sleep rather than near instant but no battery drain!
I had one laptop where the wifi would only come up every other boot, but that did eventually, after many years, get resolved with a new kernel.
It's been really nice having no power drain on personal laptops! Agreed, I too see pretty excellent power consumption on Linux. The new connected Windows stuff looks so awful, unbelievable power drain. I hope it largely remains a Windows problem & doesnt Linux greatly. To be honest, 20% after a weekend is much less than I'd want.
Absolutely love my (completely base model) M1 Air, and my M1 Pro MBP supplied by work. Both have shockingly amazing battery life.
My work one was replacing a 2016 Macbook Pro which struggled to last more than a couple of hours when I was doing dev work, all whilst trying to reach orbit with its fans. I've never heard the fans kick in on the new one.
As for the Air, I thought I'd be crippling myself a bit by only going fot the basic 8GB RAM model, and other than when I installed a copy of Premier Pro and tried rendering something it's been amazing - even then it rendered fine, just took a fair bit more time than on a pro.
I'm not sure I could go back to an overheating, crappy Intel "Mobile" based laptop - they shouldn't even be allowed to call those things mobile chips at this point, they're utterly shite.
This is actually probably the best comparison Latest Intel Mac MBP to M1/M2 MBP. A lot of discussion here is about windows and OEMs not doing enough. If that were the case, then why couldn't Apple work their magic on intel Macs? They had pretty crap battery life & performance too compared to their new stuff. I doubt they would have gone through the work of developing their own processor if it were just a matter of getting power states and clock boosting right in the software...
The last few years of Apple notebook revisions have been pretty incremental, until the M1 arrived. I was happy using 3-4 year old MBA's because the user experience was not that different, but the M1 models were a night and day difference- better battery life and better performance, cooler operation with no fan.
Seems practically impossible- if not technically then definitely from a business standpoint (why were these potential gains unrealized for so long?)
Same experience here. I have a 16" M1 MBP for work (mostly Go and Rust development), and it's the best computer I've ever had. It's a beast that can handle anything thrown at it without even spinning the fans up, it's insane. It's closely followed by my personal 16GB M1 MBAir which I am currently basically only using as a LaTeX machine to write my PhD thesis on. The Air in particular has AMAZING battery life, I can go days and days without charging it while using it for several hours a day. It's also faster and more responsive than my previous i7 tower with 32GB RAM, and that's saying something.
I keep having to unlearn years of "Laptop = hot" and "<2 hours of battery if I'm pushing the laptop".
My 2014 got warm but never too hot on my lap but when I got the 2019 i9 it was unbearable to put on my lap without some kind of insulation. Now my M1 Max barely feels warm at all on my lap, often cold and the battery is really something else.
Sometimes I play Factorio sitting on my couch and with my 2014/2019 MBP I HAD to have the power cord nearby. Obviously I could go on battery for a while but as a rule I always plugged it in while using (see also: I LOVE having magsafe back, the 2019 always made me super anxious with only having USB-C for charging). Now with my M1 Max I can go for hours playing without thinking about power.
Likewise when I go up to see my parents/family I can jump on my laptop a number of times for quick tasks (~2-3 hours over a few days) and I never have to think about plugging it in. Maybe it was something I had installed on my 2014/2019 MBP but I felt like those discharged faster when not in use. My M1 Max feels much closer to an iPad where you can leave it unused for days and come back to still having 80%+ battery.
For me the worst part was not my lap but my left palm (2013 MBA.) We had carpal tunnel, someday there's going to be a new syndrome that comes from constant heat being applies to your palms while using a laptop computer.
I have a 2019 MacBook Pro with an Intel processor.
I have a 2021 MacBook Pro with an Apple processor.
One is a work machine, the other is my personal machine.
---
Things I love about the 2021...
The keyboard is just spot on perfect. No stupid touch bar. It feels good to type on too.
It has MagSafe again.
It doesn't get hot. The Intel MBP has real heat issues. It sucks to use it on my lap, or lying down. MBPs get really hot on video calls... and I spend all my time on video calls. For comparison, I shut everything down on the two machines and video called one from the other... and the 2019 had over 3x power usage on video calls. Also it didn't heat up!
The screen is "burn a hole in your eyes at night if you forget to dim it a bit" bright. It's amazingly bright.
I thought the notch would be annoying but if anything now the entire top of the 2019 feels like a notch. Meh. I don't care about the notch, it's never gotten in the way. And if anything the consistent black space around the monitor on the 2021 makes me really noticed how the 2019 has more black space on the top than it does on the sides.
The battery can go all day. Like no joke. And what's awesome is when I travel for work I don't even bring the charge brick with me. I know it seems like a small thing, but it means I can just use a smaller "sleeve" style case and it just makes traveling a little easier. One less thing I have to lug around on a day-flight.
---
Things I miss about the 2019...
The case on the 2021 is just flat out ugly and hideous. It's a total square box. It's thicker, it has more "give" and doesn't feel as solid. It feels cheap and shitty and hollow and all the things Apple laptops aren't supposed to feel like. It doesn't have any of the nice graceful curves that make you say, "Man these guys... how did they do all this?!" It's just a big thick ugly box that "thuds" when you tap on it and has visible give when you grab it in certain places. I think they even changed the finish texture on the aluminum. I just hate the way the 2021 feels. It feels cheap and flimsy. I hate the design choices on the 2021 too. Feels like they went back 10 years.
Using M1 Macbook Pro for year and a half. It's great, it's quiet, it's not getting hot and the keyboard has been reverted to a great one (comparing to previous model).
Silence when I work is something I have been unaware of how precious it is - because none of my previous machines were able to be silent ever before.
Doing a lot of web-dev with docker desktop. I can't think of a flaw - it's not it does not have any (like it could be better for occasional gaming), but compared to other setups available on the market, it's just flawless.
It was this way for Windows for 1.5 decades with Boot Camp. On Intel chips they provided strong support for running Windows natively. In the end it seems like few people used it.
I think it's less of Mac hardware specifically, and more of the quality of their hardware.
The best Linux users can do (for a laptop) is a Thinkpad (as they officially have support from Lenovo) or one of the more boutique brands (system76, starlabs, purism) and at best (my opinion, Thinkpads) they trade blows with Apple hardware.
As far as desktops go, building a computer for Linux is simple and easy and doesn't really have that many problems in my experience.
That would probably have either a negligible effect on sales or have a small upside overall. People who would buy a Mac to use with Linux are people who are not using macOS already. I would love to use the Mac hardware but I ain't giving up on Linux. The question is whether there are enough of us for them to make it worth providing support to Linux. Unfortunately, probably not.
I can do almost anything on a Mac terminal that I can do in a Linux terminal. So to _me_ there is nothing that a dedicated Linux environment can give me that OSX can't.
How about a usable package manager, or a usable window manager? Brew is extremely slow and cannot do versioning (as in, installing a package will result in random other packages upgraded across major versions), and for window managers there are only two non-paid versions which are OK but still poorer than alternatives available on Linux.
Versioning is done through the package maintainers. If there is no versioning for the package it's not Apple's fault (you could argue it's brews fault for not requiring it).
If you think Brew should require versioning then open a request.
If you don't think there is enough free windows managers perhaps you should create one yourself or contribute to a non paid project to ensure your wants are addressed.
It appears your problems are with 3rd parties and not Apple.
Packages within Brew are versioned, but Brew is incapable of understanding that and assumes you always want latest on anything.
> It appears your problems are with 3rd parties and not Apple.
No, my problem is that Apple can't be bothered to include basic features in their software, thus requiring endless third parties to fill the gaps pro bono / for money in the hopes Apple doesn't one day Sherlock them.
Maybe when they stop crippling their already expensive hardware to gouge customers.
Where's my multiple display outputs, Apple? I paid way more than any competing device, every one of which allowed for more than one extra display. I'm not a video editor, I don't want to buy an M1 Max just to get triple monitor support back.
It's only the entry level M1 that is limited to 1 extra display, although even then you can use DisplayLink. The M1 Pro supports two and the M1 Max supports 4.
I wouldn’t call the display limitation intentional crippling just yet. It seems like it’s a legacy of the history of the base M1 / M2 chips. Although if they don’t resolve it in the next couple of generations I’ll agree.
The problem is that the normal M1 only allows one external DisplayLink screen on macOS (Linux can get more screens out of the same hardware, I believe, as long as the I/O bandwidth doesn't get exceeded).
In the end, the problem is that the first M1 chips simply didn't have the I/O capacity to drive multiple screens normally like the Intel machines before them could. Higher end and more recent chips have added more I/O (two screens for M1 Pro, four to five for M1 Max, five on M1 Ultra)
DisplayLink, a proprietary protocol that sends video over USB, was never a problem. The quality and performance of DisplayLink is far below DisplayPort, though, because the video needs to go through software and an additional layer of compression to fit within the USB data speeds.
If you need more than 1 external display you go with an M1 Pro or M1 Max. They support 2 and 4 screens respectively. Buying an entry level model and expecting it to come with higher end features is a user issue, not a product issue.
Hm one display connection seems to be reserved for the HDMI port if there is one. Considering how much trouble I tend to have with HDMI ports, that means the M1 Pro only supports ... one external display.
Well...no it supports two. Mines litterally sitting on my desk right now plugged into 2 screens and has been for ~4 months. A you issue doesnt make it an issue with the product ;)
Sure but can i do 2 x usb-c to displayport? Every time I plug something via hdmi in my current mac it tells me its a tv. Fortunately being Intel i can use 2x usb-c instead.
If their OS wasn't a dumpster fire of mindbogglingly poor UX hidden inbetween good and shiny UI/UX (if anyone wants examples, not having the option to have different scroll directions between mouse and trackpad even if the UX implies you do, and useless undebuggable error messages such as "A USB device is consuming too much power and has been shut down, reinsert it to use it again" without any indication which device it's talking about, with all of them still working), and if their hardware wasn't utterly unmaintainable I would have shared your opinion, but hélas that isn't the case.
I'm sorry how is Apple not allowing you? You are free to run whatever YOU want on it. Just because something YOU want isn't created, well that isn't apple or anyone else's problem. You should build what you need instead of expecting someone else to do it for you just because you demand it.
You're using the term "own" in a non-standard way to make your point.
When you buy a Mac, part of the deal is that it runs macOS. However, you still own the Mac. You can loan it to others or sell it. You can sleep with it under your pillow.
That is one place where Mac needs to improve, largely because you end up using VMs instead of containers to simulate your favorite container platform for development.
My M2 MacBook Air is the best computer I've owned, bar none. Can confirm the battery life, easily 22+ hours in my use.
My only known downside to low power consumption: when I wake, I usually lie in for 30 minutes of surfing with the M2 on my chest. It's quite cold. I usually visit a compute-intensive site like Ventusky to warm it up for a couple of minutes.
Is your link correct? Because it shows an article on Yahoo News titled "Surgery under sappers supervision: doctors retrieve unexploded VOG grenade out of soldiers body", which seems to have nothing to do with the topic of your comment.
I've had my M1s for over 2 years and not a day goes by that I don't think "Thank you Apple, this has made my life better." Absolute game changer. Has helped me do things that were previously impossible.
The battery life is amazing. But just as amazing is that my hands don't sweat when using my laptop anymore because it doesn't get anywhere near as hot as other laptops when under load.
On the flip side, I've sold every one of my old Apple laptops, usually for a few hundred bucks at least after 4-6 years of use. After 4-6 years I can get maybe $100 or a little more for any of the Lenovo and Dell laptops I owned.
I just breathed new life into a 2015 MacBook Pro, OpenCore Legacy Patcher to install Ventura, 2TB NVME SSD with the help of a random Amazon adapter, and a new battery in the mail. All can be had for a couple of hundred dollars or less.
I hope to delay a year or two the inevitable of having to buy a nicely specced Macbook, now that even the disk can no longer be upgraded, if I understand correctly.
I’ve administered several hundred Macs for a few years, and when they do break you’re at Apple’s mercy for turnaround. The business I worked for had surplus, so not that big a deal, but it would be a bitch for my personal business and life.
If all I need is parts then it’s a day max, or I can switch to whatever can run Fedora halfway decently.
Flipping again, I just go to the Apple Store, of which there are multiple in any major city. I was on a Surface for a while but Microsoft closing all their retail stores meant my next upgrade was to a MacBook.
Apple stops giving you software updates after seven years which makes such a device unusable for normal use (browsing, banking, coding etc).
Also the ssd will fail sooner or later.
Can’t confirm - my parents still use one of my first work laptops which is over 12 years old and is powered by Linux. It’s enough for all their casual use cases.
Edit: throwing away laptops after 5 years won’t help us to live more in a more environment friendly way don’t u think?
I am running code right now on my desk (archival ops) on a ~10 year old MacBook Air (MacBookAir6,2 from system report;8GB ram 512GB SSD). Apple replaced the battery for $100.
We are already in situation that if you buy 32gb RAM for your PC, it is likely that there are other bottlenecks than RAM for the next 5 years.
Also the disk requirement is not increasing so drastically anymore, unless you are gamer.
I had an ASUS that was only marginally thicker than a MPB with upgradable ram and storage (which puts it miles ahead of the MBA I ran out of both on immediately despite using it for almost nothing). The battery was likely also replacable, I just never tried.
I just pay apple for insurance and they replace my battery at no additional cost. And if I ever spill water on the laptop or damage it in any other way that's my fault they replace it for a fraction of the actual cost of the thing.
It's not difficult to upgrade ram and storage on M1. The problem is limited availability of memory chips. But if you have pre-balled chips and hot air station (even the cheap ones would do) it's fairly quick to swap them out.
Even if you have used chips re-balling with right stencils is relatively easy.
I think legislators should compel Apple to make chips available to general public so these upgrades can become mainstream.
If general public is not comfortable doing that themselves, I am sure any repair shop would be more than happy to do such an upgrade for the customer.
Batteries on Apple laptops have always been pretty easy to replace (if not RAM and storage), just not swappable on the go. No glue, just screws- not sure if that's changed recently though.
That hasn't been the case since the Retina models. My mid-2013 Retina MBP had to have the battery replaced, which involved the top case, track pad, keyboard... and the battery.
And then the charging circuit went on the motherboard, for which only a like-for-like replacement was permitted. I had hoped to purchase one with more RAM, so I bought a Thinkpad instead of fixing this one.
No contest on optimized battery life with MacBooks. Or the trackpad, or keyboard, or screen, and performance all in a very lightweight machine that just works.
Apple seems to have (more than others at least) attempted to deliver things that people actually value.
It often feels like there are two factions working against each other in Apple.
One faction believes they know best, and have a specific aesthetic in mind when designing products (for example that Magic Mouse that charges on the bottom, which was a design decision to prevent people using it as a wired mouse).
Another faction seems to try to bring what humans actually want to technology.
Everyone says they want a fast computer that can compute half the galaxy, yet, when they have such a machine they become jailed to their desks in a semi-portable manner.
Everyone says they're fine with 5-10hours of battery life, but when you give people 18hrs suddenly everyone talks about being "free", which is something I experienced myself.
It's hard to reconcile, since some people seem to hate apple almost as a religion, and they have some really great points! But nobody seems to be competing with Apple on the same terms. There is no other manufacturer of computer equipment that has a whole package which isn't a letdown in most areas, usually beating Apple products in 1 or 2 categories.
Whether that be chasing the numbers (i9 5GHz!, 128G OF RAM!), some feature which is pretty good but not really fleshed out fully or "being a brand" and the aesthetics that go with it (Sony Vaios, for example).
The charge port on the underside of the Magic Mouse catches a lot of flack but I don't really think it's a big deal. I had a Logitech mouse that charged that way and it was never an issue. It charged at a rate of 1 hr use / 1 min charging (and I assume Apple's performed similarly), so even if it ran out battery while in-use it was not a big deal to recharge.
> But nobody seems to be competing with Apple on the same terms. There is no other manufacturer of computer equipment that has a whole package which isn't a total letdown.
Very true. For the last 10 years, I've been looking at alternatives to Macbooks in the Windows ecosystem: there is none that gets the whole package well. With every alternative I've considered, there is always something they get wrong and Apple gets right. There is a lack of care for the end-to-end user experience which is quite fascinating. The closest I've found are the Surface laptops.
Yes, the trackpad is a great example :) And also a reliable sleep and resume. Vents in the screen hinge instead of below the laptop, to let you use your laptop on your laps. A keyboard with no flex when you type. Etc.
> for example that Magic Mouse that charges on the bottom, which was a design decision to prevent people using it as a wired mouse
This is actually not an issue at all. The surface of the mouse gets really close to the surface so fitting a charge port isn’t easy.
The os tells you the battery level so I’ve never had one die in over 10 years of use (same back when I had to put in AAs). So I just plug it in at night and it’s fine by the morning.
You can charge it pretty quick too. I really wish we had an update to it though to add wireless charging. There were wireless battery packs for the last version (The AA battery one) which worked great.
A mousepad that charges would be neat. Comically, I find that I’m back on mousepad, not because of the need for traction or resolution but just to define a little bit of workspace.
The best of all time IMO were the old G4 silver Powerbook keyboards. The response felt great (both tactile and cushioned?), and the scalloped keys felt great to the fingers.
That keyboard was probably too thick for the modern era though.
I understand the whole "stuck in walled garden" part when talking about iOS. But exactly which executables have you not been able to run on OSX because Apple prevented you to do so ?
I use daily both an MacBook Air m1 and x220. I have the 9 cell for the x220 and run Linux on it. Battery life for the most work (I do) is similar on both.
There are some interesting cases; running Docker drains the m1 (only ARM images) faster than the x220 and doing typescript work drains the x220 faster than the m1.
Afaik on Mac OS Docker starts up a large Linux VM and has that run the containers, while on Linux it uses the base OS for that. x86 or arm, it's the same. That explains the extra battery usage (and RAM) on Mac OS vs Linux.
Except their OS updates reduce their battery by 2-3 times effectively zeroing out any gains.
I have a 14" M1 MBP that was doing just dandy until Ventura came along and now my battery life is a joke. It drains a ton from just light coding sessions (I'm talking 30% in 90 mins or so).
The M1 at its introduction was a little too good at its price point. The cynic in me says it will get hamstrung in software by piling on a lot of useless ‘ML’-crap in the OS that makes it perceivably slower. I’ve had my M1 laptop for two years and it still feels like it just came out of the box. I’m holding off on Ventura for as long as I can. The page showing the new features for MacOS is a laughable piece of marketing, trying desperately to find added value or frame things as such.
People have been hating on the new System Settings app (a messy redesign that violates Apple's own design guidelines) and how almost every other enhancement is really minor.
However, I've really enjoyed Ventura, because of the following (non-exhaustive list):
* As a Mac admin: enterprise managed updates actually work compared to Monterey!
* As a Mac developer: SwiftUI graphics rendering is significantly faster, closing in on AppKit levels of performance with no changes required!
As a regular user:
* A battery life leak that I noticed with Monterey 12.6.1 was fixed and my battery life became much better
* The Weather app is absolutely amazing
* Live Text is even better than before
One issue that's been irking me is that I've had a lot more accidental Siri triggers while talking to other people, and even while watching movies and videos through the speakers. I'm going to re-do the Hey Siri set up to see if it fixes the issue.
The settings page doesn't work for one thing. Not badly designed (which I believe it is), but it just plain is a buggy mess.
I've also had to manually fix permissions problems on directories that in the UI do not show as granted, but are actually enabled and allowing file reads/writes (the UI is just showing that permissions haven't been granted when they have, which is incredibly dangerous).
I don't mind the new organization of System Settings.app, but the search field is extremely slow and doesn't always return a sensible top result. I was skeptical of it at first, but I think it was probably the right direction, assuming they continue improve it.
It reduced battery life and consumes more resources for me on m2. I do roughly the same thing every day, so I'm used to when I need to plug in and can plan accordingly. Ventura has higher resource usage, and I need to plug up 70-90 minutes earlier, though sometimes up to 120 mins earlier.
Have you looked at CPU/energy utilization of applications during this? It could very well be that some update to your workflow is causing increased consumption you're not aware of.
yep it seems like the most CPU/energy is PyCharm/iTerm/Docker. This is the same set of tools and workflow I was using pre-ventura and I can very much notice the hit.
Contrarian anecdote... My M1 16" MBP has the best laptop I've ever had (and I have had quite a few). And I develop full time (lots of big Java, Go and Rust builds). No battery issues at all.
While I'd love to be able to go back to my Thinkpad Debian life, the idea of leaving the combination of a great screen, superb battery life, large developer ecosystem, and (newly, the butterfly keys were trash) ok keyboard would make me want to cry.
The butterfly keys were a brilliant system that was objectively better than the keys they replaced (and were in turn replaced by). Unfortunately, people preferred the "feel" of the old keys (myself included). They also broke constantly and expensively in the early generations (although Apple picked up the tab on this in the end).
The modern keys are orders of magnitude better for typing. The low profile butterfly keys were just not durable.
I have large hands and often bottom out on keys/keyboards. I went though three macbooks in two years at my last job. The new macbooks, I have no problem.
I think there was an update in the past which drained the battery more than it should have BUT now since the latest update I have the feeling that the power consumption is even better utilized than it has been ever before on the 14". I watched Netflix for hours and hours and hours on the weekend. PS: I switched from Safari to Chrome. Not sure how big the impact is on the battery in regards of media streaming.
Cant say I've had the same experience. Venturas had zero noticable impact on my battery life and I leave it on battery almost all day whilst doing software dev.
Majority of the energy use appears to come from PyCharm/iTerm/Docker which is the same workflow I've had since pre ventura but with a noticeable battery hit now.
SoCs, ARM especially, are the future I don't doubt. I am thankful Apple is pushing the industry to advance. But :-) until these kind of battery advancements trickle down to Linux and Windows, I really don't care to be in Apple Prison(TM) as a trade-off.
Also, Apple has a long and storied history of specification marketing hyperbole, so I am extremely skeptical these chips will get 'real-world' 2x battery life.
I will keep building my cased x86 behemoths and refurbished Thinkpads as long as there is stock available :-) Hmm, where did that rotary phone go...
It's amazingly true. And it's not 2x, it's 3-5x depending on workload.
I've got an i9 MBP, and I've been using an M1 MBP for the last year for work. The M1 I don't even need to plug in, I can work with it for 2-3 days on a single charge, the i9 I essentially run plugged in all the time.
Also the fans never even start on the M1. They run a LOT on the i9 and it throttles and gets very hot often due to my workloads.
Apple may have a history, as you say, and I wish they'd publish more quantitative metrics -- but this thing really works. It's life changing.
The 'real world' battery genuinely is at least 2x (most of the time WAY more)...thats not marketing hyperbole, its real-world people doing real-world jobs telling you.
What is your take on the "Apple prison" metaphor? I hear this a lot, but I can install whatever I want on my Mac and have zero issues doing what I need to.
It's true that iOS is a walled garden. I suppose if you have serious work to do on a smartphone without a keyboard, then the limitations could feel like a prison. I question how many people need to do this and wouldn't have a laptop handy, but that's just me.
I think battery power is important, but realistically in terms of a laptop (not a phone) how often are you not near a power source? I don't really think it is a top priority once you reach a certain point of battery longevity. ARM is always going to have a battery advantage over x86/x64, it would make more sense to compare the M2 to a PC using ARM anyhow.
Even though it seems like a minor thing, it is pretty liberating to just close up your laptop and carry it around without also carrying a power brick. Even if you're the type who always has a bag and does not mind carrying extra cords, it still is liberating to not have to hunt around for a power outlet. Just set up in the most convenient/comfy place and get to work.
I can work for 4-6 hours on battery anywhere, after that I have to charge. That's with a Surface Pro i7. My point is that how often are you going without recharging your battery to begin with and how often are you not within reach of a plug? At some point it is no longer a top priority for most consumers. Also comparing x86 chips to ARM chips is nonsensical when it comes to battery, given the ARM will ALWAYS win. It's not even a discussion. Why that is downvoted is insane. The performance is less about "Apple silicon" and more about ARM having a massive advantage on power consumption. Compare it to a PC with an ARM chip, it will still beat it across the board but at least it will have been a fair comparison.
> it is pretty liberating to just close up your laptop and carry it around without also carrying a power brick
With USB-C (thanks EU!) that's not a problem anymore, you can bet someone else already brought it or the coworking/bar/friends house/whateverplace has a spare one.
It kind of sucks how much stuff wouldn't work on it without special install/compile from source instructions. But that's mainly just growing pains that go away.
There's also the issue that WASM multithreading basically does not work on M1 yet[1].
But, I am in love with a laptop that runs for days (of my 3-4 hours per day usage) without needing to be charged. I look at that laptop like my Kindle more than my phone: charging it is this occasional chore, not a daily "must get plugged in." I cannot quite articulate why, but this is such an awesome thing that's changed my behaviour patterns. It just sits there and I pull it up, use it, shut it, and put it down. I'll see "20% battery life" and think, "I'd better charge it tomorrow" rather than "I'd better interrupt everything I'm doing and move to an un-ergonomic location so that I can sit a hotplate on my lap while it charges."
I also love that it doesn't make noise or get hot.
Oh and another curious weirdness: when I open the lid in the morning, the cursor only moves at 30fps until I close the lid and re-open it.
[1] https://bugs.chromium.org/p/chromium/issues/detail?id=122868...