I'm a developer as well and don't use apps like VSCode, or Java/Gradle... but instead develop open source projects running Python, and Rust.
1. For my open source projects, running tests has sped up so much that I find it more of a joy to work on this projects now. Test suites that take ~60 seconds on my 2017 MacBook Pro i7, now take just ~15 seconds. That's a HUGE win in the edit, run test suite cycle.
2. My Rust projects now compile in WAY less time. Something that used to take 3 minutes to compile + link for a debug build is now down to 40 seconds. Once again, this has increased my productivity in the edit, compile, test cycle.
For other projects like node/npm the speedups are also massive and running various pipelines to go from source code to final deployable object is also much faster.
I use vim to develop, and it runs on Rosetta just fine, so I can continue to use all my existing plugins. I don't extensively use docker containers, but for those I connect to the cloud instead of running them locally, which is a move I was already making even on my Intel based MacBook Pro.
I got the 16GB 13" MacBook Pro. I, like the blog author, was thinking of this as a secondary laptop, something fun to have around, but I have hardly touched my old 15" MacBook Pro because this one does everything I already need. And does it faster.
I've been living with 16GB of RAM forever, so maybe I am used to fitting in that memory space... I avoid Electron based apps like the plague, and with being able to run iPad/iOS apps on the M1, I've been able to cut down even further since things like Authy are way faster to launch when they aren't large Chrome based hogs.
I am looking forward to the 16" MacBook Pro's with Apple Silicon based processors, but only because I do miss the additional screen real-estate, and prefer the larger size. But the 13" MacBook Pro M1 has been absolutely fantastic so far.
>I've been living with 16GB of RAM forever, so maybe I am used to fitting in that memory space... I avoid Electron based apps like the plague, and with being able to run iPad/iOS apps on the M1, I've been able to cut down even further since things like Authy are way faster to launch when they aren't large Chrome based hogs.
iOS apps on the Mac are a game changer for me. The UX isn't 100% perfect, but it sure is better than web apps, which are very often slow, resource intensive and kill your battery. I haven't touched the Twitter web app since I've installed the iOS/Catalyst app on my Mac. Even with its iOS-optimized user interface, the better performance of the iOS app makes it so much more enjoyable than the web version.
I'm very much looking forward to seeing other iOS apps arrive on the Mac, to replace their web variants.
There's really something to be said about how web developers have been killing the performance of their websites with their bloated frameworks or whatever. I know web apps will always be at a performance disadvantage compared to native apps, but the performance regression has gotten absurd. Using Reddit via the Apollo iOS app is a quick and buttery smooth experience, meanwhile the redesigned Reddit webpage is a laggy stuttering mess on my MacBook Pro.
> iOS apps on the Mac are a game changer for me. The UX isn't 100% perfect, but it sure is better than web apps, which are very often slow, resource intensive and kill your battery. I haven't touched the Twitter web app since I've installed the iOS/Catalyst app on my Mac. Even with its iOS-optimized user interface, the better performance of the iOS app makes it so much more enjoyable than the web version.
Totally agree, particularly for small utility apps and the like. Most good iOS apps launch instantly and sip power even on comparatively limited iPhone and iPad batteries, so the impact they have on a Macbook battery should be close to nothing. Night and day compared to electron stuff.
Similarly, it’s a joy to use the Jira app for macOS which uses Catalyst to port their iPad app to macOS. Compared to how long it takes to load and use Jira on the web, it’s just a bit snappier and nice. The dark mode support is also appreciated, though aesthetics could still improve to be a bit more mac-native... and because of Jira’s design, it’s possible the app will always lag in functionality compared to the website. But still— progress! :)
Are you serious? The JIRA app is by far the worst native mac app I've ever used. If their website is anything to go by then it's not the underlying technologies fault, but I found the JIRA mac app to be just as slow as the website.
I'm serious, yes, it's better. I'm not going to say it's as good as other apps. I mean, if you're keeping a to-do list, better to do that in Apple Reminders -- especially now that they've added that new shared list feature and assigning todos. But if you have to put up with Jira, then yes, I personally find the Jira app to be faster than the website. It reminds me why I like apps as much (or more) than websites.
I've also run Jira on a server before, and if you optimize the Java runtime's settings and over-provision, you can get a pretty fast Jira instance going. It's just most folks rarely touch the default configurations, which are terrible, or don't shell out for SSDs when running it. And the cloud version ... well, it seems a bit under-provisioned if you ask me. Jira doesn't appear to be written well enough to take full advantage of the cloud, yet.
The Jira Cloud iPad app does not appear in the App Store. Apps can opt out from the macOS App Store, and this is especially useful if there already is a macOS version. Why? Generally, any macOS version of an app -- even one like Jira which is nearly a straight port of the iOS app -- is 10x better than the iOS version. Running iOS apps on the M1 is a lot rougher around the edges compared to running Intel apps on the M1. I expect Apple will polish up how iOS apps run on macOS in future releases, if they maintain the feature in its current form.
I think a lot of the performance gain is due to switching from an aging 2017 Intel processor to a modern processor on a recent TSMC node (5nm in Apple's case).
Had Apple switched to AMD Ryzen processors on TSMC's 7nm node the performance improvement will have been similarly stark.
Light travels about a foot in 1 nanosecond so moving RAM from a foot away to touching the CPU saves 2 ns in round-trip latency. That's a big win on modern systems.
In reality it's a bigger effect since electric signals travel slower than light. And the effect is yet bigger because off-package memory must be accessed through buses with several levels of synchronization and buffer and gate delays.
Apple gets away with "unified memory" in the M1 because having the memory on-package means a good deal of the bus and sync and contention logic becomes unnecessary. So everything that touches RAM gets a lot faster. And almost everything touches RAM.
Plenty move advantages then just round trip latency. The fact that its a single SOC, what in turn reduces traces on the MB. Freeing up real-estate.
We are finally moving to the conversion point between smartphone's and PCs. It has been in the making for a while.
Now, if that is a good thing for the consumers ( losing upgradability ) is another thing? But given how PC's have become less and less flexible anyway.
We used to have slots for everything ( GPU, Network, Audio, ...). Now we have ATX boards where you maximally put in a GPU. SATA is going the way of the dodo with m.2 becoming the new standard for anything data. I mean, what do we even change anymore on PC's... CPU, Memory, GPU.
GPU's will become a external device with USB4. Mass storage has been moving to NAS more and more or USB3 devices. The people that need 4+ HDD's are more exceptions. And you can get away with USB3 HUB's + external enclosures.
I questioned for a long time, why we still have Chipset, that artificially segment MB's when the difference has become very small between them anyway. You can easily move that last bit of IO into the CPU SOC.
The days that we buy AMD or Intel SOC's with some default CPU+RAM+IO is probably closer then what most think.
Separate hardware is probably going to become a Server / Workstation Pro only feature ( with big $$$ prices ).
The reality is that hardware has reached a point, that most people did not even upgrade for years anymore. And its more a smaller group / minority that really needs ultra fast hardware.
Flexibility is moved from big MB's to external devices, connected over high speed connections.
Sorry if i have gone a bit off topic but when you mentioned the onboard memory, it got me thinking about how we really are moving to a SOC/NUC/... future for even powerful hardware.
That's not entirely true. My Ryzen 4900hs laptop gets great battery life with integrated graphics. Not as good as the m1, but it's a zen 2 chip and we have yet to see what zen 3 mobile will do.
Apple clearly has a huge advantage here with a cutting edge process and for all practical purposes unlimited money to throw at this. Their cash on hand is more than AMD
s full market cap.
My friend sent me a screen shot from `powercfg /batteryreport` on his Asus Zephyrus G14 (Ryzen 9 4900HS) and it had consumed 16,704 Wh of battery in 1:47:27 of active use. That tracks to about 8.5 hours (capacity showed just under 80 Wh).
Going by TSMCs claim of 30% better power consumption at iso-speed would put the battery of the G14 at 11-12 hours. So that places the Apple system on a 25-36% better battery range, instead of 80-100%.
And still, there are more catches: Apple is probably driving the screen at 60Hz instead of 120Hz for the G14, and the screen tends to be one of the main drivers of battery life.
What I'm trying to say is that the talks revolve around the M1 vs x86 for battery consumption, but the big savings can probably come from other components in the system.
My Lenovo Thinkpad X390 which i'm sitting on right now for a while, still has ~8h left to go.
Chrome with 30 tabs, intellij etc. open and running.
I do assume the M1 is doing a great job, don't get me wrong, but i would like to see a more objective comparison.
Personally speaking, the most interesting thing for me to see is that Apple now put a very strong SoC on all its models. From Air to Pro, with a relative good price point (as long as you don't upgrade anything...)
Do you think moving from python 3.6 to 3.6 had any impact whatsoever? I'm surprised by such a large improvement in performance as other reports I read indicated similar-to-slightly-worse python performance of M1 vs x86.
I run test suite across those Python's. So Python 3.6, 3.7, 3.8, and 3.9 using tox to drive the automation to test against all of those versions to validate compatibility and that nothing breaks.
So there is no moving anything. Python compiled using pyenv for the Intel platform.
Which reports are you reading? Are they using C libraries or doing heavy math computation that is not CPU bound because there's no optimized ARM version yet?
Okay, I'm not finding any benchmark that supports the idea of a slowdown, so probably I remembered some rumour I read on twitter or reddit.
What I am finding now is that python is working more than fine on M1, be it through rosetta or native code, which is consistent with your experience.
Also to clear things a bit, I parsed your "3.6 -> 3.9" as meaning that you had moved from Py36 on intel to Py39 on apple, which is why I asked if maybe there were speedups or differences from moving to 3.9.
Ah, my apologies for not making that more clear. Currently all are running under Rosetta.
I was unable to get all the versions I needed compiled natively, which means for now I will use the non-native versions.
I have tested Py3.9 compiled for arm64, and one particular test suite went from 9 seconds to 6 seconds. Other test suites were not as dramatic, especially larger ones, where it might go down from 40 seconds to 36 seconds.
Still a speed improvement, but not as dramatic as the 2017 MacBook Pro i7 -> 2020 MacBook Pro M1.
> I'm honestly for the first time in a decade a bit excited about an upcoming computer,
I think we are more excited about a shake up in the whole computer world, that may mean the dead of ATX/MB's/etc as we know it. And a more profound move to eGPU/NUC/Laptop like powerful devices with a low power usage.
This feels a bit like a wakeup call to the industry. Like how the first (commercial) successful smartphone, the iPhone 1 fueled a entire industry for over 10 years to massive evolve. What is kind of funny because i felt like the smartphone evolution was starting to loos steam. The conversion between PC/Laptop/Smarthones will begin now.
Nobody actually knows, but I remember Apple saying something like a 2 year transition (which includes new Intel products). So if you're not in a rush that's likely the longest you'll wait.
It hasn't been merged yet as some of the dependencies aren't available, one of them apparently being LuaJIT, but if you build it yourself without that it shouldn't be a problem.
Gluing yourself at the hip to homebrew is a bad idea IMO. It's trivial to install things yourself. It's just a matter of `./configure && make && make install` and you're done.
Why do a `git clone pkg && cd pkg
&& ./configure && make && make install` instead of just a `brew install`? The whole point of a package manager is to make installations and upgrades easier.
Also need to track down and install all of the dependencies, and the dependencies dependencies and all the way down, and figure out the right configure flags to pass, and what environment variables it needs and where it expects to look for header files and the like.
Doing all of that from scratch takes hours and hours. Homebrew lets me not worry about that, and instead be productive.
I'm doing my Rust development in a Docker container deployed on AWS w/Fargate and EFS to store my source, via VSCode Remote on my M1 MacBook Pro :) Thin client, arbitrarily fat server. VSCode Remote makes it a very pleasant experience. I'll document my experience once I've smoothed out the kinks.
The nice thing is I can just wait out the compatibility issues re: arm migration, especially with some of the third party crates my rust projects depend on currently failing to build.
Being able to easily install it using homebrew is really where that need came from. I can use the existing bottles and the recommendation for right now is to not mix and match both Intel/Arm binaries.
Also, the ARM version of Homebrew is not yet ready for primetime, there is a ton of software that doesn't compile for ARM yet (Including MacVim).
Why are you using a laptop for development in the first place unless you have to be on the move? Why not take advantage of even faster CPUs or GPUs, more RAM, many times more screen space, more disk space, ergonomic keyboard and mouse, or even real speakers if you work from home?
In a similar vein: "Why are you using a Mac for development in the first place unless you have to develop software for Apple devices? Why not run Linux if everything you write runs on Linux servers?"
Your post reads less like a sincere question and more like "you should be doing it this way unless you can defend your position". Maybe--just maybe--people weigh up tradeoffs differently than you do personally.
FWIW 'on the move' can mean going into a meeting room or even just walking over to a coworker's desk with your computer. Desktops kinda suck when you work around humans who don't just stare at their screens with cans on all day.
You've touched on something that irks me to no end — the tendency of many Elite Coder Bros to say things like, "Didn't you see my enormous headphones on?! Can't you see I'm coding — don't you realize I think for a living?!"
Yeah, dude, we're all thinking. As a hands-on programmer-manager I can report that creating slides for the fundraising deck that'll raise money to pay you or writing that email about you taking bags of beef jerky home for your post-workout recovery meal can take as much sustained concentration as you spinning in your `while 1 { copy_compiler_error(); stack_overflow_it(); paste_code(); }` infinite loop.
I tend to get given large projects to do. I'm fairly senior, and once we've got past the brainstorming stage, the ideas have flowed forth, and the direction is, if not clear, then at least aspirational, I often get given the job of breaking ground.
Here's how I think.
I model interaction between distinct parts. I have a mental model of how X fits with Y, how X can affect Y, and how Y can in turn affect X. I'm not doing it with just X and Y, I painstakingly build this mental model[1] over as much of the problem space as I can, and having done this for many years now, I can cover a fair amount of ground before the complexity gets beyond my ability to model. It takes a while to create this, and then when some middle-management type wanders along, taps me on the shoulder and disturbs the concentration, and it all comes crashing down around me, I am less than best-pleased. Bonus points if it's just to "touch base" or "remember we have a meeting in 2 hours time", or ... you get the picture.
Why do I expend this enormous mental effort to gain such a fragile and ephemeral state ? Because I can mentally throw boundary conditions at it and "see" how things will react. It's how I deal with inherent complexity of large systems, and a couple of hours of mental effort can prevent me spending a week coding down a dead-end path. It's happened often enough now that even my line management understand it's worth the time - there's been plenty-a-meeting where I go in and say "yeah, I know we wanted to do <insert X> but I think there's a problem when Y and Z come into play under conditions A, B and C. I think <insert option gamma> is a better route even though we didn't think so at first".
Sometimes you really do just need to be able to be left alone and think. As someone who used to own the company before he sold it, and who's done pitches to VC's and other investors, I can quite categorically state that (for me), the slides, presentations, and client management is nowhere near the level of mental investment. Nowhere near.
Just my $0.02
---
[1] It's not visual, I have aphantasia, it's more firing-condition-based.
My wife and I discus this all the time. Not all jobs require the same level of thought. Her work involves a lot of mechanical movement, practice, skill, talent, and some thinking. But most of her day to day can be done listening to a podcast like she's driving.
I can't do that.
So no,
We're not "all thinking", some tasks require deep thought and long periods of uninterrupted concentration.
The problem is working in a team requires communication, and management of interruption, it is not useful to be like this dick with the headphones on, that creates barriers (the attitude, not the headphones) to communication.
I do think it makes little sense and it’s a message board, sometimes things will be said directly. If a preference is not rational and has negative consequences then it’s worth challenging. Who knows, maybe somebody will realize that squinting at the text on a 13-in screen while looking down and having their hands in unnatural positions for many hours each day is not great for them. I never said that you can’t have a laptop to take with you. I’m talking about development like the OP was - where things like compile times matter and where you can very likely remotely access your desktop if you want to show things to others.
Much of the development world has moved to laptops, docks for ergonomic desktop use, and cloud compute for when beefier hardware is needed. As a developer, I haven't used a desktop machine either at work or at home in well over a decade.
Sorry to hear that. I tried to use a top-of-the-line 2019 MacBook Pro for some development while on the road and it doesn’t come close to a desktop, even when it’s augmented with external monitors, etc.
“Compared to the desktop (sitting), tablet and laptop use resulted in increased neck flexion (mean difference tablet 16.92°, 95% CI 12.79-21.04; laptop 10.92, 7.86-13.97, P < 0.001) and shoulder elevation (right; tablet 10.29, 5.27-15.11; laptop 7.36, 3.72-11.01, P < 0.001).”
“These findings suggest that using a tablet or laptop may increase neck flexion, potentially increasing posture strain.”
Ever want to code on the couch? Not everyone wants to spend all day sitting on a gaming chair in the man cave illuminated by the light of their LED desktop enclosure
Literally everyone who chooses a laptop as their main driver is doing it purely for the portability... We don't choose it because it's small, we choose it because while most of the time we're stationary, the time we do need to be portable, its easier to just undock and walk away than to make sure your laptop and desktop are synced.
I was considering picking up a new Mini so I could drive two displays, but then I remembered another laptop advantage: TouchID. I would really miss that.
I wouldn't class it as a hack since it is pretty standard in enterprise environments. Downside is that it does consume CPU which depending on your workload may/may not be noticeable.
Good point. I don’t think it handles all of my use cases (e.g., passwords and credit card info in Safari) but I’d forgotten that there are several instances where my finger and my watch are in competition to see who unlocks something first.
Not the original commentor, but it's nice to be able to pick up my laptop from my desk and move to sit in the kitchen, or the couch, or outside, or (in better days) a café. I really only barely use my desktop nowadays, and half the time it's over mosh.
Don't most companies also provide workstations at desks for engineers? I know many who only use their laptop for meetings (until WFH) and it always seemed like a huge waste.
Because I enjoy working from my home office hooked up to my 49" UltraWide, but I also enjoy working from a coffee shop, or from my couch, or from my bed, or when I go spend time with friends and do amateur photography I like having my laptop with me to do on the fly editing/reviewing of images before finishing a shoot...
Working from home on a friday is probably the biggest reason for a lot of people i guess.
Meetings, is another one.
Doing presentations.
Also for me at least: On-Call.
I would prefer a workstation at home and a lightweight laptop with company network access but hey my companie prefers to buy me a Laptop for 3k over a desktop for 1k and a light laptop for 1k.
Personally can’t wait to upgrade. The point about 16GB being on the low side was nice to get confirmed. There has been a lot of conspiracy on 16GB being plenty enough, but as a developer I know that to future proof my next machine I would really like 64GB. I already have 16GB of ram on my 8 year old MacBook Pro.
I suspect there are a lot of us in that boat, with 5-to-8-year-old Macbook Pros. Apple really missed the mark with the intervening generation, by removing useful features and replacing them with features unattractive to developers.
Maybe this big leap in processor / thermal / battery life (and a revision back to proper keyboards) will be enough to get us to switch! I think 64GB (or at least 32GB) would be a nice sweetener.
I'm still running a late 2013 Macbook, now on Big Sur. The battery life is now about 3-4 hours even if I'm doing dev work, but otherwise it's still pretty good otherwise, for a 7 year old laptop. I had a Sony Vaio back in the early 00s, and it started with running XP 'ok', and then... Vista, which made it dog slow. My point being that I'm surprised that a 7 year old laptop is still "good".
A new laptop isn't critical to my work at the moment, so I'm happy to wait for an M2, if that's what it will be called. Looking at Apple timelines a new chip tends to be 1-2 years apart, and I'm fine with that.
I'm also reluctant to give up Magsafe, or at least have to buy a magnetic USB C connector.
With Apple moving forward rapidly on this architecture change, I expect them to upgrade the processors at a faster pace than normal. They are 6 months into a 2 year plan to migrate all of their systems. I would expect the next version of M chips in 6 to 12 months with higher RAM limits and faster GPU. That would be targeted at their other laptops and the iMac. It will likely take longer for them to get to something that can be used in the Mac Pro and maybe an iMac Pro where RAM and GPU needs are even greater.
It seems unlikely. There's a lot of extra complexity with multi-socket and there's no actual reason for them to do this. I'm not a CPU designer but they might do the equivalent of what you said with a chiplet design like AMD (Intel also has announced plans to eventually get to chiplets).
I love Magsafe as well. However, if the new laptop battery truly lasts all day, then it won't be nearly as important. I'll just charge the laptop overnight, and treat it like my phone.
Maybe they will integrate wireless charging into the laptops soon?
I would agree, with the exception that I’ve noticed really CPU heavy things like builds that take > 1 min still complete a lot faster when plugged in, even on a full battery. I assume they still need to throttle somewhat when only running on battery.
It's not throttling, per se; the M1 prioritizes its efficiency cores when off mains power, and the performance cores when on mains power.
When unplugged, it engages the performance cores as little as possible. You should be able to run `powermetrics` on an interval while running a high-intensity task to confirm; the performance cores pin first when on mains, then ramp up the efficiency cores, and the opposite happens on battery.
It's a big difference because the efficiency cores max out at about 2W draw, and the performance cores max out at around 18W. There's a big incentive in battery life to rely on that behavior.
I'm not sure yet if it's tunable, macOS doesn't make that a transparent behavior where you can just tweak it at will.
The Air will throttle if under full load long enough, but it's still mostly just ramping down the performance cores to stay cool.
With a true 'all day' battery, you'll probably only plug it in at a desk where tripping or knocking things over isn't an issue. While travelling, you just charge it at night or when you aren't using it. Much more like smartphone usage.
How many people are aching for a magsafe phone charger?
If it’s any consolation the battery life on my 6 month old 16” MBP is also 3-4 hours when I’m doing work. The CPU is an absolute dog, and the fans kick in under the faintest of loads.
Same battery life here. :( Love the actual machine but the battery life is a disappointment.
Just got a 16MB 500GB M1 Air and so far everything is at least as snappy as the 16" Pro, and battery looks like a full 8 hour day + so far... it's a keeper I think. I also find the keyboard a bit nicer to type on personally than on the 16", so accuracy is up.
Moving from a 2015 Air to a 2018 Pro felt like a downgrade since the battery life was much, much worse. I have to keep the power plug around if I want to work on the couch for any amount of time.
Thats crazy; Are you using it as your daily primarly work horse?
I can't imagine how you got used to the performance you have right now. Either you are not doing anything with your device or you would hugely benefit of upgrading at least every 3 years or so.
Yeah, maybe. It's still hard for me to envision giving up a 2014 and 2015 MBP. They both work flawlessly and don't compromise on any of the useful features, like having USB A ports, SD card slot, MagSafe, no touchbar, reasonably sized touchpad and so forth.
My plan is to get a min spec M1 air as a personal laptop and then get the real 14"/16" M2 version with touch, more cores and more RAM later on. As one apple employee told me, you never really want to get a v1 with apple anything if you can.
With the 'touchification' of big sur, it seems pretty likely apple is going to release some sort of convertible macbook or maybe something iPad style. Or maybe iPads can start running macos?
Work laptop will get upgraded to an M1 one way or another anyway.
I keep seeing this, but the performance gains and real-world, Docker and all, battery life are significant in the 16 inch MacBook Pro‘s more so than even my 2017 15”.
I upgraded to a high spec MacBook Pro 16 after the M1 model was announced. I need solid virtualization support for Windows and Linux now, plus Big Sur seems kind of a mess from the design and privacy side of things. I'll probably wait 3-5 generations until everything with Apple Silicon is worked out to upgrade again, maybe get an Air as a device for travel if things look stable sooner.
The reason it has been stuck to 16GB is precisely because Intel mobile chips did not support any more than that up until a year ago. So rather than blaming Apple, it should rather be placed on intel.
I don't think expecting more than 16GB of ram for entry level Apple laptops should be expected.
The low amount of ram, energy inefficiency and plateauing performance are the exact reason why Apple is moving away from Intel and onto a platform where they can control their own destiny.
The actual reason the M1 Macs cap out at 16GB is that they use LPDDR4, and that on a 128-bit bus caps out at 16GB. Assuming that they move to LPDRR5 as it becomes available (hasn't Apple generally been pretty quick on the uptake on new RAM standards?), that limit goes up to 64GB.
It's a conservative spec for sure given the rest of the M1. Samsung has shipped their own LPDDR5 in the Galaxy S20 since February. Maybe production capacity is still too limited.
I believe that Samsung co-designed the memory with their interface block, and then it was effectively codified as the standard. Most other companies had to wait for the standard to be formed before they could start work on their IP. This gave Samsung a huge head start.
Intel's fault? Uhm, they've had some misses recently, but your statement about RAM support is incorrect.
The 7200u supports 32GB or RAM. That chip is over 4 years old and nothing special. Not sure how long that's been supported, I didn't feel like looking further back.
The rated battery life may not have, but the actual, real world experience battery life certainly has. Speaking as the user of a. 2019 MacBook Pro with the DDR4 that I use for work.
I find myself scrambling for a power supply way more often than I ever did with my 2017 15" MacBook Pro.
Even with the older battery in the 2017, with more cycles, I get a longer battery life out of it than the 2019.
And I have the complete opposite experience. Pinning this on the switch from LPDDR3 to DDR4 seems like a very weak conclusion. Especially when it is known that processors and displays are the largest consumers of energy and the 2019 MacBook Pro boosts way higher with more cores than the 2017.
Apple sells the 16" macbook with 64gb of DDR4. It's their decision whether to use LPDDR or DDR. They choose LPDDR for some and DDR for others.
Pretty incredible to blame another company for a decision that was made inside Apple. Especially for a chip that was designed from scratch in apple, and therefore has no Intel constraints.
The mini is their cheapest computer. The Air has been their entry level laptop for the last several generations. The 13” Pro has had 2 versions since 2016 - a low-end 2 port model, and a higher end 4 port model.
Every device with an M1 available today is Apple’s lowest-end (and least expensive) offering.
> The reason it has been stuck to 16GB is precisely because Intel mobile chips did not support any more than that up until a year ago. So rather than blaming Apple, it should rather be placed on intel.
The reason some older laptops didn't have more RAM was twofold:
1. Either the BIOS wasn't written with such support
2. There were no RAM sticks big enough (no 16GB in a single SO-DIMM) so you could have 2x16 - but there are currently since a year or two, SO-DIMMs with 32GB RAM, so you could easily reach 64MB in a laptop.
GP means LPDDR3 support which capped out at 16gb on intel, which was because their 10nm lines with LPDDR4 were delayed years. It was always possible to use regular DDR but for laptops the low power variant is important.
Yes they do on the 16”, after years of complaints from users who didn’t understand the constraint, and which they added the maximum allowable battery size to (~100wH), and it still only lasts 3-4h in moderate use.
I don't know. I did what I'm currently doing (Ruby on Rails dev) on a 4 GB MacBook Pro until last year and it was fine.
At the moment I have around 2 GB free on my 8 GB Linux machine with a webserver, MariaDB, a JetBrains IDE, Slack and Firefox running. It would be easy and dirt cheap to increase the RAM to 16 or 32 GB but I'm too lazy to even order it and open the machine up since I have never felt the need to have more.
I'm just saying this so that people like me remember that they're still professionals and their work is valuable even if they don't need 64 GB of RAM.
IDEs such as IntelliJ can easily consume 8GB of memory on a moderately sized project of 100k loc. If one has to open multiple such projects or work on a very large project in the million loc scope then you’ll run out of 16 GB really fast.
Folks working with data intensive applications often need to trade off writing code to page data in and out of memory during development with larger dev boxes.
100k LOC would be, let's be generous, about 8 million characters (80/line average, which is high). 8GB of RAM for 8MB of input seems excessive to me. What's going on with that IDE that it needs so much for something so small? Even if that were 8 million distinct tokens/symbols in the input, that's still 1KB/token of memory consumed.
Except that is only what an IDE does at the very minimum. An IDE loads plugins, provides intellisense, documentation, potentially loads documentation of all the libraries, watches files, displays Git status, parses your code for syntax highlighting etc. etc.
You're not opening a TXT file with 100K lines in it.
For every key stroke a new immutable string is created with the content of the file/state. Plus data structure for AST/static analysis of every file in the project including dependencies. Plus another copy of the file for rendering with syntax highlight. One would think all this copying of data is very inefficient, but computers are good at moving data around. The most costly is rendering the glyphs.
> IDEs such as IntelliJ can easily consume 8GB of memory
Sound more like a Jetbrain ( IntelliJ company ) problem. Used to develop on PHPstorm ( another Jetbrain ) product ( same software in the background ) and the slow startup's, 2GB+ memory eating for relative small projects, crashes way too much, expensive as hell for what it does ( especially with the license change years ago ) ...
Eventually switched to Visual Studio Code and while people whine how Electron is memory inefficient, its like 1/6 the memory usage of PHPstorm, with only a few features missing.
Even on a 32GB PC system, i barely use 16GB ( and WSL2 is eating up a lot with a lot of docker images. And its not really docker, just Linux cache eating up memory ). Not a issue on a Mac that does not need a VM like layer.
Its about priorities sometimes. If people keep upgrading their memory, developers/companies simply push the responsibility to the clients and do not bother with spending time on optimizing.
If people start leaving software product on the way side for being just horrible inefficient messes, then maybe a bit of focus will come back to actually optimizing products! You will see Jetbrain change tune, when VSC etc keep eating its market share.
I have worked on a large 10 year old Spring application with well over 300k loc for 3 years. I don't recall Intelij ever exceeding 3gb of ram. Intelij reminded snappy on my Macbook Pro 2015 with 16gb of ram and an i7.
Same experience here. My Ubuntu laptop accidentally doubled-down on 16gb RAM because I forgot to give it a swap partition (whoops). It hangs maybe once a month under a VS Code, PostgreSQL, FastAPI/Python, React workload. Basically, not annoyingly frequently enough for a lazy person like me to change the partition table. The same workload on my 7 year old Ubuntu desktop with a swap partition on a fast SSD very rarely ever hangs.
Certainly some devs need more than 16gb, but they know who they are. And you know who they are because, well, they love telling you :).
you can always add a swap file ... that makes your life easier so you don't have hangs, and you don't need to repartition.
I use swap files all the time with things like raspberry pi's and such because I don't want to mess with the sd card partitions, but still from time to time it's good to have that swap fallback since the memory onboard is limited.
Swap files are super easy to set up, take up no "brainpower" and disk space is cheap these days.
I think it really depends on if your development workflow requires VMs/Docker or not. It seems crazy to me to have a bunch of containers running using 4-8gb each, but for many that is how they develop.
Not GP, but RubyMine (the Ruby IDE from JetBrains) has decent support for refactoring things, type interference, executing single tests with a debugger, etc.
I mean, it depends what you do. I have 16GB on my personal laptop, and essentially never swap, but the 16GB on my work machine is painful for large Scala projects.
As always, for some people 16 GB will be fine, for some it won’t.
8 GB when using Android emulator and Xamarin has been a pain for me, but my iMac doesn't have a SSD drive (it's got an Apple Fusion drive). The machine was swapping all the time, e.g. when alt-tabbing between Android emulator and Xamarin.
After upgrading to 24 GB the machine became very usable.
I wouldn't ever want to go back to 8 GB. Perhaps 16 GB might be usable for iOS and Xamarin development, especially if the Android emulator can run in some kind of HAXM [0] mode, but for ARM processors ... not sure if that's possible right now. I'm sure the SSD also makes a big difference compared to my Fusion drive right now.
Still it feels safer (more future-proof) if the machine has a bit more memory than 16 GB.
Honestly Xamarin is pretty nice these days if ones’ aim is to share a lot of code between iOS and Android projects while still being able to provide a native experience.
If Virtual Machines are essential part of the workflow then 16GB is out of question IMO, could get away with Tiny Core Linux/Lubuntu on a VM to some extent but when getting some serious work done on both host and guest; 16GB would be a bottleneck for VM.
That article seems confused and/or misinformed on many points, on many levels.
As for the GP cited use case, there's no plausible way in which the compilers involved in the V8 build process would start using less memory due to their memory management machinery getting magically transformed by the hardware from tracing GC to refcounting GC, even assuming LLVM had been using tracing GC in the first place.
(Also the tracing GC memory overhead claims originating in Apple's marketing copy are way hyperbolic, modern tracing GCs don't require double storage used in "mark and sweep" GCs)
Many memory hungry apps on Apple - Web browsers, compilers, photo/video editing, games, dev tools like vscode/electron/Jave IDEs/emacs/databases - are in this category.
I think the point is that memory bandwidth and SSD bandwidth and latency have improved so much that M1-based Macs are really fast at swapping. Combine that with compression (which macOS has done for a long time) and I could imagine iPadOS-like performance.
For normal users (such as people not compiling V8), swapping may be virtually unnoticeable.
Macs have had fast SSDs for a while and as you say compression has also been done for a long time. M1 made no difference whatsoever to memory usage. The posts similar to above seem to just be an attempt to fanfic-justify what's an actual regression for some.
For many people 16GB is absolutely enough, and Apple had to start somewhere. But if 16GB wasn't enough for you yesterday, it isn't enough for you now.
Do you have a link? MKBD just said it was "slightly faster" than his comparison system and the numbers for the M1 appear to be around 3GB/s reads and 2.7GB/s writes in disk speed test, which is pretty typical of a last generation NVME drive (current pcie gen4 drives are hitting 7GB/s reads & 5GB/s writes)
Then sounds like Apple put some really garbage SSDs in the 2020 Intel-based Air, not sure what you're looking for here?
But 2-3x won't change swap from grinding to a halt to perfectly smooth, either. It's still over 20x slower than RAM.
More significant for swap usage though is random reads and read latency, neither of which are going to be particularly impressive on the M1's SSDs. You need something like Optane to make that a compelling case.
I do wonder if it’s taking advantage of the cheap CPU power and memory bandwidth to be much more aggressive about memory compression. That could reduce swapping for some workloads, though probably not a v8 build.
What do you mean exactly? It is normal for swapping to happen even if you are using less than the physical memory, so as to trade rarely-used RAM for frequently-used disk cache which improves overall throughput
> There has been a lot of conspiracy on 16GB being plenty enough
I bought a computer with 16GB in 2011 and around 2015 it was already lagging quite a bit due to swap - switching to 64GB in 2016 was day and night, I don't have to fear launching arbitrarily many jobs anymore.
> Xcode runs FAST on the M1. Compiling the PSPDFKit PDF SDK (debug, arm64) can almost compete with the fastest Intel-based MacBook Pro Apple offers to date, with 8:49 min vs 7:31 min. For comparison, my Hackintosh builds the same in less than 5 minutes.
This is telling just about how unrealistic Geekbench benchmarks, used by most reviewers, really are - they were all forecasting M1 completely humiliating everything from Intel and yet in practice it's not quite right. It's a dang impressive chip still though.
Sure, but his Hackintosh isn't running a laptop chip. 15W X86 CPUs would probably compile that faster than the M1 if the difference from Intel is that small.
This is one data point among many. Tons of devs have been tweeting their compile times, and in many/most cases the M1 spanks the best Intel Mac out there.
If anything, then numbers in TFA are an outlier rather than a contradiction.
I hate the "M1 vs Intel MacBook" comparison. Every Intel MacBook back to 2016 has broken thermals. They're all running at maybe half their rated clock speed. 13" MBP is a 4GHz part running at 1.4GHz. 16" MBP is a 4.8GHz part throttled to 2.3GHz. You're comparing M1 vs. a broken design which Apple broke.
Don't congratulate Apple for failing to ship trash.
There's an argument for efficiency on a laptop, no doubt, but that's not what the parent commenter is talking about.
M1 is the highest perf-per-watt CPU today, no question. Ignoring efficiency, there are plenty of faster CPUs both for single-core and multi-core tasks. That's what "my Hackintosh did the build in 5 minutes" is showing.
You're misunderstanding Intel's specs. If you want the chip to run within TDP you can only expect the base frequency across all cores, not the ridiculous turbo frequency. The best laptop chip Intel has right now is the i9-10980HK with 8 cores at a 2.4GHz base frequency and a 45W TDP. Apple's laptops are more than capable of dissipating the rated TDP and hitting the base frequencies (and often quite a bit higher), although the fans can be a bit loud. So Apple's designs are not broken, at least not by Intel's definition.
You can relax the power limits and try to clock it closer to the 5.3GHz turbo frequency. But how much power do you need? I can't find numbers specifically for the i9-10980HK, but it seems like the desktop i9-9900K needs over 160 watts [1] to hit a mere 4.7GHz across all cores, measured at the CPU package (ie. not including VRM losses). Overall system power would be in excess of 200 watts, perhaps 300 watts with a GPU. Good luck cooling that in a laptop unless it's 2 inches thick or has fans that sound like a jet engine.
You've got it backwards. Apple chooses the TDP. Intel provides the CPU to suit. Apple is choosing TDPs which are too small and then providing thermal solutions which only just meet that spec. They could provide better thermals without hurting anything else in the machine and get a higher base clock.
I assume they do this for market segmentation; see 2016 Touch Bar vs. non-Touch-Bar Pro. One fan vs. two.
The TDPs look appropriate for M1 parts. They're too small for Intel. I'm guessing that (a) Apple predicted the M1 transition sooner and (b) Apple designed ahead for Intel's roadmap (perf at reduced TDP) which never eventuated.
So, unfortunately, Apple have shipped a generation of laptops with inadequate cooling.
> very Intel MacBook back to 2016 has broken thermals. They're all running at maybe half their rated clock speed.
Are you saying it's an unfair comparison? The Intel Macs are operating in the same environment as the M1 Macs. It doesn't matter if the Intel parts could be faster in theory, because you're still dealing with battery and size constraints. If you want unthrottled Intel CPU in a laptop, your only options are 6 pound, 2 inch thick gaming laptops with 30 minutes of battery life. Now comparing that (or worse, a desktop) to M1 is unfair.
> 13" MBP is a 4GHz part running at 1.4GHz. 16" MBP is a 4.8GHz part throttled to 2.3GHz.
Apple's thermal solutions could be better, but they are designed within Intel's power envelope specs. e.g. The i9-9880H in the current 16" MBP is only rated for 2.3Ghz with all cores active at its 45W TDP. The
i9-9880H is a 2.3Ghz @ 45W part that can burst up to 4.8Ghz for short periods, not the other way around.
That's one of my biggest sources of skepticism with the M1 on the long-term — in lieu of improving thermal management, they instead reinvented _everything_ to generate less heat. Which is great! The current state of thermal management at Apple will work great at low TDPs, but they've procrastinated instead of improving. If they don't ever learn how to handle heat, this arch will still have a hard ceiling.
There's nothing in M1 that indicates that Apple learned how to improve thermal management, but lots to indicate that they'd still rather make thinner/lighter devices that compromise on repair, expansion, or sustained high-end performance — the even deeper RAM integration, offering binned parts as the lower-end "budget" option instead of a designed solution, or offering Thunderbolt 3, fewer PCIe lanes, and a lower RAM cap as being enough for a MBP or Mini.
Which is why it's such a horribly meaningless comparison - the marketing (and reviewers assisting it) blasted out comparisons with Intels obsolete technology, hamstrung by awful thermals. And then proclaimed it generally faster.
It would be interesting, though, to see what those numbers would look like with a fan. Anything that runs that long on an Air is going to start throttling.
> IntelliJ is working on porting the JetBrains Runtime to Apple Silicon. The apps currently work through Rosetta 2, however building via Gradle is extremely slow. Gradle creates code at runtime, which seems a particular bad combination with the Rosetta 2 ahead-of-time translation logic.
Java does not need to be run through emulation. Azul already published ARM64 Zulu builds for OpenJDK (here: https://www.azul.com/downloads/zulu-community/?package=jdk) and they work great on the M1. I'm currently using IntelliJ running in emulation mode (an official release for the M1 was originally expected by the end of November) and but build/run my projects on a local Zulu JVM with ARM64 support.
Not sure why people are so worried about developers getting trapped by locked down systems. I have plenty of other PC’s, running Linux, *BSD, Windows much less locked down. I’m very aware Apple is controlling of their platforms. No I’m not thrilled about certain aspects of the M1 platforms, but most platforms I’ve used feel locked down in some shape or form. If, in the case Apple tries some of the outlandish theories out there, I’ll simply cease to use it and probably trash Apple then as it will be useless to me, but I’m not locked into anything.
All x86_64 Windows processor manufacturers are supporting the Microsoft firmware that offers secure attestation about whether your system was booted securely or not, so if by lockdown you mean “I can make undetectable changes to my system”, that’s coming to an end as rapidly as they can bring it to market (before Apple locks them out of it). You’ll still have the option to alter whatever you please, but software and websites will have the option to refuse service to your modified system.
Citing a twenty plus year old manifesto with offers no guidance on how to reinterpret the idealistic view of “It is my right to modify” versus the modern day problem of “I must protect myself and others from malicious modifications”.
Stallman hasn’t been of much use for this, and his followers fervently cite his decades-old works each time I raise this point — but meanwhile, this is available today to any website accessed by any Apple Silicon MacBook, and it’s already deployed and in use in the (Linux) servers powering Azure Cloud, and it’ll probably reach the consumer Windows market in the next year.
Now what? After we get past “RMS told us this would happen”, after we blow off steam about “My ideals are being violated”, is there anything left to discuss and consider here?
I thought there was — for example: “Is it possible to reconcile the conflicting needs of safety and modifiability?” — but the prevalence of replies like the above over the past few weeks makes me think that I’m mistaken, and should simply let this go unnoticed until it’s too late and irrelevant what anyone believes.
> the modern day problem of “I must protect myself and others from malicious modifications”. Stallman hasn’t been of much use for this
Clearly we differ, because it looks to me like Stallman put forward the only plausible solution to someone taking control of my system without my knowledge or permission: open source. Granted, that wasn't what he was trying to do at the time, he was as you say perusing the "It is my right to modify" line. But that's how it's turned out.
You are apparently perfectly happy to have Apple / Google / Microsoft or whoever install whatever backdoors and spyware they please on your system. It's not like you have a choice, or will even know, so it's probably best you've made your peace with it; just as people have made their peace with Facebook, Google Chrome not deleting their own cookies, or Microsoft refusing to copy files because some virus checker had a copyright violation signature fire (that's actually happened to me).
Maybe I'd even be OK with trusting those companies, but I definitely draw the line with governments granting themselves the legal right to rummage through that same cookie jar, which is exactly what the Australian government did with it's Assistance and Access bill. [0] I'm sure all governments do the same thing of course, including the Chinese government. A good indicator of how seriously the governments themselves take this threat is how Huawei is being treated by Western Governments. I have no doubt the Russian and Chinese view western gear with the same level of suspicion. To me that is the only sane position to take.
Just to be clear, I'm not saying TPM's and the DRM they enable aren't useful, but the problem is the lack of visibility into what these black boxes you are carrying around with you and putting in your living room. If you know what those boxes are doing, locking them down so hard they can’t be compromised by someone who has physical access to them is a nice addition, although the threat scenario (someone who has physical access) is very limited so perhaps not a major addition. But what you seem to be applauding is locking them down so hard even you, who has physical access, can’t see what they are doing, and then you go on to pillar a person who proselytized making all software transparent, so everyone could see if there systems system are running software they approve of, and not malware or worse.
[0] Quick summary: the Assistance and Access bill gives the ozzie government the right to force any company to write spyware that won't be detected by their OS's (that's the "Assistance" but), and then install it via their auto patch systems onto to device they nominate (that's the "Access" bit).
My personal opinion on this technology isn’t included here, and I’ve made a point of withholding it each time, specifically to deny the opportunity to invoke the messenger’s feelings as relevant to the issue at hand. What I feel about this doesn’t matter, because this is already live in two marketplaces and headed rapidly to a third. I could be pro, I could be con, I could be both/mixed or uncertain/apathetic (hint: it’s not a purist view on either side of the fence). Discussing my viewpoint isn’t even possible, yet half of the words in your reply are dedicated to your speculation about it.
Focus your energy on the real issues at hand:
How are we going to adapt to the reality of secure attestation? How are we going to confront it with technology? How should we legislate to protect against abuse of it? How can we make use of it appropriately?
My goal is to raise awareness, and based on the other half of your reply, I’ve succeeded with one person. That’s progress, I suppose.
At least in the US, my understanding is a 1975 law, the Magnuson–Moss Warranty Act, protects warranties on modified products as long as the modification didn't cause the issue. It comes up all the time with cars since there's an active mod community.
That being said, I usually backup and "factory restore" my computers before servicing. Mostly because I don't want to hand out my password or hand over my personal data, but I expect them to test that its fixed and that's easier in a generic OS. I also think it'd be odd to hand over a Macbook Pro (or even a Microsoft Surface) with Linux installed and expect their random, low-level tech to asses things.
I’m aware that Windows is locked down and even with Linux or *BSD parts of the platform you’re running on are mildly concerning at least. However, as you said I can still alter what I wish, whereas a lot of theories I’ve heard about Apple here recently involve turning the Mac platform into some sort of crippled sandbox in the spirit of mobile devices, or worse like some consoles that are so locked down the hackers haven’t beat it yet.
I do believe Microsoft would love to head down a path like that, but are too burned with the massive frankenstein of legacy tech that is Windows. I also tend to share a bleak view of the future of computing as many of the people voicing concerns, I just don’t believe that a niche set of consumers can overpower that of the majority who could care less if you ranted to them about control or privacy or whatever.
> If, in the case Apple tries some of the outlandish theories out there, I’ll simply cease to use it and probably trash Apple then as it will be useless to me, but I’m not locked into anything.
And what if it is too late? E.g. Apple owning the entire space you want to develop apps for? Or, if this seems an impossible theory to you, how about competitors copying Apple's model?
Please understand that as a consumer you have power, but if you don't use it you can end up like the proverbial boiled frog.
I can understand this level of pearl-clutching when we’re talking about social media, but I just don’t see Apple as the five-alarm fire it’s made out to be.
Practically speaking being “locked” into a system would only be a possibility in a world where highly distributed, massively supported OS’s (Windows, Linux) didn’t exist.
Anti-apple people have a hard time fathoming the degree to which many other “power users” are okay with privacy/security trade offs if it results in a more robust/stable environment that just works.
Problem is, it’s being framed as some sort of Faustian bargain when it’s really not that big of a deal.
Until it doesn't. I can open my Thinkpad (W530) and replace parts in it that die. Can you say the same for your Macbook? Are you okay with a future where everyone follows Apple's example because it generates more profits, where right-to-repair is a thing of the past? I know that sounds alarmist, but it's definitely the trajectory I see.
The EU is introducing legal laws that will combat planned obsolescence. And that protect the right to repair. The EU is also fed up with manufactures thinking they can get away with everything. Those laws will also have a effect across the world because its easier to design on product that applies to the strictest laws, then two products.
So yea, Apple their gravy train is going to run into a massive road block. In the US states are also starting to rectify laws that push the right to repair.
Its one of the reasons why Apple "suddenly" allows repair shops to buy parts. As a way to change the narrative. Of course that buying parts has "issues" Apple style. A grifter is going to grift.
The X1 Carbon doesn't have upgradeable RAM. On some level, I'm convinced that Lenovo wants to follow in Apple's footsteps and is testing the waters with the X1 Carbon series.
When I have parts in my MacBook Air die, I guess I'll find out. It's 10 years old now and doesn't show any sign of breakage. I'm pretty sure it's paid itself off several times over.
Since the new Air lacks even a fan, it seems even less likely to need service.
Homebrew is still a work in progress for Big Sur, let alone M1, and brew upgrade on an x86 Mac will give you this message:
Warning: You are using macOS 11.0.
We do not provide support for this released but not yet supported version.
You will encounter build failures with some formulae. Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. You are responsible for resolving any issues you experience while you are running this released but not yet supported version.
Well, I did have one problem when I upgraded homebrew on Big Sur. For some reason I got the `brew link unbound` returns `/usr/local/sbin is not writable` error. I looked on Ask Different and found (for an earlier version of OSX) this [1] Basically, mkdir the directory.
that was the same with catalina when it first came out, same warning so I don't think this will be an issue indeed, also in catalina it worked fine regardless of the warning
So just bypass Homebrew and install your own stuff. It's not that hard. Homebrew isn't at all required, it's just a convenient tool that slightly simplifies some things.
Those things are very much worth simplifying: as more and more packages are starting to rely on pkgconfig, getting the headers recognized that ship with whatever SDK you are currently using (Xcode, command line tools) is very tricky and sometimes requires patching the build tools.
Homebrew is the perfect glue between the traditional Unix world and the somewhat unexpected Unix environment that macOS is providing these days (mostly a consequence of Xcode wanting to be a self-contained .app and /usr being read-only while still honoring the tradition of leaving /usr/local alone for you as a user)
This actually simply isn't true. Firstly, pkgconfig works perfectly well on MacOS and always has. It sounds like you more have issues with Xcode than you do with anything else. I've not heard of a commonly used package requiring any sort of patches to get it to compile on MacOS. MacOS is a first class citizen of the open source world and things generally work perfectly for it.
Yes you need to spend a bit of work to find out what dependencies you need, but the configure scripts will tell you that when they fail.
> Homebrew is the perfect glue between the traditional Unix world and the somewhat unexpected Unix environment that macOS is providing these days (mostly a consequence of Xcode wanting to be a self-contained .app and /usr being read-only while still honoring the tradition of leaving /usr/local alone for you as a user)
On Mac packages have always naturally installed into /usr/local and not /usr.
The problem is that the macOS SDK doesn’t ship pkgconfig files for the libraries included with macOS.
Unless you go the extra length, the “easy fix” is to compile your own version of those libraries which is fine until you need to be linked against a binary shipped with the OS (say you are trying to compile an Apache module).
At that point you will have conflicting symbol names between the binary that shipped with the OS and depends on its libraries and your binary that depends on the self-compiled libraries which might or might not have matching versions and/or custom patches.
Libedit, libxml2 and many others are samples of this. Their binaries are in-fact installed in read-only /usr as shipped by apple, but the matching include files live in the Developer folder of either the command line developer tools or Xcode.
No open source package will be able to link to the system libraries by default with this split and when all they support is pkgconfig, you’re SOL without additional manual work
I am a iOS developer and I have an M1. There are some issues that need to be ironed out with some packages that I use not compiling for some reason on the arm iOS simulator, but that has diverted me to compiling on device which isn't much of a hassle.
I am not bothered as much by the 16gb of ram, it is still rather usable. What I really bought the machine for was the battery life. I've had it for a day or two and it is amazing how long I can stay away from my power socket. Also this thing runs really cool. I have not even heard the fans spin up once, even during the recent Sydney heatwave (45°C).
I've been disappointed in Apple's hardware offerings for developers over the past few years and almost swore them off, but have a renewed interest in this machine. However, it's not because of anything Apple's done in particular, but because I'm in the process of fully transitioning my workflow to the cloud, and using a laptop only as a thin client.
Of course, there will always be a need for running dev toolchains locally, but I wonder how many other people are like me who would rather use Linux over an internet connection, and really only need a terminal emulator and an IDE.
This seems like a great use case for a chromebook. Have you looked into that? It seems like the Apple premium may not be worth it. You get long battery life, a very slimmed down machine that can run a browser and a terminal, and they are just as slim as a Air + much cheaper.
Chromebooks would be a perfect machine. Except that most chromebooks don't have premium features: dim screens, capped at 1080p, cheap Intel CPUs, plastic chassis, ...
I am still waiting for a premium chromebook with ARM CPU and great battery life to hit the market.
There is probably plenty of people like us that do this. VS Code remote development feature is quite popular which indicates many people are going with this setup.
I’ve thought about it! The keyboard is too important for me, and still being able to checkout source code locally and run a local ide tips it in the favor of laptops.
Assuming you prefer Apple's keyboards, I'm pretty sure the Magic Keyboard with Bluetooth works with iPads. The downside is you'd have to manage charging the keyboard and the iPad, but on the bright side you could choose to get the keyboard with the numeric keypad.
I have heard really good things about the keyboards for the iPads. I’m still using an iPad Air 2 with a Logitech Bluetooth keyboard which works wonders.
There are code editing apps for iOS, but I don’t know what they're capable of these days. Some use cloud servers, but as soon as you know you want the option then the laptop is probably the way to go. Happy hunting!
I travel and move around a lot (not so much now, but I still move around my country and to the office), so portability is very important to me. Another advantage of the MacBook line is that I can carry one Anker USB-C & A charger and charge everything I have.
I was looking into buying the new MacBook Air. Anybody has any particular alternative to recommend? I am looking for a new laptop and haven't found anything similar to the MacBook Air (I hate the touchbar) for a similar price (the 16GB of RAM model, if expandable better). Anything similar is either a lot more expensive or the build quality is a lot worse.
Seems like a cool device. It’s too bad that apparently you need weird workarounds for more than one external monitor for the M1 MacBooks, which is my own personal dealbreaker.
I still can't decide. I am wanting to get into ios development after 10 years. which computer should I go for.
What is opinion on these m1 machines longevity. Some tech people are comparing it to first gen iPad.Like they will be sunset after a couple of years vs 2nd gen which are supported much longer. but I am not so sure most of the issues with these first gen m1 machines are software so in theory they should be supported for long time.
Apple still plans on releasing new Intel macs. I don't even think buying one of those is out of the question for some people.
If you look back at the Intel transition, if I remember correctly they released the MacBook Pro with a Core Duo in January of 2006 and Core 2 Duo in October. The latter being 64-bit. Sure, the latter ones were better, but the former ones worked fine. OS support seems to go with architecture generations. The early 2006 laptops supported Snow Leopard which had its last release in 2011. The next laptops supported Lion which had its last release in 2012 (but with hacks could get updated until 2018).
I think the major question is if you want to wait to see what the "pro" machines look like or if Apple has tweaks in 6 months to the current line up. The Mac Pro has some big questions about what may need to change if they're looking for parity with the current Mac Pro (improved multi-monitor support, more RAM options, upgradable GPUs). Those changes may show up in the high end Macbook Pros (likely not upgradable GPUs).
You will be inside of Xcode for most of your time - I just upgraded to a 13" Air (maxed out configuration) from my 2012 11" Macbook Air which was still fine with Xcode. I think any Mac will run Xcode well enough.
I have the new MBP and I can tell you its a remarkable machine. Of course you need to check which SW you need to run and if it works (I assume it will). I guess everything else, native versions, will come with time.
The performance of these is very impressive - selfishly, I'm glad to read that the 16" is still faster for Xcode tasks (as I have the same spec as Pete and some of the commentary about the speed of the new models was making me a bit envious!), but I have no doubt the Pro models of these will be remarkably performant.
Personally I will try to wait it out until they release a new design... although may end up with a Mac Mini build machine
Sorry if this has already come up but apart from the excitement about performance what is the build quality like for the new MacBooks? The main issues I’ve been facing from Apple lately have been physical failures from keyboards to screens and ports. The 16” is better but already I’ve found some keyboard keys get unresponsive occasionally and reported battery health is deteriorating fast after only a handful of cycles...
Most of the complaints I have seen online are will not power on, DOA type issues. Not many yet (some review units out of box etc).
Apple deems itself a consumer company and perfection is too expensive. Apple quality is generally going up every year, but slowly.
Apple makes incremental improvements manufacturing nearly weekly just like software builds. I always say never buy first gen, or at least never on the first day. Never assume your machine is just going to work.
MacOS stores a full frame buffer for every window (visible or not), for compositing. If I recall correctly, the frame buffers of every active window are attributed to the Window Server. That can get big when there are lots of windows, especially on a high DPI display.
A full screen at 4K is 8 megapixels or 32MB or so. I guess if they double or triple buffer each app might take 100MB or so, but you'd still have to open 10 apps to reach 1 GB.
10 app windows doesn't sound like a lot to me.
Besides that, if you enable resolution scaling macOS renders at a higher internal resolution and lets the compositor scale it down. Try that on a 5k screen...
I mean, what of it? 10 windows doesn’t seem like many, at all. I tend to end up with at least ten terminal windows by the end of he day, nevermind everything else.
Really? Huh. I’ve never seen it use anywhere near that much on my Hackintosh, but I don’t have a super high-res screen or anything like that. I wonder what the difference is caused by.
On my 2015 15” MacBook Pro, I have about seven apps open, about half with multiple windows, and an external monitor plugged in. WindowServer is using 200MB.
Unless you use anything that does JIT compilation (e.g. JVM), e.g. JetBrains IDEs are really slow onder Rosetta.
Another pain point is code that relies heavily on SIMD. AVX/AVX2/AVX-512 instructions are not translated, so in those cases you'll be using slower kernels. My machine learned models are ~10 times slower with libtorch under Rosetta. Though they are competitive when compiling libtorch natively.
(Source: I have used an M1 Air for a week, but decided to return it.)
I’ve been on a new M1 since the day after release. Love it. FAR faster than my previous Mac. But there is some work left to do. It crashes every day, usually more than once. Frustrating but still very happy to have it. Just bought a second one today for my wife.
Are there CPU manuals from Apple that explain how to optimize assembly code for their CPU and document any deviations from the ARM reference architecture?
There are always microarchitectural differences you can take advantage of, but Apple probably wants you to stick to the ISA as these things change in ways they don't want to document.
Apple of course uses these things in the lower levels of their software, so if you're using their APIs you're getting this for free across OSes and CPUs.
Micro-architectural differences. For instance, from the 386 to Pentium 3 (and on Athlon, I believe), shifting was cheap, due to the presence of a barrel shifter. This was dropped in the P4, and suddenly shifting was quite expensive (I think, though I’m not sure, that it may actually have come back in later P4s). That’s the sort of thing compiler writers need to know about.
The P4 was quite an anomaly[1] in the history of x86, optimised for clock speed at the expense of many other things. All the subsequent generations went back to the norm, with the exception of the Atom series, and even that one is not as unusual as the P4.
[1] Even the family/model/stepping designation has an oddity: the 486 was family 4, the Pentium was family 5, and then everything from the Pentium II/III up to the latest Core series, as well as the Atoms, uses family 6; but the P4 was family 15.
The costs of different operations differ between CPUs that implement the same instruction set. Compilers have costs tables for each processor family for when you target them (e.g. with -mtune)
It really depends on the actual design of the processor.
Some things like the instruction density and inlining heuristics translate from processor to processor, but the actual microarchitecture (rather than the ISA) determine what is "good" code from platform to platform e.g. Good instruction scheduling depends on how the CPU is designed.
Indeed, although it's worth saying that GNU were offered LLVM and said no, they have suffered from their own success because new developers and culture within programming is now post-GNU where people can't remember the environment that led to the GPL becoming popular.
What an odd article. It spends several paragraphs describing nightmarish compatibility issues and lack of support or capability for various mainstream tools, but seems to have a completely unjustified and wildly optimistic point of view that all the problems are just short term transitional annoyances that will be fixed in Q1 2021 or shortly thereafter. It concludes M1 is worth the hype.
What a bizarre perspective with nothing to support it except apparently blind optimism.
Well, there’s also Apple’s success rate at pulling off transitions like this, and the fact that their developer partners know they have to either make the transition or abandon the platform.
Unlike Microsoft dipping its toes into the ARM waters, everyone knows Apple is committed to this.
Apple pulled off a similar transition once in recent history (the other examples are so antiquated that they teach us little about the present hardware industry), and that was a move toward the rest of the personal computing market --- not away.
This is a step toward mobile/iOS. Apple isn't the same underdog that it was when PowerPC was put to bed, and the transition is challenging because the Mac platform is bifurcated in the interim.
PowerPC to Intel isn't a good proxy for this because the technology is different, the transition strategy is different, and the company itself is different.
If the M1 transition is to be smooth, it has to achieve that on its own merits, without looking back in search of comparable transitions past.
The complaints raised in this article are mostly irrelevant to the typical consumer.
Rosetta 2, from all reports, works well enough that I’d argue they’ve already succeeded in making the transition smooth enough to call it a success.
The main non-developer concern raised in this piece is extraneous dialog messages, which sadly we’ve pretty well all been trained to ignore. I agree they’re unfortunate, but they’re hardly a show-stopper.
The real risks Apple faces, as far as I can see:
* Will developers do the needful for universal apps? That seems to be well underway, even among the giants.
* Will a lack of x86-64 Windows emulation prove to be a deal-breaker for too many users? I’d say that’s impossible to know, but I’m optimistic. (And if Apple once again is setting a trend for the industry, Windows itself may fully make this transition someday.)
* Will Apple be able to keep up with AMD and Intel in the chip race? Obviously early signs are promising.
Big risks, but mostly ones Apple can control, or at least influence. What other dangers am I missing?
Adding: the dependence on TSMC is obviously a risk, but I imagine Apple has some notion of what they can do if they lose that option.
Dev here, the burden of Apple far exceeds any claimed time savings. I think every dev knows exactly what I'm talking about. And if you aren't a dev, Apple treats us like replaceable Dogs.
Apple has killed any good will among nerdy developers by treating them like shit for 20 years. Sure there are still corporate developers that are forced to make iOS apps, but it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...
Until Apple treats us(and maybe their non dev customers) better, I don't see any reason outside iOS Apps to ever buy a macbook for dev.
> it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...
Yeah, this is simply just false.
Apple being a consumer grade Posix OS is exactly why hobbiest choose it. It has first tier support every consumer app you need, with no configuration headaches and has all the tools you want for modern development.
This is why most web devs today are still carrying Macbooks. Even if Apple has not been prioritizing building machines with beefy specs.
I’ve been doing web and server dev, and personal computing, on a Mac for those same last 20 years and I honestly can’t relate one bit to any of what you’re saying either as a dev or just an end user. In fact, the companies I feel most mistreated by are those who have pushed more and more cross platform (Electron & out of place UI/UX) onto the platform.
(A notable exception to that feeling is VSCode. While I would quite prefer a native app with its features, I’ll gladly pay the Electron tax because it’s the best editor I’ve ever used.)
Almost all the full stack web developers I know are using MacBooks for both front and back end codebases across half a dozen languages. It’s an advanced, modern operating system with nearly complete Linux compatibility.
It’s fine if it’s not for you, but to stereotype an entire part of the industry like this isn’t fair.
As they say, to know where the ball is headed, watch the full stack web developers. I know it was a half dozen because I had to use both hands to count.
I really don't know what you are talking about. All my development work could probably be made on a linux computer, or windows, but I choose to use Mac just because it makes embedded work and web development so much easier.
I’ve gone through dozens of tutorials for installing tool XYZ for Mac to develop with embedded, and not once have I had to install drivers or mess with udev rules. Things just work when you plug them in.
Maybe most developers are professionals that just want to solve problems for their customers, on the platform that their customers are using, instead of shipping 2nd grade experiences based only on unsubstantiated assumption of what future Apple will do?
This is certainly a hot take. Every event for “developers” I’ve been to recently Macs outnumber anything else by a significant margin.
I can say for me personally that yeah I have my issues with macOS, but the amount of fiddling I have to do with Linux and the different patterns I have to learn with Windows means macOS will likely continue to be my choice for years to come.
I’m a dev; fled to MacOS in 2005 or so because Linux on a laptop was just such a nightmare, and never really looked back. I gather things are better now, but still not _great_, and life is too short to have to sacrifice a goat every time a kernel update breaks your Bluetooth audio or whatever.
Apple’s far from perfect, but there is something to be said for ‘just works’.
Heh, been running Linux for a decade plus and don't remember any such problems recently.
Amusingly you mentioned bluetooth, apparently the new M1s are having serious bluetooth issues, so much that those with the apple bluetooth mouse and keyboard are resorting to wired devices.
> Dev here, the burden of Apple far exceeds any claimed time savings.
I have the complete opposite experience. I don't feel any burden from apple when using a mac. MacOS being a unix like operating system makes developing on it a breeze. On windows i need to use archaic developer tools like powershell. Linux commands I use on my production servers don't work on my development environment. If I want to use git in a sane manner from the command line i need to install an emulator that has its own issues.
There is a reason almost all companies in silicon valley use macbooks for their developers.
> but it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...
As someone else has said, this is exactly what people seem to be doing.
> On windows i need to use archaic developer tools like powershell. Linux commands I use on my production servers don't work on my development environment.
I mean, you can dislike Powershell and love Unix tools, but your choice of words is remarkably funny.
If anything, Unix tools are archaic, generally organically grown and not that designed or designed based on principles from 50 years ago, in many cases principles that have been superseded.
Powershell actually has a design and a modern one.
Again, you might like one and dislike the other, it depends a lot on personal preference and familiarity.
Apple's terminal environment is POSIX-like. PowerShell was always a bit of a lame duck, or Microsoft wouldn't have bothered with WSL to get real power users to stick with their platform.
Powershell is the only mainstream shell that is close enough to the Xerox PARC workstations REPL experience.
As for WSL you got it all wrong, after the failure of Project Astoria (running Android apps on UWP), Microsoft found a business opportunity in selling Windows to folks that buy Apple devices to do GNU/Linux work instead of supporting Linux OEMS, unhappy with Apple no longer caring for them as they no longer need their money.
So they picked up the infrastructure, redone it as WSL, and start selling the feature to that crowd, now they can get the hardware that Apple doesn't sell them, while keep paying proprietary garbage vendors instead of supporting the vendors in the Linux community.
I really wanted to like WSL 2 but it has a lot of bugs that may take years to get ironed out (unusably slow disk IO, memory leaks, virtual disks that grow forever, etc.)
I think it would probably work for web development as long as you keep everything inside WSL, but for other development tasks you're going to run into issues real quick.
I use WSL 2 for full time development on the stable release of Windows 10.
The last 2 issues are an issue as of today but you can workaround them where it becomes a non-issue in the end by setting 1 config file value and running 1 command maybe once a month.
Also if you keep your source code inside WSL 2's file system then the first 2 issues are non-issues in practice.
> I think every dev knows exactly what I'm talking about. And if you aren't a dev, Apple treats us like replaceable Dogs.
No. Stop presuming to speak for all of us. Apple has generally treated me a lot better than Microsoft did, as a user and as a developer.
Downvoting the other side may give the impression that everybody is of the same opinion in this echo chamber, but it won’t change the reality that many people are happy with Apple and excited for the direction they’re taking.
calm down, since 10 years cpu speed doesn't really matter anymore, and if you aren't involved in the apple ecosystem hulahoop (or javascripting - god forbid) i think m1 is not very interesting untill it allows other oses to run on free hardware.
I’m sorry but this is just elitist nonsense. It’s disappointing to me that so many here simply can’t conceive that some people don’t share their priorities, and make conscious choices that reflect their own priorities.
I don't care if the inevitable future of the Mac platform is more locked down. In fact I welcome it. Does it limit what I can do with my device? Of course. Do I lament some of the control I’ve already given up? Sure. But the trade off is a safer and smoother computing experience that allows me to focus more on what I want to achieve with what the platform does provide.
I spent some time on Linux, I got into hyper-customizing every minute detail, I swapped out third party RAM and even early adopted SSDs by mounting them in the optical drive bay. I built a PC tower from personally selected components. I triple booted Windows and Linux on my first Intel Mac with community built boot loaders well before Boot Camp was a thing.
I can’t imagine spending my time that way anymore. I learned a lot, but now I just wanna get stuff done. I like having some of the system out of reach. I like knowing that it’s readonly and I like the kind of hardware advances Apple’s vertical integration has enabled.
It’s okay if we value different things, and it’s okay if we make different compromises to experience them.
I don't understand, so you don't care and it's okay if he cares, but if he says that people not caring is part of the problem then he's an elitist? What is your argument here, that he has no grounds to point out the apathy toward the long-term impact of your decisions?
You go so far as to validate the things he's concerned about and reiterate that you just don't care about them. I don't see what's wrong with pointing out the inherent shortsightedness of that line of thinking.
>it’s disappointing to me that so many here simply can’t conceive that some people don’t share their priorities, and make conscious choices that reflect their own priorities.
Have you considered that those people understand basic human psychology and that they take issue with the exact priorities that they are speaking against?
> I don't understand, so you don't care and it's okay if he cares, but if he says that people not caring is part of the problem then he's an elitist? What is your argument here, that he has no grounds to point out the apathy toward the long-term impact of your decisions?
What problem? How do my decisions to prefer one computer platform have any impact on GP or you or anyone? Apple turning the entire Mac line into iPads can’t and won’t eliminate more open platforms that you or GP may prefer. There’s no future inflection point where Apple has locked down the Mac platform in a way that enables them to storm into your home and replace your Linux computer (or whatever) with a Mac.
It’s elitist because the suggestion is that my preference represents an inability to evaluate the impact it has on me. It has no meaningful impact on you. It would have no meaningful impact on you if I decided to eschew all computing technology and go live in a monastery.
> You go so far as to validate the things he's concerned about and reiterate that you just don't care about them. I don't see what's wrong with pointing out the inherent shortsightedness of that line of thinking.
I am allowed to have different preferences and priorities. It’s not shortsighted, I see the compromise I’m making and I’m satisfied with what it provides in return. And since it in no way harms you or GP, it’s not your business to tell me to change my preferences.
> Have you considered that those people understand basic human psychology and that they take issue with the exact priorities that they are speaking against?
Have you considered that taking issue with other people’s preferences and imposing your own is far more controlling than just accepting that some people like different things, even if their preferences are personally limiting in a way you’re not comfortable with for yourself?
Why it is so difficult to consider that your calculations of risk-vs-reward are not universal nor inherently correct? They likely did the math and found that the risk is worth the reward to them. Actual money in the hand now is worth more than theoretical money in the future and so on.
I think it's funny that you accuse someone of not considering the alternatives with the supporting argument that 'they probably considered the alternatives'. What makes you think a for-profit company is thinking about the long-term negative impacts of their business on the industry when that's rarely the case in a free market?
>Actual money in the hand now is worth more than theoretical money in the future and so on.
Exactly, short-term profit will almost always trump long-term problems. Is that so difficult to consider?
This is a discussion regarding third party developers who use the Apple platform. I honestly have no clue how your statements tie into that in any way.
Third party developers are typically for-profit, and your argument was that those developers are capable of factoring in the long-term impacts of their decisions over the short-term profits when in reality that is very frequently not the case.
>They likely did the math and found that the risk is worth the reward to them
GP is saying that they likely did not do the math or that they prioritized short-term gains over the long-term impacts of their decisions as for-profit entities typically do.
I'm not sure what there was to misunderstand about my original comment since I more or less said exactly that.
>GP is saying that they likely did not do the math or that they prioritized short-term gains over the long-term impacts of their decisions as for-profit entities typically do.
No, GP said they did not consider future impact, period. I said that entities can consider future impact and still choose short term profit (and that it is often rational to do so). I then pointed out that assuming someone else didn't do the math because their results are different than your results is egoistical.
I think they aren't wasting time and money. Look at the article's author -- they are investing time to earn money selling their product on this platform.
Also, your point about Mac future is pure speculation at this point, which contradicts public statements from Apple.
Apple contradicts public statements from Apple pretty consistently, they have a habit of saying they won't ever do something right up until they do it. I don't think there's anything unreasonable about those types of concerns because when/if Apple does go back on their public statements you won't have much recourse.
I used to live in that past, and you know what, it was great to be able to sell development tools to developers instead of being forced to work only with enterprise customers, the only ones left willing to pay for tools.
Everyone gets to pay for their tools, in some form even if it means getting them n-hand out of some flea market, just in some software development circles for some strange reason we get a bunch of people that feel entitled to earning money without paying others for their work.
With this argument iOS should have never taken off. Yet it’s better than “open” Android standards. Thing is Apple rewired people to stop thinking about RAM and CPUs and made developers write software that works on 4 year old iPhone. Try that in Android or Windows. Yes there is a down side to it but as long as consumers can win with the outcome I personally think it should be fine.
Apple purposely doesn't implement standards that would make progressive web apps possible on their platforms, and they don't allow third party applications on their mobile platforms.
So... don't buy it? Unless you think it's the best hardware on the market at the best price point.
And what makes you so important that every hardware provider has to bend to your will? I can't run Shor's Algorithm on my Raspberry Pi, whose fault is that?
How will Apple take away your options? That's what's difficult to understand. Either the FOSS/DIY approach has merit and will continue to thrive, or it'll die because it becomes impractical, neither of those courses have anything to do with Apple trying to prevent Joe Bloggs becoming a node in a botnet.
"It's no different than consoles" is a pretty strong statement to make unsupported. There are many differences. You can make a more direct comparison between say, an iPhone and a game console, but a general-purpose laptop or desktop has very many differences from a game console, and ought to.
> "It's no different than consoles" is a pretty strong statement to make unsupported. There are many differences.
Like what? ps5/xbox series x/ps4/xbox one are very sophisticated, comparable to a modern desktop PC, yet it's totally closed off. Consoles no longer resemble embedded devices like previous gens. Heck, even the original nes used the MOS 6502 chip, which was very popular in PCs in the early days.
Interesting that Java "write-once, run anywhere" stuff ends up being the slowest due to having to run through emulation if it interfaced with native code (native compiled code: fast; java jit for the right arch: fast; java jit for the wrong arch due to old native integration: super slow).
You should really think about porting the python 3. It's not actually that bad unless you heavily use something that changed. I ported a major django project last year and it wasn't a major problem or time sink.
Most developers I know use already a mac. It is a faster system for similar hardware (maybe due to the filesystem or something) and its Unix insides support development work better than Windows.
M1 is just going to make macs even more dominant.
And AMD is not going to rescue Windows/Intel. These night and day improvements are simply too big to be explained by TSMC being a better fab than Intel.
"Linux and Windows maintain the top spots for most popular platforms, with over half of the respondents reporting that they have done development work with them this year. We also see some year over year growth in the popularity of container technologies such as Docker and Kubernetes."
Which means we're looking at the question with the wrong granularity.
There are sectors of the industry where Windows is a hard requirement, ones where Macs are utterly dominant, and others where Linux is assumed and running a proprietary Apple OS gets you the side-eye.
> Windows is a hard requirement, ones where Macs are utterly dominant
I think you got it reversed. One can develop Android apps on macOS because there's no lockin like XCode. The same is not true for iOS apps which require XCode.
I don't have it reversed at all, mobile applications are just one sector of software development, and not a very large one.
There are entire industries (healthcare, to name but one), where Windows is simply a given, and all development targets that platform.
It's true that Macs are a requirement for iOS development, I was referring more to the observed fact that most devs in SF doing "cloud" whatever use a Mac, probably around 80%.
> most devs in SF doing "cloud" whatever use a Mac, probably around 80%.
They're just victims of group think, advertising, peer pressure or status signalling. Everywhere else in the world, 80% of people doing "cloud" whatever aren't using a Mac.
Maybe I should clarify the "most developers I know".
Inside my world little circle of freelance Java / JavaScript developers 2/3 are on a mac. People buy their own work machine unlike maybe working for a big organisation where that is provided for you.
I don't get this blog, neither what's the point !?
The author lists several bugs, instead of posting the issue number with a link, he puts his Tweet with the bug numbers (without link). Im not sure what the author is trying to accomplish here. Also I guess these kind of bugs, although not nice, they are mostly because MacOS suffered the biggest change since its history and supporting 2 completely different architectures is not an easy job. Of course we are at the mercy of Apple to fix them but this is part of the deal when you buy such a machine at this point.
Then all other points are just showing all the known the same issues that some commercial SW was not yet ported to a new architecture.
Software is going to be a big sticking point for this platform and it may be more difficult outside of the professional software to bring along types of software.
Apple cedes a lot of sales by not attempting to bring into their fold all the gaming companies. Just looking at Steam alone shows the disparity and there is a lot of money to be had. The idea of just own two systems is one that not everyone can justify.
Maybe I've only focused on graphics and gaming PCs but it's really interesting to see how low spec some of your work machines are in this thread, like maybe I should write more efficient code. My desktop is using 32gb of DDR4 ram and laptop is on 16gb. When I did controlmylights.net I was using almost all of the 32gbs, maxing out the gpu /cpu. I don't normally run that heavy of a production on my system but it was nice to be able to do it (2x resource heavy OBS stream setups (Twitch/YouTube), MongoDB, Redis, NodeJS application, openFrameworks). My desktop raised the temperature of the room by at least 5 degrees F, it was a mini space heater. I definitely could have made that project in something lighter, maybe using Rust, but it's nice to have the headroom. The next performance upgrade for me will be overclocking my 8700k.
Wow, HN is so much strict to Apple.
I'm already seeing 4 ~ 5 top level comments on how Apple "locks down" computers and they are "taking our freedom"...
Apple have stated multiple times that they don't have any intention to lock down macOS more or less - I can't really think why anyone would think Apple would lock down macOS. There's... really no reason right?
My gut feeling is that there are a lot of people that don't like Apple, mostly due to their proprietary nature, and they just... argue against Apple. Before the M1 appeared, the argument was that Apple's Macs are expensive for nothing, they have terrible hardware, Touch Bar is bad, software quality has declined, etc... and now it's all about the user's freedom.
Seriously people. This article is about using dev tools on the M1 Mac. Let's not start arguing about how Apple is bad for freedom, etc.
It's quite the opposite actually. HN has a ridiculous Apple bias (the flood of blogspam mentioning M1 should be evidence of this), and any comments critical of it are shouted down. They can literally do nothing wrong in most eyes.
NO company can, or should be above criticism. You will see comments critical of _all_ large companies in HN threads, but only comments defensive in these threads. Imagine a comment like yours on a thread about Amazon or Facebook.
I have gut feeling of my own - that many people identify with their Apple products (identity politics) and are personally offended by any criticism that the organisation receives.
--
Edit: To put it in a little perspective... I invite you to view the contemporary Amazon thread. What would happen if you posted that everyone should focus on AMZN creating jobs and hiring people rather than criticizing them?
Maybe HN just has a diversity of opinion on apple products? There is plenty of reason to critique apple and plenty of reason to praise their products. I come to HN for the insight, and it is hard to get more insight about a tech company than from a bunch of tech nerds who disagree about it!
M1 has been out for a short time, and Apple Stores are not accepting walk-in customers. People aren't seeing their friends much due to the pandemic, so it's unlikely that those who bought the new machines are broadly showing them off.
Lots of forum praise about M1 seems like it's coming from people who have no hands-on experience with the new devices. I have no reason to believe that Apple pays people to stir up hype on the web, but I definitely sense that the vast majority of people hyping M1 haven't purchased an M1 device. My sense is that they just read some blogs, ogle some benchmarks, and regurgitate what they see.
That's not exactly a diversity of opinion. I can tell you I've been voted down a couple of times just for suggesting that there's too much hype and not enough information.
I've seen some blogs of techy people with hands on experience saying that they like the machine. I haven't really seen anyone who has it say they dislike it. But they do seem willing to jump through extra hoops to make the machine work for them.
So I think there is some reliable evidence in it's favour. But I don't think you should be downvoted for disagreeing with that. Since it definitely up for debate! Someone seems to have downvoted your comment here, and I think that is the wrong action to take on what is clearly a thoughtfull response on your part.
There is some genuine excitement about a new CPU architecture, that’s quite natural. If anything the opposite would be surprising on HN. Some of these people are Apple users, some of them are hoping another vendor will follow and offer powerful ARM laptops and desktops, some of them are just happy that a breakthrough has been made and that it will stimulate the competition. There is quite a lot of diversity in these opinions, actually.
You even find the usual contrarians who moan that it’s Apple, so anyone saying anything positive have to be shills and the CPU has to be terrible (plenty of those in this thread).
Since when do he have to buy something to have an opinion? Are you saying that the people who complain about the M1 (or Macs in general) without having bought one should just shut up?
> Since when do he have to buy something to have an opinion?
It's pretty difficult to have an informed opinion about a product that you've never seen/used/tested/tried.
Due to the requirements for a shopping appointment at an Apple Store (hard to come by, if you check online), and due to social distancing, it's fairly unlikely that a person who hasn't bought/received an M1 machine can have an informed opinion about it -- especially so soon after the release.
That's why I take the hype regurgitation with a grain of salt. Has nothing to do with buying the right to have an opinion.
Apple dictating what tools we can and cannot use (e.g. Firefox not being allowed to use their own rendering engine on iOS, but forced to use Safari internally).
This should be enough to make you look elsewhere.
But then there is the locked bootloader, the Apple tax, etc. etc.
All the things you state is only for their phones, not their computers. The bootloader isn't locked at all. And in terms of the "Apple tax". It's largely a myth. If you look at individual components used, they tend to be all top-end binned parts of everything. There is some Apple upmark, but it's not nearly as much as others say. You can't get a PC laptop with the same mix of low weight, high performance, and long longevity.
> I can't really think why anyone would think Apple would lock down macOS.
Because they keep introducing new security features that get progressively more annoying to disable. Have you looked at how difficult it is to modify system files these days?
Security is the antithesis of convenience. Would you rather they send every unit out with Frontier Law levels of user protections? Every graphics artist, musician, businessperson, student, and Joe Bloggs who buys a Mac needs to study the implications of owning a computer for themselves and mitigate all the threats on their own. Suddenly Apple computers are infected with malware like Windows was in the 2000s and we just burn all the research Apple did on the right compromise for their entire user base because a few power users want to turn on a machine and debug system processes before they even log in.
I perceive a level of anti-Apple bias in HN threads, I don't think it's imagined, but more prevalent is the power user bias of people who have advanced knowledge of computer systems and want a superuser machine off the shelf.
To me, the fact that you call them "security features" and not some pejorative euphemism is a bit of a capitulation.
Some of the things Apple makes are certainly "security features" and not security features, but to your main point: Apple designed their security model to be "trust us or trust nobody" and the security model most users want is "I trust me". It is eminently possible to do that without going back to the wild west of not having any security, actually, the issue is that Apple forces you to go there if you aren't happy with anything they add on macOS.
Most power users maybe. Most consumer users don’t think about the fact they have to trust at all. And you can’t say you don’t want the security features and then say that it’s bad you have to turn them off if you don’t want them. The goalposts are Apple’s customers, my guesstimate is that 80% of those paying customers would rather be secured by the vendor they already trusted to produce the hardware, firmware, software, and infrastructure they use on a daily basis. Do you know of any market research about the level of security most users want?
No, most users would rather trust Apple. These are the people who drop their computer off at the Genius Bar for software problems. They never even think about rolling up their sleeves to fix a problem, never mind firing up the terminal.
`sudo spctl --master-disable` to disable Gatekeeper, and `csrutil disable` in Recovery Mode to disable rootless/System Integrity Protection. Just like that, it's a Mac from a decade ago in openness.
Almost all the security features are getting progressively less sensible to disable though. You gotta be for real Jedi to be using a computer with easily modifiable system files these days.
I run beta software on my devices. Apple, in its infinite wisdom, believes that programs compiled on beta operating systems do not belong on the App Store. Thus, I modify SystemVersion.plist so Xcode will think it's running the latest release version rather than the beta.
This is just one of the things that you can do; I've occasionally patched system libraries, replaced resources, and so on depending on what I've wanted to do.
Do you think it's really that odd that they wouldn't want someone publishing releases from beta OS/Libraries?
Given how everything else you talked about could be turned off. Do you think that would be a better default? to allow system libraries and such to be altered? and that the protection would need to be explicitly turned on instead?
> Do you think it's really that odd that they wouldn't want someone publishing releases from beta OS/Libraries?
Yes, very. It's not like the toolchain really cares which version of XNU it's running on.
> Given how everything else you talked about could be turned off. Do you think that would be a better default? to allow system libraries and such to be altered? and that the protection would need to be explicitly turned on instead?
I liked what they did in Catalina, where you could remount the rootfs as writable. What they have in Big Sur is too much; you have to restart every time you want to modify anything and it is not possible to change anything after booting.
I don't see how that's any different from Microsoft, Amazon, or any other major tech company that makes the front page of HN.
In fact the only difference I can think of is that there's always some degree of reactionary discourse when Apple is mentioned by people who feel the need to defend the company personally. You never see this type of comment on a thread full of people complaining about Windows or AWS, but you almost always see it when there are a few comments critical of Apple.
Historically the attacks against Tesla have only been matched by the ferocity of political attack ads. So a lot of fans of Tesla get very defensive about them.
I think the reactionaries in any of these conversations are the ones advocating that Apple not improve security, i.e. that they don't make progress in protecting their users.
There are many reactive voices speaking out in support of Apple, but they are responding to those who complain (for the most part) from the outside, or on their way out of Apple's door.
I specifically mean reacting to the criticism Apple is getting. You see the same level of criticism of any major tech company in the comments section of the front page but very rarely are those criticisms followed up by a bunch of users defending the tech company in the comment section. You're arguing a weird semantic that doesn't really have anything to do with what I'm saying.
A cursory scan of the front page article concerning Amazon turns up a few standing in Amazon’s defence. I guess Apple isn’t the only tech giant with blindly loyal fans
None of these comments are reacting to people complaining about Amazon in the comments or claiming that HN is unfairly biased against Amazon, which is what the comment I replied to was about.
Some of these are the replies to the claims of bias, some are the original claims themselves. The search bar is at the bottom of the home page. I may have been rather childish throughout this thread, it's a defence mechanism, but it doesn't mean I'm wrong, and it doesn't mean there isn't pro-Apple bias on HN. That doesn't mean there isn't pro-<FAANG> bias around here either.
People are entitled to these opinions, and to express them, as are you and I to participate in whatever you call this exchange. It's why I still love the internet even though it feels like it's raising my blood pressure from time to time. And why I go through spits and spats of spurning all social media and then participating in controversial discussions like this and other philosophical debates.
Would you happen to know any data science? I took a course on R but I never got to use it, but this is actually starting to sound like an interesting project. There's a lot of data entry level work to do I guess. My data chops aren't great but I can handle repetitive tasks.
Yeah that’s reactive, rather than reactionary. It’s a significant definition, particularly as reactionary tends to be used pejoratively to describe the proverbial “stick in the mud”.
And my point was that the very fact that Apple is the one with defendants, among the technologists who tend to make up HN commenters, should make one wonder if they aren’t doing some of those things right, not that the supporting commenters are mouth-breathing dullards who just “don’t get it”. Apple are getting bad press right now over pushing for a bill in the US designed to eliminate slavery from supply chains, and I will join the voices calling on them to take responsibility for that. But in reference to how they control their hardware-software ecosystem for their customers, the reactionary point of view is that they shouldn’t be progressing towards a user-safe environment.
Maybe don't get into an argument about semantics when it's pretty clear what the person you replied to originally meant. Half the comments in this thread are reacting to the negative comments about Apple in this thread. Typically for other big tech companies you don't see that because there's a huge bias towards Apple on HN. Idk why you went totally off the rails because of "cowardly" downvotes but I assume that you received them because you were very clearly misconstruing my statement.
Maybe don't use big words you completely misunderstand in a discussion concerning a complex topic littered with political commentary and obvious bias. Particularly if you are only pointing out the obvious bias.
Sorry you're so salty about downvotes that you're resorting to ad hominem and debating semantics instead of discussion, but you clearly don't have anything more productive to say so enjoy your well-founded sense of intellectual superiority I guess.
>Particularly if you are only pointing out the obvious bias.
Sure, let's pretend you didn't take my original comment completely differently than I meant it and then act like I'm too dumb to understand what I was talking about. Have a nice day.
How many times can I say: you were wrong; before you understand that I mean you misused the wrong word to make a bad point that isn’t true. Then you jumped down my throat about pointing out that you’re wrong, and deflected the criticism to my minor point that you swapped the two terms.
I’m ready for bed, I had a nice day thanks. I took a small walk, learned a couple of songs and now I’m ignoring the TV while I sketch out designs for a web service, but your deliberate continuing obtuseness is keeping me entertained for the evening.
Edit (upon reflection): I'm salty about the downvotes because I don't yet have the karma to downvote comments myself, so I can only reply expressing my negative opinion. This so often results in further downvoting that I get further away from the privilege of a silent dismissal of comments with which I disagree.
> Apple have stated multiple times that they don't have any intention to lock down macOS more or less
What Apple says and Apple does can or are two different things. And it applies to any company in the world. Blindly believing them is naive at best.
> I can't really think why anyone would think Apple would lock down macOS
I can think of one: iOS. Locking down the software and hardware can allow Apple to funnel people to their own stores and services. It is not rocket science to figure out that a company is trying to increase their revenue. Right now you have only MacOS as the operating system, Linux may never come and Windows support is anyones guess right now.
The easy path on M1 would have been to lock everything down. Use the same iBoot phones use, the same kernel configuration, and call it a day.
It takes a lot of engineering work to make more permissive security modes work. To re-architect booting. To make external drive booting work. To make developing kernel extensions on production systems work. To allow signing your own boot blob. Work that 99% of users don't even understand let alone care about.
If that isn't enough evidence then nothing ever will be.
Yep this is true of all big tech companies in general. It’s a bit like when punk rock became mainstream, but now it’s tech companies - or rather was ~10 years ago. People hate change.
These new CPUs will represent a huge generational leap and dropping of old baggage. A laptop which has nearly 20h battery life, is small/portable, and is still fast? Wow.
Anything Apple delivers on for the CPUs will find its way into Linux and Windows machine.
This is good for everyone. Eventually you’ll see the next generation Lenovo’s X series or Dell’s XPS series with the same generational improvements.
One example of Apple locking down macOS is how they're pushing out custom kernel extensions. For example firewalls used to be kernel extensions. Now they use explicit APIs and no longer run in kernel space. One could say that it allows for more stable operating system. But on the other hand macOS now by-passes firewalls for its own services and this feature was abused to by-pass firewall with ordinary programs.
The margins on the App Store for iOS is enough of a reason to lock down the OS as far as they can. The difference is they can't practically lock it down to the point that app developers aren't impeded.
But all that said, given the quality of developer tools from Apple I don't have much faith that they won't impede everything but swift development in Xcode.
I have a different perspective. I do all my development remotely if I can now in cloud VMs. I know that doesn’t apply to some classes of development but it’s a hell of a lot less painful isolating it and having snapshots available when inevitably you break something. The amount of hours I’ve lost due to an OS upgrade or dev tool chain upgrade breaking each other is huge so I keep them well apart.
Thus I’ve been using an 8gb M1 Mac mini doing this for a week now and haven’t had any problems. It makes a very nice terminal computer with luxury desktop. When a suitable 27” approx iMac appears with an M-series CPU in it, I will buy one. It’s nice being able to triple swipe between a Mac desktop with native apps, a desktop full of terminals to Linux VMs and a windows desktop machine via RDP transparently!
It’s not really because the hardware is quieter, faster and cooler than the old i3 Mac mini I was using. That has a direct benefit to the end user, not just on sheer local grunt. On the dev front it means it gets out of your way and stays out of your way.
1. For my open source projects, running tests has sped up so much that I find it more of a joy to work on this projects now. Test suites that take ~60 seconds on my 2017 MacBook Pro i7, now take just ~15 seconds. That's a HUGE win in the edit, run test suite cycle.
2. My Rust projects now compile in WAY less time. Something that used to take 3 minutes to compile + link for a debug build is now down to 40 seconds. Once again, this has increased my productivity in the edit, compile, test cycle.
For other projects like node/npm the speedups are also massive and running various pipelines to go from source code to final deployable object is also much faster.
I use vim to develop, and it runs on Rosetta just fine, so I can continue to use all my existing plugins. I don't extensively use docker containers, but for those I connect to the cloud instead of running them locally, which is a move I was already making even on my Intel based MacBook Pro.
I got the 16GB 13" MacBook Pro. I, like the blog author, was thinking of this as a secondary laptop, something fun to have around, but I have hardly touched my old 15" MacBook Pro because this one does everything I already need. And does it faster.
I've been living with 16GB of RAM forever, so maybe I am used to fitting in that memory space... I avoid Electron based apps like the plague, and with being able to run iPad/iOS apps on the M1, I've been able to cut down even further since things like Authy are way faster to launch when they aren't large Chrome based hogs.
I am looking forward to the 16" MacBook Pro's with Apple Silicon based processors, but only because I do miss the additional screen real-estate, and prefer the larger size. But the 13" MacBook Pro M1 has been absolutely fantastic so far.