> For every extra cycle an hardware engineer can squeeze out of silicon, you will find a programmer adding one hundred cycles to their program[6]. In other terms, I fear that once devs figure out how powerful the M1 is, they will throw more "features" at it.
Can we please stop replicating the "growth" fallacy into software ecosystems? Haven't we learnt enough about how unsustainable and damaging this is?
I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
> I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
Web technologies are fine, but what we really need is some kind of lightweight browser which allows you to use HTML/CSS/JS, but with far lower memory usage. I found https://ultralig.ht/ which seems to be exactly what I am looking for, but the license is a major turn off for most paid services. It makes sense for smaller, indie projects to adopt it, but I haven't seen many "desktop apps" using this in the wild.
What I'd really like to see with CEF et al, is JS being dropped, in favor of directly controlling the DOM from the host language. Then we could, for example, write a Rust (or Kotlin, Zig, Haskell, etc) desktop application that simply directly manipulated the DOM, and had it rendered by a HTML+CSS layout engine. Folks could then write a React-like framework for that language (to help render & re-render the DOM in an elegant way).
Ultralight (https://ultralig.ht/) looks pretty cool. I think another possible option is Servo (https://github.com/servo/servo) – it was abandoned by Mozilla along with Rust during their layoffs a while back (but the project still seems to have a decent bit of activity). It would be great if some group of devs could revive the project, or a company could fund such a revival.
Eventually, we'll need to reflect on, and explore whether HTML+CSS is really the best way to do layout, and we could maybe perhaps consider porting the Android/iOS layout approach over to desktop. Maybe WPF/GTk/Qt/etc even got things right, and HTML+CSS isn't the best way to do layout.
We already know HTML/CSS are not the best way to do layout. We already have many desktop based UI frameworks to create cross platform desktop apps.
GTK, QT, etc which are also superior to the single app, limited size focused Android/iOS UI frameworks. More importantly, unlike iOS/Android, they’re even cross platform.
HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS which was the original lingua franca of the web, and there are tons of libraries, frameworks, tools, components, etc available in the HTML/CSS/JS world that makes development easier and quicker.
> HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS
The actual reason is a different one IMHO. The implementation of web standards enabled cross-platform engines that provide every UI customization one could possibly need for a SaaS product. This decision included many benefits for businesses and developers alike, such as:
- User data stays in the walled garden
- Users most probably already have a web-browser installed
- Responsibility to keep the runtime (web-browser) up-to-date is on the user side
- Automated updates (POV user)
- No installation instructions
I think it as much as a business decision as it was the decision from developers to bet on HTML/CSS/JS instead of GTK, QT, etc.
Flutter does something like this, rendering on a canvas directly and eschewing HTML, CSS and JS altogether. It works pretty well. With WASM in the future though I suspect many other languages will adopt a similar paradigm.
> directly controlling the DOM from the host language
I think your path to this future is getting a DOM API specced for WASM (as in, not going via a Javascript bridge). WASI might help.
If and when that happens, then you can repurpose that same API without necessarily needing to compile to WASM. The biggest hurdle is that the current DOM API is a Javascript API, not an API describing e.g. memory layouts.
The majority of users who own an M1 are never going to push it to 100% utilisation.
So whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience.
Electron and similar apps serve a legitimate purpose in the marketplace which is to allow developers to deliver a cross-platform app for the same cost as a single platform one.
A machine that is 99% idle isn't running at 1% utilization all the time. It's idling 99% of the time and running flat out 1% of the time. The user pushes at least one core to the limit with every action they take, even if only for microseconds.
This is why performance matters even with a machine that is mostly idle. A faster system will respond faster to events. It will have lower latency when communicating over a network. It will handle user inputs faster. Users will complain that a program takes a second to respond even if the load average is 0.01.
(I know CPUs enter lower-performance states when lightly loaded. They can go from idle to their highest performance state in much less time than it takes for the user to notice latency.)
> whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience
Subjectively, every Electron app I’ve used including the local beloved VSCode has been a painful UX. I can always tell. Nothing makes a fast computer feel slow quite like them, it’s amazing. I would settle for React Native at this point.
Electron is an example of modern software that’s slow. And it’s common and exceptionally slow.
I think most software is designed to load once and never be quit; you’re probably just supposed to leave photosnot running in the background and let it swap to disk if needed.
But I have a hard time running like that, I quit programs when I’m done using them.
I wonder if there is a name for the idea that the more you earn the lazier you get.
Sysadmins used to know a lot but earn pittance compared to “devops” or “SRE”; then those more expensive folks outsourced the majority of their hard work to cloud vendors who charge 5-11x the cost for compute.
Developers earn 5-10x more than 15 years ago, yet continue to choose solutions for their own convenience and time saving. Stating that its better for the company if they have to work less.
Surely at some point its just your job to consider the entire ecosystem and your contribution to it.
Because they -are- providing value, and because dev resources are limited. If you're a dev, do you think your company would want you to spend 2 weeks on building a new feature, or on "considering the entire ecosystem"?
And more than that - if you have a team that only/mostly has web experience, they're going to use Electron, and... what kind of company would want them to stop and go learn native app development, or hire extra people just to get a native app?
Note: I fully agree with you, I do feel that as a profession we really have descended into a weird "ship the MVP, ship new features non-stop, forget quality" reality, and I dislike it. I'd love it if everything was resource-thrifty native apps, and dev teams could stop chasing new and shiny things and work on improving their existing apps until they were shiny polished diamonds. I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".
But the fact that that's not happening is not because devs are lazy, or not worth their money. It just doesn't make business sense.
> I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".
I don’t think developers earn 5-10x more than 15 years ago. Assuming you’re using something like $250k as a typical salary (for dev jobs at US tech-hubs) I guarantee you the typical equivalent in 2008 was not $25-50k.
Average income in tech-hub area in 2007 was: $85,355 for "Software Developers"[0]
Salary for developers that we discuss on HN seems to be in the region of $250k base, but that's not factoring in TC- which seems to be the bulk of income for the last 5 years. It's not unheard of around here to hear people talking about pulling in half-million if they are ex-FAANG. Which is nearly everybody these days.
Sure, if they're calling into an Electron front-end. Otherwise, each front-end is different so no amount of C++ and dev exeperience is going to take your Android UI calls and transform them into iOS UI calls. So, you'll have to build your own wrappers over each native UI framework to interact with, costing more money.
Ah, you live in a parallel universe which is the same as ours but Qt and other frameworks to do exactly this but without wasting a ton of resources don't exist?
On the contrary, decent developers should shame shoddy ones every chance they get. Maybe that would keep them from making horrible software that millions of people have to suffer every day.
Keep hoping, because it won’t happen. Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.
The problem is having to target 5 operating systems. Extremely complicated web app runs on Linux, Windows, Mac, Android, and iOS. Carefully-crafted Native UI widgets only target one platform at a time.
If you want to get rid of Electron, you personally have to volunteer to be responsible for 5x the work of anyone else, and accept bootcamp-graduate pay. Do you see why it's not happening?
Arguably the success of MacOS has caused the expansion of Electron, because developers are using macs and want to use macs even when programming tools for windows.
> Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.
And you know that because you’ve analyzed whole market and know opinion of users of every web software? Definitely not because you only live in bubble of other software engineers so you’re not exposed to complaints of common folk?
No end consumers I know or even acquaintances of other people I know even know nor care about Electron. You might say that I'm in a bubble but at that point it's just the No True Scotsman fallacy.
You can make slow & buggy software anywhere, with any stack. Teams themselves shipped a whole release focused around sucking significantly less & were getting much faster.
Discord has gotten pretty solid, vscode is spit fast. The hate against Electron feels so selective & antagonistic, picks it's targets, never acknowledged any value, never recognizes any points for the other side. It is just incredibly lopsidedly biased. Over what doesn't feel real to so many.
I was hoping Tauri would be the one, by using a shared library model. That would drop memory consumption down significantly. But they use webkit on Linux and I'm sorry but I want a much better base that supports more modern capabilities.
Objectively he’s right and there’s a reason the iPhone mini was discontinued. The demand for mini phones is niche but very loud online. It isn’t economical to support a mainstream product for the handful of people who are actually willing to pay for that specific niche feature point.
Kinda like headphone ports and microSD slots on android phones. Are there models that do this? Yes! Is anyone willing to pay more for it? No! But they are very very loud online about it.
There do exist smaller android phones as well, just like a headphone port nothing is stopping you from buying one instead of complaining on the internet!
> People want a small phone, but perhaps not if it costs like 6 phones? How is this hard to understand?
iphone mini wasn't priced like 6 phones, and it still didn't sell well. again, you can't seem to grapple with the reality here that what you want is a very niche thing and it doesn't sell well as a mass-market product regardless of price. that is, after all, why apple stopped making that size once again.
the iphone SE is about as small as the market wants. Below that is the domain of mini phones - which do exist, but mostly from smaller android vendors.
> No they don't. There is NOTHING on the market currently that is sized and priced like a samsung S5 mini today.
How do you know they care but have no choice? Are average people besides devs talking about how slow their web apps are?
For your small phone example, even Apple is cutting production of their smaller phones because not enough people are buying them. The entire premise really is overblown online.
The arrogance to believe that their software is so well written already that the only way to improve their performance is by using handcrafted assembly.
Just changing their high level data structures alone would yield 10x improvements, easily.
If they knew the main reason they have to shell out hundreds of dollars to upgrade their computers and phones is because of lazy/shoddy developers they would care.
Nothing says you have to use it to make a game. I recall seeing articles here on HN about using it as a cross-platform app development framework. GDScript may be less appealing than JavaScript, however.
Electron is quite efficient. You get a modern GPU composited UI, and JavaScript is by far the most highly optimized scripting language out there.
The problem with Electron is RAM, disk space, and download size, which is because every app needs to ship its own copy of Blink. This is a solvable problem, but nobody is incentivized to solve it.
A similarly bad problem is that Electron apps by default does not adhere to the operating system conventions and API:s, and often also does not use the the features which makes the OS good.
For example all Electron apps I use have their own implementation of a spell checker rather than hooking into the system one. Normally in Mac OS if I add a word to the system dictionary all apps will learn that word, but in each Electron app I will have to do it again, and again, ...
Many also implement their own UI widgets like basic buttons and input fields whose UX is not at all in line with native apps.
Most consumers don't care about native controls. In fact for a company it's more important to have a unified UX across all devices so that the user can pick up easily from one to the other, ie Macbook to iPhone. Slack is a good example of this.
Strong disagree with your assumption. I __hate__ unified controls across devices. When I am on an iOS device I want iOS controls, not the same as I have on Windows, and on Windows I want Windows controls, not something kind of like Mac OS. And I am 100% certain pretty much every consumer would actually agree if they were given the opportunity and knowledge about the situation, and the "unified" look is just something developers like me lie to ourselves about since it is easier to do it with Electron and not spend time and energy to do a properly good user experience.
Well I can tell you having doing user interviews about precisely this topic for our teams that users really don't care about device specific controls and we've found that it's really devs that care a lot about it, and even then it's been specifically Apple users. Windows or Linux users didn't care as much, although Linux users cared a bit more than Windows users. Most of our interviewees said they either don't care or prefer unified controls per app.
No, we never asked them about our own application specifically, it was about what they preferred generally across all of their devices. So, no flaws here. You may just be projecting your own bias for OS specific controls onto the general populace.
I highly doubt your research was so perfect, since it seems it wasn't even peer reviewed and published, so that conveniently nobody can point out methodology flaws.
I never said it was some peer reviewed study, just that we did it for our team to figure out how to lay out the UX, that doesn't mean it's not statistically and methodologically valid. It seems you are also trying to project your biases on top of my comment so as to question any sort of research that we conducted.
Notice that the Slack app has a similar layout on both mobile and desktop, with a list of teams on the left, followed by channels, followed by the channel content. There are no device UI framework specific elements anywhere, it doesn't use iOS or Android style buttons nor is it designed with Fluent Design on Windows or HIG on macOS.
Agreed that this is a good example where designing for everyone ends up with a worse product for everyone. Same thing goes for Teams, GitHub Desktop, and Postman.
It's horrible having to constantly context switch _within the operating system I am using right now within this very moment_. Context switching when switching devices is not a problem at all.
If somebody just made Electron into a HTML/CSS/JavaScript engine capable of opening just the HTML/etc parts from, say, an HTTP server, and then shipped it to all users worldwide...
The RAM usage is from JavaScript heaps, not from multiple copies of the binaries.
(There basically aren't memory metrics that can show the effect of binary sizes on macOS. It'd show up as disk reads/page ins/general slowdowns instead, but SSDs are so fast that it's very hard to see one.)
> people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?
Decrying cycle burning for the sake of high level abstractions feels like a complete rebutal of everything OSX/macos ecosystem stood for.
> Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?
No, not really. You're making it sound like Classic MacOS applications were written in assembly, when it was mostly C++. Switching to Objective-C was more of a lateral move, not a clear step forward in time or level of abstraction.
The Mac OS X graphics stack was a pretty big change from Classic MacOS, but it wasn't that much of a change from NeXTSTEP, which worked with 12MB of RAM + 1.5MB VRAM. NeXTSTEP just wasn't as eye-catchingly colorful. Early OS X did poineer gratuitous use of some effects like translucency with blurs in desktop GUIs, but since this was the time period where GPUs were becoming mainstream, these effects weren't egregiously resource-intensive.
And considering that OS X was adapted with relative ease to power a smartphone with 128MB of RAM (with the graphical effects turned up to 11), it definitely wasn't fundamentally bloated in the way that today's Electron apps are.
Many Obj C libraries were a wrapping layer on top of the Carbon APIs and in terms of abstraction Obj C is also a level above C++. Passing everything as messages is an additional layer of indirection.
NextSTEP itself was a big change compared to X or Windows's GUI stack. NeXT machines were beefy and adjusted to high workloads so the user experience wasn't problematic, but it's still more resource use than the standard stacks of the time. I don't see it as a bad thing, but we should acknowledge resource economy wasn't a target of the neither NeXT nor Apple at that time.
> 12MB of RAM + 1.5MB VRAM
NeXT launched in 1988, basically around the times when "640k should be enough for everyone" was a meme. Even in 1996, 12MB of RAM was nothing to sneeze at. So, sure those are not enormous numbers, but they're far from the lower end of what was sold to consumers at the time.
> smartphone with 128MB of RAM
We should take a minute to note that 128Mb or RAM is not small for a phone at the time.
Otherwise, sure electron apps use a lot more than that, but the point was always to pay for powerful hardware and use it at full potential to run fancy apps. People loathe electron apps' bloat, but ease of development has also always been one of the core values. Better have apps that aren't fully optimized than no apps at all.
Electron’s bloat and power consumption wouldn’t bother me so much if the performance was actually good on a fast machine. But the experience is awful compared to a native app.
My guess is that Slack for instance would never make a full blown native client for macos. If it wasn't electron, they'd probably have gone with Mono or another cross platform stack, and we'd see performance lag and memory bloat compared to a handcrafted native app.
I say that looking at macos' pretty big market share compared to what it was a decade or two ago.
And we see tons of users choosing VSCode or IntelliJ's offering above Textmate, Codea, or even BBEdit, so the market for handcrafted delightful editors is also pretty slim. And that trend was there since the Eclipse days, so nothing new either.
All in all, I really think the choice comes down to electron/cross compiled apps or no apps, in many many cases.
OSX used a high-level framework that could, at one point, produce a very fancy hello world (or highly sophisticated application) on a 25-MHz 68030 with 8 MiB of RAM.
To the CPU it always boils down to sequences of instructions operating on data, so the syntax sugar put on top has be undone first, the abbreviations have to be unrolled, some way or another.
But more importantly it's really not about the lines you code write once, or very few times in comparison -- it's about the computer does any time the application starts up on any machine, or any user does something. Obviously, to an extent. You wouldn't write a thousand lines of code instead of 100 to make something that is already very fast 10% faster. But I certainly would write 10 line instead of 1 if it meant the code is 10 times faster. It's just typing after all.
My point was that CPU cycles mean nothing now. Compared to programmer cycles they are effectively free. So stuff it into RAM and forget about it. Why not? Complexity is bad, and being able to hide it (effectively) is good. I shouldn't be thinking about pointer allocation when building a a text input box for a UI.
Still running a M1 Air 13“, now into my third job since, and refusing to bother with company provided computers. Still feeling as great as 3 years ago… will only upgrade if the next gen is substantially better (read: not just %faster somewhere).
Can comfortably develop on my lap for a full day if needed, only MS Teams draws power like crazy, but apart from that 8 hours+.
The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and non-upgradeable as far as CPU, RAM and disk are concerned.
Now people may claim that Apple hardware is superior to other manufacturers and that it will last longer. So far I have found that to be false, I have more old non-Apple hardware that is still working fine and few Apple devices that can no longer be used as Apple devices but at least I was able to replace failed hard drives on them.
Similarly, people will claim that Apple's superior hardware is the reason for their inflated prices but I have also found that to be false...it is the Apple's perceived value that is driving those prices. Apple users will actually pay more for second hand devices that what they would for new Apple devices. I saw this just recently on ebay where I followed a few auctions and the ending price for 2nd hand Mx laptop was higher that what I would pay for newer model in the store. Madness.
Just had a 2019 MacBook Pro 16" have its SSD die randomly. Took it to a repair shop and they said that since the internal SSD is soldered on to the logic board, there is no easy way to repair it without also getting a whole new board, so the machine is essentially completely bricked. I can't even boot it off of an external drive because Apple requires the boot volume to be the internal SSD!
I couldn't believe it was designed in such a fragile and user-hostile way. I took care of it, didn't drop it or spill anything on it, and it is bricked after just a couple of years due to a simple component failure beyond my control.
Yeah that is a unfortunate design choice, I consider drives to be wear items, that is, they will wear much faster than the rest of the machine, and as such should be field replaceable units.
Might not be a secret but it's not like Apple inform you on it.
Looking at the product page you need to click on "How much is storage is right for you" to find a vague : "MacBook Air storage is not user accessible".
What does accessible even mean ? Far from clear that even Apple won't upgrade/change it.
And as for the boot issue it was not present on older modele and quite a lot of people only learned of it with the recent Rossman video.
casual classism on HN. You're suggesting not only that everyone do the research, but be capable of doing the research, understanding the research, understanding whether they're being lied to, all without potentially any knowledge of how things work.
Meanwhile in another HN thread - "muahaha, look at these Europoors, how your economy is stagnating, no successful startups, no innovation. That's all due to your overregulation!!!111" /s
But hey, they're super environmentally conscious at least by adding a few recycled cans to the new laptop you'll end up buying after the repair isn't worth it.
Now people may claim that Apple hardware is superior to other manufacturers and that it will last longer.
-------
On the average that's undeniably true.
But not for some Apple is perfect kind of reason but because apple is producing high quality laptops while others manufacturers are producing a lot of shit too besides that.
> The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and non-upgradeable as far as CPU, RAM and disk are concerned.
Given inflation, the Macbook Airs are pretty crazy cheap. If they break you can just replace them. The base base low end M1 Air is only $1000 last I looked (and another $100 off for those in education). They're some of the best price:perf in a laptop the world has ever seen. "inflated prices" is not an accurate description of an $899 retina M1.
Money is only part of the equation. It's pretty wasteful to have to toss a computer because one component failed, when that specific component would be replaceable in other brands.
This seems like a movement of the goalposts. I'm not that concerned about electronic waste, and I haven't seen any great arguments why anyone really should be.
It's not replaceable in other brands, because approximately no one is making a laptop remotely equivalent to the M1/M2. Perhaps the Surface Pro, but even that isn't a good comparison.
Remember when Apple made the first Air and it was so far ahead of every other laptop out there that Intel had to invent the term ultrabook to describe it so Asus and Dell could clone it? Yeah.
That's an insane thing to say. I am concerned about electronic waste. I am not concerned about it to the extent that I would never buy or use a new computer. I am concerned enough about it that I don't want to waste a perfectly good computer because it has been designed to be thrown away.
Would you say that someone concerned about food/food-related) waste should not ever buy any food? Of course not. But don't buy food that you don't intend to eat, and don't buy food that causes unnecessary waste like food that is unnecessarily individually-wrapped.
I go out of my way to recycle things appropriately: dropping off old computers at the local e-waste depo, recycling old rechargeable batteries at retail, etc. It may very well be pointless, but at least I feel I tried.
I have the base model $1k M2 air and it functions excellently as a development machine. Both web dev for work (running multiple react clients, node servers, Pg instances, docker containers, etc all at the same time), and for basic game dev for recreation - though admittedly I’m using Godot for that which is among the lighter weight game engines available.
That to say, it’s far more than a glorified Facebook machine and is a truly excellent value. I much prefer it over my work issued intel MacBook Pro.
M1 Macs would make fantastic Linux laptops, because macOS is appallingly bad at backend development. It makes it up in sheer hardware power and efficiency, otherwise I’d be completely at loss why it’s so loved.
I had a laptop with a replaceable CPU. By the time it was worth upgrading it made more sense to buy a new laptop rather than upgrading the CPU. For repairability it's also not really that different unless you're running something like a framework laptop and, even then, replacement mainboards cost about as much as a whole new laptop anyways.
I do wish the M* laptops had something like a CFexpress slot or similar though. The built in storage is blazing fast and power efficient but soldered. Thunderbolt or USB drives are plenty fast enough these days but the external nature of the ports is a bit of a pain. Large amounts of storage are also where the price inflation really is (since you can't just add cheap storage it has to be the blazing high end storage + markup) so this would help the most.
RAM I'd rather have soldered in these days on a laptop. Performance is higher, power usage is lower, and you can order it with enough that you don't need to wait for new modules to be invented to upgrade and have enough RAM 5 years from now. It does mean you can't take any old M1 from ebay and upgrade it, which is definitely a downside, but for a laptop I think the benefits outweigh the cons. One thing I've noticed is there are always people that order them maxed out and move on when the next generation comes so high RAM options are still available used. One thing that is lost is, again, you can't choose to cheap out and get bottom bin speed memory. You have to pay for 8 channel 6400 MHz RAM in your laptop. Not necessarily a problem with the manufacturing method, it's possible to solder cheap RAM, rather just an Appleism. On the other hand... you do always know what you're getting if you buy one, which could be a plus.
Anyways, all of this is to say if you want a budget laptop which minmaxes it's price/performance ratio at a lower cost then the Apple laptops are definitely not the right pick. The reason for this is not as much how they are manufactured rather wher Apple targets the platform. The best price/performance optimization is probably a Chromebook with a monoboard and soldered RAM.
> The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and […]
The sad unspoken truth is that most apple users are just rich enough to replace the whole thing when it breaks, and don’t really care much of the environmental issues if apple hardware.
I doubt you can find any data to back that claim up. My first macbook was in daily use for about 8 years. My current macbook is only four years old so I'm not expecting to replace it any time soon. Colleagues seem to have the same usage pattern. Apple hardware lasts a long time.
edit: no hard data after a quick google but every article on the first two pages estimates 5-8 years.
Most of my Apple laptops I have given to other people who have used them for years. Of the three M laptops I have (including the first M1 Air), none has any issues so far. If one breaks, I would let it repair at Apple. My personal main Laptop has AppleCare to make that less costly, in case something breaks.
I often recommend Apple refurbished to friends who are strapped for cash, the argument being that it’ll last longer and be cheaper over the long run. So far haven’t really run into an issue with this. Call it the Sam Vimes theory of computer upgrades.
I'm a bit confused by the "still". How often would you expect to upgrade hardware? Keeping a computer for three years is completely normal for most people.
Thinkpad x220 checking in, still rocking my Sandy Lake and still enjoying a real keyboard. It's about 12 now. Other laptops have come and gone.
I couldn't live with my macbook m1. I hated the keyboard and the desktop - why is there no delete key, pray? UTM crashed a lot. Asahi was lovely but I just didn't like touching the hardware.
I got my M1 Max MBP 14" at the end of 2021 (so almost 2 years ago now) and my previous machine was an X220. The main reasons I switched were:
- I was a consultant/contractor, and sometimes you have to run software you really don't want to (Citrix, Teams). You can kind of do this on Linux w/ emulation, but on an older machine it's painful.
- Can't drive a 4k monitor at 60fps
- 1366x768 is hard to webdev on
I also had a multi-machine setup so like, a Chromebook for being spied on, a Windows machine for gaming, and an air-gapped Windows machine for music production. My partner and I had decided to move abroad and it felt like a little much to bring 4 laptops over, so I reluctantly consolidated. In truth the only thing I really gave up was acquiescing to being spied on, but at this point that's more because I haven't set up a pi-hole (etc.) than anything.
I do miss (Arch) Linux, and obviously the X220 input system is unparalleled, but I'll say that I'm pretty accustomed to the MBP input system now (I learned Fn-Delete is actual delete, which actually improved my life significantly). I probably lost ~30WPM (I'm around 100-150WPM on a good keyboard) and it's definitely a worse ergonomic experience to have to leave home row to use the trackpad (you're telling me you want me to stretch my ulnar nerve even more?).
There are other niceties. My partner can airdrop me things. I can load truly ridiculous Excel spreadsheets. I can move my music production onto it. I can stream both an Snes9x emulator window and a webcam grab through OBS to game w/ my brother in the states.
Glossy screens on laptops don't make any sense to me though. The argument is you get better color reproduction, but there are great matte screens that do that (I have a 32" matte display with 100% AdobeRGB that proves it), and besides that, I bet if the difference matters to you you've got a high performance glossy screen on your desk in an office with controlled lighting. Laptops go places! Sometimes they're sunny! Matte screens please!
I upgrade my macbook once per decade. I changed the battery, the ssd and the bluetooth card about 6 month ago (8 years into owning my second macbook). I need to change the screen (1 cm black strip on the left because of a shock) and the speakers are broken. Anyone has a tip for replacing the screen cheaply ? I spotted some going for 30$ on Aliexpress, it looked like the seller had a rather important stock.
Lower-end consumer laptops (think: Acer, Asus, ...) tend to fall apart after a year, this is what I was starting out with in university being rather poor.
Now its more like: the current device has zero issues, but the newer available models look increasingly tempting for my inner child to play with.
Also I owe my career, salary and life situation basically to open source (so I donate regularly) and ... apple computers that work flawlessly in many situations other laptops would be dead already or require me to tinker with stuff instead of doing billable things.
More like: if I would lose it today, I would be the same thing again, maybe next gen M2/3, but everything basically identical to this one. There is no single other device out there I'd prefer to buy instead. Nothing shiny to chase thats better somehow...
Keep in mind that people in their 30s and older today either grew up or lived their working years during a time when upgrading computers every year or two was the norm.
windows 2000 was peak windows for me. only got worse afterwards one way or another... but hot take: windows 8 "metro" was kinda awesome refreshing though most people hated it :-)
My org insists that the disks of all obsolete machines are destroyed and the rest of the hardware sent for disposal.
I have a few systems that could be wiped and given away for charity but that’s not allowed.
This seems incredibly wasteful to me.
i think swift had another post about it a bit more recently but i cannot find it now
if you encrypt the disk at the os level, you only need to throw away the keys, ie: reformat the drive. no need to physically erase as what is store is just noise once the keys are gone.
There’s a temptation that is removed by destroying or selling to a liquidator - and it removes any chance of apparent favoritism: Steve got to take home that twelve CPU rack mount and I just got an old mouse.
But even more common in large companies is that it’s all on leases and goes back to the leasing company to be liquidated on eBay; you can always find miles of three or five year old gear there.
Was wondering the same thing - going through a tech audit at the moment for ISO and that certainly wouldn’t fly, unless we were able to prove that we effectively treated it like we owned it.
Working at companies without strict audits, and previously at bigger corps that had audits we established parallel networks for developers. Either way, BYOD was possible after all.
Also I insist on not having security based on VPN/VPC only, but treat every service as if (and often has) direct internet connections, so devs are _forced_ to think about security from day 1.
2020 M1 Air here, still running strong and will soon hit 3 years (bought the day it launched).
So strong that despite finding the 15 inch air appealing just in terms of screen real estate, this current laptop doesn't feel dated. Especially since work provided me with an intel macbook pro with great spec but .. just doesn't compare with the exception of remote monitor count.
Out of curiosity, is the poor posture which comes with using a laptop (mainly over extended periods of time) something you actively think about?
This is the main reason why although I love my MacBook M1 Air keyboard and screen, I still use an external monitor and keyboard for the majority of my work.
Honestly don't think too much about it... probably I should care more. Just tend to change places regularly and run/swim daily.
Also, I keep quite a large font to be able to be further away from the screen and having a small view into the codebase forces to know it better. (vscode fullscreen of course with clock/battery in status line, hidden dock)
I like UTM too. I must be doing something wrong though. I created both X86 Ubuntu and Debian 10 VMs. They are usable, but not quick. I have been blown away how fast Rosetta 2 x86 programs run my m1. The arm embedded gcc too chain compile code 4x faster than the native x86 MBPro from a year or so ago. But the UTM VMs run about 3x slower.
The article mentions that they are using virtualisation (arm Linux and Windows on Arm MacOS and M1) not emulation (Ubuntu and Debian x86 by emulation on arm on MacOS and m1).
Emulation is slow sadly. Virtualisation is approaching native speed.
“Both Ubuntu and Windows run at native speed since they are virtualized instead of emulated.”
You can't run x86 Linux quickly on macOS. What you can do is to run ARM Linux quickly on macOS and run x86 programs inside of that Linux with Rosetta quickly.
It is quicker because Rosetta is a proprietary Apple technology which apparently works superb, but only supports userspace code. Emulating x86 with qemu does not work as fast, unfortunately. I'm not qualified enough to judge whether it's qemu shortcoming or it's just not possible to emulate entire VM as efficiently, but the practical result for today is that you want to use Rosetta.
Yes. If you run x86 Linux in a VM on an M1 you are emulating/translating every single instruction; with a native OS you are only translating the application.
Interesting, this results in a fully functional desktop version?
I have a M2 and the hardware is great, but the OS that came with it not so. If I could replace the OS with Ubuntu that would be great. (Windows would also be great, but I have no experience in installing that)
I almost always use my MacBook Pro in clamshell mode attached to a thunderbolt dock for external displays. Right now Ashai Linux does not support thunderbolt, however it is a work in progress.
Does UTM work well with external displays? Preferably in full screen?
There's both Apple's Hypervisor framework (a lower-level framework for hypervisors to use to avoid needing a kernel extension), and Apple's Virtualization framework (which is a higher-level framework that lets you run Linux and macOS VMs). QEMU supports virtualization on macOS with the HVF accelerator, which uses the Hypervisor framework. UTM uses QEMU, supporting virtualisation through QEMU with HVF (as well as emulation of other architectures with QEMU). UTM also has support for using the Virtualization framework. But either way, with an ARM64 image, you'll be getting virtualization (unless you disable HVF when using QEMU).
I recently bought a M1 Air for $750 (25% less than regular retail) from amazon as a vacation laptop. I was expecting it to have quite a bit of performance limitation but the machine is actually fantastic. I think it may be one of the best values in a computer I have experienced in quite a while.
This thing is so accessible in its affordability and performance that I think I am kinda over obsessing over if an application or service is mac-only.
It has a price/power ratio off the charts, mainly because Apple wanted to show off what Apple Silicon can do while having to stay at $999 for Air.
I love mine.
The M1 is great, but I really dislike the limitation to 2 external displays unless you get the ultra model CPU.
I've just gotten really used to my portrait landscape landscape setup! And of course I'm an outlier so corporate IT won't buy me an ultra CPU version.
I guess that it's to force people to pay more, but it feels like such an arbitrary decision. Sure, restrict it to 2x6K displays, but I'm using 3x4K displays that the Intel MacBook Pro handles just fine.
Also, it's a real hassle trying to explain to angry developers about why their 4 year old ancient x86_64 arch docker containers won't work natively. (The containers crash under Rosetta emulation)
Compiling Emacs in 5 minutes vs 10 on Intel is really nice though!
I’ve seen the longstanding understood technical explanation for this limitation. I can’t recall what it is now, but it seemed more than plausible at the time. Of course, ‘Apple bad’ is always going to be more viral. And, of course, you can take any hardware component and “just add…” it to max spec.
The 15” MBA not getting a Pro variant like Mac mini is a bummer. I hope they do that at some point because I agree, a 1-monitor limit makes it a toy.
The base M1/M2 only getting 1 display does make sense from apples perspective though. M1 is fundamentally a tablet chip, just a fast enough one that it crosses the line to ultraportable laptop as well (and it’s quite fast for a tablet chip as a result). Tablets don’t need 2-3 external displays. Nor does the average user of a MacBook air. Remember this is a product that’s essentially coming from a netbook heritage, 1 display is fine in that segment.
What doesn’t make sense is not offering a M2 Pro option especially on the 15” Air which could obviously handle it. Because other people do need it, and the MacBook Pro isn’t the same product, it’s got fans and a different display and chassis etc. There is a segment who is just using the MBA as a fanless ultrabook and the Pro cpu is necessary to support the needs of this segment. I really hope 15” MBA gets an M3 Pro option at some point.
And please for the love of god put the 13” touchbar model out of its misery. It looks like it’s getting refreshed again with the M3 line. The new chassis on the 14” and the 16” is just lightyears better.
> Compiling Emacs in 5 minutes vs 10 on Intel is really nice though!
I know the thread is one day old but in case you read this comment... That's really weird though: on my 7700X the latest version of Emacs compiles in 1m36s (and 1m39s, only 3s more, when it's in "eco mode", thermally throttled at 85 C).
On the 3700X it was already only 2m25s.
I think there's something wrong if a M1 needs 5 minutes to compile Emacs where an AMD 7700X desktop only takes 1m36s.
EDIT: I think we misunderstood each other, sorry about that! You were referring to 1 external display, I was referring to 2 being the limit of the Pro Apple CPUs.
Yeah, the M1 can do one external + the built in (so two)
The M1 Pro can do two external, the M1 Max can do four (three at a higher resolution, one at a lower).
I have a Max with five displays total (including the internal). I suspect with some trickery I could get more than five (some external adapters allow two displays to appear as one larger/higher resolution one. But 34,216,704 pixels should be enough for anyone ;)
Little known fact is that the M1 does not support nested virtualization. As a college professor I really miss that during Docker classes. Supposedly the M2 does. Can anyone confirm Docker works under Windows on Parallels?
I have to pitch in and recommend Orb Stack[1] for virtualization here. If you need WSL-style, terminal-only Linux VMS or want Docker without the bloat of Docker desktop, it's the perfect tool. If you go with Alpine as your distro, the whole thing takes under 200mb of disk space, and less RAM and CPU than most Electron apps when idle. It can do most things that WSL can do, Finder integration, lightning-fast, extremely easy installation, VS Code integration, running Mac commands from Linux and vice versa, effortless file sharing and network connections between the host and guests, fully terminal-based management, out-of-the-box integration with things like "open" and "pbcopy", VPN / proxy detection and autoconfiguration, it can even use Rosetta instead of Qemu for running Intel binaries, which makes them faster than on quite a few Intel laptops.
I'm not affiliated with the company, just a very happy user.
UTM is great, but sort of useless for many of my most acute Windows needs: talking to some random peripheral or device that only has a closed-source x86 windows driver.
I have an m1 mbp and a asus zephyrus g14 of similar vintage, and I have to say day-to-day, stock, the m1 mbp can get better battery life idle, and is less prone to getting into a state in which its fans need to activate. I'd say they can both get over 10 hours of battery life easily which gets into the point where who cares anymore, but it's easier to get the asus to drop to only 5.
But, speed-wise, I haven't noticed a difference. They're both snappy. In fact the asus would be faster for all practical purposes of mine because:
* macos has comically bad UX, it wishes for me to use the mouse far more often than it should, and fights me on keybindings.
* programs like blender and stable diffusion just aren't optimized for apple. they're optimized for nvidia. i can't say if the m1 gpu is actually good, because nobody seems to care to support it, which brings me to my final point:
it doesn't much matter if m1 is or isn't good as long as it's exclusive to apple's walled garden. if I can't put it in my PC, it may as well not exist, both from a user point of view but also support of developers.
The best outcome of the m-line of CPUs would be for hardware competitors to make another great ARM chip, that I can actually use elsewhere.
Weird, I've had the exact opposite experience with UTM. I need to get Ubuntu up and running with virtualization (emulation is too slow for GUI work), but the install simply hangs. If I forcibly reboot, the installed OS simply blackscreens and does nothing. I've spent a good few hours trying to get it to work (including turning off the display and trying to use the old tty terminal output), but eventually I gave up.
Have you tried updating UTM to the latest version? Many of these bugs have been fixed in newer versions. UTM doesn't auto update, so you have to check for updates yourself on their GitHub releases page: https://github.com/utmapp/UTM/releases
I don't know, but I reject the premise of the question (i.e. there is a consensus). But I'm no pollster.
Speaking for myself, to virtualize arm64 *nix on an M1/M2 Mac, UTM has worked whereas other alternatives didn't work for me (such as VirtualBox). I haven't put my hands on the glowing orb yet; I'm sure it will be mesmerizing and tell me to do things like watch Inception (inside a VM) ^ 4.
The arm release is a "whoops it got out before it's ready" moment, and the current status apart from being buggy and slow, is that it only runs 32bit x86 VMs on arm.
Good point, definitely makes sense to use UTM if you need GUI apps. It's planned in the long term but in all honesty, I'm more focused on containers at the moment.
Also worth mentioning that OrbStack's philosophy to be more like WSL (designed for integration first and foremost, not isolation) than UTM (full virtual machines, isolated from the host).
I only use utm to run chrome for testing because when run natively with selenium it steals the window focus all the time. I want the test to run while I'm coding and also see and interact with it.
Edit: The project does not run on "docker machine" as per the graceful correction by OrbStack's developer below[2].
It's a proprietary container interface that - as far as I could tell - runs on the outdated and unsupported "docker machine"[1]. An alternative would be to run a more recent container runtime of your choosing in a VM through UTM which works beautifully.
Hey, dev here — not sure what you mean. OrbStack currently ships an official build of Docker Engine 24.0.5, which is the latest version as of writing. Previous versions used Alpine Linux's build.
I did intentionally delay the update from 23.x to 24.x by a few weeks while I waited for things to settle, because last time I updated from 22.x to 23.x shortly after release and some users were hit by bugs that hadn't been ironed out yet.
Edit: Noticed that you edited your comment to say that it uses "Docker Machine" instead. That's not the case. OrbStack uses a nested architecture where it runs Linux machines [1]. There's a special machine called "docker" that runs a copy of the Docker Engine, but that's not related to the old Docker Machine project.
My confusion must have come from mixed terminology being used by others than yourself on GitHub and in articles about your project I skimmed through when looking into it.
The project might still not be for me regardless of its underpinning, although I will have to say that it's immediately obvious that you persue high standards and care a great deal about delivering a quality product.
I wish I could upgrade from my 13" mbp that's the last of the intel i5 series. My 16" M1 Max that I was issued at my last job was a dream, and now that I'm playing with SwiftUI and dealing with xcode, it's feeling slow and noisy. Unfortunately I was laid off, so no M1, and the ram upgrade to the 64gb or 96gb is insultingly expensive still. As nice as it is, it isn't a $4k boost in productivity.
M1 Max doesn’t get the HDMI 2.1 port (although you can coerce some cablematters adapters to do it), or the Wi-Fi 6E, and the M2 Max is generally faster especially in gpu (at the cost of higher power - perf/w doesn’t change much) but in general it’s not worth paying $2k more for.
(to be specific: M1 Max has HDMI 2.1 (including things like HDMI Org VRR afaik) but it's limited to 4K60. M2 gets 4K120 and some other features. As always, examine each device for the feature support you care about, because HDMI 2.1 completely supplants the HDMI 2.0 standard and no HDMI 2.0 certs are being issued anymore, so, "HDMI 2.1" devices can range between 4K120/8K support with 10-bit deep color, and 4K60 support with 8 bit support and no VRR or any other features. Each individual feature is treated separately, a HDMI 2.1 receiver could support 4K120 but not 4K, and so on.)
And of course, this might not be the best time to be spending even $3k. But it doesn't have to be $5K for a fully loaded M2 or $4k for a decently loaded one. $3k gets you a lot of laptop too, just not the M2 right now.
Refurb is another option... you can do a 64GB/1TB with the M1 Max 24c GPU (vs 32c) for $2550 via refurb store if you can get the edu or mil/veteran discount. Personally for another $500 I think the 4TB and the nicer GPU would be worth it but again, it edges the cost down quite a bit.
Woot also has been carrying "manufacturer-refurbished" M1 Pro/Max for pretty good deals recently. IIRC the 16GB/512GB is $1550, and you can get 32GB/1TB for around $2100. And because it's manufacturer-recertified (Apple) it's eligible for applecare.
Good to know about the sales, though being in Canada might hamper me there. The refurb and edu discounts are pretty decent though, and I'd probably consider those if I really start pursuing an upgrade.
HDMI has always been a bit irrelevant for me, aside from using it with work-issued displays. I've always used displayport over thunderbolt, and don't know that HDMI offers any advantages. I'd prefer almost any other port over HDMI.
My issue is more that every time I spec out a new MBP, it genuinely feels like I'm being grifted, and it really turns me off more than in the previous 10+ years I've been buying macs. $200 for a ram upgrade was already too high, and hurt the wallet then, but $500 - $1500 CAD for ram upgrades (inflated by the apparent, probably artificial, requirement to choose CPU upgrades that I'd never have in the past and wouldn't likely get any value from.
If I choose the minimum CPU that allows 64gb, I then need to spend an addition $750 to add 8 gpu cores and go up to 96gb (not that I'd necessarily go that high or feel like I'd need to, but it's not like I can upgrade, and just stands to illustrate the point). $750 is my damn rent payment.
So before even going past 512gb ssd or including Applecare or taxes, the computer sits at $5100 CAD, if I were to max out the ram, on the 14" new.
> Both Ubuntu and Windows run at native speed since they are virtualized instead of emulated.
Coudl someone explain it to me how it's possible? Because their website clearly says "On Intel Macs, x86/x64 operating system can be virtualized. In addition, lower performance emulation is available to run x86/x64 on Apple Silicon" - and Windows on ARM64 is still in a preview.
Meh, I have some old screwdrivers that still work too. I just don't get hardware fetish. Just boot it up, run code, shut it down, and go enjoy summer and/or winter.
You were wrong. Ever hear people talking about AMD gpus online?
It’s crazy how normalized it is for certain fanboys to just call everyone who disagrees with them “brainless”, “ruled by mindshare”, “buying it for the blue bubble”, etc. I see it all the time even on HN, it is crazy how normalized the android and AMD fans have made it to just insult a group of people to their faces. Even in a tightly moderated community!
But put out one article saying “yo M1 is a good laptop” and watch the “I thought we left fanboyism in the 90s!!!” replies flood in.
> For every extra cycle an hardware engineer can squeeze out of silicon, you will find a programmer adding one hundred cycles to their program[6]. In other terms, I fear that once devs figure out how powerful the M1 is, they will throw more "features" at it.
Can we please stop replicating the "growth" fallacy into software ecosystems? Haven't we learnt enough about how unsustainable and damaging this is?
I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.