Hacker News new | past | comments | ask | show | jobs | submit login
Ode to the M1 (fabiensanglard.net)
220 points by guiambros on Aug 13, 2023 | hide | past | favorite | 282 comments



> Software "engineers"

> For every extra cycle an hardware engineer can squeeze out of silicon, you will find a programmer adding one hundred cycles to their program[6]. In other terms, I fear that once devs figure out how powerful the M1 is, they will throw more "features" at it.

Can we please stop replicating the "growth" fallacy into software ecosystems? Haven't we learnt enough about how unsustainable and damaging this is?

I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.


> I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.

Web technologies are fine, but what we really need is some kind of lightweight browser which allows you to use HTML/CSS/JS, but with far lower memory usage. I found https://ultralig.ht/ which seems to be exactly what I am looking for, but the license is a major turn off for most paid services. It makes sense for smaller, indie projects to adopt it, but I haven't seen many "desktop apps" using this in the wild.


What I'd really like to see with CEF et al, is JS being dropped, in favor of directly controlling the DOM from the host language. Then we could, for example, write a Rust (or Kotlin, Zig, Haskell, etc) desktop application that simply directly manipulated the DOM, and had it rendered by a HTML+CSS layout engine. Folks could then write a React-like framework for that language (to help render & re-render the DOM in an elegant way).

Ultralight (https://ultralig.ht/) looks pretty cool. I think another possible option is Servo (https://github.com/servo/servo) – it was abandoned by Mozilla along with Rust during their layoffs a while back (but the project still seems to have a decent bit of activity). It would be great if some group of devs could revive the project, or a company could fund such a revival.

Eventually, we'll need to reflect on, and explore whether HTML+CSS is really the best way to do layout, and we could maybe perhaps consider porting the Android/iOS layout approach over to desktop. Maybe WPF/GTk/Qt/etc even got things right, and HTML+CSS isn't the best way to do layout.


We already know HTML/CSS are not the best way to do layout. We already have many desktop based UI frameworks to create cross platform desktop apps.

GTK, QT, etc which are also superior to the single app, limited size focused Android/iOS UI frameworks. More importantly, unlike iOS/Android, they’re even cross platform.

HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS which was the original lingua franca of the web, and there are tons of libraries, frameworks, tools, components, etc available in the HTML/CSS/JS world that makes development easier and quicker.


> HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS

The actual reason is a different one IMHO. The implementation of web standards enabled cross-platform engines that provide every UI customization one could possibly need for a SaaS product. This decision included many benefits for businesses and developers alike, such as:

- User data stays in the walled garden

- Users most probably already have a web-browser installed

- Responsibility to keep the runtime (web-browser) up-to-date is on the user side

- Automated updates (POV user)

- No installation instructions

I think it as much as a business decision as it was the decision from developers to bet on HTML/CSS/JS instead of GTK, QT, etc.


Flutter does something like this, rendering on a canvas directly and eschewing HTML, CSS and JS altogether. It works pretty well. With WASM in the future though I suspect many other languages will adopt a similar paradigm.


> directly controlling the DOM from the host language

I think your path to this future is getting a DOM API specced for WASM (as in, not going via a Javascript bridge). WASI might help.

If and when that happens, then you can repurpose that same API without necessarily needing to compile to WASM. The biggest hurdle is that the current DOM API is a Javascript API, not an API describing e.g. memory layouts.


Tauri is somewhat similar to this, it's much lighter weight than Electron.


The majority of users who own an M1 are never going to push it to 100% utilisation.

So whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience.

Electron and similar apps serve a legitimate purpose in the marketplace which is to allow developers to deliver a cross-platform app for the same cost as a single platform one.


A machine that is 99% idle isn't running at 1% utilization all the time. It's idling 99% of the time and running flat out 1% of the time. The user pushes at least one core to the limit with every action they take, even if only for microseconds.

This is why performance matters even with a machine that is mostly idle. A faster system will respond faster to events. It will have lower latency when communicating over a network. It will handle user inputs faster. Users will complain that a program takes a second to respond even if the load average is 0.01.

(I know CPUs enter lower-performance states when lightly loaded. They can go from idle to their highest performance state in much less time than it takes for the user to notice latency.)


> whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience

Subjectively, every Electron app I’ve used including the local beloved VSCode has been a painful UX. I can always tell. Nothing makes a fast computer feel slow quite like them, it’s amazing. I would settle for React Native at this point.


From the fine article: "Remember when Photoshop opened in 1s on SSD?"

I do remember. And it now takes A LOT longer on insanely fast CPUs and SSDs. And I hate it every time I have to wait for it.


So it’s not Electron that’s slow, but modern software?


Electron is an example of modern software that’s slow. And it’s common and exceptionally slow.

I think most software is designed to load once and never be quit; you’re probably just supposed to leave photosnot running in the background and let it swap to disk if needed.

But I have a hard time running like that, I quit programs when I’m done using them.


Photoshop isn't Electron and yet it is slow. So is it Electron that sucks, or inefficient software sucks?


Why not both?

Inefficiency sucks, and electron enables that on a large scale.

I have to assume that photoshop could start faster, but trade offs would be made.


I have a liquid cooled 14 core processor that runs at 5.7GHz. 96Gb of DDR5. Top of the line Samsung m.2 drive.

Photoshop still takes a good 30 seconds+ to open.


“640k ought to be enough for anyone.”


Decent developers can do that in C++


Decenter developers can do that with a magnetized needle and a steady hand


I wonder if there is a name for the idea that the more you earn the lazier you get.

Sysadmins used to know a lot but earn pittance compared to “devops” or “SRE”; then those more expensive folks outsourced the majority of their hard work to cloud vendors who charge 5-11x the cost for compute.

Developers earn 5-10x more than 15 years ago, yet continue to choose solutions for their own convenience and time saving. Stating that its better for the company if they have to work less.

Surely at some point its just your job to consider the entire ecosystem and your contribution to it.

Otherwise why are you worth all that money?


Because they -are- providing value, and because dev resources are limited. If you're a dev, do you think your company would want you to spend 2 weeks on building a new feature, or on "considering the entire ecosystem"?

And more than that - if you have a team that only/mostly has web experience, they're going to use Electron, and... what kind of company would want them to stop and go learn native app development, or hire extra people just to get a native app?

Note: I fully agree with you, I do feel that as a profession we really have descended into a weird "ship the MVP, ship new features non-stop, forget quality" reality, and I dislike it. I'd love it if everything was resource-thrifty native apps, and dev teams could stop chasing new and shiny things and work on improving their existing apps until they were shiny polished diamonds. I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".

But the fact that that's not happening is not because devs are lazy, or not worth their money. It just doesn't make business sense.


> I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".

https://en.wikipedia.org/wiki/Directory_Opus


I don’t think developers earn 5-10x more than 15 years ago. Assuming you’re using something like $250k as a typical salary (for dev jobs at US tech-hubs) I guarantee you the typical equivalent in 2008 was not $25-50k.


Average income in tech-hub area in 2007 was: $85,355 for "Software Developers"[0]

Salary for developers that we discuss on HN seems to be in the region of $250k base, but that's not factoring in TC- which seems to be the bulk of income for the last 5 years. It's not unheard of around here to hear people talking about pulling in half-million if they are ex-FAANG. Which is nearly everybody these days.

[0]: source is here: https://www.datamation.com/careers/developer-salary-levels-2... based on https://e-janco.com


I don’t think it’s nearly everybody these days, but that it may seem like that if you happen to be in FAANG-heavy local bubble.


Also, but it's easier in C++ and it consumes 99% less resources.


Real hackers use C-x M-c M-butterfly

https://xkcd.com/378/


Sure, if they're calling into an Electron front-end. Otherwise, each front-end is different so no amount of C++ and dev exeperience is going to take your Android UI calls and transform them into iOS UI calls. So, you'll have to build your own wrappers over each native UI framework to interact with, costing more money.


Ah, you live in a parallel universe which is the same as ours but Qt and other frameworks to do exactly this but without wasting a ton of resources don't exist?


Decent developers don’t leave comments like this about other developers.


On the contrary, decent developers should shame shoddy ones every chance they get. Maybe that would keep them from making horrible software that millions of people have to suffer every day.


[flagged]


Deflect much?


Do you have a point to make or are you here just to be angry because you do write code for electron?


Keep hoping, because it won’t happen. Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.


The problem is having to target 5 operating systems. Extremely complicated web app runs on Linux, Windows, Mac, Android, and iOS. Carefully-crafted Native UI widgets only target one platform at a time.

If you want to get rid of Electron, you personally have to volunteer to be responsible for 5x the work of anyone else, and accept bootcamp-graduate pay. Do you see why it's not happening?


Arguably the success of MacOS has caused the expansion of Electron, because developers are using macs and want to use macs even when programming tools for windows.


Some (maybe Apple) ought to make a Swift -> Windows/Linux cross compiler...


Exactly this.

And HTML/CSS may have its quirks but I'll take it every day over fumbling with GTK or whatever


I just use Flutter instead, works great for all 5 OS.


> Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.

And you know that because you’ve analyzed whole market and know opinion of users of every web software? Definitely not because you only live in bubble of other software engineers so you’re not exposed to complaints of common folk?


No end consumers I know or even acquaintances of other people I know even know nor care about Electron. You might say that I'm in a bubble but at that point it's just the No True Scotsman fallacy.


Literally all of my acquaintances in previous company complained about Teams being slow and buggy. I also see it constantly being mentioned online.


You can make slow & buggy software anywhere, with any stack. Teams themselves shipped a whole release focused around sucking significantly less & were getting much faster.

Discord has gotten pretty solid, vscode is spit fast. The hate against Electron feels so selective & antagonistic, picks it's targets, never acknowledged any value, never recognizes any points for the other side. It is just incredibly lopsidedly biased. Over what doesn't feel real to so many.

I was hoping Tauri would be the one, by using a shared library model. That would drop memory consumption down significantly. But they use webkit on Linux and I'm sorry but I want a much better base that supports more modern capabilities.


They don't know how it's called, they still hate when they are forced to use slow applications.


Consumers care but have no choice.

Like consumers that want to buy a phone that fit in the hand… there is none at a reasonable price.


The demand for small phone is completely overblown online.


Because people who don't go online need BIGGER phones? For what?


Objectively he’s right and there’s a reason the iPhone mini was discontinued. The demand for mini phones is niche but very loud online. It isn’t economical to support a mainstream product for the handful of people who are actually willing to pay for that specific niche feature point.

Kinda like headphone ports and microSD slots on android phones. Are there models that do this? Yes! Is anyone willing to pay more for it? No! But they are very very loud online about it.

There do exist smaller android phones as well, just like a headphone port nothing is stopping you from buying one instead of complaining on the internet!


People want a small phone, but perhaps not if it costs like 6 phones? How is this hard to understand?

> Is anyone willing to pay more for it? No!

Is anyone willing to pick a phone over another (despite what paulmd's opinion might be)? YES.

> There do exist smaller android phones as well

No they don't. There is NOTHING on the market currently that is sized and priced like a samsung S5 mini today.


> People want a small phone, but perhaps not if it costs like 6 phones? How is this hard to understand?

iphone mini wasn't priced like 6 phones, and it still didn't sell well. again, you can't seem to grapple with the reality here that what you want is a very niche thing and it doesn't sell well as a mass-market product regardless of price. that is, after all, why apple stopped making that size once again.

the iphone SE is about as small as the market wants. Below that is the domain of mini phones - which do exist, but mostly from smaller android vendors.

> No they don't. There is NOTHING on the market currently that is sized and priced like a samsung S5 mini today.

s5 mini: 131.1 x 64.8 x 9.1 mm

cubot pocket 3: 133.2 x 55 x 12.3mm

unihertz jelly pro: 92.4 × 43 × 13 mm

just with literally a few minutes of searching.


> iphone mini wasn't priced like 6 phones

Still costs 4x than regular android phones.

> you can't seem to grapple with the reality

You can't seem to grapple with the reality that 839€ for a phone is completely insane, and not everybody is a rich spoiled person like you.

> unihertz jelly pro: 92.4 × 43 × 13 mm

Funny… I am returning a unihertz because of software issues (default google apps just crashing)… sure it's small… but I implied that stuff must work.

> just with literally a few minutes of searching.

Too bad that being rich and spoiled didn't enrich your brain uh?

I hope it's ok for you that unlike you, I knew what I was talking about.


How do you know they care but have no choice? Are average people besides devs talking about how slow their web apps are?

For your small phone example, even Apple is cutting production of their smaller phones because not enough people are buying them. The entire premise really is overblown online.


> How do you know they care but have no choice?

You've ever heard a anyone say "This computer is too fast" or "this computer doesn't burn enough electricity"?

> For your small phone example, even Apple is cutting production of their smaller phones because not enough people are buying them.

Yeah I did mention "reasonably priced", which apple is not.


The arrogance to believe that their software is so well written already that the only way to improve their performance is by using handcrafted assembly.

Just changing their high level data structures alone would yield 10x improvements, easily.


The naivete to believe that consumers driving company sales even realize that performance is something they need to care about.


If they knew the main reason they have to shell out hundreds of dollars to upgrade their computers and phones is because of lazy/shoddy developers they would care.


For that you need a better alternative. I’m not aware of anything.


Godot?

Targets Windows, UWP, MacOS, Linux, BSD, Android, iOS, and Web: https://docs.godotengine.org/en/stable/about/faq.html

Nothing says you have to use it to make a game. I recall seeing articles here on HN about using it as a cross-platform app development framework. GDScript may be less appealing than JavaScript, however.


Godot is wonderful. But GDScript is a horrible, terrible, ill conceived idea.

So how's the QuakeC development ecosystem these days? How good is ChatGPT at generating QuakeC?

Who's still programming Unity3D in Managed JScript or Boo?


I wonder if Bevy might be a good alternative, it's in Rust which isn't a proprietary language.


How’s accessibility and access to native apis?


Flutter works pretty well for apps like these.



Electron is quite efficient. You get a modern GPU composited UI, and JavaScript is by far the most highly optimized scripting language out there.

The problem with Electron is RAM, disk space, and download size, which is because every app needs to ship its own copy of Blink. This is a solvable problem, but nobody is incentivized to solve it.


A similarly bad problem is that Electron apps by default does not adhere to the operating system conventions and API:s, and often also does not use the the features which makes the OS good.

For example all Electron apps I use have their own implementation of a spell checker rather than hooking into the system one. Normally in Mac OS if I add a word to the system dictionary all apps will learn that word, but in each Electron app I will have to do it again, and again, ...

Many also implement their own UI widgets like basic buttons and input fields whose UX is not at all in line with native apps.


Most consumers don't care about native controls. In fact for a company it's more important to have a unified UX across all devices so that the user can pick up easily from one to the other, ie Macbook to iPhone. Slack is a good example of this.


Strong disagree with your assumption. I __hate__ unified controls across devices. When I am on an iOS device I want iOS controls, not the same as I have on Windows, and on Windows I want Windows controls, not something kind of like Mac OS. And I am 100% certain pretty much every consumer would actually agree if they were given the opportunity and knowledge about the situation, and the "unified" look is just something developers like me lie to ourselves about since it is easier to do it with Electron and not spend time and energy to do a properly good user experience.


Well I can tell you having doing user interviews about precisely this topic for our teams that users really don't care about device specific controls and we've found that it's really devs that care a lot about it, and even then it's been specifically Apple users. Windows or Linux users didn't care as much, although Linux users cared a bit more than Windows users. Most of our interviewees said they either don't care or prefer unified controls per app.


I guess you limit your context to your own application and not the entire computing device and therefore you get flawed results.


No, we never asked them about our own application specifically, it was about what they preferred generally across all of their devices. So, no flaws here. You may just be projecting your own bias for OS specific controls onto the general populace.


I highly doubt your research was so perfect, since it seems it wasn't even peer reviewed and published, so that conveniently nobody can point out methodology flaws.


I never said it was some peer reviewed study, just that we did it for our team to figure out how to lay out the UX, that doesn't mean it's not statistically and methodologically valid. It seems you are also trying to project your biases on top of my comment so as to question any sort of research that we conducted.


Yeah because it doesn't really match with the state of the art.


What is the state of the art then? Where are your peer reviewed studies on this topic?


How are controls unified when on a device you have a mouse and 400cm² of space and on another you have a finger and 60cm² of space?


Notice that the Slack app has a similar layout on both mobile and desktop, with a list of teams on the left, followed by channels, followed by the channel content. There are no device UI framework specific elements anywhere, it doesn't use iOS or Android style buttons nor is it designed with Fluent Design on Windows or HIG on macOS.


Agreed that this is a good example where designing for everyone ends up with a worse product for everyone. Same thing goes for Teams, GitHub Desktop, and Postman.

It's horrible having to constantly context switch _within the operating system I am using right now within this very moment_. Context switching when switching devices is not a problem at all.


I wouldn't say worse, I appreciate not having to context switch between different devices.


The RAM usages comes with cache misses. It's NOT efficient. Just because you have a monster machine, doesn't mean it's efficient


If somebody just made Electron into a HTML/CSS/JavaScript engine capable of opening just the HTML/etc parts from, say, an HTTP server, and then shipped it to all users worldwide...


Heh

I wonder if anyone did that comparison, an Electron app vs Chrome + a local copy of node+express whatever running the backend


Why does it have to be node and express? You can compile lightweight Go, C++, Rust binaries.


The RAM usage is from JavaScript heaps, not from multiple copies of the binaries.

(There basically aren't memory metrics that can show the effect of binary sizes on macOS. It'd show up as disk reads/page ins/general slowdowns instead, but SSDs are so fast that it's very hard to see one.)


Multiple copies of the same binary can't share a memory page for the code. They do take more RAM.


And there are no metrics you can look at to see the impact of this, so it's not contributing to any of the metrics you are seeing, as I said.


It has been solved decades ago: it’s called a browser.


> people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.

Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?

Decrying cycle burning for the sake of high level abstractions feels like a complete rebutal of everything OSX/macos ecosystem stood for.


> Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?

No, not really. You're making it sound like Classic MacOS applications were written in assembly, when it was mostly C++. Switching to Objective-C was more of a lateral move, not a clear step forward in time or level of abstraction.

The Mac OS X graphics stack was a pretty big change from Classic MacOS, but it wasn't that much of a change from NeXTSTEP, which worked with 12MB of RAM + 1.5MB VRAM. NeXTSTEP just wasn't as eye-catchingly colorful. Early OS X did poineer gratuitous use of some effects like translucency with blurs in desktop GUIs, but since this was the time period where GPUs were becoming mainstream, these effects weren't egregiously resource-intensive.

And considering that OS X was adapted with relative ease to power a smartphone with 128MB of RAM (with the graphical effects turned up to 11), it definitely wasn't fundamentally bloated in the way that today's Electron apps are.


Many Obj C libraries were a wrapping layer on top of the Carbon APIs and in terms of abstraction Obj C is also a level above C++. Passing everything as messages is an additional layer of indirection.

NextSTEP itself was a big change compared to X or Windows's GUI stack. NeXT machines were beefy and adjusted to high workloads so the user experience wasn't problematic, but it's still more resource use than the standard stacks of the time. I don't see it as a bad thing, but we should acknowledge resource economy wasn't a target of the neither NeXT nor Apple at that time.

> 12MB of RAM + 1.5MB VRAM

NeXT launched in 1988, basically around the times when "640k should be enough for everyone" was a meme. Even in 1996, 12MB of RAM was nothing to sneeze at. So, sure those are not enormous numbers, but they're far from the lower end of what was sold to consumers at the time.

> smartphone with 128MB of RAM

We should take a minute to note that 128Mb or RAM is not small for a phone at the time.

Otherwise, sure electron apps use a lot more than that, but the point was always to pay for powerful hardware and use it at full potential to run fancy apps. People loathe electron apps' bloat, but ease of development has also always been one of the core values. Better have apps that aren't fully optimized than no apps at all.


Electron’s bloat and power consumption wouldn’t bother me so much if the performance was actually good on a fast machine. But the experience is awful compared to a native app.


My guess is that Slack for instance would never make a full blown native client for macos. If it wasn't electron, they'd probably have gone with Mono or another cross platform stack, and we'd see performance lag and memory bloat compared to a handcrafted native app.

I say that looking at macos' pretty big market share compared to what it was a decade or two ago.

And we see tons of users choosing VSCode or IntelliJ's offering above Textmate, Codea, or even BBEdit, so the market for handcrafted delightful editors is also pretty slim. And that trend was there since the Eclipse days, so nothing new either.

All in all, I really think the choice comes down to electron/cross compiled apps or no apps, in many many cases.


I think Sublime Text does a good job of being both good and cross platform.

But yes I also remember the days when lots of software just didn’t exist for macOS and I was thankful for a webtech version.


OSX used a high-level framework that could, at one point, produce a very fancy hello world (or highly sophisticated application) on a 25-MHz 68030 with 8 MiB of RAM.

Cocoa and Electron are not the same.


>I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.

Yes, so we can go back to writing a hundred lines of imperative code to render hello world.


To the CPU it always boils down to sequences of instructions operating on data, so the syntax sugar put on top has be undone first, the abbreviations have to be unrolled, some way or another.

But more importantly it's really not about the lines you code write once, or very few times in comparison -- it's about the computer does any time the application starts up on any machine, or any user does something. Obviously, to an extent. You wouldn't write a thousand lines of code instead of 100 to make something that is already very fast 10% faster. But I certainly would write 10 line instead of 1 if it meant the code is 10 times faster. It's just typing after all.


My point was that CPU cycles mean nothing now. Compared to programmer cycles they are effectively free. So stuff it into RAM and forget about it. Why not? Complexity is bad, and being able to hide it (effectively) is good. I shouldn't be thinking about pointer allocation when building a a text input box for a UI.


4 in c. 11 in assembly. Not so bad, really.

https://i.imgur.com/r2WRnZk.png


I was about to make a joke about how many lines are required for an electron hello world but it was even more than I expected.

11 for the HTML document template

7 for the boilerplate package.json

3 more for the scripts addition to package.json

3 more for the “devdependencies”

20 for main.js

from: https://electronjs.org/docs/latest/tutorial/quick-start

A fairer assessment would be C++ with Qt to be honest though.


write a hello world application in gtk or qt and see how much assembly you get.


x86_64 assembly, gtk3, 21 lines / 19 instructions

build with e.g. gcc hw.s `pkg-config --libs gtk+-3.0`

  .globl main;main:
  push %rbx
  xor %esi, %esi
  xor %edi, %edi
  call gtk_init@PLT
  lea .HW(%rip), %r8
  xor %edx, %edx
  xor %esi, %esi
  mov $2, %ecx
  xor %edi, %edi
  xor %eax, %eax
  call gtk_message_dialog_new@PLT
  mov %rax, %rbx
  mov %rax, %rdi
  call gtk_dialog_run@PLT
  mov %rbx, %rdi
  call gtk_widget_destroy@PLT
  xor %eax, %eax
  pop %rbx
  ret
  .HW:.string "Hello World!"


Are you sure you couldn’t wrap that up in another library and just do everything in a single call?



let's move the goal posts to doing it in brainfuck since we're at it


What goalpost did I supposedly move?


That’s super ironic that you’ve shown screenshot from a web application.


Why, thank you!


Still running a M1 Air 13“, now into my third job since, and refusing to bother with company provided computers. Still feeling as great as 3 years ago… will only upgrade if the next gen is substantially better (read: not just %faster somewhere).

Can comfortably develop on my lap for a full day if needed, only MS Teams draws power like crazy, but apart from that 8 hours+.

Best computer I ever had I think.


The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and non-upgradeable as far as CPU, RAM and disk are concerned. Now people may claim that Apple hardware is superior to other manufacturers and that it will last longer. So far I have found that to be false, I have more old non-Apple hardware that is still working fine and few Apple devices that can no longer be used as Apple devices but at least I was able to replace failed hard drives on them. Similarly, people will claim that Apple's superior hardware is the reason for their inflated prices but I have also found that to be false...it is the Apple's perceived value that is driving those prices. Apple users will actually pay more for second hand devices that what they would for new Apple devices. I saw this just recently on ebay where I followed a few auctions and the ending price for 2nd hand Mx laptop was higher that what I would pay for newer model in the store. Madness.


Just had a 2019 MacBook Pro 16" have its SSD die randomly. Took it to a repair shop and they said that since the internal SSD is soldered on to the logic board, there is no easy way to repair it without also getting a whole new board, so the machine is essentially completely bricked. I can't even boot it off of an external drive because Apple requires the boot volume to be the internal SSD!

I couldn't believe it was designed in such a fragile and user-hostile way. I took care of it, didn't drop it or spill anything on it, and it is bricked after just a couple of years due to a simple component failure beyond my control.


Yeah that is a unfortunate design choice, I consider drives to be wear items, that is, they will wear much faster than the rest of the machine, and as such should be field replaceable units.


But it’s not like this design was a secret that Apple kept from you, this is well known information.


Might not be a secret but it's not like Apple inform you on it. Looking at the product page you need to click on "How much is storage is right for you" to find a vague : "MacBook Air storage is not user accessible".

What does accessible even mean ? Far from clear that even Apple won't upgrade/change it.

And as for the boot issue it was not present on older modele and quite a lot of people only learned of it with the recent Rossman video.


I would expect any person buying a device that costs a couple of thousand bucks to do some research before buying.


casual classism on HN. You're suggesting not only that everyone do the research, but be capable of doing the research, understanding the research, understanding whether they're being lied to, all without potentially any knowledge of how things work.

remember how normal people operate ffs


Sorry but you don’t need a CS degree to do some basic research.

I’m not a “car guy” but I can still make an informed decision when buying a new car.


I honestly hope that EU regulatory pressure forces Apple to make their Macs more repairable. SSD failure = brick is just bad design.


Meanwhile in another HN thread - "muahaha, look at these Europoors, how your economy is stagnating, no successful startups, no innovation. That's all due to your overregulation!!!111" /s


true in the context of HN, but kind of a moot point from a more broad consumer perspective, I think.


But hey, they're super environmentally conscious at least by adding a few recycled cans to the new laptop you'll end up buying after the repair isn't worth it.


had the same thing happen to the ssd in my father's thinkpad ... replaced it with a new ssd.


------

Now people may claim that Apple hardware is superior to other manufacturers and that it will last longer.

-------

On the average that's undeniably true.

But not for some Apple is perfect kind of reason but because apple is producing high quality laptops while others manufacturers are producing a lot of shit too besides that.


> The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and non-upgradeable as far as CPU, RAM and disk are concerned.

Given inflation, the Macbook Airs are pretty crazy cheap. If they break you can just replace them. The base base low end M1 Air is only $1000 last I looked (and another $100 off for those in education). They're some of the best price:perf in a laptop the world has ever seen. "inflated prices" is not an accurate description of an $899 retina M1.


Money is only part of the equation. It's pretty wasteful to have to toss a computer because one component failed, when that specific component would be replaceable in other brands.


This seems like a movement of the goalposts. I'm not that concerned about electronic waste, and I haven't seen any great arguments why anyone really should be.

It's not replaceable in other brands, because approximately no one is making a laptop remotely equivalent to the M1/M2. Perhaps the Surface Pro, but even that isn't a good comparison.

Remember when Apple made the first Air and it was so far ahead of every other laptop out there that Intel had to invent the term ultrabook to describe it so Asus and Dell could clone it? Yeah.


>I'm not concerned about electronic waste.

Why not?


If you’re concerned about electronic waste specifically you’d never buy a new computer, only repair computers destined for the trash.

Concerned about it in general is appropriate for governments and large companies, of course; they can make a dent.


That's an insane thing to say. I am concerned about electronic waste. I am not concerned about it to the extent that I would never buy or use a new computer. I am concerned enough about it that I don't want to waste a perfectly good computer because it has been designed to be thrown away.

Would you say that someone concerned about food/food-related) waste should not ever buy any food? Of course not. But don't buy food that you don't intend to eat, and don't buy food that causes unnecessary waste like food that is unnecessarily individually-wrapped.


I go out of my way to recycle things appropriately: dropping off old computers at the local e-waste depo, recycling old rechargeable batteries at retail, etc. It may very well be pointless, but at least I feel I tried.


That’s good, but reuse is best - maybe something like https://www.pcsforpeople.org/ is nearby. They’re great.


Apple has a tradein program where you can trade in your current Apple device for a discount on a new device.


Thats even worse! Tradeinn orogramm are insidious, their purpose is to drain the marlet of second-hand goods, so to increase sale of new ones.

Every time you sell a second. hand conouter you are cutting their sale of new ones.


But how much of what is traded in is actually repurposed?


And get paid peanuts in return?


The base model price is a little misleading.

256 and 8 gb are lowkey unusable for anything but a Facebook machine .

I always look at the price for the 16/500 or 16/1t machine as the reference point


I have the base model $1k M2 air and it functions excellently as a development machine. Both web dev for work (running multiple react clients, node servers, Pg instances, docker containers, etc all at the same time), and for basic game dev for recreation - though admittedly I’m using Godot for that which is among the lighter weight game engines available.

That to say, it’s far more than a glorified Facebook machine and is a truly excellent value. I much prefer it over my work issued intel MacBook Pro.


I used the M1 Air 8/256 as a dev machine for a whole year. It was faster than a 16/512 Intel 16" MBP from 2019.

Calling it a Facebook machine is silly.


That’s not true at all. Have you actually used one?

Edit: obviously if you need a windows vm you’re out of luck at 8gb ram


You’re out of luck if you need docker.

M1 Macs would make fantastic Linux laptops, because macOS is appallingly bad at backend development. It makes it up in sheer hardware power and efficiency, otherwise I’d be completely at loss why it’s so loved.


> macOS is appallingly bad at backend development

That's news to me. In what way are they bad compared to Linux? Both can run docker, Intellij, Insomnia and Dbeaver.


The main thing is Docker requires emulation


True although if you develop for arm then you can use virtualisation. Still, performance wise I've not noticed any impact.


Docker runs fine... Both x64 and arm64 containers run


It requires a full VM. macOS is the only OS with that requirement.


And? It still runs fine.


It works. It doesn’t work fine. It breaks, sometimes in a way you need to manually clean it up.


I had a laptop with a replaceable CPU. By the time it was worth upgrading it made more sense to buy a new laptop rather than upgrading the CPU. For repairability it's also not really that different unless you're running something like a framework laptop and, even then, replacement mainboards cost about as much as a whole new laptop anyways.

I do wish the M* laptops had something like a CFexpress slot or similar though. The built in storage is blazing fast and power efficient but soldered. Thunderbolt or USB drives are plenty fast enough these days but the external nature of the ports is a bit of a pain. Large amounts of storage are also where the price inflation really is (since you can't just add cheap storage it has to be the blazing high end storage + markup) so this would help the most.

RAM I'd rather have soldered in these days on a laptop. Performance is higher, power usage is lower, and you can order it with enough that you don't need to wait for new modules to be invented to upgrade and have enough RAM 5 years from now. It does mean you can't take any old M1 from ebay and upgrade it, which is definitely a downside, but for a laptop I think the benefits outweigh the cons. One thing I've noticed is there are always people that order them maxed out and move on when the next generation comes so high RAM options are still available used. One thing that is lost is, again, you can't choose to cheap out and get bottom bin speed memory. You have to pay for 8 channel 6400 MHz RAM in your laptop. Not necessarily a problem with the manufacturing method, it's possible to solder cheap RAM, rather just an Appleism. On the other hand... you do always know what you're getting if you buy one, which could be a plus.

Anyways, all of this is to say if you want a budget laptop which minmaxes it's price/performance ratio at a lower cost then the Apple laptops are definitely not the right pick. The reason for this is not as much how they are manufactured rather wher Apple targets the platform. The best price/performance optimization is probably a Chromebook with a monoboard and soldered RAM.


> The main problem with devices like Apple's Mx series laptops is that everything works fine...until it doesn't. This device is non-repairable and […]

The sad unspoken truth is that most apple users are just rich enough to replace the whole thing when it breaks, and don’t really care much of the environmental issues if apple hardware.


I doubt you can find any data to back that claim up. My first macbook was in daily use for about 8 years. My current macbook is only four years old so I'm not expecting to replace it any time soon. Colleagues seem to have the same usage pattern. Apple hardware lasts a long time.

edit: no hard data after a quick google but every article on the first two pages estimates 5-8 years.


Most of my Apple laptops I have given to other people who have used them for years. Of the three M laptops I have (including the first M1 Air), none has any issues so far. If one breaks, I would let it repair at Apple. My personal main Laptop has AppleCare to make that less costly, in case something breaks.


I often recommend Apple refurbished to friends who are strapped for cash, the argument being that it’ll last longer and be cheaper over the long run. So far haven’t really run into an issue with this. Call it the Sam Vimes theory of computer upgrades.


I'm a bit confused by the "still". How often would you expect to upgrade hardware? Keeping a computer for three years is completely normal for most people.


True… here I’m going wondering if it’s time to upgrade my 2017 MacBook Pro because a couple of keys started to fail and I’m having a hard time…


A thinkpad keyboard replacement cost ~50$ last time I needed it (and I thought it was unreasonably expensive).


I am using my Dell XPS 13 now for almost ten years. Still works perfectly.


I have a 10 year old MacBook Pro that works perfectly too. Unfortunately, Apple doesn’t allow OS upgrades.

I’ll probably get an M3 and use it for 8-10 years.


Ever tried OCLP [1]? It allows for patched MacOS installs on officially unsupported Apple hardware.

[1] https://dortania.github.io/OpenCore-Legacy-Patcher/


Thinkpad x220 checking in, still rocking my Sandy Lake and still enjoying a real keyboard. It's about 12 now. Other laptops have come and gone.

I couldn't live with my macbook m1. I hated the keyboard and the desktop - why is there no delete key, pray? UTM crashed a lot. Asahi was lovely but I just didn't like touching the hardware.


I got my M1 Max MBP 14" at the end of 2021 (so almost 2 years ago now) and my previous machine was an X220. The main reasons I switched were:

- I was a consultant/contractor, and sometimes you have to run software you really don't want to (Citrix, Teams). You can kind of do this on Linux w/ emulation, but on an older machine it's painful.

- Can't drive a 4k monitor at 60fps

- 1366x768 is hard to webdev on

I also had a multi-machine setup so like, a Chromebook for being spied on, a Windows machine for gaming, and an air-gapped Windows machine for music production. My partner and I had decided to move abroad and it felt like a little much to bring 4 laptops over, so I reluctantly consolidated. In truth the only thing I really gave up was acquiescing to being spied on, but at this point that's more because I haven't set up a pi-hole (etc.) than anything.

I do miss (Arch) Linux, and obviously the X220 input system is unparalleled, but I'll say that I'm pretty accustomed to the MBP input system now (I learned Fn-Delete is actual delete, which actually improved my life significantly). I probably lost ~30WPM (I'm around 100-150WPM on a good keyboard) and it's definitely a worse ergonomic experience to have to leave home row to use the trackpad (you're telling me you want me to stretch my ulnar nerve even more?).

There are other niceties. My partner can airdrop me things. I can load truly ridiculous Excel spreadsheets. I can move my music production onto it. I can stream both an Snes9x emulator window and a webcam grab through OBS to game w/ my brother in the states.

Glossy screens on laptops don't make any sense to me though. The argument is you get better color reproduction, but there are great matte screens that do that (I have a 32" matte display with 100% AdobeRGB that proves it), and besides that, I bet if the difference matters to you you've got a high performance glossy screen on your desk in an office with controlled lighting. Laptops go places! Sometimes they're sunny! Matte screens please!


This year I upgraded from MBP 2015 to M2. I am still bothered by the notch on the M2.


I use a completely black background, I forget the notch exists plus I save a bit of battery.


This is the way, or have an external monitor as your main monitor.

Or have four external monitors; the reason I had to wait for the M1 Pro.


you're changing architecture, and memory and storage speed in one go; i think you're going to like it


I upgrade every 2-3 years. Although current device is first macbook so maybe i'll break that (+ an M1)


I upgrade my macbook once per decade. I changed the battery, the ssd and the bluetooth card about 6 month ago (8 years into owning my second macbook). I need to change the screen (1 cm black strip on the left because of a shock) and the speakers are broken. Anyone has a tip for replacing the screen cheaply ? I spotted some going for 30$ on Aliexpress, it looked like the seller had a rather important stock.


Lower-end consumer laptops (think: Acer, Asus, ...) tend to fall apart after a year, this is what I was starting out with in university being rather poor.

Now its more like: the current device has zero issues, but the newer available models look increasingly tempting for my inner child to play with.

Also I owe my career, salary and life situation basically to open source (so I donate regularly) and ... apple computers that work flawlessly in many situations other laptops would be dead already or require me to tinker with stuff instead of doing billable things.


So true, 4-5 years seems reasonable to me, 3 years isn't much


it's Apple customer we're talking here...


True, I am using my 2014 desktop and my work PC is from 2020. Totally normal...


What do you mean still? They’re only a few years old at this point.

It shouldn’t be impressive that a basically new computer still does everything that one would want to do.


Not OP, but I can describe a similar experience. Still feels better than brand new $current-year non m1 laptops.


More like: if I would lose it today, I would be the same thing again, maybe next gen M2/3, but everything basically identical to this one. There is no single other device out there I'd prefer to buy instead. Nothing shiny to chase thats better somehow...


Keep in mind that people in their 30s and older today either grew up or lived their working years during a time when upgrading computers every year or two was the norm.


I remember those days, but that hasn't been the case since intel released the Core 2 line.


That hasn't been necessary for 15 years.


I still remember the joy of buying additional/bigger RAM sticks for my PC and then being able to do more demanding stuff much better :-)


Also am M1 user. I remember back in my Windows days the things would slow dramatically over a couple of years mainly due to software bloat.


That seems to have gotten better but I don’t know if it’s because I’m smarter on what I install, or if windows is better.

The problem I have is Microsoft keeps shoving windows 11 down my throat; that damn start menu needs to remain where it was in 1995!


windows 2000 was peak windows for me. only got worse afterwards one way or another... but hot take: windows 8 "metro" was kinda awesome refreshing though most people hated it :-)


W2K was certainly peak UI for me - all my windows machines run as close to that UI as I can get them.


It’s not a great thing, but most companies supply employees new laptops every 3 years if they want to keep those employees.


is it rare the decommissioned machines are given back to the employee after cleanup?


Depends on how the computers were procured. Leasing deals won’t allow that, for instance.


My org insists that the disks of all obsolete machines are destroyed and the rest of the hardware sent for disposal. I have a few systems that could be wiped and given away for charity but that’s not allowed. This seems incredibly wasteful to me.


seems a lot of orgs work on cargo-cult knowledge

https://twitter.com/SwiftOnSecurity/status/56891586200393318... (2015)

https://twitter.com/SwiftOnSecurity/status/11917713453627965... (2019)

i think swift had another post about it a bit more recently but i cannot find it now

if you encrypt the disk at the os level, you only need to throw away the keys, ie: reformat the drive. no need to physically erase as what is store is just noise once the keys are gone.


Thanks for the links. I wonder if it’s entirely that, or if management fears that staff might somehow gain a benefit by taking obsolete hardware.


There’s a temptation that is removed by destroying or selling to a liquidator - and it removes any chance of apparent favoritism: Steve got to take home that twelve CPU rack mount and I just got an old mouse.

But even more common in large companies is that it’s all on leases and goes back to the leasing company to be liquidated on eBay; you can always find miles of three or five year old gear there.


ebay being the access normaliser to this equipment doesn't sound too bad

it substitutes the internal network/hierarchy bias for a mostly monetary one


maybe there's no point for us to argue over the decisions of management


>and refusing to bother with company provided computers

To you and people that do this:

Do you enroll your personal PC in the companies MDM like intune or Jamf?

Do you install their endpoint protection/antivirus/EDR/MDR?

or you just leave it unmanaged?


Was wondering the same thing - going through a tech audit at the moment for ISO and that certainly wouldn’t fly, unless we were able to prove that we effectively treated it like we owned it.


Working at companies without strict audits, and previously at bigger corps that had audits we established parallel networks for developers. Either way, BYOD was possible after all.

Also I insist on not having security based on VPN/VPC only, but treat every service as if (and often has) direct internet connections, so devs are _forced_ to think about security from day 1.


2020 M1 Air here, still running strong and will soon hit 3 years (bought the day it launched).

So strong that despite finding the 15 inch air appealing just in terms of screen real estate, this current laptop doesn't feel dated. Especially since work provided me with an intel macbook pro with great spec but .. just doesn't compare with the exception of remote monitor count.


We got it the same day! I coded >100k LoC (the whole core of the OKcontract codebase) solely on that machine, without any external screen.

Working flawlessly since day 1.


Out of curiosity, is the poor posture which comes with using a laptop (mainly over extended periods of time) something you actively think about?

This is the main reason why although I love my MacBook M1 Air keyboard and screen, I still use an external monitor and keyboard for the majority of my work.


Good point!

Honestly don't think too much about it... probably I should care more. Just tend to change places regularly and run/swim daily.

Also, I keep quite a large font to be able to be further away from the screen and having a small view into the codebase forces to know it better. (vscode fullscreen of course with clock/battery in status line, hidden dock)


Well, posture can be voluntarily modified, so it doesn't "come with using a laptop".


The remote monitor was huge for me - when we learned that the M1 could only do one external (without dirty tricks) I held off until the M1 Max.

Multiple monitors is just a baseline requirement in my opinion.


Same, but since I'm only using a personal computer I attach it at home mostly to a huge monitor and is ok that way.

For work, at least 2 is desired.


It’s a 3 year old laptop, it better not feel dated


Give me a 3 year old laptop from anywhere else, I'd say they have dated.


I like UTM too. I must be doing something wrong though. I created both X86 Ubuntu and Debian 10 VMs. They are usable, but not quick. I have been blown away how fast Rosetta 2 x86 programs run my m1. The arm embedded gcc too chain compile code 4x faster than the native x86 MBPro from a year or so ago. But the UTM VMs run about 3x slower.


The article mentions that they are using virtualisation (arm Linux and Windows on Arm MacOS and M1) not emulation (Ubuntu and Debian x86 by emulation on arm on MacOS and m1).

Emulation is slow sadly. Virtualisation is approaching native speed.

“Both Ubuntu and Windows run at native speed since they are virtualized instead of emulated.”


You can't run x86 Linux quickly on macOS. What you can do is to run ARM Linux quickly on macOS and run x86 programs inside of that Linux with Rosetta quickly.


Wait you can use Rosetta inside a Linux VM?




Am I missing something? Is this quicker just because you are emulating only the program and not the OS?


It is quicker because Rosetta is a proprietary Apple technology which apparently works superb, but only supports userspace code. Emulating x86 with qemu does not work as fast, unfortunately. I'm not qualified enough to judge whether it's qemu shortcoming or it's just not possible to emulate entire VM as efficiently, but the practical result for today is that you want to use Rosetta.


Yes. If you run x86 Linux in a VM on an M1 you are emulating/translating every single instruction; with a native OS you are only translating the application.


I use Ubuntu in UTM, but I installed the ARM64 server version, then the desktop packages.


Interesting, this results in a fully functional desktop version?

I have a M2 and the hardware is great, but the OS that came with it not so. If I could replace the OS with Ubuntu that would be great. (Windows would also be great, but I have no experience in installing that)


Yes. You wouldn’t replace the OS but you can make UTM full screen and hardly notice it’s not the main os

There are attempts to run Linux on the Mac natively but they have limitations.

You can also get utilities that abuse Mac OS to make it imitate windows or Linux if you want.


I almost always use my MacBook Pro in clamshell mode attached to a thunderbolt dock for external displays. Right now Ashai Linux does not support thunderbolt, however it is a work in progress.

Does UTM work well with external displays? Preferably in full screen?


> Interesting, this results in a fully functional desktop version?

It's very snappy. I run Firefox to work with my Internet banking site in an isolated environment.

I installed KDE -- I am very satisfied.


Are you using emulation or virtualization? You mention x86 versions but the M1 uses arm64. That may be why it is slow.


I think it would be faster to use an ARM64 VM and use Rosetta, FEX-Emu, or Box64 for x86 programs inside the VM.


I've had some great results using Orbstack's x86 Linux VMs, though primarily for Linux development.


Yes, before installing read that UTM uses Qemu and did not pursue it any further. AFAIK only Mac runs on apple silicon, everything else is emulated.


As I understand it, an aarch64 Mac can virtualize aarch64 Linux and does not have to emulate it. Here's an Apple sample project: https://developer.apple.com/documentation/virtualization/run...


There's both Apple's Hypervisor framework (a lower-level framework for hypervisors to use to avoid needing a kernel extension), and Apple's Virtualization framework (which is a higher-level framework that lets you run Linux and macOS VMs). QEMU supports virtualization on macOS with the HVF accelerator, which uses the Hypervisor framework. UTM uses QEMU, supporting virtualisation through QEMU with HVF (as well as emulation of other architectures with QEMU). UTM also has support for using the Virtualization framework. But either way, with an ARM64 image, you'll be getting virtualization (unless you disable HVF when using QEMU).


I recently bought a M1 Air for $750 (25% less than regular retail) from amazon as a vacation laptop. I was expecting it to have quite a bit of performance limitation but the machine is actually fantastic. I think it may be one of the best values in a computer I have experienced in quite a while.

This thing is so accessible in its affordability and performance that I think I am kinda over obsessing over if an application or service is mac-only.


It has a price/power ratio off the charts, mainly because Apple wanted to show off what Apple Silicon can do while having to stay at $999 for Air. I love mine.


There used to be (and still is somewhat) an entire class of Mac-only software; some of it was extremely well made.


OmniGraffle and SubEthaEdit in the house!

But in all seriousness I use Notes and Photos like a second brain to story memories and information.


The M1 is great, but I really dislike the limitation to 2 external displays unless you get the ultra model CPU.

I've just gotten really used to my portrait landscape landscape setup! And of course I'm an outlier so corporate IT won't buy me an ultra CPU version.

I guess that it's to force people to pay more, but it feels like such an arbitrary decision. Sure, restrict it to 2x6K displays, but I'm using 3x4K displays that the Intel MacBook Pro handles just fine.

Also, it's a real hassle trying to explain to angry developers about why their 4 year old ancient x86_64 arch docker containers won't work natively. (The containers crash under Rosetta emulation)

Compiling Emacs in 5 minutes vs 10 on Intel is really nice though!


> I know it’s to force people…

I’ve seen the longstanding understood technical explanation for this limitation. I can’t recall what it is now, but it seemed more than plausible at the time. Of course, ‘Apple bad’ is always going to be more viral. And, of course, you can take any hardware component and “just add…” it to max spec.


I'll edit "I know" to "I guess". Sorry, I just woke up and saw this thread while making coffee.


The 15” MBA not getting a Pro variant like Mac mini is a bummer. I hope they do that at some point because I agree, a 1-monitor limit makes it a toy.

The base M1/M2 only getting 1 display does make sense from apples perspective though. M1 is fundamentally a tablet chip, just a fast enough one that it crosses the line to ultraportable laptop as well (and it’s quite fast for a tablet chip as a result). Tablets don’t need 2-3 external displays. Nor does the average user of a MacBook air. Remember this is a product that’s essentially coming from a netbook heritage, 1 display is fine in that segment.

What doesn’t make sense is not offering a M2 Pro option especially on the 15” Air which could obviously handle it. Because other people do need it, and the MacBook Pro isn’t the same product, it’s got fans and a different display and chassis etc. There is a segment who is just using the MBA as a fanless ultrabook and the Pro cpu is necessary to support the needs of this segment. I really hope 15” MBA gets an M3 Pro option at some point.

And please for the love of god put the 13” touchbar model out of its misery. It looks like it’s getting refreshed again with the M3 line. The new chassis on the 14” and the 16” is just lightyears better.


> Compiling Emacs in 5 minutes vs 10 on Intel is really nice though!

I know the thread is one day old but in case you read this comment... That's really weird though: on my 7700X the latest version of Emacs compiles in 1m36s (and 1m39s, only 3s more, when it's in "eco mode", thermally throttled at 85 C).

On the 3700X it was already only 2m25s.

I think there's something wrong if a M1 needs 5 minutes to compile Emacs where an AMD 7700X desktop only takes 1m36s.


M1 Pro and greater break that one monitor limit.

There is also a trick with external docks that can get an M1 to use two external monitors but I just got a Max.


EDIT: I think we misunderstood each other, sorry about that! You were referring to 1 external display, I was referring to 2 being the limit of the Pro Apple CPUs.

----

This says otherwise: https://support.apple.com/kb/SP858?locale=en_US

M1 Pro: 2 external displays. M1 Ultra: 3 external displays.

https://support.apple.com/en-us/HT213503


Yeah, the M1 can do one external + the built in (so two)

The M1 Pro can do two external, the M1 Max can do four (three at a higher resolution, one at a lower).

I have a Max with five displays total (including the internal). I suspect with some trickery I could get more than five (some external adapters allow two displays to appear as one larger/higher resolution one. But 34,216,704 pixels should be enough for anyone ;)


Little known fact is that the M1 does not support nested virtualization. As a college professor I really miss that during Docker classes. Supposedly the M2 does. Can anyone confirm Docker works under Windows on Parallels?


I have to pitch in and recommend Orb Stack[1] for virtualization here. If you need WSL-style, terminal-only Linux VMS or want Docker without the bloat of Docker desktop, it's the perfect tool. If you go with Alpine as your distro, the whole thing takes under 200mb of disk space, and less RAM and CPU than most Electron apps when idle. It can do most things that WSL can do, Finder integration, lightning-fast, extremely easy installation, VS Code integration, running Mac commands from Linux and vice versa, effortless file sharing and network connections between the host and guests, fully terminal-based management, out-of-the-box integration with things like "open" and "pbcopy", VPN / proxy detection and autoconfiguration, it can even use Rosetta instead of Qemu for running Intel binaries, which makes them faster than on quite a few Intel laptops.

I'm not affiliated with the company, just a very happy user.

[1] https://orbstack.dev/


UTM is great, but sort of useless for many of my most acute Windows needs: talking to some random peripheral or device that only has a closed-source x86 windows driver.


How does UTM compare to parallels? I’ve used parallels for years but I’m curious


I have an m1 mbp and a asus zephyrus g14 of similar vintage, and I have to say day-to-day, stock, the m1 mbp can get better battery life idle, and is less prone to getting into a state in which its fans need to activate. I'd say they can both get over 10 hours of battery life easily which gets into the point where who cares anymore, but it's easier to get the asus to drop to only 5.

But, speed-wise, I haven't noticed a difference. They're both snappy. In fact the asus would be faster for all practical purposes of mine because: * macos has comically bad UX, it wishes for me to use the mouse far more often than it should, and fights me on keybindings. * programs like blender and stable diffusion just aren't optimized for apple. they're optimized for nvidia. i can't say if the m1 gpu is actually good, because nobody seems to care to support it, which brings me to my final point:

it doesn't much matter if m1 is or isn't good as long as it's exclusive to apple's walled garden. if I can't put it in my PC, it may as well not exist, both from a user point of view but also support of developers. The best outcome of the m-line of CPUs would be for hardware competitors to make another great ARM chip, that I can actually use elsewhere.


Weird, I've had the exact opposite experience with UTM. I need to get Ubuntu up and running with virtualization (emulation is too slow for GUI work), but the install simply hangs. If I forcibly reboot, the installed OS simply blackscreens and does nothing. I've spent a good few hours trying to get it to work (including turning off the display and trying to use the old tty terminal output), but eventually I gave up.


Have you tried updating UTM to the latest version? Many of these bugs have been fixed in newer versions. UTM doesn't auto update, so you have to check for updates yourself on their GitHub releases page: https://github.com/utmapp/UTM/releases


The auto update apparently only applies if you purchase from the app store.


Orbstack

I thought https://orbstack.dev/ is what people are using these days instead of UTM.

Am I mistaken?


I don't know, but I reject the premise of the question (i.e. there is a consensus). But I'm no pollster.

Speaking for myself, to virtualize arm64 *nix on an M1/M2 Mac, UTM has worked whereas other alternatives didn't work for me (such as VirtualBox). I haven't put my hands on the glowing orb yet; I'm sure it will be mesmerizing and tell me to do things like watch Inception (inside a VM) ^ 4.

And Orbstack has a comparison chart with UTM: https://docs.orbstack.dev/compare/utm


FYI vbox isn't meant to run arm-on-arm VMs.

The arm release is a "whoops it got out before it's ready" moment, and the current status apart from being buggy and slow, is that it only runs 32bit x86 VMs on arm.


UTM runs GUI apps, while Orbstack doesn’t. It‘s kind of like VirtualBox vs. Docker Desktop.


Good point, definitely makes sense to use UTM if you need GUI apps. It's planned in the long term but in all honesty, I'm more focused on containers at the moment.

Also worth mentioning that OrbStack's philosophy to be more like WSL (designed for integration first and foremost, not isolation) than UTM (full virtual machines, isolated from the host).


It's usually the case that people are using several different competing tools rather than one specific "latest" tool, isn't it?


Maybe it's still getting out there? First time seeing it, looks super promising. Thanks for sharing :)


Still a bit newer, but thanks for mentioning OrbStack! Happy to answer questions as the dev.


I only use utm to run chrome for testing because when run natively with selenium it steals the window focus all the time. I want the test to run while I'm coding and also see and interact with it.


You are mistaken.


Orbstack does not seem to run Windows.


Edit: The project does not run on "docker machine" as per the graceful correction by OrbStack's developer below[2].

It's a proprietary container interface that - as far as I could tell - runs on the outdated and unsupported "docker machine"[1]. An alternative would be to run a more recent container runtime of your choosing in a VM through UTM which works beautifully.

[1]: https://github.com/docker/machine

[2]: https://news.ycombinator.com/item?id=37107233


Hey, dev here — not sure what you mean. OrbStack currently ships an official build of Docker Engine 24.0.5, which is the latest version as of writing. Previous versions used Alpine Linux's build.

I did intentionally delay the update from 23.x to 24.x by a few weeks while I waited for things to settle, because last time I updated from 22.x to 23.x shortly after release and some users were hit by bugs that hadn't been ironed out yet.

Edit: Noticed that you edited your comment to say that it uses "Docker Machine" instead. That's not the case. OrbStack uses a nested architecture where it runs Linux machines [1]. There's a special machine called "docker" that runs a copy of the Docker Engine, but that's not related to the old Docker Machine project.

[1] https://docs.orbstack.dev/architecture


I stand corrected! Glad you were here to do so.

My confusion must have come from mixed terminology being used by others than yourself on GitHub and in articles about your project I skimmed through when looking into it.

The project might still not be for me regardless of its underpinning, although I will have to say that it's immediately obvious that you persue high standards and care a great deal about delivering a quality product.


Makes sense, no worries!


I wish I could upgrade from my 13" mbp that's the last of the intel i5 series. My 16" M1 Max that I was issued at my last job was a dream, and now that I'm playing with SwiftUI and dealing with xcode, it's feeling slow and noisy. Unfortunately I was laid off, so no M1, and the ram upgrade to the 64gb or 96gb is insultingly expensive still. As nice as it is, it isn't a $4k boost in productivity.


B+H has been periodically running deals on the loaded M1 Max 10C/32C 64GB/4TB for $3000-3200.

https://slickdeals.net/f/16807523-apple-macbook-pro-16-2-lap...

M1 Max doesn’t get the HDMI 2.1 port (although you can coerce some cablematters adapters to do it), or the Wi-Fi 6E, and the M2 Max is generally faster especially in gpu (at the cost of higher power - perf/w doesn’t change much) but in general it’s not worth paying $2k more for.


(to be specific: M1 Max has HDMI 2.1 (including things like HDMI Org VRR afaik) but it's limited to 4K60. M2 gets 4K120 and some other features. As always, examine each device for the feature support you care about, because HDMI 2.1 completely supplants the HDMI 2.0 standard and no HDMI 2.0 certs are being issued anymore, so, "HDMI 2.1" devices can range between 4K120/8K support with 10-bit deep color, and 4K60 support with 8 bit support and no VRR or any other features. Each individual feature is treated separately, a HDMI 2.1 receiver could support 4K120 but not 4K, and so on.)

And of course, this might not be the best time to be spending even $3k. But it doesn't have to be $5K for a fully loaded M2 or $4k for a decently loaded one. $3k gets you a lot of laptop too, just not the M2 right now.

Refurb is another option... you can do a 64GB/1TB with the M1 Max 24c GPU (vs 32c) for $2550 via refurb store if you can get the edu or mil/veteran discount. Personally for another $500 I think the 4TB and the nicer GPU would be worth it but again, it edges the cost down quite a bit.

Woot also has been carrying "manufacturer-refurbished" M1 Pro/Max for pretty good deals recently. IIRC the 16GB/512GB is $1550, and you can get 32GB/1TB for around $2100. And because it's manufacturer-recertified (Apple) it's eligible for applecare.


Good to know about the sales, though being in Canada might hamper me there. The refurb and edu discounts are pretty decent though, and I'd probably consider those if I really start pursuing an upgrade.

HDMI has always been a bit irrelevant for me, aside from using it with work-issued displays. I've always used displayport over thunderbolt, and don't know that HDMI offers any advantages. I'd prefer almost any other port over HDMI.

My issue is more that every time I spec out a new MBP, it genuinely feels like I'm being grifted, and it really turns me off more than in the previous 10+ years I've been buying macs. $200 for a ram upgrade was already too high, and hurt the wallet then, but $500 - $1500 CAD for ram upgrades (inflated by the apparent, probably artificial, requirement to choose CPU upgrades that I'd never have in the past and wouldn't likely get any value from.

If I choose the minimum CPU that allows 64gb, I then need to spend an addition $750 to add 8 gpu cores and go up to 96gb (not that I'd necessarily go that high or feel like I'd need to, but it's not like I can upgrade, and just stands to illustrate the point). $750 is my damn rent payment.

So before even going past 512gb ssd or including Applecare or taxes, the computer sits at $5100 CAD, if I were to max out the ram, on the 14" new.


$4k / ($your-total-cost * 3). Assuming 3 years of ownership. Tools are not only a laptop but the idea, I believe, is valid.


If it's worth upgrading to, it's worth keeping for way longer than 3 years imo, but even then the amortized cost is quite high.


> Both Ubuntu and Windows run at native speed since they are virtualized instead of emulated.

Coudl someone explain it to me how it's possible? Because their website clearly says "On Intel Macs, x86/x64 operating system can be virtualized. In addition, lower performance emulation is available to run x86/x64 on Apple Silicon" - and Windows on ARM64 is still in a preview.


They run ARM builds of both Ubuntu and Windows. To run x86 programs, on Ubuntu they use Rosetta and on Windows its own emulation solution.


I use Windows on ARM64 on a daily basis in Parallels. It has been a shipping product for a while now (on Surface hardware).


I went back to an M2 Pro mini recently, and have mostly the same experience: https://taoofmac.com/space/blog/2023/06/22/1200


M1's lead architect is now in Tenstorrent, leading Ascalon.

Ascalon is expected to be released in 2024.



What am I supposed to see there?


> Atomic was founded by Sam Zeloof and Jim Keller.

i assumed you were referring to jim


I was not.

M1 was led by Wei-han Lien.

Ascalon is led by Wei-han Lien.

Jim Keller is (and continues to be) CEO of Tenstorrent.


Meh, I have some old screwdrivers that still work too. I just don't get hardware fetish. Just boot it up, run code, shut it down, and go enjoy summer and/or winter.


> It is not often I am blown away by a piece of hardware.

Indeed. The Korg M1 was definitely a milestone in the history of music synthesizers.

Oh wait…


When I saw the title I thought they were talking about the Korg M1 synthesizer (which also deserves an ode) :).


I wasn’t the only one, good. :-)


i actually have vmware fusion for running windows apps and things, wonder if i should just switch wholesale


I tried to clean the stuff off of my keyboard and accidentally broke it. Not amused .


Did you clean it with a hammer or what?


Just a sponge that was lightly dabbed


I believed fanboyism was left at the 2010s


You were wrong. Ever hear people talking about AMD gpus online?

It’s crazy how normalized it is for certain fanboys to just call everyone who disagrees with them “brainless”, “ruled by mindshare”, “buying it for the blue bubble”, etc. I see it all the time even on HN, it is crazy how normalized the android and AMD fans have made it to just insult a group of people to their faces. Even in a tightly moderated community!

But put out one article saying “yo M1 is a good laptop” and watch the “I thought we left fanboyism in the 90s!!!” replies flood in.


What about NVIDIA fanbois?

They will buy NVIDIA, irrespective to whatever else is available at any given price point.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: