Hacker News new | past | comments | ask | show | jobs | submit login
M1 Pro 14“ MacBook Pro Running KDE Plasma 5 on Arch Linux ARM (twitter.com/marcan42)
392 points by nixcraft 77 days ago | hide | past | favorite | 259 comments



I am so happy to see the quick progress of Linux on the ARM-Macs. In the 80ies and 90ies, there were several competing processor architectures on the market, but for the last 20 years for PCs it was x86 only. With the Apple Silicon, there is now a real contender, actually surpassing the current x86 offerings in many aspects. And that is, why competition between architectures is so important. And of course just interesting from a software development perspective.

With Linux becoming a viable option on those machines, they become interesting for a far wider audience than just the MacOS users. Thanks to the great work by Alyssa, GPU acceleration should be close too.

Then lets see when Linus gets himself a Mac, he already indicated that he would be interested to do so as long as he doesn't have to port Linux himself.


> With Linux becoming a viable option on those machines, they become interesting for a far wider audience than just the MacOS users.

While the option of running Linux on one of these M1 chips is intriguing to many of us, I have a hard time seeing that this will bring these machines to a "far wider audience" than MacOS users. It does open up some niches, and in particular it could mean that people will still be able to make use of these laptops after Apple stops supporting them. But we're pretty much a rounding error for the duopoly that owns the desktop OS market.

I do share your admiration for the accomplishments of Alyssa and all of those who are porting an open source operating system to a new hardware design with little help from the manufacturer.


The size of the audience is moot, but the benefit to me, a software developer, is sizeable.

I have remained a Mac user through the most recent several major OS X/OSX/MacOs/macOS changes with increasing reluctance as Apple increases its ownership of my hardware. I "own" an M1 Mac. I would like the freedom to run a free OS with free drivers on it. I watch Asahi Linux and the associated work closely. I donate and I hope.


It's really hard to beat apple's laptop hardware in overall quality. Maybe framework will be there one day, but it's difficult to see them with an ARM laptop any time soon. I know more than one person who wants to just use apple laptops with linux for example.


Yup. My grandmother wanted a laptop to use, so I slapped basically chrome os on an old 2012 MBP and she's been using that just fine. The hardware itself is still quite snappy, the GPU is way out of date, but for facebook and youtube and video calls? Works great!


>I would like the freedom to run a free OS with free drivers on it.

I may be reading this wrong, but it sounds like you think this is something of a limit because of Apple. You have always been able to do this if you could find software that works. You can't "use" their OS on non-Apple hardware though.

It's not Apple's responsibility to write code for software that is not theirs. Could they release "drivers"? For what? Why would they? The hopes of selling 1k more hardware units to Linux devotees? Why would they want to incur the expense of that support when it is such a tear in the ocean level of user base?


You are right. It is not Apple's responsibility, further I knew the score when I bought my M1 Max.

I merely wish that Apple didn't think it was their responsibility to track and monitor my use of "my" hardware. Free and open software allows me to use the hardware exclusively in accordance with my desires.


>track and monitor my use of "my" hardware.

in what way are they doing this? is there something other than no other software works on this new hardware preventing your from running other software when (not if) it does arrive?


I thought they officially supported linux on the rack version of the mac pro?


No, it mainly exists for music production which also uses racks, not for server-y applications.


Macs have huge penetration into the developer market. I could see a lot of devs whacking Linux on their M1 Macs to enjoy benefits like being able to run containers outside of a virtual machine.

If Apple waited 6 months and released a significantly cheaper 16 inch M1 Pro Mac with a non-XDR screen and in the old form factor (to save money on tooling), similar how they do with the iPhone SE line, then they'd make so much damn money from devs jumping on board.


In the big developer markets, the price of the device isn't a huge factor. Today, the available stock is a bigger deal. At least for a daily driver.

I also want the highest resolution screen with crispest graphics possible if I'm staring at it all day (1080p IPS displays are awful for my eyes). The thing about Macs that makes them nicer than most comparable Linux machines is the display is much better.

One of the only reasons to consider the Mac over something like an XPS today though (other than that you can go out and buy it) is that it supports Adobe products which you may need for front end work. However, if my company is all in on Figma and Im just working on a generic backend you bet your butt I'm asking for an XPS.


> In the big developer markets, the price of the device isn't a huge factor.

You might suffer from SV syndrome. I live in a small town in the east of France: anything Apple is unaffordable (without major sacrifices and uncertainty to be able to replace the machine if it breaks too soon).

Right now if I had to replace my current computer, I would have about 600€ total budget. And France isn't exactly a third world country (but we're doing our best to get there ;)


I'm also in France, but in a larger city, but what you said still rings true. I prefer older Thinkpads, as if something breaks I can either fix it myself or take it to just about any computer repair shop on earth and they'll have the ability to repair it. Plus they tend to run Debian with ease.


I'm not in SV, but my employers have all provided my machines. Some companies are better than others, but price usually isn't a factor at places I've worked. I also probably wouldn't work at a company that gives their engineers shitty computers, since that's like working at a restaurant that makes their cooks use dull knives (and there is a big market of places that will give you whatever set you want)


1920x1200 is good enough for me, i don't use anything adobe, don't do graphics, etc

have you looked at cpu performance or cpu-only power efficiency of the XPS 17 vs M1 Pro MacBook 16 for backend work ?


The last I heard, docker was around 2x slower than Intel macs and that performance was already pretty bad.

I don't really care about power efficiency tbh, I'm plugged into the wall all day


I don't know where you heard that, but even if we accept that assertion, it's not relevant to this conversation.

We're talking about running Linux on the Mac, so you won't be paying that virtualisation penalty.

There is no reason to think ARM64 containers running on an ARM64 machine would run any worse than AMD64 images running on an AMD64 machine. So given M1 Pro/Max Macs consistently rate amongst the best performing laptops on the market, you should expect a similar experience when running ARM64 containers on Linux.

As more developers pick up ARM64 macs, more binaries and containers will be releasing in ARM64. There was already a massive boost in this kind of thing after the launch of the M1.


I want to run the same containers I deploy though, we deploy native code, and right now we are not deploying on ARM.

So paying the price of Docker x86 images chomping CPU and being significantly slower is a downgrade from Intel.


Then maybe an ARM platform isn't the right choice for you.


No they wouldn't. There really aren't enough devs around to make that statement true. Plus, if so many devs would do that, they can buy the 6-month-old refurbs at that time and do what they want. If they aren't doing that, then they wouldn't buy the machine you're imagining.


Devs may be small in number relative to the consumer market, but they're big in impact when it comes to building out an ecosystem. People complain about Linux on Desktop, but the Linux on Desktop is unreasonably good considering its market share.

As devs build out that ecosystem it'll become easier bring products to ARM. Eventually, we might even see something like the Steam Machines built on an ARM64 platform. If it ever happens, it'll happen on the backs of those trail blazing devs who built out the ecosystem, because they were able to get a pretty rockin' laptop on an ARM64 platform.

Devs also happen to be great evangelists for bringing technologies into the corporate ecosystem. This will create a demand for porting popular MDM and other corporate tooling over to ARM64.

I'm a dev and I'd love a machine like that. I actually think the older form factor is better, I carry my laptop to and from work every day, so I appreciate a lighter and slimmer laptop and I don't think I'm alone on that account. I couldn't care less about a screen that can get up to 1000+ nits, but only when playing certain videos.


I fail to see how any of that would make Apple a bunch of money.

I know Apple needs devs to make money. Lots of them. But I don't see any lack of them at the moment. In this context it appears we are just talking about selling more machines. You'd buy one. So would a few others. Apple wouldn't make a ton of money on that machine though.


There are enough iOS developers to make some difference. And I think that if it weren't for those iOS developers, who have no choice but to run MacOS, there would be many fewer developers using Macs.


Not just iOS, just the issue that you need MacOS to make applications for the apple ecosystem regardless. Im sure iOS is a big part, but I also imagine all the desktop application that want to be cross platform, so think like Photoshop. Or Microsoft has to have a Mac somewhere to compile Office to MacOS.


You don't need a Mac anymore.

https://aws.amazon.com/ec2/instance-types/mac/


I could see a lot of devs whacking Linux on their M1 Macs to enjoy benefits like being able to run containers outside of a virtual machine.

If Apple did something about the container situation, it might show a lot of developers that Apple notices them and gives a damn.


The containers won't run on an M1 Macs.

Software packages baked into images (at least the ones meant to be run ultimately on the servers) are actually x86 binaries, so no luck running them on ARM CPUs.


People can't build new containers with the appropriate binaries?


They can, but typically, as a developer, you want to run containers with images that you'll later deploy to servers. And servers run on x86...


I would think most developers in the world are using corporate laptops. Don't think we'll ever see a fully unsupported OS installed on those.


There's plenty of companies that operate on the business model of selling 'supported' linux to the narcs in corporate IT.


Running container outside a VM is my use case too.

Would love to see some performance numbers.


What is the advantage?


Docker Desktop for Mac runs Docker in a VM. It has poor disk I/O compared to native, and IIRC it is not as capable as native from a networking perspective (perhaps not a big issue). Also, as of recently it costs money to use Docker Desktop for Mac for work purposes.

So the advantage is you get better container performance, all the capability you would normally get with container workloads on their native platform (Linux), and you wouldn't have to use the Docker Desktop product anymore.


In addition to performance, I also ran into weird bugs with the Docker Desktop vm. The VM would run out of disk, and other things that would not occur with a native docker.


I was constantly seeing a coworker at my last job in that sort of situation. For some reason Docker on MacOS was inflating RAM use, IIRC.


It also has weird bouts of high CPU usage, even if running containers are not busy. I think it's due to overhead with osxfs handling local disk changes (e.g. you've mounted a volume with a large git repo, and in that repo on the host you switched to a much different revision). It's hard to troubleshoot because it's not obvious how to get into the VM to see what's going on.

FWIW, https://stackoverflow.com/a/68561693/3230028 is how you do that.

Overall, this is an issue that I think goes beyond Docker on a Mac. We have multiple examples of development targeting one platform, or involving tools native to that platform. But because of preference, or conflict with other tools, or policy, we end up with these emulation or compatibility layers like HyperKit/Docker for Mac, or WSL/WSL2, or WINE, or frankly WebAssembly/Emscripten/whatever. And I think the result is almost always worse than if people just found the most native set of tooling for their primary use case and used that.

What is it about macOS that makes Docker Desktop for Mac worth it, if your development workflow is heavily container-based? At my workplace, it's mostly policy and preference issues making Linux unsuitable for the majority of the user base, even though that majority is working with containers constantly and targeting Linux.

I guess you can pick at this and point out that we're not writing software in assembly for a reason, but I feel like there's a line where you have too much abstraction, adaptation, or emulation, and for me workflows built around something like Docker Desktop for Mac just so we can use macOS are over that line.


Docker on the Mac is 10x slower for some of my workloads than Docker on my linux cloud machine. (e.g., npm install is literally a few seconds vs 5 minutes)


Rich countries developer market, on the other tiers not so much.


> It does open up some niches

A laptop that doesn't throttle down when unplugged from the wall, yet still maintains an all day battery life is hardly a niche.

Laptops are supposed to be a portable computers that work anywhere, not just luggable computers that must be plugged in to work.


>Laptops are supposed to be

By whose definition? There have always been trade offs to achieve that portability. Processing power has always been one mainly due to the electrical power demands. We're just now getting to battery tech that is impressive. We've taken this long to get to processing abilities that didn't require being attached directly to a power line to idle.

If "supposed to be" means we all have been "hoping and wishing one day" it might be possible, then sure, "supposed to be" it is.


>Processing power has always been one mainly due to the electrical power demands. We're just now getting to battery tech that is impressive.

The M1 Mac laptops are using the same battery tech as everyone else.

What has changed is the ratio of performance to power draw, and leaving behind the almost immediate thermal throttling you see in x86 laptops.

>The chips here aren’t only able to outclass any competitor laptop design, but also competes against the best desktop systems out there, you’d have to bring out server-class hardware to get ahead of the M1 Max – it’s just generally absurd.

https://www.anandtech.com/show/17024/apple-m1-max-performanc...

High performance, combined with all day battery life is a new thing, not a niche.


M1 Max Macs that have enough memory and processing power to be a mobile workstation can use their battery in 1 hour if running at 100% CPU usage so if you need all that performance then you will still need to be close to the wall power outlet. If you don't need all that performance then there are other laptops (not Apple) that can last long enough between charging too. M1 Mac is currently an excellent laptop for regular users but real power users either have to be close to the wall or use desktop CPUs to run their workloads...and no, web and software development does not require a lot of CPU power, I know, I am a developer and I develop software for people who do need to run 100% CPU for hours and hours and M1 laptop will not last them any longer than the i7.


>there are other laptops (not Apple) that can last long enough between charging too.

The reviews don't agree that the "not Apple" laptops have anywhere near the same performance and battery life.

>the new MacBook Pros with M1 Pro and M1 Max chips are incredible — the fastest laptops we’ve ever tested in some tasks, with some of the longest battery life we’ve ever seen.

https://www.theverge.com/22751921/apple-macbook-pro-14-16-in...


They can have the same battery tech, but if the rest of the hardware is less efficient, then the battery will not have the same "performance"


>can use their battery in 1 hour if running at 100% CPU usage

But I'm okay with that. I can be working at home/work and then move, pickup at new location, and go right back to work without losing peformance. Just need a plug


> But we're pretty much a rounding error for the duopoly that owns the desktop OS market.

We are a rounding error when it comes to casual usage; but in the pro segment and especially for coding, Linux usage seems nontrivial.

According to Wikipedia [1], Linux had 2.33% share in laprop/desktop OS. But in a SO survey 25% of programmers picked Linux.

[1]: https://en.m.wikipedia.org/wiki/Usage_share_of_operating_sys...


I run a small Linux compute cluster for a research institute with a ~8 kW budget. If it ran m1s, I’d have 5x more cores to work with (and the cores would be faster). How niche is this?


>How niche is this?

You my friend are the specialist of special snow flakes.


Extremely niche


> It does open up some niches,

I would love to be able to run Nuke, so definitely one of those niches. There's not much *nix only software that can't be run on macOS, but there's definitely lots of macOS software not able to run on *nix. Being 1 reboot away from using whichever is needed is dream a little dream territory.


I agree with this sentiment. Overall, we should be praising projects like frame.work over proprietary and hard to port systems like the M1. I'm hoping some day the frame.work laptops will have a mainboard available similar to the M1, but I'd rather have repairable than not.


Agreed.

This project would open things up if it did something novel.

IMO a Linux distribution is the perfect base as a metaverse client for the entire internet.

Ditch the window manager that only acts like a desktop metaphor and login to a 3D capable viewport. Toggle between 2D and 3D representations, virtually load websites. Like applying AR to cyberspace. Natively relying on that ML friendly GPU.

Something like Godot as the window manager process (controls abstracted behind a traditional default UX or something) and hacking away at its scene tree format. Update UX state to be a 2D UI if needed.

Store the contents of a file as a hash to regenerate it like procedural game engines do would improve security if the users login unlocks things. /home need not be a traditional filesystem at all.

There are a lot of ideas going unexplored due to the money being thrown at business as usual problems.


20 years ago IBM PowerPC was still a contender too. With Apple no less.

The x86/64 solo reign was more like 15 years.

But I miss it too. The 90s with all its amazing architectures. SPARC, Alpha, MIPS, PA-RISC, PowerPC. I still have several of those here at home :) Computers have become boring and it's nice ARM is shaking things up.

I wonder if M1 will ever be fully supported though. With full unrestricted video, 3D and AI acceleration. There seems to be a lot of Apple secret sauce in these processors.


> with all its amazing architectures. SPARC, Alpha, MIPS, PA-RISC, PowerPC

And here's the thing, mostly they weren't amazing. They were just expensive and not popular. But some things like SPARC were just headache-inducing

https://atiqcs.wordpress.com/2016/11/15/sparc-register-windo...


Every architecture has its quirks. But compared to x86 and its legacy, most RISC architectures were quite nice. And the point is, you had multiple competing archtectures, so there was a chance to try out new ideas and find out, which of those were actually good.


It was pretty amazing to enjoy 64 bit computing and before Intel had it. They were more popular in some niches, for a time, than anything else.


Yes Intel's attitude is too much based on marketing sometimes.

"Consumers don't need 64-bit" (and trying to promote Itanium)

"Consumers don't need ECC RAM"

It holds back the industry now that they are the only PC platform.

PS: I think Itanium was a really good idea but again marketing made it unviable. They wanted to position it purely for servers, just at a time where there was a real cost focus on servers using commodity hardware (e.g. from Google)


> And here's the thing, mostly they weren't amazing. They were just expensive and not popular.

I agree with this part, but the sparc register windows are completely reasonable and obvious.


It might be reasonable, but it's a chore, especially on linux to work with it https://news.ycombinator.com/item?id=23709902


"Avoiding needing to spill registers to the stack" is a focus of much optimisation work in lots of places - and, yeah, it's absolutely work but the performance gains can still be worth it.

Plus I'd argue that SPARC is enough simpler to work with than x86 in other ways that you could look at it as "spending your complexity budget somewhere else" more than it being more headache inducing.

(though admittedly I cut my teeth on assembly on an ARM2 so basically every modern architecture is kinda headache inducingly complicated to me ;)


> The x86/64 solo reign was more like 15 years.

Solo, yes, probably 15 years. But domination? Probably starting with the IBM PC in 1981 (ok, make that 1982 or 1983 to allow for the sales ramp up) until 2020. That's a very long run for a computer architecture.

I expect that the next architectures to rise to the top will be even more entrenched.

We're way past computing's early years, childhood and adolescence.


Yes, PCs are around since 1981, but if you looked around, they were not that commonplace outside of businesses and quite rare in private usages. There was the age of the home computers, first 8-Bit (Atari, Commodore, TI), then the 68k Machines, the first ARM. In parallel, all the great workstation vendors with their RISC chips.

The PC for home usage really started only in the 90ies, scientists would use workstations. It was only towards the end of the 90ies, that x86 caught up with them and well into the 2000 years, when they overtook them. It was actually AMD who dealt the killing blow with the Athlon and especially the Opteron. At that time, Intel was pushing Ithanium as the architecture for professional usage and kept x86 onto 32 bit with more game-optimized processors (P4).


No offense, but that is all either wrong or ahistorical. You write about personal computing history like someone who experienced it after the fact and solely through blog posts. (are you writing about the PC revolution as you experienced it in some country other than the United States?)


Not sure what's wrong with OPs account. Care to give examples?

It roughly aligns with my experience during that time.


> Not sure what's wrong with OPs account.

Pretty much everything.

The OP made a number of statements about the various platforms' marketshare over time which were inaccurate if applied globally, but might have been accurate in a certain country, hence my question to them. All of this in service of an idea - that x86 hasn't been dominant for four decades - which is so absurdly counterfactual that it's silly to argue about. I mean, you guys know that these sales figures are public, right?

Regarding the assertion that PCs were ever "quite rare in private usages." There are some pretty good charts of the various home computers' marketshare available here:

https://arstechnica.com/features/2005/12/total-share/4/

https://arstechnica.com/features/2005/12/total-share/5/

https://arstechnica.com/features/2005/12/total-share/6/

It occurs to me after sharing those that one could look at those charts and assume all those IBM PCs were going to businesses (that's an AWFUL LOT of businesses!) but I don't have all day to spend on gathering this data... research game sales in the mid eighties if you have doubts about this! They were servicing a dying C64 and Apple II market, and a rapidly growing x86 market. x86 did not wait for nineties (how is this even an argument... jeez).

I still have the Athlon XP 3200+ system I built, but AMD didn't deal any kind of "killing blow" with anything. That's extremely silly. AMD did an admirable job of forcing 64-bit adoption on the PC a bit sooner than Intel would have liked (and driving multicore!), but x86's domination in the marketplace did not have to wait for that.

In terms of marketshare, ARM didn't matter much until it became an embedded standard. Regarding 68k, which the OP brought up for some reason, I offer this:

https://arstechnica.com/features/2005/12/total-share/7/


AMD did deliver the killing blow to Sun. At least in the company I worked by then, they were all firmly Sun for the large compute servers until the Opteron arrived. PCs were nice as desktop machines, but large servers with multiple CPUs and lots of gigabytes of RAM were not feasible on x86 until the arrival of the Opterons. Eventually they would replace all Sparc based compute servers.


By the time the Opteron came out in 2003, Google was already a large company. My guess is they were already the largest search engine in the world -- and they ran their web crawler and search engine entirely on PCs.

Hotmail was bought by Microsoft in 1997, then Microsoft promptly announced that the service would be switched over to Windows. Till then, according to Wikipedia [1], Hotmail ran some Sun boxes, but also a lot of FreeBSD, and I am almost certain that the FreeBSD was running on PCs. They were probably the largest email provider in the world in 1997.

The Apache web server "was the most popular web server by Spring 1996 and stayed like that until the Summer of 2014" [2]. It ran and runs almost exclusively on Linux, which in turn ran and runs almost exclusively on x86.

1: https://en.wikipedia.org/wiki/Outlook.com

2: https://blog.cloudware.bg/en/the-history-of-the-apache-http-...


Indeed, Intel got flat footed and was pushing for 64 bit only on itanium for a substantial price premium. AMD was first to market with the x86-64 instruction set and did quite well.


I wouldn't short-change AMD by saying their big success during that period was about the 32 vs 64 bit issue, or Itanium. They made the fastest 32-bit x86 chip in the world with the Athlon K7, and they did it four years before they launched their 64-bit chip.


Sure, but the opteron doubled down on it. They added x86-64 to a server class chip for the first time and they moved the memory controller on chip, which made the AMD scale dramatically better under a variety of workloads.


I believe a lot of people had the experience your company had at that time. But...

x86 (and Linux and Windows) started killing Unix and the other architectures a lot sooner than Opteron. At some point in the late nineties, SGI's workstation people pretty much curled up on the floor in the fetal position, moaning, "Windows NT, Windows NT." That spectacle was downright undignified, although the NT box they produced was impressive in its way. (They made some nice contributions to Linux nevertheless. Would that they had taken Linux even more seriously.) It says something about their outlook on the future of MIPS, as well as UNIX, that SGI designed their Visual Workstation using Pentium in an era when Windows NT on MIPS was still a thing.

You can't quite say that AMD64 killed sun. Sun actually made some decent, if overpriced, Opteron stuff. Sun's demise is a fun thing to discuss because former Sun employees often have an interesting opinion about where Sun went wrong. I'm waiting for the guy who says something like "yeah, that was my department's fault. We blew it and the company failed." So far I haven't seen that.


So am I. Op seem ok.


Well, sure, I experienced it in a country other than the United States. Only 4% of the current human population lives in the United States. To be precise, I am from Germany and lived through all the times I described. While PCs appeared in the 80ies, it was not until the very late 80ies and early 90ies until they had some significant home usage. And as far as I know, Apples were very popular in the US at that time. In Europe they were not very common, due to even worse prices than today :p. But my school hat some Apple II, which I eventually used with Pascal.

But for home usage, the "home computers" were popular as named before. Universities used Suns till the late 90ies.


I don't think the US home computer market followed the same path as Germany's. 68K-based machines never became huge sellers over here, for instance, and the Macintosh was the most successful of the bunch -- whereas it's my understanding that in a lot of Europe, the Amiga and the Atari ST were serious contenders even on into the '90s. Conversely, IBM PC clones had taken the lead in US sales by the end of the '80s and just never had any serious competition by the early '90s. It's not an exaggeration to say that Radio Shack was selling more Tandy 1000s than the Amiga and the ST were selling combined over here. (There were games with specific Tandy 1000 "enhanced graphics and sound," so it was actually considered a viable market all on its own!)

> Apples were very popular in the US at that time

Well, the Apple II line was popular in the US in the late 1970s through the mid-1980s -- at one point it was the best-selling 8-bit computer, taking the throne from the Radio Shack TRS-80 -- but the Mac was absolutely not a big seller in the 1990s; Apple survived because they dominated a few vertical markets like desktop publishing. For home computing, the first really successful Mac was arguably the iMac circa 2000.


> While PCs appeared in the 80ies, it was not until the very late 80ies and early 90ies until they had some significant home usage

That might have been true in Germany. It wasn't true in the United States. IBM PC (and moreso, PC clone) adoption was quite strong in the home from the mid eighties. That's all I meant by the question about nationality - I really do think there's a difference there when it comes to market share. (I probably shouldn't beat the subject to death but it was an interesting discussion)


I was responding to the original OP who mentioned Intel only.

I doubt it'll remain that stable. The mobile OSes have already embraced a number of platforms, not even all ARM based. Android seems to be quite flexible regarding architectures with its pre-compilation. I think this is a sign of what's to come.

RISC-V is also an upcoming player and if it's successful it may spawn more fully open contenders. As we move to more AI integration there's a whole new lifecycle opening up too in terms of ML coprocessors. We're in the same situation as early computers with multiple vendor-specific solutions.

I think on the security side there'll also be more hardware signature checking rather than the chain-based checks of Secure Boot. Rather than the OS checking if a program is legit, the CPU could do it (already done on some custom like consoles)

So I don't think computing is really mature at all. It just has had a stable phase for a while.


> With the Apple Silicon, there is now a real contender, actually surpassing the current x86 offerings in many aspects

It's absolutely NOT a real contender for widespread use until you can buy a mini-itx, microatx or regular ATX motherboard from any one of the well known dozen Taiwanese motherboard manufacturers, and an individual CPU to socket into it. Or at least a selection of motherboards with CPUs soldered onto them from same vendors.

The hardware availability is basically a walled garden.


I don't think whether you can buy it in a form factor though is a good indicator of "contender of widespread use." It ignores any technical merits and what the chip can actually do. Can it enable someone to check email, watch youtube, and check social media? Yes. Can it render graphics? Yes.If you put someone infront of a Macbook with an M1, can they accomplish everything hey can on an intel machine? Yes.

Now, is it probably priced out of being a real contender for widespread use? Most likely. Is it offered is configurations that suite everyone? Maybe not. But that doesn't mean it can't accomplish the same or similar tasks. If someone can sit at a computer and accomplish all of their normal tasks, then for the most part, it is a contender for widespread use, it is just a cost factor.


the subject matter of the parent is for widespread linux use, which is rather unlikely if it's only available from one vendor.

yes iphones are mainstream, in terms of market share.

no iphones are no mainstream in any context related to use of one's own choice of GPL licensed software.


it'll happen eventually :)

https://www.youtube.com/watch?v=lxdRSCQfhyw https://www.solid-run.com/arm-servers-networking-platforms/h...

- Layerscape LX2160A 16-core Arm Cortex A72 (up to 2GHz) - up to 64GB DDR4 dual channel 3200MT/s - 4 x SATA 3.0 - 1 x PCIe x8 Gen 3.0, open slot (can support x16) - 4 x SFP+ ports (10GbE each) - 3 x USB 3.0 & 3 x USB 2.0 - GPIO header - 170mm x 170mm standard Mini ITX form factor


Isn't that much harder with architectures without a BIOS equivalent? The components on the motherboard have to be told how to talk to each other. Swapping out the CPU would require some reconfiguration.


I bet Apple will sell order(s) of magnitude more MacBooks than the combined total sales of motherboards in those form factors, so your definition of widespread is surprising to me.


Sure, but Apple's proving what is possible. It's only a matter of time until similar chips ship from the competition.


Its coming


In a few more years RISC-V may become a thing for general use as well.


But will it be competitive enough?


I often find myself using ARM on the desktop nowadays. I use android phones, Nvidia devices, SoCs (most famously raspi).


Linus did use a MacBook Air quite happily for several years. Running Linux, of course.


> With the Apple Silicon, there is now a real contender

Yeah, no. It's a walled ecosystem, not a 'contender' in the sense that it can't be horizontally integrated with other technologies.


it's not a walled eco system, people really need to stop saying this.

iPhone is a walled garden Macs are not a walled garden.


It is. You can’t buy Apple Silicon without buying everything else including a laptop and macOS. It’s entirely vertically integrated from iCloud account to transistors.

A real contender would be Amazon’s Annapurna Labs with their ARM processors or something with RISC-V.


Big problem is that you can't buy M1 cpu like you can e.g. i9. I think Apple should be forced to open up their platform so other manufacturers could make laptops or desktops with that CPU.


I was surprised by glxgears/OpenGL running, but in a later tweet I read this:

> It's been running the glxgears demo (60% all-core CPU usage)

Looks like it'll be a while before this thing runs Linux with anywhere near acceptable performance if glxgears still runs in software at 60% CPU.


Yeah it's LLVMpipe, it's basically doing everything in software and Plasma uses quite a bit OpenGL for animations and compositing. A lighter WM like Fluxbox could maybe be easier on the CPU. In any case, looking back at Nouveau, writing decent drivers for a GPU nobody has specs of is definitely challenging. Only time can tell.


Working off inprecise documentation or reverse engineering everything is nothing new in the Linux world. At least on the M1 drivers seem to be able to exert full control over the card, compared to newer Nvidia cards where firmware signed by Nvidia is required to boost the card to a reasonable clock frequency!


I'll bet that's thanks to the mining market buying cheap cards and overclocking them into the ground. Very annoying for the wider market, certainly...


The % usage is not indicative of anything considering it will use as much CPU as possible to reach as high a frame rate as possible


I think by default glxgears has vsync enabled. But it's trivially possible to disable it with an env var. Considering the CPU usage I assume that is what was done. 60% CPU usage for running glxgears at 120Hz seems excessive.


There is no vsync with the dumb framebuffer backend I'm using. We have a real display driver, but it needs rebasing and adapting to work on these laptops (it was developed on the Mac Mini).


I always remember glxgears running at hundreds of FPS and it being used as a poor man’s benchmark. Perhaps this has changed of late though…


I always have had to set `vblank_mode=0` to get it to run as fast as possible.


He also mentions: "buttery smooth software only rendering"


If networking and KVM suport exist for this (I think it does), then these would make great servers as well.


The only motivation for me to own Apple hardware in 2021 is the ability to use FOSS software in the natural habitat of Linux.

I don't trust Apple's software and "ideas" of computing at all. Just another big company with data gathering ambitions. After Catalina MacOs is a total joke. https://www.nytimes.com/2021/05/17/technology/apple-china-ce...

So let's hope that running Linux on M cips will be possible, soon.


> After Catalina MacOs is a total joke

I'm running Monterey now, and nothing has changed for me between Catalina and Monterey. The security changes (like kernel extensions, some paths, etc) and deprecations (32-bit apps), that broke some of the software/devices I had, were already present in Catalina. What damning difference do you see in Big Sur and Monterey?


  > What damning difference do you see in Big Sur and Monterey?
the ui/ux i presume


I must "OS" differently than GP. The changes were subtle and inconsequential, from my perspective. In fact, it made some window management software I had obsolete!


But if you don't want MacOS why wouldn't you buy a $800 laptop and don't have to worry about ARM compatibility?


Because it's not a huge concern and compatibility has been making extremely fast progress.


I've been watching macran's stream on youtube in the last couple of days, where he had to fix a lot of stuff so this notebook can come to life. If you are intrested in ARM/bootloader/linux/drivers, I highly recommend the stream recordings.


Mind linking me to their channel?



I had the same question and I found this: https://www.youtube.com/channel/UCxS98ISZNcuaJRCvy6JV6Fw


Are there any comparable non-Apple laptops which are well supported by Linux? I'd hate to pay the Apple tax, only to run Linux on the bare metal. Plus, I'm assuming "Apple care" goes out the window once you wipe out MacOS?


I'm not aware of any machines with comparable performance/efficiency from any other vendor.

Apple officially supports running your own kernels on these machines. This isn't a jailbreak, it's an official feature of the hardware and firmware design that Apple added, and which their EULA allows you to use in this way.

These machines are also not brickable, you can always do a DFU restore from macOS or (soon; Monterey needs some stuff I already figured out but haven't implemented yet, but it works for older versions on the M1s) Linux using idevicerestore, no matter how much you wipe or screw up.

(That said, I wouldn't recommend wiping macOS at this stage; we expect most users to have a dual-boot setup.)


I used to rant about paying so much extra for Apple hardware and call it the "Apple tax" too, but the truth is, no one else makes good laptops. You're paying for more than simply the brand.

I spent a fortune on a Lenovo X1 Extreme, and the thing bricked itself three times in 14 months, the last of which was outside the warranty window, and it's now basically impossible to get repaired (I couldn't even get anyone to look at it, and Lenovo is no help at all). Even if it weren't bricked, the shitty plastic case is already getting loose (the display hinge is loose/wobbly and squeaky); the speakers sound terrible; it's loud; it gets unbearably hot.

Dell laptops are pretty good with Linux, but the build quality is similarly poor. Linux-first laptops like System 76's are clunky and inferior.

It's really hard to beat MacBooks on a hardware level. I'd love to run Linux on one.


This has been my fear as well when looking at other laptops.

In terms of hardware and build quality, apple is crushing it with m1 mpb.

However, I just received my framework laptop and am totally excited by it. The build quality is pretty good compared to my 2019 15” intel mbp.

It feels like a hunk of metal, similar to mbp. I love the size, the monitor is great, keyboard is much better, and all the parts are replaceable. Running arch with zero issues so far (although i havent tried bluetooth or fingerprint reader yet).

The battery doesnt seem great (8hr real use) and the speakers are not even close to my mbp.

Other than those two items (both of which are fixable with battery pack + headphones) I find the offerings very similar.


> The battery doesnt seem great (8hr real use)

Eight hours of real use is fantastic. I've never gotten more than eight hours out of a MBP while doing actual work; it's usually more like 4-5 hours, if I'd doing things like running Docker and frequently doing builds and running tests. Laptop manufacturers stretch their battery life claims like crazy. My MBP will last 12 hours if all I'm doing is using a terminal to SSH somewhere and doing very light web browsing, but if I'm doing dev work, forget it.


From my experience - a notable percent of all Dell laptops (especially Precision and Latitude series) has a decent Linux support. I have never been an owner of laptop from System76 or any other brand that is targeting Linux users. However, I am really interested in getting a Framework Laptop [0] for myself, which looks nice and I have already read good opinions about it.

[0]: https://frame.work/


Apple Care covers hardware issues so installing Linux shouldn't void that unless you physically open up the laptop to do some weird thing.


Not exactly comparable and not ARM, but https://puri.sm/products/librem-14.

Another one is ARM, but very under-powered: https://www.pine64.org/pinebook-pro/.


After running ubuntu on a Dell XPS 15 with varying degrees of success the last four years, I placed an order for a Framework laptop that will be shipping this month. I don't expect the build quality to match that of apple, or even dell, but I've read linux support is good, and will only get better with more dedicated people on the platform.


My Framework is amazing and probably one of my favorite laptops in quite some time. I think the build quality is great all things considered


I just received mine while using a 2019 15” intel mbp. Honestly I was shocked at how great the framework is. The hype is real.

if mbp is a 10:

monitor 9

trackpad 8

keyboard 15

case 9

ports 20

battery 6

speakers 5

I an shocked by how well the trackpad works. I havent messed with gestures yet but the precision seems on par to me.

Linux support is first class.

Im very happy with my decision. M1 mbp is definitely better overall but I would choose this laptop over it in a heartbeat.


It's a different aesthetic, but t- and p-series thinkpads are well supported by various linux distros and the experience is considerably smoother (and with better battery life) than 5-10 years ago.


Lenovo ThinkPad X1 Carbon with Linux


It seems they're at the stage where they could do with more development help...

For example, bringup of the wifi can probably happen in parallel with GPU work...


WiFi already works, but the existing patch (which was written by Corellium as part of their throwaway M1 Linux PR stunt earlier this year) needs a complete rewrite, among other things because the way it handles firmware is completely backwards. I know how I want to do it, just need to spend a day or so on that one, maybe another day on hooking up firmware copying into the installer :)


Wondering how GPU is enabled. I thought Apple silicon GPU is largely a black box. Or is it already reverse engineered in a certain level?


> Wondering how GPU is enabled

For this demo it's not. It's doing CPU rendering, which the M1 is apparently fast enough to do while providing decent UI performance (albeit while using 60% all-core CPU to do it).

> it already reverse engineered in a certain level?

It is. See:

- https://rosenzweig.io/blog/asahi-gpu-part-1.html

- https://rosenzweig.io/blog/asahi-gpu-part-2.html

- https://rosenzweig.io/blog/asahi-gpu-part-3.html

- https://rosenzweig.io/blog/asahi-gpu-part-4.html


Serious question since i don't know much, how can the display be used without the gpu?


The frame buffer doesn't have to be on the GPU side (and for M1, it's unified memory architecture.) So the role of the GPU is just to make the graphic related computation faster. Things like composition and display output is kinda orthogonal.


Wow, things like this blog series are exactly what I'm interested in! Thanks a lot!


I guess the global menu ignores notch in the middle and displays sections anyway? The difference in fonts between KDE and macOS is visible here at first sight (ofc I know it's not surprising...). And maybe that's what Apple should do - tweak GUI a bit? Because from what I've read, people complain that menus are getting cut out.


> I guess the global menu ignores notch in the middle and displays sections anyway?

Yep, but that's not really an issue on KDE since most widgets will avoid the center of your bar unless you explicitly tell something to display there. I'm sure some frustrated MBP owner will push a fix for it in the future though, it's just a matter of time.

> The difference in fonts between KDE and macOS is visible here at first sight

Well, in this screenshot it is. As you can see, there's no GPU acceleration on anything here, which means they've probably got subpixel AA disabled (and obviously aren't running San Francisco system-wide). You can configure the two to be frighteningly similar though, in my experience.


Subpixel AA has nothing to do with the GPU; not sure if it's enabled by default or not though, I didn't touch it, but it is pretty irrelevant with high-DPI screens. Apple does not use subpixel AA by default any more for this reason (and because they often use scaled non-native resolutions the way their UI scaling is designed).


It never fails to boggle my mind. The sheer amount of work, time and money that goes into "freeing" this hardware from the clutches of Apple's walled garden. Truly remarkable, but maybe it's just a first-world problem.


>"So it basically ran for a whole day off of battery doing hardware experiments. Still had 30% left."

Are there metrics comparing efficiency / battery life of ARM linux and OS X, on the same hardware? This sounds promising.


This wouldn't be representative yet. Software rendering will kill the efficiency. We'll need to wait for the GPU to be properly supported.


Even without the GPU, it sounds like the M1 is quiet and efficient, more so than any HP/Dell/Lenovo laptop I've used. Most heat up quite quickly doing anything, and the battery life quickly drops once you go past "light web browsing".

If someone can pull off the GPU in Linux on these, I have a feeling they'll be some beastly Linux dev workstations too.


I just picked one up yesterday, I pulled it out of the box at 70 percent, installed all my crap, set it up to my liking, used it all afternoon and evening, closed the lid, went to bed, and now I'm using it to type this comment and the charger is still in the box.

It's frankly incredible.


I love Linux but after 5 years of using Apple products exclusively, I have learned to hate everything except Apple hardware.

Linux on macbooks would be an amazing dream.


I don't get this. The only reason I feel Apple has an advantage compared to competition is software integration.

M1 performance in laptops is great but that's only been true for a year or so, for the last 5 years Apple laptops have been hot garbage.

As an owner of fully loaded 2018 mbp (i9/upgraded GPU, 32 GB ram) I can without a doubt say it's the worst premium device I've ever owned. The battery runs out on me after 1 hour meeting - and I have >100 cycles on it. The amount of heat and noise it produces is surreal, and that's after I opened it and cleaned up the vents (which clog easily) and added thermal pads to connect VRM to chassis (which helped significantly, prior to that the CPU would downclock so bad I couldn't use anything on my device after 15 mins in a Google meet connected to a 5k monitor).

Not to mention all the bugs I had to go through with it - wasn't untill 6 months after I purchased the device that I could actually use my 5k on my USBC monitor on full resolution - only started working after an OS upgrade.

And the keyboards failing being a known problem they replace out of warranty because they recognise their design sucks (luckily I use external keyboard 95% of the time).

Apple hardware was mediocre at best for the price they charge, up until M1.

In the mobile space, they are faster but as someone who switched from an iPhone to Samsung - I can't really say it matters. Phones are good and well rounded but nothing spectacular, hardware is on par with Samsung.

Again using the Mac/iOS combo is really nice so the software integration is next level, but considering their business practices I refuse to get locked in to the ecosystem, it's just too limiting.

And Linux on M1 would likely be subpar to any premium x86 device, Linux support sucked even on x86 Macs.


I understand the frustration, especially if your first impression of Apple is a MacBook Pro is in the 2016-2019 era - you've probably seen the worst MacBooks available and not the best of Apple.

There were some good things, the displays were excellent, the touchpad is still the best in class, and the size/weight were excellent.

But usb-c only was step too far, the Touch Bar wasn't the right move for a Pro audience (i), there wasn't enough room for cooling the intel chips and the keyboard situation was farcical - it's the primary interface to a Mac (can you imagine the outcry if iPhones had touch displays randomly not working, doing extra touches etc?!).

I think Apple gets a lot of leeway because the 2008-2015 MacBook Pros were probably the best laptops you could buy.

Having owned a 2009 MacBook Pro which in my opinion was the best laptop I'd ever owned and never made me question the amount of money I spent on it. The 2016 MacBook Pro was the exact opposite (mainly due to the keyboard being so bad).

I'm glad Apple have come to their senses and course corrected. I do wonder though for people that have only seen the 2016-2019 era if they will bother to try Apple again...

(i) I understand it probably would've made it too expensive to produce but I think the Touch Bar would've gone down well on a MacBook Air where I would imagine there's a lot more hunt-and-peck typists that'd appreciate and notice what's being displayed on the Touch Bar. As a touch typist I never looked down to see the Touch Bar so it was a mostly wasted on me.


> I'm glad Apple have come to their senses and course corrected. I do wonder though for people that have only seen the 2016-2019 era if they will bother to try Apple again...

I'm one of those angry bastards, and I even own an M1 Macbook Air. The hardware is impressive, no doubt, but MacOS frustrates me so much these days that I cannot daily-drive it for my workflow. Plus, once you make the switch to Linux it's really hard to see Apple products as an upgrade anymore. You're giving away your freedom, and condemning yourself to paying $8 to manage your windows properly or $10 just to hide some statusbar icons. And when all is said-and-done, I can't move that statusbar to the left side of my screen... God it frustrates me endlessly. When I saw how Big Sur redesigned everything, I just gave up on the OS altogether. The thousand papercuts I feel on MacOS are reduced to a couple hundred on Linux, the majority of which I can automate away without worrying about some bigger company pulling the rug out from under me.

I really wanted Apple Silicon to be a barnburner for me, and I was hoping beyond hope that they would take the extra space savings to add an M.2 drive or an easier to repair chassis. At this point though, I think I'm contented to just stop caring. Apple courageously headed in a direction I'm not ready to follow in, so I cut them loose in exchange for all my sweet creature comforts. And how comforting it is.


I sympathize with many of your takes, but have you looked into Framework[1] laptops?

They're currently only Intel based, but there's a marketplace where you can buy or sell just the mainboards once the AMD, RISC V, or ARM64 models become available.

[1] https://frame.work/


Framework looks great! I actually have no real need to upgrade my hardware right now though, as all my devices still run fine. I'd be very interested in picking up a RISC-V model once it hits manufacturing though, they seem very promising.


This is why Apple went to their own CPUs due to the poor thermal dynamics of the recent x86 chips. The recent chips are like the PowerPC chips of 2005.


Do you have any 5nm x86 chips that you could compare it to?


It's probably going to be a cool toy, but being able to use the hardware to its full potential and also having Linux on ARM Macs be your daily work driver is going to be a humongous challenge.

I wonder if it will ever happen outside of very limited use cases.


What makes you say that?


They need to reverse engineer the GPU, the ML cores, the security chips. They need to figure out power management and use it as well as MacOS.

Apple will just throw some crap random docs at them, best case scenario. They won't discourage it actively but they won't help them either.

The time budget for high quality engineers working on this will also be limited as there's no market that I think of, only itch scratching. Which generally takes you to "good enough/mediocre" not to "awesome for the general public".


> The time budget for high quality engineers working on this will also be limited as there's no market that I think of, only itch scratching.

That's why I started this project backed by a Patreon. I knew this couldn't be accomplished via itch scratching; it has to be a job (not quite full time yet, but a good chunk thereof).

FWIW: the userspace GPU driver is now passing 90% or so of the GLES tests running on macOS; only the kernel side is not started yet. The ML cores are a niche use case and broadly not useful for anyone not doing machine learning on Linux (but some people are interested in them and have started taking a look). And power management already works to a significant extent; in the next week or so I'll be submitting v3 of the PMGR driver and v2 of the cpufreq driver for upstreaming, and starting work on SMC (needed for battery/charging info). And the security chip (SEP) has already been poked at; the mailbox interface to it is shared with other chips and has already been upstreamed, and the protocol on top isn't terribly complicated. It's mostly a matter of building a little driver to give userspace access to it, and writing some tooling to use it.


Well, I wish you luck!

Unfortunately I'm a late pragmatist so I'm probably half a decade away from using your work :-)

https://tbkconsult.com/wp-content/uploads/2016/05/Crossing-t...


If you look at the progress of Asahi Linux, I am very optimistic that Linux on Apple Silicon becomes a reasonable thing quite soon. Most of the basic hardware infrastructure is already supported, GPU support is coming along nicely too (90% of the GL tests already passing). There might be corners of the chip not supported for a long time, but if there were no neural net acceleration, it might be a pitty, but not a showstopper for all other tasks.


I don't understand apple why don't they provide docs so thousands of Dev don't waste time reversing their api. It will also help to bring some PR stuff to apple. But it seems they are too concerned and doesn't their user to use other os.


I agree, I think Apple would profit from directly supporting Linux on their hardware. Their main profit is in selling the hardware and that they would do, probably to a lot of people who wouldn't buy Apple hardware right now. The Apple Silicon are some damn nice chips.


Apple likes to upsell their users on additional Apple products.

If you're not running their software, you're not in their ecosystem and they can't upsell you.


No doubt about that. But I think most people, who would buy a Mac specifically to run Linux on it, wouldn't see a Mac without Linux as an alternative, they would just get some PC hardware. If they buy a Mac for Linux, they might try out MacOS too and that would be an opportunity for Apple to expand the user base.


> Their main profit is in selling the hardware

Not anymore - it's now about selling services to get recurring income. E.g. App Store (to extort money from developers), Apple Care, Apple Pay and iCloud (from users) etc.

That is why they are locking down the Mac systems too, and making it a closed system like the iPhone / iPad platform. They have been doing this for a while now. The only thing that now differentiates an M1 mac to an iPad or iPhone is that the bootloader is not locked on the M1, and so you can still theoretically install another OS. (Ofcourse, the practical reality is that due to the closed box nature of the M1 Apple Silicons SoC, non-macOS OS can't exploit the full benefit of the hardware and will always offer subpar performance).


No, they are not locking down the Mac, Asahi Linux is a good example of that. Also, they are not locking down users of MacOS. As I wrote in a separate post, they do have some incentive to sell hardware for Linux enthusiasts. Those wouldn't have considered a Mac otherwise. And if they have a Mac, they would try also MacOS and all Apple offerings.


> No, they are not locking down the Mac

It is quite evident now that Apple has been locking down the macs:

1. The first few Intel Mac Minis allowed you some level of customisation of both the hardware (change RAM or HDD / SSD) and software (install other full featured OS).

2. Then came the Mac Minis with soldered RAM and SSD. You could no longer customise the hardware. Software was still customisable and you could still install other OSes. (Recall that Apple even offered free drivers for another OS, i.e. Windows).

3. The current generation of M1 Minis now doesn't allow you to customise both the hardware (everything is soldered) and the software (no other viable OS except macOS). Technically you can install other OSes, but the reality is that currently only crippled versions of Linux and xBSD is available and practically the only full-featured OS available for it is macOS.

The only difference between the M1 mac platform and the iPhone / iPad platform is the unlocked bootloader on the macs. And the only reason they haven't locked it yet is because they know that as soon as they do it, many will avoid it. They are just waiting for the M1 Macs to reach a criticial level of acceptance before they do so. (And till then, the lack of an alternative full featured OS to the macOS, currently works to their advantage).

> Asahi Linux is a good example of that

Asahi Linux is a distro built on existing ARM based Linux. Apple Silicon is built with ARM chips, so it is not surprising that ARM Linux could be ported to it. But it will be never be as full featured as macOS on the M1 because no device drivers exist for it on the M1 platform. While their aim to reverse engineer the hardware is laudable, I see it as a pointless endeavor as it will take years, and Apple doesn't want any other OS to run on it. From their perspective, any other full features OS hinders with the built-in plannned obsolescence built into the M1, and also cuts into the recurring revenue model built into the services in macOS (iCloud, App Store, Apple Care etc).


The Intel Mac Minis don't have soldered RAM, it can be upgraded, though it is deep inside the device. The M1 Macs have soldered RAM and SSD, but here there is a clear technical tradeoff giving much better performance.

Asahi Linux contains device drivers for the M1 Macs, quite a bit of their work has been accepted to the Linux kernel already. GPU drivers are pretty close too.

Apple does explicitly allow other OSes to run on their hardware, they explicitly enabled that.


Playing devil's advocate:

Apple provides the complete package. If you're homebrewing on top of their PCs, that's your problem and you're actually more of a nuisance to them. They don't actually get any benefit from you and you, being an advanced user, are very likely to have demands that are not representative of the general population and that also risk being difficult to accommodate. What could you possibly do that the All Mighty Apple™ can't do better?


> They don't actually get any benefit from you and you

Actually they don’t get any penny from me because I can’t run Linux on M1. I like MacBooks. I like OSX.

I’d love to buy one of those M1 and even keep OSX on it but I’m never going to buy a computer whose OS updates could be stopped at any time with no replacement possibility.


My Cyrix-based machines don't get software updates anymore, even with any modern Linux distros :'(


Playing devil's advocate's devil's advocate: Maybe Apple is trying to cater to advanced users again?

In the last decade or so, we've seen Apple flip-flop their designs and priorities between content creators and content consumers. If you can remember back to the 2000s, with the first-gen Mac(book) Pros, those were actually pretty remarkable computers, albeit very expensive and arguably overpriced from a performance per dollar standpoint. They were proper workstations -- if you could afford them. They also ran Windows via Boot Camp, with first-party drivers. Apple recognized the incremental value of supporting "advanced users" then, even if it was just a small niche.

Then the iPhone and iPad came on the scene, and quickly overtook not only the Mac market but the computing market in general. Content consumption, and App Store profits, thus became most of Apple's revenue, and their priorities shifted accordingly. Once Jobs died and Jony Ive took over, they tried to refactor their computing devices to be more like their consumption devices, changing the Macs from workstations to, I don't know what, designer furniture with screens? The Mac Pro (trash can) was obsolete within a year, the Macbook Pros went from fantastic workstations to experimental art featuring terrible keyboards and no ports and broken-by-design cooling.

Enter the present day. Jony Ive left (good riddance!) and they are again free to pursue the content creator market and cater to user demands, not the holier-than-thou vision of their annoying designer. The Macbook Pro is back with a vengeance (hopefully the desktop Pro soon), now featuring a vastly superior chipset, with a powerful & efficient CPU/GPU, ML chips, hardware encoders, etc. If only it had a USB-A port. Anyway, those are all signals Apple is starting to take the workstation market more seriously again.

The market has also changed in the intervening years. Any ol' laptop will suffice for most users. For some, iPads with a keyboard or Chromebooks are more than enough. For the remaining power users, what are their options? Surface, ThinkPad, Latitude, and now Macbooks again, finally. But that requires accommodations for "pro" workflows and tooling, and these early signs are encouraging, or they wouldn't have bothered much past the Macbook Air.

If Apple was able to support Boot Camp for Windows, there's no reason they can't provide similar support for other ARM operating systems, whether that's Linux or Windows for ARM. Parallels can already run Windows on a M1. Adobe has ported much of their stuff over to M1, Unity is halfway there, Unreal is thinking about it, Docker works... which is to say, the industry is excited about the possibility that Apple is going to start caring again. Hopefully. Maybe.

Personally, I'm also intrigued by the possibility of M1 in data centers, in rack mounts, with amazing performance/watt metrics. If that happens, surely Linux development on M1 will pick up alongside, and that may trickle back down to desktop users? One can hope. I don't think Apple has ever been in this situation before: where its own chips (as opposed to integration and design) risk disrupting the status quo. It's an exciting time. We'll see if it lasts...


> I don't understand apple why don't they provide docs so thousands of Dev don't waste time reversing their api.

It's exactly because they don't want thousands of Devs wasting time at all. Why reimplement something that already works well as a computing platform?

There is no good reason why they would want to do this. But it's not like they're blocking everything and locking it down. They just haven't published docs etc - which can carry a significant maintenance cost.


Shitty explanation. If they don't want people to waste time, give them what you already know.


The good thing seems to be, that Apple doesn't intend to break how they speak to the hardware across generations. It is to simplify _their_ driver development, too.


On the plus side though there is a relatively small pool of actual hardware targets that they have to make it work on.


True. But for example Nvidia GPUs have never been reverse engineered fully, more than 20 years after their introduction. Because Nvidia updates their tech, just as Apple will.


Nvidia actively sabotaged reverse engineering efforts with their firmware shenanigans, and also Nvidia hardware is way, way worse to support than Apple. Apple actually makes very clean SoCs with neat software interfaces and keeps compatibility across SoCs except when they need to break things. Case in point: I brought up our existing M1 efforts on the M1 Pro in 4 days, and that involved changing several drivers because they actually made a big compatibility break in order to support more cores and more RAM (the new versions support over a terabyte and probably dozens of cores, so I doubt they'll change them again for quite a few chip generations).


No OP, But its most likely that Apple's software is designed to take full advantage of the hardware, but to do that with Linux software would require (1) a lot of funding, and (2) documentation of the hardware, which as I understand it, Apple is currently not willing to provide. So the folks who work on this have to do a lot of reverse engineering.


Honestly I use Linux on a MacBook Air 2013. 4 gigs of ram and haswell i5 and it’s an amazing machine. It’s not my work machine for compiling and running a ton of apps but it’s super solid. Great battery life. Runs chrome amazing and Stardew valley great.


You can run it in a VM


I just had a crazy idea: Apple selling systems with a nice Linux distribution that included full support for iCloud. Apple would lose money of App Store sales, but opening up to a slightly larger market might work. I would also be interested if they got into selling M1 racked systems for data centers. Why leave money on the table.


great news, the progress is much faster than I expected!



What's Win10/11 story on m1 macs?


Wholly dependent on Microsoft. So far, they didn't say anything apart from indicating last year that they don't have immediate plans to do it.


> Wholly dependent on Microsoft.

Not wholly, surely? As an OEM, Apple would need to provide drivers. Given that their system isn’t an off-the-shelf SBBR/ServerReady ARM64 device, that’s going to be a not-insignificant amount of work that isn’t Microsoft’s responsibility.

Seeing as the Asahi Linux project has had to reverse-engineer pretty much everything, I don’t see how Microsoft could be totally on the hook.


It's precisely it not being a SystemReady ARM64 device that makes it Microsoft's responsibility. The Windows kernel requires core changes to support this hardware; it cannot be done with just drivers. Nobody can make Windows work on this without Microsoft's help. Same reason we had to make a few changes to core Linux plumbing, besides adding drivers.


Ah, thanks for the explanation! That makes sense.


Apple has publicly stated that they’re willing to do that, but Microsoft refuses to license Windows for the M1: https://www.macrumors.com/2020/11/20/craig-federighi-on-wind...


I'm surprised Apple stated that as macOS (ARM) have stretchy support for dual booting, even with two macOS version installed. So how could Apple be willing to do that if they haven't flesh out the dual-booting for Apple Silicon? Strange they passes the ball to Microsoft (deflecting to their competitor) whereas Apple should be holding the ball until it is capable enough.


Microsoft doesn't offer Windows 11 unless it is bundled with hardware. I am sure that if Apple wishes to sell Macbook Pro with Windows, Apple could reach out and Microsoft would be interested. But I really doubt that Apple would want to go down this route.


Eh? You can go out and buy a retail copy of Windows 11 from loads of places without hardware. E.g. here in the UK Scan, Ebuyer, Insight, etc. All major software/hardware sellers.


My bad. I should have specified Windows 11 ARM.


Bare metal Win10/11 ARM is non functional on the M1 machines, Virtualised is officially unsupported [1], there is no apparent talk of official support for either in the rumour mill. However it wouldn't surprise me if there was a working dev build in Redmond somewhere.

[1] - https://www.theregister.com/2021/09/10/windows_11_m1/


Buy a copy of Parallels, run the Windows 11 ARM64 insider preview in a VM. Works pretty great for everything but gaming.

If you want to activate it, use slmgr with some sketchy third party KMS server.


Won't happen because it would work against their Surface efforts, I think


for a software-focussed company like Microsoft to deny running Windows on the M1 seems very strange to me…

if MSFT is serious about (eventually) transitioning to ARM for the majority of consumer-oriented Windows machines, it surely makes sense for them to provide support for the most adopted ARM-based laptop and desktop platform out there… to bolster development of ARM-native applications for Windows on ARM, if nothing else.

i don’t own either machine — but i imagine the ARM-based Surface Pro X would make a miserable development experience, especially compared to a hypothetical Windows on M1(X) MacBooks.

we shall see i suppose, but i would be very surprised if we never see it happen.


no such story


"The End."


Not a technical comment, but wow, Plasma 5 looks great on that hardware.


Does Bluetooth work? Does the built in keyboard work? Trackpad? WiFi?


Yes to all of the above, with out-of-tree patches that need cleanup/rewrites. I'm not running them myself for that reason (I keep my tree to upstreamable things or nearly so), and doing so is on my list for the next week or two.

WiFi also needs some installer plumbing to copy the firmware to the right place. I already have it all planned out, just need to sit down and write it.


No


[flagged]


> Their only purpose is to encourage customers to buy from these companies.

No, you are assuming too much. People who do this are typically not interested in boosting Apple's sales at all. They merely want to make a good use of a good hardware.

In general, the ignorance or inability to understand why something is being done does not imply that it in fact is pointless.

For instance, I do not fully see well into the physics of the general relativity. But that does not entitle me to imply that the people who dedicate their lives to exploring this field are wasting their time.


> I really hope that all of these efforts end up failing miserably. Their only purpose is to encourage customers to buy from these companies.

Or to enable customers to migrate away bit-by-bit because cold turkey is hard.

I personally am running linux full-time on my personal machines, but I wouldn't have ever made the switch had I not been able to run linux in a VM for a few weeks and then dual boot for a few years and then run windows in a VM for another few years. If people back in the early 2000s had your mindset I'd probably be running a mix of windows and mac today.


>I really hope that all of these efforts end up failing miserably.

Deep breaths my friend. Maybe time for a walk in the sunshine.


Apple seems to draw so much actual pure hate...

It's like, I can understand hating an oil company or something like that, but Apple? Geez.


I think it's a product of cognitive dissonance. These people believe in freedom, as in free and open systems. But Apple freely chooses to make closed systems. But that's cheating, so they want to promote freedom by coercing Apple by law into making the products the way they want them to be. Except that meanwhile billions of people all over the world don't give a fig about all that and quite happily buy these products as Apple designs them. So huge numbers of people are freely choosing to buy things that are closed, and they shouldn't be allowed to do that. It must be stopped, because of freedom.

Plus of course "Open wins every time" tuned out not to be true, which is impossible, so clearly Apple must have cheated.

If I had to hold all that in my head, I'd go a bit bananas too.


No, I only care about _my_ freedom to use the hardware however I want, not Apple's or anyone else's freedom. However, if people keep not giving a shit, my freedom will be more and more limited.

You're trying too hard.


> It'll be entertaining to watch them engage with their sisyphean mental gymnastics once Apple releases a new product with extra wrenches for their pleasure.

I mean, you're looking at the new Apple product. This project started on M1 and now it's being extended to work on M1 Pro.


Wrenches found so far: they changed some hardware blocks to scale up to ~2TB of RAM (up from 32GB theoretical max on the M1) and better support more CPU cores and multi-die setups (future product?). And they moved some fuse bits around.

Seems pretty reasonable to me.


A static image with no example videos of software rendering?


The only thing that would make it better is the M2 Pro where the Notch will be directly in the middle of the screen so that we can optimize zoom calls even more.


I don't understand why you'd want a Mac unless you intended to run it's provided OS.

With Linux you can get the same hardware for half the price from other manufacturers.

EDIT: By 'same hardware' I mean 'hardware with comparable / equivalent performance'


Except there's no current hardware that has comparable / equivalent performance to Apple Silicon chips. The tightly integrated SoC allows for optimisations that x86 chips can't even get close to in certain workloads like video editing. That's not to say that M1 can do it all, but for the workloads it excels at (and it excels at the majority) it is absolutely the best choice for those that need it.

Even discounting the performance benefits of Apple Silicon, the reliability and longevity of Apple products in and of itself is reason enough for me to purchase them. I'm still using an early 2011 Macbook Pro as my personal laptop but I no longer get macOS upgrades; it's stuck on 10.13 and has been for some time so I'll be installing Linux on it at some point in the near future.

And I say all this as someone who once hated Apple for their price premiums. I still find their prices hard to swallow even though I have the means now compared to back then. But their prices are easier to justify for me once you take all of the above into account.


The only real attraction of the M1 Apple Silicon is the power saving it offers, especially on laptops where it means longer battery life. Other performance metrics are just temporary gains that AMD / Intel will match up with future CPUs. (Intel and AMD themselves used to compete similarly, with each bettering the other over the years, and it will be no different with Apple Silicons too).

Everybody understands that with soldered RAM, SSD and a closed SoC the M1 Apple Silicon Macs are just one step away from being a completely closed system like the iPhone / iPad platform - all that Apple has to do is lock the boot-loader of the Mac, like on iPhones / iPads, and tighten SIP to prevent installation outside of its App Store. When (not if) they do this, the users will be effectively trapped into the Apple ecosystem, with them beholden to Apple's mercy on how effectively and how long they can use their system (planned obsolescence).

This is why many with common sense have ignored the new Apple M1 systems, and continue to stick with the more open systems relying on AMD / Intel. Constant hyping of "Linux on M1" is meant to counter the perception of the closed-box macOS mono-culture, and give us the false hope that the M1 macs are just like any other Intel / AMD computer. Where as the reality is that unless Apple releases hardware documentation for it, all non-macOS operating systems on the M1 will always offer sub-par performance.


> Apple Silicon Macs are just one step away from being a completely closed system like the iPhone / iPad platform

Apple Silicon Macs are based on the iPhone / iPad platform. Apple chose to spend a significant amount of developer time adding the ability for users to securely load their own kernels, which is part of the new BootPolicy system that iDevices do not have, and a documented feature with multiple official tools to support it. There is a blog writen by Apple's head of XNU development detailing how to use it. If Apple wanted to lock these machines down they would've just not done any of that.

As for soldered RAM, you would need 8 RAM sticks in individual channels to match the M1 Max's memory bandwidth, at a much higher power consumption. Modular RAM is no longer viable for low-power, high-performance laptops. Modular, low power, high performance: pick two. It's just the way the physics works. Carrying a 512-bit bus across a connector isn't free, it has a significant power/performance cost due to increased capacitance and decreased signal integrity.

(Soldered SSDs, sure, that's a valid concern, but it has nothing to do with the OS.)

> Where as the reality is that unless Apple releases hardware documentation for it, all non-macOS operating systems on the M1 will always offer sub-par performance.

Funny enough, we already have better VM performance than macOS thanks to supporting the M1's vGIC (which macOS does not use yet), and we've also figured out how to work around a USB death issue that affects macOS, and I'm already putting making simultaneous DisplayPort 1.4 + USB3 work if at all possible on my TODO list, because I just found out macOS can't do it.

No documentation doesn't mean we can't beat Apple at their own game.


> Modular RAM is no longer viable for low-power, high-performance laptops.

The fact that this is true makes me grumpy for all the obvious reasons, but even at my rather less informed level of understanding it's still obviously true.

My "solution" here is basically to enjoy the better batter life while grumbling quietly to myself ;)


Whatever you said doesn't at all change my assertion that the M1 is now just one step away from becoming a closed system - With the M1, Apple can now easily lock the bootloaders of M1 Macs any time and make it a completely closed platform like the iPhones / iPads. And it is evident that Apple has been planning this for years:

1. The first few Intel Mac Minis allowed you some level of customisation of both the hardware (change RAM or HDD / SSD) and software (install other full featured OS).

2. Then came the Mac Minis with soldered RAM and SSD. You could no longer customise the hardware. Software was still customisable and you could still install other OSes. (Recall that Apple even offered free drivers for another OS, i.e. Windows).

3. The current generation of M1 Mini now doesn't allow you to customise both the hardware (everything is soldered) and the software. Technically you can install other OSes, but the reality is that currently only crippled versions of Linux and xBSD is available and practically the only full-featured OS available for it is macOS.

These are clear indicators of how Apple has been working slowly to lockdown the Mac platform like their ios platforms. (The strategy to keep you in denial - https://en.wikipedia.org/wiki/Boiling_frog - has been working great for them). The reason for this is simple - BigTech are increasingly moving towards selling everything as a service. Services are more profitable because it means they create recurring income (even after the device is sold) which means more profits. and Apple's successful business model for this, that earns them billions of dollars, is the closed-platform model that is evident on iPhones / iPads.

> Modular RAM is no longer viable for low-power, high-performance laptops.

That is deliberately misleading and ignorant. Low Power DDR chips (that are soldered) were invented for mobile phones where every watt drawn matters. On laptops they offer only negligible power savings that doesn't really matter - yes, the power saving is negligible on a laptop as the difference in power used in modular RAM vs LPDDR is between 1 to 0.5 volt. That matters on phones because they have small batteries (5 - 15 Watt-hour), but that power saving is negligible on a laptop as it has bigger batteries (40 - 95 Watt-hour).

It's as good as meaningless when you consider what you lose - soldered ram can't be upgraded and is hard to repair. The loss of this big benefit - repairability - hugely trumps the minor power savings. Apple, and others, know this very well - for them, the real attraction to using soldered RAM is that it allows for planned obsolescence.

In fact, reasonable alternative ram chips like DDR3L (low voltage) and DDR3U (ultra low voltage) that offer lower power on modular RAM that were specifically created for servers and laptops, have been deliberately ignored by computer manufacturers, in favour of soldered LPDDR, and that is why the memory industry too is forced to do more R&D on them.


> the M1 is now just one step away from becoming a closed system

Just like almost every other computer. Any manufacturer can do this at will with new systems or a firmware update. You are making a strawman argument. The question is whether they will, and the answer is they seem to have no intention to do so given how much time they spent not doing it. Soldering things down has nothing to do with locking down firmware.

> install other full featured OS

All current M1 machines allow you to install any full featured OS. Apple just isn't writing the drivers for us. There is nothing locked down when you run your own OS. I know because I also run macOS kernels under my hypervisor that way and everything works exactly as it does when booting in Apple-signed mode.

Your claim seems to boil down to "Apple changed their hardware in a way that Linux doesn't support". Well, duh. That's what we're fixing. Nothing about that means they're locking anything down.

> That is deliberately misleading and ignorant.

You didn't even bother to read my comment. The M1 Max has a 512-bit RAM bus. That is equal to 8 RAM sticks (DIMMs are 64 bits). Do you want a laptop with 8 RAM sticks?

Each one of those 4 RAM chips on the M1 Max has 8 RAM dies internally, each handling 16 bits as independent channels, for a total of 128 bits per chip. Typical DDR RAM chips as used in DIMMs are 8 bits each (which is why you get 8 per DIMM to make 64 bits). You would need 64 such chips to reach the same bus width (and thus similar bandwidth).

You just can't make that modular short of putting your RAM on a thousand-pin LGA package like CPUs themselves, and that'd still increase power consumption (and significantly increase physical size, which again makes the power consumption problem worse as it makes your interconnect longer). It just can't work with DIMMs and with standard non-LP DDR technology.

> On laptops they offer only negligible power savings that doesn't really matter.

Yeah, until you make a laptop that is actually power efficient like this one, and then suddenly it does. These things draw milliwatts when idle and have aggressive RAM power saving that puts channels into low power modes after mere microseconds (I know because I've been investigating the memory controller power management config and benchmarking memory accesses). That's part of how they get amazing battery life. You can't do that with regular DDR RAM. These things will run for a week with light usage, and they can do that because they are based on extremely efficient mobile architectures. Your phone lasts a day; make the battery larger without increasing idle power usage and that's how you get a week.

> the difference power used in modular RAM vs LPDDR is between 1 to 0.5 volt.

It is evident you have little hardware engineering experience if you think power is measured in volts. That's not how it works.

I would advise you to spend some time reading up on high-speed digital interfaces and learn about concepts such as ohm's law, IR loss, capacitance, impedance, eye diagrams, and insertion loss. LPDDR RAM is much lower power than regular RAM precisely because it can optimize for very short interconnects, which is why you won't find DIMMs of it. It's not just about the voltage.


Your whole tactic during this debate is to just throw lot of technical jargons to try and confuse the reader and to evade the actual fact - all the hardware changes Apple has made has also been done to create a closed system to support the software services.

All the techno-babble spouted by you, to make yourself appear more knowledgeable, and the cheap potshot on me (also an engineer), desperately tries to hide the fact that high performing modular RAM architecture already exist without all the modifications Apple has made to their hardware with the aim to deliberately create an un-repairable and closed system.

I have clearly pointed out how Apple has been converting the macOS into a closed system, on both the hardware and software front, over the years. This is an undisputable fact as the only thing that now distinguishes the M1 Macs and the iPhone / iPad platform is the bootloader.

And looking at Apple's business model, it is only logical that Apple will soon be locking the bootloader of the Macs too, once it reaches a critical level of acceptance (and we are far off from that, for now).

The M1 is undoubtedly a great piece of hardware - but it has been deliberately designed with built-in planned obsolescence. That's good for Apple's profit margin. But not for us consumers.


Ah yes, "technobabble".

I'll go back to porting Linux to this thing then; clearly you're not interested in hearing about how these things are designed.

It doesn't take being a senior hardware engineer to be able to count bus width and calculate bandwidth, though, so I'm still curious how you want a laptop built with the 8 DIMM slots it'd take to match the memory bandwidth of the M1 Max. Remember, these things have high performance integrated GPUs, that perform at the same level as discrete ones. Ever wonder why discrete GPUs haven't had modular RAM for a couple decades now? Yeah. Bus width.

> high performing modular RAM architecture already exist

The M1 Max has a memory bandwidth of 409 GB/sec. A top spec EPYC server (Rome) chip has 410 GB/sec of memory bandwidth, with 8 channels populated with the fastest RAM they'll take.

Indeed, high performing modular RAM architectures do exist. In servers that eat huge amounts of power and require 8 DIMMs to go that fast. Good luck fitting that into a laptop.

But I guess this is all still just technobabble to you :-)


With Linux you can get the same hardware for half the price from other manufacturers.

That might have been true before, but today no one else is selling ARM laptops that can match the M1 Pro in performance and efficiency.


No one else is selling laptops period that can match the M1 Pro in performance and efficiency. Much less the M1 Max.


Agreed, but who will better take advantage of the CPI's architecture to utilize those gains, the same people making the hardware and native OS, or a Linux distro? Just because a CPU can do something does not mean every OS / kernel is going to use it.

What I was getting at is if you want Linux, you can get a good Lenovo or Dell for about $1000-1200 that would seem / feel compararble to a $2000 macbook.


Some people just want Linux. Running it on Apple silicon (that is, aarch64) is exactly what they want–performance wise, it'll run circles around anything else in that price race.


> EDIT: By 'same hardware' I mean 'hardware with comparable / equivalent performance

Where are you finding laptops with comparable displays at half the price?


What do you want to compare about the display?

- Resolution? There the macbooks are behind both in aspect ratio (worse than 3:2) and in pixels (4k or 4k+ vs...)

- Contrast? Not OLED, so worse.

- Refresh rate? Good enough, but far from top of the line.

- Gamut? That is a point. But also available e.g. asus

- Brightness? Outdoor viewable screens have been as bright as that or brighter since windows XP days, but fairly specialized. I guess I prefer contrast (OLED), but yeah, difficult to find outside a macbook.

- Touch, wacom pen? Oh, none.

It is a really great quality screen, but not "the best".


He suggested that you can't find a similar or better quality display for $1000, not that it's the best display that exists.

At that price point you can find something that, at best, meets three of the criteria you've listed while heavily compromising on the rest.


> It is a really great quality screen, but not "the best".

I never said it was the best. The previous poster said that you can find comparable hardware at half the price. At lower price points than MacBooks/Dell XPS/Microsoft Surface, I've usually found the screen quality to be the hardest to match.


link us with better alternatives please


Asus for example (oled, $600)

Or huawei (ips, 3:2, touchscreen, $700 or so)

Or having pen and touch at all.


So, you just mentioned two different brands for features that a single laptop display has.


No, I mentioned features that the single laptop does not have at all, which you can get elsewhere and for much cheaper even.


Prefacing this with that I'm not normally a Apple fan and use Linux on Intel and AMD CPUs as my daily driver.

Apple does do incredible hardware, but their software is really crappy, from UX to reliability and being general useful. Since the M1 appeared, I've been looking at purchasing it only for the hardware, but then run something like Arch Linux on it, as that software experience is really hard to beat.

So in short, the combination of Apple hardware but Open Source/Linux software makes a lot of sense and is a pleasure to work with. The hardware Apple been producing lately been kind of shit though, so it's not until now it starts being interesting again.


Yeah I’ve been an apple fan for years. As of a couple of days ago my work desk has a M1 MacBook pro and a ryzen x5800 running Linux Mint. The CPUs are remarkably similar - same core count, and only slightly different single core performance. So the only main difference is software.

I expected the MacBook to blow Linux out of the water - after all, their hardware and software integration is excellent. The trackpad drivers and consistent UI is fantastic. But watching CPU usage on both machines, Linux mint stays lean and quiet while the MacBook has all sorts of weird processes popping up to do who knows what.

On macos the “WindowServer” process sometimes just pegs an entire core until I reboot. My usb-c Ethernet dongle doesn’t do hardware offload, so cpu usage goes way up when I use it. Firefox uses way more CPU on macos than it does on Linux. And there’s random processes all the time reporting things to apple or other garbage like that. I’ve been googling process names all day trying to figure out what all this crap does. Spotify alone uses 10% of a core on macos sometimes, even when it’s not even playing music.

It was a pain to get Linux working how I want it to. But now that it’s mostly[1] set up, it feels snappier and more reliable than macos. When I don’t touch the computer, it settles at 0% CPU; just like it should. I suppose that’s what the desktop looks like without the last decade of macos features that nobody really cares about.

I’m really surprised how close the competition feels between my two machines; though I miss Snow Leopard.

[1] Keyboard shortcuts on are all over the place in Linux though. And I can’t even set keyboard shortcuts up how I’d like because intellij can’t use the meta key as a modifier. And the Linux trackpad drivers are nowhere near as well tuned as they are in macos. In linux the trackpad is way too sensitive. I’m sure there’s a way to fix it hiding in a config file somewhere.


Maybe because the M1 is a powerful, low-power-consumption CPU and it is very hard to get powerful aarch64 CPUs outside of this? Not everyone likes macOS.


Even if you run Linux, how to you ensure that the SoC does not e.g. scan memory and send telemetry to Apple independently of the installed operating system?


If you're worried about these sorts of situations then there aren't going to be very many choices for you. You can't choose Intel because they have their management engine, you can't choose AMD because they don't own their own fabs.

You could go full air-gapped and get a machine without any network cards. Would still recommend a nice bunker deep inside a mountain just to be on the safe side.


AMD processors also have an equivalent to the Intel Management Engine: https://en.wikipedia.org/wiki/AMD_Platform_Security_Processo...


I wonder if you could get around this by having a multi CPU machine setup (many vendors silicon) and have a "Generals problem" algorithm ?


Well, there still is https://raptorcs.com/content/base/products.html if you're not dependent on x86-64... You'd still be relying on other peoples fabs, of course. But it's a step up.


Ok, so because other companies do that it should be acceptable? I don't understand your patronising tone. How is that helpful?


Assuming your question was sincere: You cannot ensure that the SoC isn't doing something secret that the OS doesn't know about in this environment.

Every respondent so far assumed you knew this (because it's an unusually high-knowledge question to ask and an unusually obvious answer) and so assumed the actual purpose of your post was to declare that you personally wouldn't use an environment where you can't be sure the SoC isn't doing something nefarious. The people replying don't really find that declaration interesting. Your subsequent reply says that the choice to live with this particular risk is some sort of tacit acceptance, which implies you are not tacitly accepting it.

Since no such environment exists, except for edge case hypotheticals like only powering an air gapped computer inside a faraday cage in a lead bunker, most people expect that you too are tacitly accepting it, which makes this particular complaint feel like a non sequitur and not an invitation to discuss the broader issue.

If you have an actual real life use case about trying to harden a system to this degree, like you are responsible for nuclear safety or the unlock codes for the secret lunar base or something, I am sure people would like to read about it.


It is just that many of us are fed up with people who immediately move the goal posts every single time someone presents an alternative or an improvent.

The implication (intentionally or not) being that we can just as well continue to run Chrome on Windows 11 on open WiFis because everything is broken all the time and if not any Three Letter Agency can always kidnap you or someone you care about and get their secrets that way.

Encryption works and massively increases the workload for would be drag net operators!

Having a choice of technology is good and also increases the workload for would be drag net operators.

Competition is good and forces companies to get their act together.

Trashing every good thing that happens because it isn't perfect is demotivating and I'm also absolutely fed up with it.


Your concerns mean theres no modern device you could possibly be using. So either you’re bringing up something that no one, including you, actually cares about. Or you’re commenting on Hacker news using the butterfly method on HN servers (https://xkcd.com/378/)


I don't think Apple does this.


Simply put: you don't. You'll have to design your own chip factories and design your own chips if you want to ensure that the SoC doesn't leak.

There's a reason China and Russia build their own CPUs. There's nothing else you can do to ensure that nothing leaks.

In practice, you could probably run the machine for a while hooked up to an ethernet cable and sniff the outgoing packets. I don't think the M1 will send Apple telemetry on its own (though it certainly could, if it wanted to).


In theory, your photolithography machine could be back-doored, too. So better design that yourself, too, and/or take out your microscope (if digital, taking care that isn’t back-doored, either)

I also guess more advanced adversaries will add stuff that only becomes active after a long time, and may use steganography to hide what it’s sending.

So, bootstrap your system the way this guy does: http://paillard.claude.free.fr/video.mp4

(I don’t think he has gone past vacuum tubes)


What if someone backdoors education and everyone's been learning the insecure way of programming?....


Why backdoor all of education when they could just attack Stack Overflow to greater effect?


You also need to design your own software to design the chip, your own compiler, and the OS the compiler and design software run on...


All of which require trusted hardware to run on! The bootstrapping problem is real.

I wonder how the Chinese and Russians have validated custom chip designs. I don't think the compiler would be bugged specifically for chip design software, but there's no real way to tell...


If you want to ensure the SoC is not backdoored, your only option is Precursor, which can make a fairly solid claim that silicon backdoors for FPGAs are infeasible in the general case.

https://www.crowdsupply.com/sutajio-kosagi/precursor

If you're willing to trust that the silicon isn't secretly backdoored in some ridiculous invisible way, well, unlike Intel, the M1 does not have any hyper-privileged proprietary firmware by all indications, and that proprietary firmware which does exist is plaintext and analyzable (and not privileged to scan memory), so I have a much easier time trusting these machines than a random x86 PC (all-privileged SMM supervisor and ME/PSP) or Android phone (all-privileged TrustZone supervisor).


You can't. But that worry is not Apple specific. Unless you fab your own chip this can always happen. And in fab your own chip I don't mean contract TSMC to do it. Literally do it yourself. TSMC could also place a trojan on the silicon.


Put a proxy between you and the internet and reject everything your OS didn't send (Assuming you're using nothing but wired ethernet).


Assuming of course the M1 doesn't have SoC code to zero day your proxy server or zero day your OS and selectively allow itself to circumvent this.

But if Apple were really clever they'd add a secret undocumented radio built into the SoC package that sends signals modulated in an unusual way over an unusual low-bandwidth but high signal frequency (probably sub-CB Radio) that no one is monitoring.

Which now brings you to the scenario that you have to use a machine that not only never uses a network, but also exists in a physical place that no network signal could ever penetrate. And it goes without saying that having taken this level of caution, one should probably buy the computer with a fake name and all cash, ideally through a network of international proxy buyers, to be sure.


I omitted the word "reliability" in the phrase "high... signal" and didn't notice until after the edit window.


> place that no network signal could ever penetrate.

This is actually rather trivial to achieve, just put it in a faraday cage.


If you read the twitter thread, there is a lot of details on "macs are locked down" posted by OP and others below this comment: https://twitter.com/_Thaodan/status/1458581955688206340


You could run black box experiments observing signals from the machine, or use some hw reverse engineering approaches, eg by analyzing the firmware.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: