Hacker News new | past | comments | ask | show | jobs | submit login
Linux 5.19 (lwn.net)
351 points by anpep on Aug 1, 2022 | hide | past | favorite | 230 comments



Somewhat surprisingly, Linux support for Apple Silicon MacBooks seems to be shaping up to be better than that for pretty much any other laptop. The Asahi team don’t have support for everything yet (notably no GPU) but what they do have support for seems to be high quality, well integrated with Linux’s conventions, and upstreamed into the mainline kernel! How much other hardware can claim that?

The Asahi developers have also stated that Apple tends to keep hardware peripheral interfaces stable across generations (they speculate that this is to keep things easy for their own OS dev teams), and that this has so far proven to be true for M2 (and several iPhone generations before that), and thus once support is available , it should stay relevant for some time.

Given this, it seems like MacBooks could easily end up becoming the laptop of choice for Linux developers in the years to come.


If you take a regular Lenovo laptop, everything likely works out of the box with an upstream kernel. This quite not the case yet with Asahi Linux which does not support power management, no USB, no sound.

There was a time where Apple laptops came with very good Linux support. It was the time were every bit of hardware support was present in Darwin (a 12" iBook G3 was working flawlessly with Linux for example). I think that at each generation, support gets worse because the hardware is closing down. This translates to far more time between release and having something usable with Linux.


I used an iBook G3/G4 for years as a primary development laptop running Linux, and I think your recollection here is a bit off. The state of it was that:

- There were numerous laptops with near-full Linux support, the Apple hardware wasn't categorically better when it came to that.

- The power usage of x86 CPUs was atrocious at the time compared to PPC. Now history is repeating with Apple's M[12] line. Therefore people put up with a lot to get 2-3x longer battery life. As I recall I got 3-4 hours of active use out of my iBook, but (going to conferences) it felt like people's x86s almost always had to be plugged in.

- The Linux support for software suspend/hibernation was really flaky at the time, but it worked perfectly on Apple hardware, because all Linux had to do was to tell the hardware "do the suspend thing now" (IIRC by tweaking a file in /proc). The hardware did a slow "heartbeat" with a front LED hidden behind the plastic frame when suspended (a nice effect). When running Linux it would do the the exact same, as it was all done in hardware.

- There were still edge cases in hardware support, just as with any other laptop vendor, it all came down to what individual components happened to have Linux drivers. Some of this was better on Apple's hardware, some of it was worse.


This is such a trip down memory lane. The iBook G4 was a bit of a pain to set up with ubuntu, but once you got it going it was pretty fool proof. I think I ran a machine on it until 12.04 where they stopped supporting PPC officially.


I used an iBook G3/G4 12" for several years with linux. I remember the G3 was supported completely out of the box. I loved it: it was small, battery lasted pretty long (I did spend some time fine tuning laptop-mode-tools), suspend-to-ram was working flawlessly and from what I remember all hardware was supported, including 3d graphics (ati radeon). For the G4, I remember there were some issues to get the wireless card working when it came out (Broadcom hardware I think), but the rest of the hardware also worked out of the box.


> The Linux support for software suspend/hibernation was really flaky at the time, but it worked perfectly on Apple hardware, because all Linux had to do was to tell the hardware "do the suspend thing now" (IIRC by tweaking a file in /proc).

That's interesting to read, because I ran Linux on a Macbook Pro for a few years around 2012 and the only thing on that machine that _never_ worked right was suspend and resume.


Pretty sure they were speaking of the PowerPC age, not Intel. The iBook and PowerBook G3 and G4, not the MacBook and MacBook Pro.


While I am a devoted fan of Thinkpad laptops (they are better than Apple MacBooks because of — at least as an option — non-glare screens, keyboards and soft-touch non-wrist-cutting palmrest: two/three most important factors for a portable machine imho), that's only partially true, and rarely with a new generation of hardware.

Eg. I remember getting Thinkpad X1 Carbon 6th gen early on in the cycle (I've also got 5th and 8th gen in the house, and X1 Yoga 6th gen which sucks with that metal finish), and the with the removal of "regular" S3 sleep, you close your laptop and it keeps running and potentially burning in your bag. It took Lenovo a year or so to add a "Linux sleep support" BIOS option, though it took community less than that to provide DSDT patches to re-enable S3 sleep.

All I am saying is that Lenovo machines, esp Thinkpad line, are usually a great choice, but you can hit early-adopter hurdles just like with any other laptop. The good thing about them is that community is huge and great, and that they are the best laptops around as far as usability on the go goes (as in, actually typing on them and seeing what you type).


The whole S3 sleep fiasco is the stuff of nightmares. I have a gen 6 X1 Carbon too and I've gotten it sleeping very well. I also have a Dell XPS 15 that I love but... it's been 2 years since it was brand new and it's still stuck on S2 "idle". As I understand it, Microsoft wants it this way so they can maintain network connectivity. My major gripe with the Lenovo is the 16 gigs of RAM soldered on the board. It's so nice to be able to pop open my Dell and put up to 64 gigs in there, while replacing the SSD in one of the two M2 slots.

Then again, I'm typing this on a 2019 Macbook Pro (Intel) that has nary a single replaceable part.


> Microsoft wants it this way so they can maintain network connectivity

I liked coolness of "Connected Standby" - laptop sleeping, but playing music over bluetooth. Other than coolness, cannot say I found much use of it - may be my ssh didn't terminate? Don't remember.


The windows created S3 sleep debacle was annoying. But it's long since solved. Lenovo ships both sleep options in bios now and Linux distro a support both sleep styles because they have different drain profiles.


Quick note about the screen glaring - my very first action upon unpacking any Apple notebook is to apply a matte screen protector (the purist may wince). I'd recommend a screen protector on any expensive purchase and the matte one has all but eliminated the glare for me.


I personally live in a country where it's hard to get any small thing easily online: I'd probably have to order it online from USA, UK or Germany, get it processed by customs and then hope it'd work well. To reduce the costs of trying out more than one item, I'd probably order a few different ones so the shipping costs and customs costs are contained. Which means that I'd be spending $150+ on a screen protector that might not work.

Or I could just look for a laptop that has an anti-glare screen ;-) And a good keyboard.

Too bad new AMD-based Thinkpad Z series do away with a bunch of the good things from X1 Carbon (soft-touch palmrests for one), or I'd seriously consider them to be able to drop my desktop entirely (Intel iGPUs struggle to do full screen video calls on 4k external screens under Linux, I am _hoping_ AMD 680M would do a better job).


The same thing happens to me with my company Thinkpad.

Closing the cover does not put it to sleep, no matter what I do to the settings (admittedly it’s a very locked down device so there may be settings that are not available to me).


Yeh same for me, I'm a big thinkpad fan (I own 5 of them, 7 if you include the ones I broke in attempts to do some outlandish mods), but I do occasionally find issues out the box, though that can be down to the distro you use as well.

That being said, an imperfect out the box experience just gives me an excuse to get another (old) thinkpad, so every cloud


Nonsense. The 10th gen X1 Carbon has basically nothing functional unless you are running master kernel, and even then it's not completely there. It's just an unusable paperweight at the moment, the OLED screen is pretty though.


As a long-time Thinkpad user, it usually helps to not go for the latest/greatest, but even so, the time between release and 99% working linux is in the region of months with Thinkpads, whereas it can be years-never with Mac.

I had some issues with my AMD X13 when getting it new - but they weren't show-stopping, just annoying, and they got ironed out over the next 6 months.


But that supports the point OP made above. Each generation of Lenovo laptop changes out lots of parts (presumably based on changing prices and supply contracts). Apple laptop hardware tends to be stable over time.


> Apple laptop hardware tends to be stable over time.

I'm confused where you got that idea. T2 chips changed a lot. The most recent Intel MBPs have a different wifi with a known-broken firmware for Linux. The sound handling has also changed and hasn't been reverse engineered yet since 2019. And that's before we even get to changing the entire architecture to M1. How is that less changes than lenovo?


> whereas it can be years-never with Mac

Notably the Asahi project got M2 laptops up to parity with the M1 family in ~48 hours of dev work. No doubt this won't always be the case (there's bound to be major hardware revisions at some point) but at the moment this is quite promising looking towards the future.


Yes, I think the difference this time is motivation (because M1!) and some money going this way.


I have a 7th gen X1 Carbon and everything, including the fingerprint reader, worked out of the box with Fedora.


It is from August 2019 though.


> regular Lenovo laptop

This is true for what we nerds call a "lenovo laptop" (so mostly T series). Install, everything works.

Lenovo also sells a bunch of cheapish plasticky laptops that look like some random noname chinese OEM manufactured and lenovo just put a sticker on... well, linux support there is a bit hit and miss.


> Lenovo also sells a bunch of cheapish plasticky laptops that look like some random noname chinese OEM manufactured and lenovo just put a sticker on... well, linux support there is a bit hit and miss.

They look like that because they basically are.

There's really two companies called Lenovo. One of them makes the ThinkPad T, X, P and one of the budget lines (L or E, always forget which, we don't use either). And the ThinkStations.

The other makes the IdeaPads and the other ThinkPad budget line which lacks the great keyboard and Linux compatibility, it just looks a bit like a ThinkPad. And the Legion gaming stuff etc.

They're really two different companies with different factories. We have a global contract with Lenovo and we can't even order the consumer laptops. We wanted to get some Legions because some of our dev teams prefer the Legion with RTX over the ThinkPad P which come with Quadros. But they simply can't sell them to us. I think RTX came to the ThinkPad P since anyway, not sure because I'm not involved anymore.

It seems one company because as a consumer you can buy all of them from Lenovo.com but that's not even really Lenovo, it's a third party reseller called DigitalRiver.


If it says "YOGA" or you can open it more than 180 degrees.....


> If you take a regular Lenovo laptop, everything likely works out of the box with an upstream kernel.

Never worked flawlessly in my experience.


I suppose it depends on your Lenovo series.

I have been using exclusively T and X laptop for the past 20+ years and it always worked perfectly. There was a time where the battery life was better with Windows, but maybe only 2 or 3 years "in-between". Nowadays, the hardware support is for me a non issue.


I had a T14 AMD for 8 months or so last years and it was death by a thousand paper cuts. Wake from (S3) sleep would not work well, e.g. the trackpad/point would often not come back up. Sometimes the GPU wouldn't wake properly. The fingerprint reader rarely worked. Bluetooth was very flaky. Battery life was much worse than Windows. I had a Lenovo USB-C dock. However, Linux configured the lanes wrong, making it impossible to use 4k@60Hz through the dock (worked fine on Windows).


> Wake from (S3) sleep would not work well, e.g. the trackpad/point would often not come back up. Sometimes the GPU wouldn't wake properly. The fingerprint reader rarely worked. Bluetooth was very flaky.

except the fingerprint one because I don't have a fingerprint reader, I had all these issues on my windows laptop


My experience, also using T and X series laptops matches yours. Some issues with resuming from sleep a long time ago and with firmware on a docking station once but otherwise only one gripe: I've never been entirely happy with the fan control. I found in the end with my T440s that leaving it to the OS worked well enough. Previously, with older X series machines, I had to both try to optimize and also ended up replacing fans.


I just wish other laptop hardware was on par with Apple. One can’t deny their attention to detail, fit and finish.


I just bought a Thinkpad because my home's other Windows machine died. The laptop is ok in general , and I love the mate screen, but the black plastic body is just horrible. It feels as if it's going to break just by opening the screen. Does apple have some kind of patent for the aluminium body?


> black plastic body is just horrible

It might not "feel" premium or durable but it's way more durable than aluminum... Also doesn't heat up as much on your lap.


And it's not actually plastic but magnesium on the upmarket models. Mine has a few scuffs and the metal shines through


> Does apple have some kind of patent for the aluminium body?

No, they just have an absolutely huge farm of CNC machines cutting them out which is expensive to maintain, not generally available from contract manufacturers in the volumes required, and would take a huge capital outlay from competitors who wanted to replicate it.


Aluminium is heavy, cold, dents and expensive.

Plastic is better in most ways for a laptop shell.


So by dent I assume you’re meaning drops? How does plastic survive that better? It’ll shatter, leaving the laptop internals on the sidewalk. I’ll take a dented aluminum body over plastic thanks.


Aluminium has the best property. It conducts heat. Plastic laptops are saunas.


Aluminium is also highly recycleable.


The rapid progress of the Asahi Linux team is very impressive, but there is no need to exaggerate.

There have always been plenty of laptops with very good Linux support.

Of the 3 laptops that I had during the last 15 years (before those I had used an Apple laptop, but then I have switched to Linux), on the first (a HP Compaq) and on the third (a Dell Precision), when I have installed Linux it just worked, while on the second (a Lenovo), I had to spend a couple of days until making everything work, mainly because it used NVIDIA Optimus, which required some workarounds, but after that it worked flawlessly. During these years I have also used Linux on various company laptops, always without problems.

At least some months might have to pass until a MacBook will be able to work under Asahi Linux as well as most laptops with Intel/AMD CPUs work already in the day when they are bought.


Yeah I'm going to use a completely custom SoC + enthusiast driven driver support - with no support from manufacturer - to be my daily driver as a developer.

Because I've got nothing better to do with my time than diagnose issues due to running a setup only 50 other people in the world use.


Most laptops don't use a completely custom SoC, but most of them have enthusiast driven driver support -- or worse, formerly-enthusiast driven driver support where nobody currently cares -- with no support from the manufacturer. And on less popular hardware; the most popular laptop models in the world are macs.

If you want manufacturer support, you need to use macOS or Windows or one of the very limited selection of laptops which feature Linux as an option. Most people who use Linux probably already use Linux on a computer without manufacturer support. If that's not for you though, that's totally fine.


PC component manufacturers offer Linux drivers for their hardware, sometimes it's binary blobs, etc. but at least there's some effort there. And companies like Intel and AMD are paying engineers to maintain it. For example AMD and Intel both have open source Linux GPU drivers available. Does Apple even have specifications for their GPU available ?


> PC component manufacturers offer Linux drivers for their hardware, sometimes it's binary blobs, etc. but at least there's some effort there.

I'm gonna say that it's a "generic" driver, which is totally fine on desktops but almost always (except for Thinkpads and Linux-focused laptop manufacturers) has a nasty edge case (modified chips or chip firmware) that just makes it incompatible. This isn't exclusive to Linux by the way, even Windows suffers from this exact problem (usually audio, fingerprint and touchpad).

Edit: for example, this is the FreeBSD code for HDA sound device (it's messy): https://github.com/freebsd/freebsd-src/blob/main/sys/dev/sou...


It's not the big components like the CPU and the GPU that's the problem most of the time, it's the wifi/bluetooth chip, the sound chip, the LAN controller, the USB controller, the trackpad, the power management stuff, display backlight, etc etc.


>GPU that's the problem most of the time

Developing a GPU driver for a custom chip with no public reference or support from Apple isn't a problem ? I guess you're saying it's not a problem in other laptops - but for Apple Silicon it sounds like more fundamental than those issues you mentioned.


Yeah, I wash just countering the idea that "AMD and Intel make GPU drivers for Linux so drivers on PC hardware isn't a problem".

On Macs it's a big deal; about as big a deal as making open source AMD drivers was before they had official open source drivers. And people did that, and people are working open source GPU drivers for Apple's GPU now.


Dell, one of the largest manufacturers of laptops supports Linux and OSS. It's not some red headed step child anymore.


Dell laptops support Linux for a select few models. You can't just get any Dell and expect manufacturer support. I think "laptops from a few tiny Linux-focused companies, plus a few Linux models from dell" counts as a "very limited selection", even though it's better than it used to be.


I'm not sure whether this is an argument for or against Asahi. The "50 other people in the world" statement is hardly true for a machine offered with a single SoC and zero customization options except RAM size and SSD.


Honestly. That’s what I’ve been doing my entire professional career.

Modern laptops are not materially different from SoCs, everything is soldered to the board and the CPU does basically everything memory related.

I guess you think that Intel designs are an open standard?

At the very least they’re extremely patent incumbered; with AMD and Intel having a sort of patent truce between them.


Intel and AMD have developers working on Linux support and drivers and support running Linux on their hardware.


Intel and AMD don't make the whole widget. They want people to buy the components, so publicly 'supporting' multiple OSs makes sense. Apple on the other hand want people to buy and use Macs and macOS...


I wonder if you ever got any meaningful support from an OEM. I never have. But I have gotten meaningful support from enthusiasts.


Enthusiast driven software can be really good. Eg the enthusiast community has given us OSS automated insulin delivery systems.


If some of those 50 are kernel devs Linux is likely to run just fine.


You've put in a support ticket with Apple, had an engineer assigned to it, and had it fixed promptly?


> it seems like MacBooks could easily end up becoming the laptop of choice for Linux developers in the years to come.

Having used MacBook Pro 14" with an M1 Max for ~6 months, I'd really wonder which developer submits themselves to such a glare-emitting screen and crappy keyboard (and potentially imprinting their palms/wrists on the sharp front edge), if you can get a Thinkpad X1 Carbon for much less?

Sure, performance was slightly better than the latest X1 Carbon, and battery life was significantly better, but if I am undocked for a prolonged period, that screen and keyboard make it very unpleasant to use. If I needed more performance, I'd go for a desktop or a workstation-type Thinkpad laptop, though I'd lose on the portability (though MacBook Pros — including 14" — are heavy as well).


I was surprised about this hard edge as well.

But as I only use it on a docking station, all good.


Did you experience any issues with USB-C monitors? I have a Dell U3219Q as a docking station, and my Thinkpads running Linux never put it in a state MacOS was able to: it won't turn on after Mac suspends, and it requires an intricate dance to unplug it from the socket and then turn it on while connected for it to restart working, and not even that always works.


> I'd really wonder which developer

I’m wondering the same. It’s probably just a few?


Oh, I know it's masses of them (my previous company prescribed them, which is how I got the unfortunate experience). That does not make it any less perplexing to me ;)


lol


Don't understand what's so great about it. Pretty much every amd64 laptop works great with Linux. I've had Lenovos, Dells, Huaweis, HPs, etc.

Not to mention laptops designed specifically to target Linux like the Framework laptop.


You really have to use the hardware to understand; things like the trackpad quality, battery life, blissful silence, and general build quality are beyond most other laptops on the market.


> Don't understand what's so great about it. Pretty much every amd64 laptop works great with Linux. I've had Lenovos, Dells, Huaweis, HPs, etc.

It's not amd64, it's especially not Intel. It's not fugly or fnoisy. The bar is really so low.


I'll agree with you here. I had a MacBook pro (2015) and switched to a system76 popOS linux notebook (8th gen intel, its really a "clevo" OEMed notebook). Its not quite as well built, but the Matt screen is nice, the keyboard and trackpad are good enough, and it has NVIDIA acceleration. Yeah, when pushed the fans spin up, the battery life if mediocre at best, but it can game (thanks steam) and its really fast. Its really not bad at all.

I got an Linux AMD laptop for work this year, much better on the battery and still very fast. Its been great.


> Somewhat surprisingly, Linux support for Apple Silicon MacBooks seems to be shaping up to be better than that for pretty much any other laptop

People are writing about Thinkpads below, but the real comparison should be to laptops that are sold with Linux on them, like the Dell XPS 13 Developer Edition, the HP Dev One, or the many laptops sold by System76 which even feature 'instant on' since they run open-source UEFI implementations.

The Apple Silicon laptops have gained Linux support faster than most Apple hardware in recent memory, including the Intel MacBooks from 2016 on. But Linux support has long been especially late on Apple machines because the hardware is custom and quirky and volunteer interest in reverse-engineering it has been relatively low.

And let's be real— a computer without GPU acceleration is virtually unusable. You need that these days for everything from actually getting to use basic features of your window manager to videe playback.

That said, Intel and Microsoft have hampered Linux support recently with crap like S0ix and Pluton, so if the GPU support ever becomes stable enough to support most Linux window managers, the Apple Silicon laptops might seem attractive by comparison to laptops that aren't built for Linux. I would consider one at that point. But it's hard to be optimistic when Apple has no material reason not to pull the rug out from under Asahi at any time (not necessarily out of malice).


> MacBooks could easily end up becoming the laptop of choice for Linux developers

That is, if they want to subsidize with their own wallets the dominance of consumer-hostile computing that open source was created to avert.

Except for development such as this here, which aims at liberating users from a closed platform, I would have a philosophical problem to buy such hardware purely for my convenience.

Let's not fool ourselves, when proprietary hardware from platform giants like Amazon and Apple becomes dominant killing open ecosystems, the end of road can only lead to consumer coercion, regardless if a negligible fraction of users would be able to still run Linux on such hardware.


Eh, Apple is merely not documenting their M1/M2 hardware as much as they should be; nothing is intentionally locked down in any way, their bootloader explicitly and intentionally supports booting alternative OSes, nothing requires being signed by Apple, etc.

Compare that to what's going on in the non-Mac world, with Pluton and SecureBoot stuff, where hardware is being cryptographically locked down. Or compare it to nvidia, who is intentionally making it impossible to make a good FOSS driver. Yet people have no problems buying thinkpads from Lenovo or nvidia cards from System76.

I'll take "undocumented hardware which needs to be reverse engineered, but once that's done we have a great driver and stuff just works" over "undocumented hardware which refuses to run unsigned software and the user can't install their own signing key" any day.


Yeah, if you can afford a Macbook, you can afford a Framework.


The framework is not at all competitive with a MacBook.


I don't think they are really targeting the same people.


I hear framework laptops can be quite loud though? With ARM Mac laptops, you have the choice between a machine which literally doesn't have a fan, and a machine which has a fan that's turned off unless you're maxing out both your CPU and your GPU for sustained periods of time (and even then it's just a wind sound, not a high pitched whine). The machines are great about coil whine too, which has been an issue for some PC laptops I've used even when their fan is off.


I use a framework and the laptop definitely isn't loud. The fan is off most of the time and even when running it's actually rather quiet. It might be running more than on an M1 (though I don't have personal experience), but it's far from annoying.


Can't you make your argument without the fallacious roping-in of Amazon? Apple isn't Amazon. Also, if Macs are actually as hostile to consumers and to open-source computing as you claim (and boy, is this ever an old and tired claim at this point), why is Linus still working so very hard and enthusiastically to support the platform in question?


In regard to proprietary hardware, it is my assertion that Apple and Amazon are essentially making the same move: throw money they obtained by dominating their respective markets into closed hardware development that will allow them to lock into their market dominance in the future by limiting consumer choice. You might disagree but there is nothing fallacious about this line of reasoning.

If you are looking for a fallacy take your argument from authority: if Linus does something then it cannot possibly be wrong. Which doesn't even get the point I'm making above, where I clearly contrast developing FOSS for the MacBook which is clearly a good thing (what Linus is doing), with buying a development MacBook for convenience, thereby putting your money to work against software liberty.


Except we clearly see that Linus has bought Macs over and over again. Presumably because they're the best tool for the job and he likes them. And you still have failed (like everyone else) to make any real argument that Apple is somehow warring against "software liberty". Promoting your own platform doesn't amount to doing so, certainly.


Again, what Linus does and doesn't is not relevant for a discussion about morality - unless you are starting a religion centered around him as a moral model.

Fyi, the App Store license is deliberately incompatible with the GPL.


There is no such thing as "App Store license".


I was referring to the terms and conditions Apple imposes on developers that wish to publish software on the App store.


Luckily you're not required to publish software on the App Store. Many developers sell direct from their website these days.


How is it deliberately incompatible? How is VLC on the app store while claiming in its about page to be GPLv2 licensed? What specific language in the GPL or App Store T&C is incompatible?


You have to declare you own the intellectual rights of the underlying code, a thing GPL does not grant. You can of course dual license your own code but you cannot own code released by someone else under GPL, and you cannot publish it in the App Store. It would be an absolutely trivial change for Apple to make their store friendly towards opens source and there have been multiple protests about this.

I think projects like VLC request some sort of copyright waiver from contributors so they can relicense in the future, or simply ignoring the TOS, and exposing themselves to summary deletion.


There, this explains it, VLC actually is dual licensed under MPLv2 as well according to it’s about page. Thanks for clearing that up


Maybe Linux UX is user hostile, eh?


The best Linux support is always where most Linux developers work on.

This seems like obvious but really is not.

As long MacBooks keep being the insane machines they now are, they will soon become (and keep being) the best Linux machines available.

Just because people will make sure of that.


Not disagreeing (or agreeing) with you about that, but as a coda I would note that I find it somewhat absurd that laptops have become the typical platform for developers. It implies something about what drives people's choices for working environment that I find strange. I do have a laptop, and I appreciate it being portable and thus being able to do work away from my (home) office. But most of the time I work on "desktop" system (Threadripper 2950X) with 3 screens, a fantastic keyboard, an even better mouse and many other things in the physical environment that improve my working experience. In short, it makes essentially no difference to me how "insane" MacBooks might be - they will never replace my "desktop" system for work.

Apparently, this is an area where if I'm not in the actual minority, I'm certainly in the quieter faction.


Many (most?) people don't have the luxury of two development machines. If I could have two, I'd probably get a laptop and a desktop too. But if I can only have one, then I'll take the laptop.


I have that luxury but the next round I’ll just get one powerful laptop and a great screen instead.

It’s way too much work syncing workspaces, repos, configuration, installed software, etc.


You can use a laptop as a desktop, but not the other way around.


The option to work anywhere is nice, I have a dock that I just plug one cable into my macbook and the entire setup lights up and is ready to go. But sometimes i want to work from the living room, or work at a coffee shop etc. and all i have to do to enable that is just unplug that cable and put my laptop in my bag. It's some pretty nice freedom to be able to do that.


This doesn't mean that support would necessarily regress on existing platforms, just that there will be a window of opportunity to catch up to the state of the art.

As for the fixed function accelerator blocks, vendors who cooperate will have more robust driver implementations derived from their work to support customers who rely on it operationally.


Apple doing some minimal work to support their own hardware with Linux might bring them some new business. And it sounds like it shouldn't even be that much work. Mainly just documenting their hardware would be good enough to bootstrap that.

For developers, macs are just really nice hardware and there are not a whole lot of similarly nice, Linux friendly laptops out there. Most of them seem to be compromises between being noisy, not having great screens, keyboards, and touchpads, mediocre performance, etc.

Typing this on a Samsung Galaxy Book with a meh screen, meh touch pad, alright keyboard, and unimpressive performance. It runs Manjaro. I do miss having a nice screen and input options. I'm actually using a wireless mouse for the first time in a decade plus. Never needed one with a mac.


You widely underestimate how much work goes into documenting hardware.


It is a lot of work. But having not really done it myself would they not already have most of this? At least one would hope they would?


They need to have that sort of documentation internally anyway for their macOS devs to work on it, right?


And good luck convincing the team of lawyers to groom tens of thousands of pages of internal confidential documentation and source code and prune what can be release to the public and what stays classified as a competitive advantage.

As lawyers usually play it safe (they're paid to defend the company) they'll tell you to keep everything internal internal, since Apple makes money selling finished products and services, not selling discreet chips to OEMs who's functionality needs to be explained to third parties.


Yeah, in the form of “here’s some excel spreadsheets from the hardware team and half-generated C sources from the validation team, just ask them for more info”.


> For developers, macs are just really nice hardware

eh, not quite. for me the OS is the most important part, with great hardware being icing on the cake. I’m not knocking linux, but it’s not for everyone.


Check out system76, they seem to be making really good laptops. Also Framework laptops support Linux fully too.


Firstly, I agree with your points here.

And while other comments here have pointed out that not all hardware is supported yet (a valid criticism), I feel it's also important to note that the current user experience of Asahi on Apple Silicon is stellar (as has been since it was released in Alpha).

I've been using Asahi for over 3 months now as my daily driver to do real development work (https://jasoneckert.github.io/myblog/asahi-linux/), and everything I need is there, and fast. Consequently, I'm not surprised Linus is doing the same.


> The Asahi developers have also stated that Apple tends to keep hardware peripheral interfaces stable across generations (they speculate that this is to keep things easy for their own OS dev teams)

It's crazy what happens when hardware designers collaborate with software developers. The current de-facto standard of working in silos with no regard for how the other party gets their part done is what's led us to the current global status quo.


> Given this, it seems like MacBooks could easily end up becoming the laptop of choice for Linux developers in the years to come.

What's the situation with traditional USB and HDMI ports? Dongles are easy to forget and easy to break, and lots of us have traditional USB devices lying around and 10-year-old TVs.


There's an HDMI, 3 Thunderbolt 4 (USB-C), an SD card reader and a headphone jack on the Macbook Pro.

https://www.apple.com/macbook-pro-14-and-16/specs/


My M1 has an HDMI 2.0 port.


That isnt currently supported under linux.


It's not, and in general I don't think one can claim that linux support for the Apple Silicon machines is good now. What seems to have been the case so far is that once hardware is supported it mostly stays supported (including on new generations of the machines), which suggests that linux support for Apple Silicon machines may well get to the point where it's really good.


Definitely not the Air


> Given this, it seems like MacBooks could easily end up becoming the laptop of choice for Linux developers in the years to come.

Sounds like a failure for free software and open source, and the quest for progressing for open systems in general.


> to be better than that for pretty much any other laptop.

No GPU support is a big miss though. ARM also means that you can't make every binary made for x86 out there work, even if initiatives like Box86 and Box64 help bridge the gap.


The very interesting thing here is, that Apple announced Rosetta for Linux being released together with Ventura (assuming the Linux is running a VM, but it seems it works on plain Linux installations too).


> How much other hardware can claim that?

You mean aside from system76 or of any of the other Linux first hardware sellers?

> The Asahi developers have also stated that Apple tends to keep hardware peripheral interfaces stable across generations

The Asahi team has made an assumption about apples implementation of mobile chipsets that hasn't had a Linux competitor working to exploit it.

Nothing is stopping apple from rug pulling the project next cycle.

Nobody should be relying on this.


>Nothing is stopping apple from rug pulling the project next cycle.

Why would they do that? As others have pointed outed, Microsoft, Intel, Nvidia, etc. are indeed pulling the rug out on the other side (in the name of security). Locking the Mac down 100% would have zero benefit to Apple and just generate negativity.


I wonder if its possible to dual boot the M1/M2 macs? Having a laptop running both Asahi and MacOS would be pretty fantastic (although my understanding is that application support is still early days)


That's how Asahi is built; you still have macOS on a separate partition, so you can boot to either OS at will.


Emphasis on MacBook Vs laptops (plural)


Title correction: From an M2 MacBook Air, which was impressively supported before it had even shipped.

https://mobile.twitter.com/AsahiLinux/status/155396839473481...

https://mobile.twitter.com/AsahiLinux/status/155086604218397...


Checkout the Asahi repos in github: https://github.com/AsahiLinux

Didn't know it was fully usable right now, amazing work!

I think one of the few last things missing is the gpu module, which is being developed with a help of a VTuber, she puts up full live coding session on YouTube: https://youtube.com/c/AsahiLina


> I'll likely call [the next release] 6.0 since I'm starting to worry about getting confused by big numbers again.

I'm not sure whether to read this as a joke, or if there's really no semantic difference between x & y version numbers for the Linux kernel?

[Edit: that was a sincere question, not 'what a joke', I didn't mean anything against it.]

If I hadn't seen that I'd have been very cautious, a bit worried even, about upgrading to :shocked-face: six point oh, scoured for news & changelogs (yes yes as I should anyway.. actually what I do is assume patch will be fine, and check /r/archlinux for minor bumps - that is, I thought they were 'minor' and 'patch' rather than arbitrary) etc.


The first number being changed doesn't mean anything in particular, it's just like another release.

Check the release post for 5.0: https://lkml.org/lkml/2019/3/3/236


So we can conclude, that the the big star of open source is not enforcing the child of open source :(

Ref SemVer: https://semver.org/


I find it irritating when people quote semver like it’s some kind of law.

It’s a random protocol someone came up with that some other people decided to follow. It’s nowhere near a de facto standard for version numbers: plenty of software has non-semver version numbers (in fact, this applies to all software I have worked on so far in my career!)


I think it's pretty clear semver doesn't really have a place in modern software. Except for Microsoft, no one cares about backwards compatibility, which semver is all about. All software is continually developed, every release contains both new features and bug fixes, which violates semver. The vast majority of software has a meaningless "1." or "0." tacked to the front to try to satisfy semver, until the project gets bored of typing it and just drops it.

I think most software should just have date-based versioning. It fits the development models we actually use far better, and actually communicates useful information to the user, unlike semver. Are you running a kernel from 2016? Might wanna update that.


SemVer has significant benefit for package manager world. Many argue that why don’t you just read a changelog, but that is not the point. Package manager should know whether it can update some dependency to later version safely, without some guy always manually hardcoding the suitable version. It is everywhere. In Arch Linux, Debian, Pip and Cargo. They all rely on versions which itself should describe the impact of the change. If there is no standard, then it is always risky to update. On Debian it means manually testing every package. In Arch you accept the risk. If everyone would follow version schema, then you could trust the number in most of the cases.

My original message was a bit joke which some missed, but SemVer has a place and need.

> All software is continually developed, every release contains both new features and bug fixes, which violates semver.

Combination of them is not violation. Overall impact of the change should be described with the correct increment. It does not matter how do you categorise the content of the change.


It's often a change in interface. If you build from source, old code simply won't compile with new incompatible interface, and won't get to the testing stage.


The impact comes from the dependencies you are using. Depending how the software is made, it can be noticed early on build time. But it can also be noticed just on runtime, e.g. dynamic libraries, which is hopefully noticed on testing stage.

However, how it impacts downstream, the dependents of your software, cannot be tested. You can only inform about it. For people, with changelog. For automated systems, with version number. But if you do not follow systematic version numbering, you fail to inform automatic systems. They update to the later version of you software and dependents will break.


> Except for Microsoft, no one cares about backwards compatibility

That would seem to be contradicted by Linus's "WE DO NOT BREAK USERSPACE"[1].

The Linux community clearly cares for backwards compatibility too.

1. https://linuxreviews.org/WE_DO_NOT_BREAK_USERSPACE


That's "Linux" the kernel, not the operating system. No one building a 'modern' desktop OS on top of that kernel cares. Download a 20 year old Linux binary and try running it on a random up to date linux distro. Unless it's a trivial program it will almost certainly fail.


This post is about the kernel, though.


> one cares about backwards compatibility, which semver is all about.

Interesting. I'd characterise it more as 'responsibly breaking backwards compatibility'. Just a tool to communicate the nature of changes, a really (really) brief summarising changelog - what's included, breaking changes, non-breaking new features, or just misc. fixes?

I quite like it, but I tried to phrase my top-level comment not to be about it specifically. It would have surprised me to learn that Linux used 'semver' - that doesn't mean there's no semantic meaning to the version numbers it uses though.


> backwards compatibility, which semver is all about

It's not about maintaining backwards compatibility. It's about making it explicit when you break compatibility. Whether you do or not is completely up to you - semver doesn't care.


The unofficial motto of the Linux kernel is "Don't break userspace" so semver wouldn't be useful.


And what's more, the second unofficial motto of the Linux kernel is "break kernel space"; they don't make any attempt at keeping in-kernel APIs stable. So if they followed semver for userspace, they'd be on 1.9000.0, if they followed semver for kernel space they'd be on 9000.0.0.


There's value, albeit less in the minor vs patch distinction I think. So you could imagine a world in which SemVer came first, Linux uses it, and we're on 1.5.19 but usually drop the 1. because we've agreed there'll never be a v2.


That's pretty much the state of the 3.x releases, but at some point numbers got big.


I’d say it was the stage of the 2.6.x series.


In 2.6.x patch levels were the fourth number in the version, not the third.


What would be the point of that?


I don't think SemVer is an appropriate versioning scheme for the Linux kernel due to the extremely large number of subsystems that make up the whole.


That's one way of doing it, not the only way, and only tangentially related ('it can be useful to' sort of relationship) to open source.


Some developers (younger ones?) seem to think that the SemVer spec is a law of the universe, when in reality it's just something a GitHub guy put into the mix in the 2010s.


While it is not the law, it is the only reasonable attempt to solve dependency issues. If the software gets an major release when author feels like it, then version itself tells nothing about the changes. Then dependant software can never estimate the impact of update. I agree that Linux Kernel is a bit different, since it sits on top of everyrhing, but still.


> Then dependant software can never estimate the impact of update.

They can read the changelog. And if they can't be bothered to read the changelog, then why even update?


Semver is a bandaid on a gaping chest wound, that can best be summed up as "major version number changes may break everything, or they may not, but minor version numbers might not break everything, but they may."

It's more of a philosophy than a law, and it can't really be relied on that much; as often the developers themselves can't accurately predict what is a major breaking change and what isn't.

The Linux kernel itself has some baggage from the 2.4/2.6 era that Linus is explicitly walking away from.


Actually testing is the bandaid . You always have the dilemma when updating dependencies of the software - do they break something? Of course you could manually check every time every software from changelog, if this has some impact. Problem is, that usually for example in Linux one dependency might have hundreds of dependents. Are you going to check all of them manually? If standard versioning would automatically show this information, a lot of time and testing would be saved.

> often the developers themselves can't accurately predict what is a major breaking change and what isn't.

There is a difference with a breaking change and a bug. All breaking changes are predictable. If you modify API, it possible breaks. If you don’t modify API but it breaks, it is just a bug. If you add feature, and don’ t modify API, but something breaks, it is a bug. You are not expected to apply SemVer for bugs because you can’t predict them.


> They can read the changelog. And if they can't be bothered to read the changelog, then why even update?

Package managers cannot read the changelog. That is why we have version numbers.


There aren't breaking changes.


It's happened in the linux kernel before. There's no "major" difference, it's just development as usual.


As far as I know the first two number basically don't have any meaning these days as far as size of the changes goes.


Yup, they're just going up until Linus gets confused and he bumps it up to next "major" number.


Wait, if Linus is happy about supporting undocumented hardware, then does he like NVidia now too?


This is a specious argument. Supporting undocumented hardware is fine and always has been - what’s not is supporting gross non-free kernel blobs.


What if Apple provided a better GPU driver, as a non-free kernel blob?


Same as e.g. codecs for MP3 or other such codecs, or non-open-source gpu drivers; it'll be offered as an option to end users in certain distributions. It's not difficult or some paradox, it's an end-user preference on the one hand, and the ideals of a distribution or whatnot on the other.


I have literally no idea – but since it hasn't happened and isn't likely to happen… who cares?


We can guess - the blob wouldn't be in the kernel, and the interface to the blob would reluctantly be added (or not added at all) and the installer would work against particular kernels.


I care. It's kind of strange if you're against undocumented software but still run on undocumented hardware. I'm hoping the providers of the tools I use can see this too.


Hardware doesn't change. The Linux kernel and related userspace move fast. Proprietary kernel drivers are painful for users and developers alike. Nvidia has held back the entire desktop and been a massive PITA for laptops.


I find it hard to believe that Nvidia has held back the entire desktop, when >50% of computers don't use Nvidia GPUs.


Nvidia is actively hostile. Apple is slightly willing to get Linux to work. It's not comperable.


Apple provides no documentation or any other resources. Nvidia at least provides their closed source drivers and has put in some effort in open sourcing some parts of them. Obviously Nvidia is not great but it is still much, much better than Apple which just totally ignores Linux and there is no guarantee some exec might decide that they need to patch the ‘security holes’ which allow Linux to even run on m1 at some point in the future.


I think Apple would have to be stupid to block booting other operating systems on the devices they already have sold. This could easily lead to lawsuit, similar to one that Sony had when they removed OtherOS functionality from PlayStation 3.


I believe Apple has been making changes to actively support the Asahi community in regards to bootloaders and such.


They have, though it's not clear if it's "engineer at Apple realizes with one small change they can make it easier for Linux" or "internally Apple is fine with supporting Linux on their hardware but they're not openly doing so".

I'm actually moderately surprised they are not actively helping, as often having a second OS option allows for various fun legal tricks when importing, bidding on government contracts, etc.


Apple actively helped Asahi Linux by changing the boot process. NVidia actively works against Linux by using things like the GPL condom. Nvidia is like that kid that is forced by a teacher to do something and then proceeds to do it in the most shitheaded way but just enough for it to work while actively working against the spirit of the request of the teacher.

Apple in contrast has helped Asahi Linux for literally no reason. It wasn't requiest, it wasn't even required. It just made life easier for Asahi Linux.


Apple helped Asahi allegedly. There is no explicit support of Asahi from Apple, just to be 100% clear.

The change you're talking about is likely to have been for Asahi, but no statement from Apple so far about it.


Allegedly? Apple added a feature useless for themselves but something Asahi Linux benefited from greatly. Can't really make it more clear then this. Or do you expect Tim Cook to announce Asahi Linux in the next keynote?


Maybe the "allegedly" is misplaced, I hope my overall point is clear anyway. Apple haven't "officially" helped Asahi, the change could have be made by a single contributor to the kernel for example, or more of a "why not" change without regarding Asahi specifically. We (unless you work at Apple) don't really know at this point.


Ask marcan_42 for how much support Apple have provided. When it has come up, they have been pretty open about it...


Marcan has been pretty clear in saying the same thing as I've said here. Apple doesn't have much use of such a feature themselves (apparently), but no communication was made of why the change happened. However small chance is that the change was made for other purposes, we simply don't know why it was made.


https://twitter.com/marcan42/status/1554395184473190400

> "Okay, it's been over a year, and it's time to end the nonsense speculation."

> "I have heard from several Apple employees that:"

> "1. The boot method we use is for 3rd-party OSes, and Apple only use it to test that it works, because"

> "2. It is policy that it works."

> "Hacker News peanut gallery, you can drop the BS now. It's not an "assumption" that this stuff exists for 3rd-party OSes. It couldn't "be something internal Apple uses that could go away any minute". That is not how it works, it never was, and now I'm telling you it's official."

> "And this isn't even news because @XenoKovah (who invented and designed this entire Boot Policy 3rd party OS mechanism) already tweeted about this whole thing a long time ago, but apparently it needs to be restated."

So no, we do know why this was made and no, Marcan hasn't made the claim that you have.


How the hell is a company that releases official drivers to support an OS working "against it"? How did you get yourself into a mental state where you convinced yourself that a company that officially supports your OS is "working against in" and is somehow worse than a company that will do absolutely nothing to support the OS?


But it's still discouraging to the companies that actually support Linux.


Ok it’s on them to make a laptop that can compete with an Apple Silicon Air then


While the main focus is on the surprising compatibility of M2, I still laughed a bit at this:

> (*) I'll likely call it 6.0 since I'm starting to worry about getting confused by big numbers again.

It seems that he ran out of fingers and toes again.


REAL developers only need to count from zero to one.


Nothing against the Asahi team, it sounds like they're doing fantastic work. But I'm a little surprised at the talk (which I've seen before) which indicates theirs is the first approach allowing you to use Linux on ARM Macs. A few months ago I used UTM Virtual Machines (which itself uses QEMU) to get ARM Linux up and running on my M1 Mini, and it worked brilliantly. Admittedly I only use the command line interface for working on Linux -- maybe the GUI is lacking? -- but I effortlessly got G++, Perl 5, and Raku working on it, and used that to get a full build of my $work up and running. It was actually easier than getting the same setup working on the MacOS side of things, with great performance.


That's not comparable. Asahi is about running natively on the hardware.


I guess working on bare metal was implied. Virtualization have always been useful but it's another use case. Asahi enable user to bypass MacOS entirely which I guess is the only satisfying way to use apple silicone for Linus Torvalds.


What would happen in Linus decided he likes MacOS better? He’s kinda stuck at this point no?


Why would he be stuck? Many people are pragmatists over purists, wouldn't be far out if Linus was as well. I've written many tools myself that were better than anything I could find, but at a later point I found a "competitor" that did something better than me, and subsequently dumped my tool in favor of the better one. I for one don't care who writes the software, if it's better it's better.


They have desktop Linux running, not only the command line. Apple has also been making changes to help Asahi.


So Linus wants to use an arm64 to build and test kernel. Why not use AWS Graviton instead? Graviton3 may be slightly less powerful than M1 for single core workloads, but for multicore workloads like building the kernel it's much better suited. In fact, I don't understand why Linus uses a laptop for development. You can easily configure a more powerful cloud instance for the type of work he does. It may be costing more than the Macbook Air M1, but I'm sure the Linux Foundation can afford it.


Maybe Linus wants to work on his laptop and not the cloud, because that's just the way he wants to work. And he wants an M1 / M2 because he likes something about them.

That doesn't feel very controversial, someone wanting to work the way they want to and not some other way.


You're assuming that he does the work using a good internet connection, is not physically mobile, and doesn't care about latency if you're proposing AWS as a replacement for a laptop. Which may happen to be true... but these are still serious assumptions.


And the news hidden in the news: this will be renamed Linux kernel 6.0.


The next kernel will be 6.0, not the one that was just pushed out and tagged.


Right, sorry.


Anyone using it fulltime? Just curious how the simple things work for you. The biggest question for me would be the keyboard, since linux seems to be much more windows like.

PS: Im on a p1 gen4 thinkapd very happy with it, BUT it is sometimes really noisy..


As the sibling said, the keyboards are basically the same, especially the US ANSI variant (horizontal Enter).

Localized versions can be different, the French for example has some symbols in very different places.

Also, the ALT (Option, key to the right of the left CTRL) and Super (Command, key to the left of the space bar) are reversed, but you can tell Linux to swap them.

I use `options hid_apple fnmode=2 swap_opt_cmd=1` to have function keys register as Fx without the FN and to have the ALT key next to the space bar.


Apple keyboards are the same as Windows, they just have different labels. The ALT key is called "option". CTRL is "control" and the windows key is "command".


Lots more different than that:

+ Windows has the Windows key (which Linux called "super")

+ Windows also has the context menu key, which I think it pretty useless to be honest but others might get value from that

+ The Mac keyboard layout is US-like, even on European models. Which means a bunch of keys are in different places from their European IBM counterparts (like @ and "). The # and / keys I find particularly hard to locate on Macs. I often end up redefining my Mac keyboard to be an IBM keyboard even though it differs from the key caps. It's a nightmare for anyone using my Mac but much easier for me who's had ~40 years of muscle memory.

Not the keyboard hardware (so somewhat out of scope for this context) but it's also worth noting that there's some macOS keyboard shortcuts that differ in really confusing ways too. Like

+ Ctrl+C vs Cmd+C

+ Text area navigation on a Mac is totally different too. Home and End buttons behave differently. Ctrl+Arrow keys don't work. Shift+Arrow keys don't select. etc

Every time I spend a few months in Linux and then switch to a Mac (or visa versa) I inevitably have a few days of pain relearning short cut keys.


Every Norwegian Mac laptop keyboard has been physically identical to every Norwegian PC laptop keyboard I've used, with the exception of the label changes and the missing context menu key.

And Mac keyboards have the Windows key, they just call it "command" instead of "Windows" or "super" but it's in the same place.


The Command key is not in the same place as the Windows key; the Command key neighbors the spacebar, whereas the Windows key is wedged between Alt and Ctrl.


Oh you're right. I use right cmd/right super so rarely that I never noticed, despite regularly using both types.


It also doesn’t function the same, with the windows key being used mostly (only?) for global shortcuts like locking the machine.


> + Windows also has the context menu key, which I think it pretty useless to be honest but others might get value from that

As a curiosity, this is actually very useful for keyboard-only users.

On X11 at least, context menu display is typically associated to Shift+F10. However, in some cases (I think the file panel in Sublime Text 3 was a notable case; not sure in v4), this binding doesn't work.

Context menu key gives a binding that is guaranteed to work :)


> Windows also has the context menu key, which I think it pretty useless to be honest but others might get value from that

It is a convenient key to map compose key on.


> The Mac keyboard layout is US-like

This is absolutely false. European Mac keyboards are ISO. The labeling doesn’t matter, as rolling your own layout is trivial.


Being ISO doesn't mean it's not "US-like". The UK layout, for example, does differ considerably between Macs and IBM, with the UK Mac layout being closer to a US layout than a UK IBM layout (examples posted here: https://news.ycombinator.com/item?id=32304113). I couldn't give a crap if the UK Mac layout is an ISO standard or not because that doesn't magically change the way I use a keyboard.

Furthermore I made the exact same point about redefining the layout. You claim to debunk my post yet go on to make literally the same points I did. In literally the same bullet point you're trying to debunk too!

(I'd forgotten why I usually avoid commenting on Apple-based threads. Ever word is analysed and if it can be interpreted in even a remotely disparaging way by either your Apple or Linux fanboys, then they'll stop reading the entirety of your comment and pull that one phrase out with some terse rebuttal phrased as if the original poster was an idiot. Completely irrelevant of whether that rebuttal is actually just reiterating the OPs comment or even if the original comment, when read in full, was actually negative at all.


Well actually you are both partially right and wrong.

The French Apple layout is closer to the Windows layout than the Mac US version. But it is quite divergent from both US layout and french windows because it included localized typographic enhancements inherited from the 1980's. They basically localized differently on Windows and MacOs.

There is another ISO international flavor of the Apple Keyboard that look closer to the US version but it never took traction in french market. I guess that even if switching from Win/Mac french layout require some time to rewire muscle memory it's still easier than to to opt for international layout.

For instance on both Win/Mac fr by default numbered keys pad require shift to produce number while by default they are used for accentuations.

PS: I have a mechanical Mac Intl ISO keyboard because I sometimes switch to US, but when writing in FR I still as of today use french layout using muscle memory

What is actually ashaming from Apple is that oftentimes some dev oriented shortcuts (like Cmd + Shift + Period) have never been properly localized in macOS which is why using Intl keyboard help in some contexts (it fail because shift is already required to produce "Period" in classical french layout)


>The Mac keyboard layout is US-like, even on European models. Which means a bunch of keys are in different places from their European IBM counterparts (like @ and ").

I just checked three external Apple keyboards, one integrated Apple Keyboard and a Logitech G15, and as far as I can tell, all of them have the @ and " in the same places.


They do differ on the UK keyboards (examples below). Other keyboards for other European languages will differ in different ways (differ both from the UK layout and also from the US layout).

UK Mac keyboard: https://www.apple.com/uk/shop/product/MMMR3B/A/magic-keyboar...

A random UK "IBM" keyboard: https://www.amazon.co.uk/Ultra-Classic-style-keyboard-Black/...


> macOS keyboard shortcuts that differ in really confusing ways too. Like + Ctrl+C vs Cmd+C + Text area navigation on a Mac is totally different too. Home and End buttons behave differently. Ctrl+Arrow keys don't work. Shift+Arrow keys don't select. etc

Cmd+C is actually more comfortable to me, as you use your thumb instead of pinky. My pinky is always fatigued after using a Win/Lin machine.

Shift + Arrow keys do select text, you must have something configured differently.

I don't remember what Ctrl + Arrow keys does on Win/Lin, assuming it's word boundaries that is Option + Arrow keys.

Overall my experience with Apple keyboard shortcuts is much more positive than on Window and Linux. On Windows it feels like they've run out of modifier keys due to locking the Windows key behind windows specific features, and Linux mostly just copied windows.


> Cmd+C is actually more comfortable to me, as you use your thumb instead of pinky. My pinky is always fatigued after using a Win/Lin machine.

I'd say they're the same in terms of comfort but I do appreciate CMD+C when working in the terminal.

> Shift + Arrow keys do select text, you must have something configured differently.

They select text differently. I went into more details about the differences there in a different post on this same thread.

> Overall my experience with Apple keyboard shortcuts is much more positive than on Window and Linux. On Windows it feels like they've run out of modifier keys due to locking the Windows key behind windows specific features, and Linux mostly just copied windows.

I think it's 100% down to whatever is muscle memory. I've you're more familiar with the Mac shortcuts then you'll prefer that, and likewise for the Windows/Linux shortcuts. Saying one is better than the other is rather silly when it's entirely down to whatever you've committed to muscle memory.

Hence why my point wasn't about preference but rather just pointing out that there are differences one has to adapt to when switching from one platform to another (whichever direction that switch might be).


> Hence why my point wasn't about preference but rather just pointing out that there are differences one has to adapt to when switching from one platform to another (whichever direction that switch might be).

Think about this from the high level, it’s obvious yes? Why did you write an essay on “Windows, Linux, and Mac key shortcuts are different” and then “it’s clearly subjective” These are obvious statements.


> Why did you write an essay on “Windows, Linux, and Mac key shortcuts are different”

How is my original post an essay? It was basically just a bullet pointed list. Are we really that deep into the Twitter generation that anything more than a couple of sentences is considered an "essay"? Or are you just throwing that term about to be derogatory about my comments?

I don't really understand what your problem is here. The OP said mac's and IBMs differ, then someone else replied saying there's only one difference. That's where I replied with a bullet pointed list of additional items the commenter before me missed off.

The conversation was really that simple.

All this extra stuff about "essays", preferences, subjectiveness, etc are additional contexts you're adding and not something I was ever discussing. It doesn't matter what you, or anyone else, prefers in relation to my post. Literally the only point I was making was that they differ. Because the commenter before me seemed unaware of many of the differences. That's literally it.


> The Mac keyboard layout is US-like

In Israel we have the option of choosing Macs with US layout or keyboard layout. What changes is mainly the enter key on two rows of the European models, the character "`", and the characters next to the shift keys.


Indeed, that's my point. What you've described (Mac US vs Mac UK) less significant than comparing a UK Mac and a UK IBM keyboard.

I'm not suggesting that a UK Mac keyboard is "the same" as a US Mac. Just that for someone used to a UK IBM keyboard, a UK Mac keyboard will feel different (and visa versa for Mac users switching to IBM keyboards).


Much of that is not the keyboard but the OS So running Linux on an Apple keyboard will not have the last two issues.

As for # in macOSdon't use a UK layout use Irish or Australian which are the same except for shift-3


> Much of that is not the keyboard but the OS So running Linux on an Apple keyboard will not have the last two issues.

Only the last two items and I did prefix those items saying it was the OS so wouldn't be affected by running Linux. I'm not trying to deceive people here.

> As for # in macOSdon't use a UK layout use Irish or Australian which are the same except for shift-3

...or you can just redefine the keyboard layout, as I also stated in my comment (I assume you did actually read it?).

None of the problems I raised were intended to be negative about Apple so you don't need to jump to the defence here. I'm just stating the differences between Mac and IBM/Microsoft keyboard layouts.


Sometimes Mac OS doesn't like to co-operate with "just change the keyboard layout": https://apple.stackexchange.com/questions/255082/external-is...


Text navigation works in the same way on Linux/mac using readline key bindings.


Not sure what you're implying here. CLI applications? Because they'd obviously behave the same.

The issues I've noticed are around desktop applications. For example if I select some text in KWrite (KDE's "Notepad") by using SHIFT+CTRL+LEFT_ARROW then it will select the word. If I do the equivalent in Notes (macOS) it will select the whole line, not just the word (I tested this just now too).

This platform specific behavioural differences isn't something that's unique when comparing to KWrite vs Notes, I just picked those applications as an example. It also isn't just SHIFT+CTRL+LEFT_ARROW that differs. There's a quite a few subtle differences when using the keyboard to navigate around text areas. None of them significant but they're always enough to break my concentration when trying to get stuff done.

A lot of these can be redefined. Both on Linux and on macOS though. So you can absolutely make the two platforms behave the same way, given enough time and motivation. So we are just talking about defaults here.


Text entry fields in mac programs accept emacs mode ctrl combos. c-a, e, u etc.


I just tried that in 3 different applications: Finder (search bar), Notes, and Slack (just in case it was an Electron specific behaviour). I couldn't get those readline/emacs bindings to work on either. This is a brand new MacBook Pro with all the defaults for the UK region so it's definitely not something I've (knowingly) disabled.

Is this something that needs to be enabled? Or is it dependant on something else to work? It's definitely a feature I'd welcome.


Weird! https://support.apple.com/en-us/HT201236 lists them under "document shortcuts." Well, not c-u although I swear that one worked too. Maybe they removed them and this document is out of date? I haven't used a mac in a long time, so it's possible. Damn shame if they did, I always liked that feature.

[edit] A friend checked on his Montery machine and said they work as expected in system text boxes.


Ahh c-u was the one I tried so that explains why it didn't work.

Thanks for the link, that will come in handy!


> SHIFT+CTRL+LEFT_ARROW then it will select the word. If I do the equivalent in Notes (macOS) it will select the whole line, not just the word (I tested this just now too).

Use option instead of control here.


Ahh yes, that works. Thanks for the tip. That'll come very in handy.


Not using full-time but have a partition for it. I went no DE on my install, figured I'd hold off until we got GPU support. I'm on an M1 Pro and all the essential hardware works so it's fine for my purposes (C++ development).


I'm running Ubuntu arm64 on my M1 MacBook Pro 14 inch via Parallels + a Bluetooth keyboard.

Ubuntu's not a daily driver, there's a bit of initial effort in switching between macOS + Linux keyboards, but it's usable.


“ On a personal note, the most interesting part here is that I did the release (and am writing this) on an arm64 laptop.”

Some comments seems not on this. It seems to a mac arm notebook he tried to use not in a big way. But will be for travelling.

For the release itself tbh I am confused his last merge window (*) comment. Is it 6.1 or not I am not sure? ( should it be odd number if it replaces 5.x9?or it 6.0 as it is after 5.x9)


Linus' third time using ARM64, once ``a decade+ ago when the Macbook Air was the only real thin-and-lite [machine] around'', confuses me. The original Macbook Air and successor a decade ago used Intel IIRC?

<edit>Wikipedia agrees with my memory.</>


Quoting directly from the text:

> It's the third time I'm using Apple hardware for Linux development

No mention of ARM64 there.


Oh, I misinterepreted. How do I delete my old posting...

Although Apple branded hardware is not apple developed hardware, so I read that in a different way. But now I realize that PowerPC was not an Apple Development, too, so he means Apple branding. So in my head, this is his first time of using Apple hardware (CPU, not keyboard, casing, etc.) for kernel compilation.

Thanks Apple for insisting that "this is the first Apple hardware" for clamshell laptops in your PR statements.


He said "It's the third time I'm using Apple hardware" (i.e macs, not arm64)


Third time using Apple Hardware.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: