Hacker News new | past | comments | ask | show | jobs | submit login
KDE runs on the Apple M2 with full GPU acceleration (vt.social)
368 points by c80e74f077 on Nov 25, 2022 | hide | past | favorite | 123 comments



Asahi is getting closer and closer to "daily driver" usability at an amazing pace.

Anyone have an idea how soon we should expect GPU support to be in mainline?


i'm using it as a daily driver for a couple months easily, but my daily driving happens to not require the not-quite-perfect device drivers. i shutdown for sleep, for instance, which works just fine, since boot and login is super fast and restores everything.


Possibly stupid q: Why buy a mac if you are just going to run Linux on it? I suspect any comparable PC would be more economical (w/ exception of power draw).


There are a number of reasons.

1. I prefer Mac hardware to any PC hardware (I don't know any manufacturers who come close to apple in hardware quality, so I don't think the "comparable PC" you cite even exists in reality).

2. I prefer to use Linux, since I'm more familiar with it, I'm more likely to be able to debug it when things go wrong (macOS Just Works more reliably, but when it doesn't, I'm stuck), and also I work on software that runs in prod on Linux and I don't want to deal with Docker for Mac.

3. While this is not yet the case, I think it's likely that someday Asahi will run better and more reliably on macs than mainstream distros run on PC laptops. The reason is that they only have one target (or, I suppose, one very closely related family of targets) whereas there are a pile of different PC vendors that are all subtly broken in different ways. I've _never_ seen a high-end PC laptop run Linux without tons of bugs and weird quirks; to get a solid Linux laptop experience, you seem to need to eschew discrete graphics cards and use a system that's a few years old at minimum.


I would probably suggest that Lenovo comes pretty close to Apple in terms of build quality. I have both Macs and ThinkPads and they both feel pretty good. The Lenovo ThinkPad X1 Carbon Gen 10 is available with Fedora Linux as an option[1] and pretty much everything works out of the box with great performance.

If you're in Europe, you can also get the ThinkPad Z13 and Z16 (which are AMD-based laptops) with Fedora as well, and that should be coming to the North American Lenovo store soon (hopefully).

Lenovo works with Fedora to ensure that things work, and there's a nice process to make sure everything stays "good" with Fedora Linux on Lenovo hardware.

[1]: https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadx1/t...


That laptop is not “high-end”. i7 processor, standard (not 4K) screen resolution, Intel graphics.

I also have a thinkpad, and it gives me no end of issues, presumably largely due to the Nvidia GPU. There are also some weird non-GPU-related quirks; for example, charging over USB-C sporadically stops working (requiring an unplug and re-plug of the cable), especially when the battery is low.


Ny X1 has lasted for 3+ years of really hard use and only really has issues due to my own fiddling with clocks. It also has weird TB3 eGPU issues but thats nearly every laptop with a dedicated gpu and TB3, which are a pretty small list surprisingly. Took a 64gb upgrade easily, and raid nvme with ease.


The X1s have great build quality compared to other non-Apple laptops, but mine didn't last but about 1.5 years of daily work use before the case itself started failing (case metal chipping and bending + keys failing, battery issues, ports failing, etc etc). On the other hand, my 2014 macbook air still looks almost new even after almost 7 years of daily use.

I think it's as close as it gets, but it's still nowhere near Apple. I'd love to see a premium non-Apple manufacturer.


I still have a gen1 x1 extreme and its going strong. Runs fedora flawlessly. If I don't buy Apple then I aint buying anything other than Lenovo.

Work gave me a DELL XPS 15, one of the worst laptops ever. 2 jobs, 2 XPS's, both 2 generations apart and both terrible plagued with issues.


i have an x1 yoga, the one that started using usb-c ports instead of the older rectangular charging ports. ive had to buy 3 cables now, which are not cheap because they are attached to a brick, and i resoldered the cable myself twice as well, so basically 5 times ive had to replace it in the space of 2 years. ive been using thinkpads for as long as i can remember and this was never an issue for me.

i think the usb-c port is connected directly to the motherboard as well so i suppose i should be greatful that its only the cable that's been breaking. im going to sell this off soon and im pretty much done with lenovo for the foreseeable future


It would be easier to buy a >60w USB charging brick and a USB cable. Then you can just replace the cable next time it breaks. They're only a few £

This is how I power my Xiaomi and Macbook laptops.


Lenovo isn't close. I have several generations of X1 Carbons, and they all do terrible things like:

- Scratch their own screens when lids are closed - Have failing LCD connectors (especially true with my 4k variant). - Flex terribly.

I bought an M2 Air recently, and it's a far better machine.


To be fair Macbooks will still scratch their screen if they weren't using glass, Thinkpads don't if they have the glass varient as well. There isn't a true Matte option for Apple and it shows why, even if there ever release the nano frosted version it loses to basic Matte panels for range and glare or lack of.


Apple computers used to break their own plastic by hitting themselves with the closing magnets and some protruding pieces of plastic.

I guess it's why they make them of metal now… Less problems with bad design and using the user as a heat sink.


I think metal tends to conduct heat more, so in the fan era they can run quieter


My 14 inch MB Pro also scratches and stains its own screen with the lid closed. It's the only problem I have with this machine.


Yeah my expensive Lenovo Slim 7 Pro Carbon edition has just had the hinge fail at 11 months. I'm sick and tired of high end Windows laptops all failing due to hardware issues. I am seriously considering a Mac for the first time in two decades in disgust with the state of reliability of Windows laptops. But I don't want to run Apple software and live in their walled garden.


why not get a Surface, then? the keyboard is fantastic and magnetically attached, and you could opt for Intel or ARM. disclaimer : i have a Surface Pro X and it's truly excellent build quality.


I heard of significant driver and stability issues on some of them, despite being all MS. How's yours going, software wise?


You speak like someone who has never tried an apple silicon laptop, they are in a different league.


Not even close when it comes to build quality if we're being honest


Is there an ARM Lenovo with Linux?


x13s


I have some hope Framework laptops are going to end up filling this niche eventually. Right now, Linux is an afterthought on them, but its the same for Apple M1/M2 series.


I get 12+ hours of battery life on my M1 Air, no longer bother with cables and outlets at coffee shops.

What else I could buy of similar weight/size/battery/quality?

Even if knew of an alternative, there are other unexpected perks to going with Apple: travel constantly, occasionally selling my old one and switching to a new machine is easy whereas with other brands would be impossible. Amortized cost is less than $1/day.

If this Asahi thing pans out (I'm guessing maybe in a year or two it won't use twice the battery) I'll immediately dual boot and spend the majority of my time in it. :)


This also happened when Macs transitioned to Intel and Core 2 Duo laptops were released. At the same price point, PCs were much noiser and had much poorer battery range.

ARM laptops from other brands are starting to pop up but they will take at least a year to catch up in terms of performance.

Right now, ThinkPad X13s runs Linux very decently, but it's less powerful than the M1.

In the US market it might be cheaper than M1 Mac Airs. In Europe it's 50% more expensive and customer support is poor.


> This also happened when Macs transitioned to Intel and Core 2 Duo laptops were released. At the same price point, PCs were much noiser and had much poorer battery range.

I had one of those macbooks… very silent but the fan broke quite soon and to use it I had to limit the cpu frequency or it would overheat and shut down.

I take the noise over the sleek computer with the air intake and outtake placed in the same hole that overheats constantly.

The machine was completely unusable in the summer, the bottom would get scorching hot. I could NOT place it on my legs.


There was ARM laptops before M1, nobody wanted them. M1 was ahead because it was 1 silicon node ahead for a year. The competition already caught back:

https://www.cpubenchmark.net/power_performance.html


That comparison chart is not very accurate. The Core i7-1250U CPUMark score of 13453 is at 29W (TDP up), not 9W (TDP down), so the performance per watt ratio they're using is inaccurate. Also, if you look at the CPUMark score distributions, you see a bimodal distribution because it mixes the 10 W (MacBook air) and 15W (MacBook Pro) configurations.

The 10W (MacBook Air) variant of the M1 achieves a median CPUMark score of ~14500. That's a lot better than the Intel CPU's 13453 at 29W. Now, if you limit the Intel CPU to a lower TDP by underclocking, the performance per watt can improve substantially (since the efficiency plummets when going for peak clock speeds), but it's still considerably behind the M1.


10W is the typical TDP, not the max TDP of the M1 chip. AFAIK M1 chip does not specify max TDP, which can go higher than the max TDP listed than the i7.

https://www.anandtech.com/show/17024/apple-m1-max-performanc...


Yes, but not in terms of heat and battery. What fanless laptop do you suggest with similar specs to a MacBook Air M1?

The best think I have found that runs Linux reasonably well (still with lots of caveats) is the X13s. And I cannot get a decent price.


Heat and battery is directly tied to the performance per watt.

With equally performing software, the one with the best performance per watt will be run cooler and longer.

The graph I sent is not raw performance, but performance per watt.


Sure, but in practical terms are there any laptop makers that sell any CPU from that list, say a i3-1210U, in a good case and with good cooling?

Because, as I said in the original post, even when Apple was selling Core 2 Duos, their cooling was better than most other manufacturers. I was often buying Macs to run Linux because of this.

Also see what the parent post says about performance per Watt, which mirrors my experience as a user. No current x86_64 machine is close to ARM if you want low heat.


Nobody wanted an arm laptop because it doesn't run windows/existing software. Apple bundles an is and a decent emulator for old software. If they didn't nobody would want it.


There was ARM surface laptops existing before M1 macs. It run existing software except x64 apps sadly. Windows phone, was windown on ARM.

People didn't liked ARM windows laptop, first because the performance was crap, then because the poor compatibility.


Interesting product. 230g lighter with 10 hours longer battery life than an M1 Air.

> In Europe it's 50% more expensive

It's basically the same price in the UK. £84 more than an equivalent M1 Air on the Lenovo site right now.


That's a good price. In mainland EU, right now, the X13s is €1490 in lenovo.com. A MacBook Air M1 is €1200, and I can get almost 10% off using academic discount.

Plus, the Snapdragon CPU in the X13s is still far from the M1 in regular benchmarks. Linux support is better in the X13s, except for the camera. It's an IR one, and this might never be well supported.


The hardware is pretty good. Utterly unmaintainable, unserviceable and unapgradeable, extremely overpriced tiering (e.g. adding storage or RAM which you have to do at purchase time), with limited options (e.g. i cannot stand glossy screens with shitty reflections everywhere causing eye strain), but still very good. However the software (macOS) is pretty shit and IMO hard to adapt to coming from any other OS.

Raw performance per watt, and per weight/dimensions is best in class. For pure performance (e.g. an Asus ROG Zephyrus) or lightness (e.g. LG Gram) there are better options, but if you want all three it's hard to beat.

I personally think the hardware is so good, even with the caveats, but the software so bad that I'm honestly tempted to get an Air for portability or a Pro as a daily driver when Asahi Linux is good enough for me and the prices are right (so some sale or something, sticker prices are ridiculous if you max everything, and you kind of have to due to the impossibility of upgrades).


Serviceability is improved (though still not amazing) in all the machines with chassises redesigned during the "M" era — the "notched" MBP 14/16 and Air have easily accessible bottom screws (no need to remove rubber feet first) and don't have batteries glued in. Keyboards can be changed independently from the top case too. Notably, many Windows laptops fail on both of those counts (like the LG Gram which hides screws under adhesive-attached feet).

But yes, it's difficult to find laptops as well-rounded as MacBooks are. Generally laptops will require you to make significant sacrifices in multiple categories to be good at one or two things, which is less true of MacBooks (particularly the 14"/16" Pro models), especially if you want good performance without the laptop being huge and bulky and/or have horrible battery life with constantly-screaming fans. The 14"/16" models get you performance in the ballpark of a desktop Ryzen 5800X while unplugged and still getting great battery life while also being silent and still reasonably portable, along with a killer screen, great speakers, decent keyboard and great trackpad.


Macs, for me, are the software. This is something that became pretty evident during the Intel era. And I bought Macs exclusively when the hardware was both much slower and pricier than PCs (68k and PPC), because I loved the software so much.

Funny to read such an opposite opinion.

I don’t mind the pretty casing, but it’s icing on the cake.


I'm quite sure it's just a question of what you are used to. It is for me, anyway.

My first PC (~year 2000) came with Windows but I wanted to use some software that only existed for Unix at the time and I was used to work in Unix anyway, so I heard about Linux and installed it. Great, I got an OS I was used to and the software I needed for my project.

When finally I had to use Windows for work a couple of years later it took time to adapt and, even to this day, I just find it easier to use Linux. It's just a metter of what you are used to.

Last year I bought a MacBook, because of the M1, and I can't get used to the "weirdness" of MacOS, specially the keyboard and the window management. Every other machine I use (Linux, Windows or ChromeOS) uses the same keybindings but in MacOS the same software I use everywhere else (e.g. Chrome) has been forced to change the standard keybindings to something else and and it's even not configurable. Programs just don't implement stuff as C-c to copy and C-v to paste. Programs link that functionality to S-c and S-v, instead. WTF? This means there is no remapping of the keyboard that can fix this, since the software itself is broken.

For me, this makes the machine pretty unusable. I'm a keyboard guy and quite fast at it. But when I'm in MacOS I waste a lot of time finding the right keybindings even for switching Windows. Example: S-w to close a tab but C-TAB to switch tabs %~(


> Last year I bought a MacBook, because of the M1, and I can't get used to the "weirdness" of MacOS, specially the keyboard and the window management.

For what it's worth, long time Mac users feel that *nix desktops and Windows have the same kind of "weirdness" you describe here. The majority of modern macOS conventions can be traced back to the original 1985 Mac or the 5-10 years following its introduction.

I started on macOS but can switch between control schemes pretty fluidly these days, thanks to having regularly used all three major OSes for several years. That said I wish there were at least one Linux DE that cloned macOS conventions as faithfully as the rest have cloned Windows conventions (with the exception of GNOME, which is more like what you'd get if you turned iPadOS into a desktop OS with Windows keyboard shortcuts).


There is a tool that makes Linux act as if it has Mac keybindings. https://github.com/rbreaves/kinto I've been using it, with some custom config, and it's made life a lot easier as I use a Mac and Linux laptop at the same time.


Thanks for the mention, I have tried hard to be faithful to mac conventions and to be honest it was both a harder and easier problem to tackle, several months.. maybe a year or more before it got to be really solid but it sorta requires one to think about it in layers so you're not really killing yourself to remap every little thing individually lol.


The command key is one of the great accidental geniuses of macOS (née Mac OS X).

It used to be just a Mac thing, but when the OS became Unix, you suddenly had a GUI key (command) and a command line key (control). Which is great for mental split and flexibility.

You can have Ctrl-C to cancel and Cmd-C to copy on the same Terminal app.

And you get to use both command/option/shift + arrow keys along with readline/Emacs shortcuts on any native text input. So sweet.


I wonder if this response is because common Linux desktop environments are _so_ derivative of Windows these days. I started using MacOS around 2004 (the G4 iBook was _amazing_ in its day; if you wanted a Unix-y laptop with decent battery life, working power management and wifi, and vaguely affordable, it was the only game in town), having previously been using Linux since about 2001 and Windows before that. At that time, Linux desktop environments generally didn't use the same conventions as Windows anyway, so moving to MacOS wasn't that jarring.


You can swap cmd and ctrl in the keyboard prefs to give you more Windows/Linux like key commands.

Personally I think CMD is more ergonomic but muscle memory trumps all I suppose


Unfortunately, that doesn't solve the problem.

The issue is not which labels keys have printed on them. I don't look at my keyboards and I always remap keys for them to be in the positions I like. That's fine and works in MacOS.

The issue is that the same application (Chrome, for example, but happens with most of them) uses the same key combinations in Windows, Linux, Chrome OS and Android (I use DeX from time to time on my S7 Tab) to do something but then decides to do something different in MacOS _without giving the user the possibility to fix it_.

Chrome example: C-w closes a tab and C-TAB changes tab in the first four OSs. In MacOS, though, it's S-w and C-TAB, respectively. There is no remapping which can fix this because the correspondence between keymappings on a given modifier key is not a bijection. The only possibility would be that each program gave the user the option to "use standard keymappings". But very few do, AFAIK. Emacs is the only one that works the same on all systems. But I also need a browser ¯\_(ツ)_/¯.

I'm not saying that MacOS convention is worse or better than the standard. In fact, I personally like the idea of having the CTRL-like functionality on my thumbs, rather than my pinkies, and I could migrate to this layout on all computers I use (via key remaping, in the same way that I already have remapped on all computers I use the ESC key on the place that normally CAPS is). I'm just saying that using MacOS makes me slower, both when using MacOS itself _and_ when using the other OSs, since suddenly I have to conciously decide which key combo to use. Normally this was something that just happened subconciously and didn't interrupt my flow.


I use an MS sculpt keyboard and use Karabiner to swap the keys.


You can swap them in the system preferences without needing extra software. At least for the modifier keys


That doesn't work for key combinations such as Cmd+Tab (which only switches between applications, but not between windows of the same application) and Cmd+` (which switches between windows of the same application), you have to change them too, and some of them flat out aren't configurable out of the box, requiring third party software.


Is S = command?

What I'm seeing is: close a tab -> cmd+w; copy something -> cmd+c; switch tab -> cmd+option+left/right arrow, switch window -> cmd + `


Yes, coutego is using Emacs notation for key chords, in which S-whatever means what macOS users would call cmd+whatever, Windows users would call win+whatever, etc. "S" stands for "Super" key.


Same thing with keyboard mapping, it really bothers me. However I've found that using a keylogger to remap (Karabiner) works decently with very few exceptions (iTerm).


I'm personally a fan of macOS as well, but I can see the draw in wanting to run something else on them. I have a ThinkPad dual booting Windows and Fedora and it's not terrible as far as generic x86 laptops go, but in many ways it's not as nice as my work MacBook. If Linux ran as well on say a MacBook Air as it does on that ThinkPad, the ThinkPad would likely be replaced with an Air and any Windows needs being handled by a Windows VM or RDP session to my custom built tower.


> And I bought Macs exclusively when the hardware was both much slower and pricier than PCs (68k and PPC), because I loved the software so much.

Incredible that these myths are being perpetuated by a Mac user. The fact is that at release, new Macs (including 68k and PPC) were always the most performant machines available in their class. Always. You could buy a cheaper x86, but it wasn't as powerful on that date. And the fact is, Mac prices were and are always within $100 of a feature-matched PC. The problem was consistently that PC consumers didn't want the included hardware features, wanted other features that weren't included, and translated that they could buy a much shittier PC for less as Macs being outrageously expensive. It was always bullshit. Ornery PC consumers were never Apple's market. Macs were and are the computer "for the rest of us." And, ironically, everything Apple designs gets copied poorly by Microsoft and PC manufacturers. Asahi is the inverse of Hackintosh. Both ideas are somewhat ridiculous. I want a Ford engine in my Chevy powered by tomato soup!


At release, they mostly were. As soon as G3 was released it screamed. But then Intel kept bumping MHz every 6 months or so. Same with the G5, but that margin lasted even less. The G4 was just a sad period in the PPC line up on desktops. They made for good laptops though.


> The G4 was just a sad period in the PPC line up on desktops.

Not really. In 1999, the US government classified them as supercomputers and banned their export to over 50 countries. Apple tried to make hay of it, but almost immediately started lobbying to get the ban lifted. The last G4 tower was released in June 2003 and discontinued a year later and just got old by 2006 because Motorola already left AIM, and IBM wasn't delivering. Not Apple's fault, and the reason for the platform jump to Intel. But, again, at release, any of the high end G4 machines were faster than any consumer Intel tower, though 2003-2006 gave them plenty of time to catch up and pass the G4. The Mirrored Drive Doors 2003 dual 1.25GHz G4 was... is still a pretty sweet machine. It is still in use in Pro Tools studios because Digi equipment and plugins were expensive, and none of the PPC Digi components work on Intel. Try sourcing one. You'll be shocked what MDD 2003 DP sell for in 2022 at 18-19yo.


Yeah, I was around back then. It was faster if the code was heavily optimized for AltiVec like Photoshop Gaussian Blur, image interpolation, audio filters, etc. But most general purpose code was not.

The G4s were overclocked for most of their life and stuck behind a 133mhz bus that choked the whole system. The MDDs you cite even gained the unaffectionate “Wind Tunnel” nick name. I hope whoever worked in Pro Tools with such machines had it running in a different room.

I remember a video studio still running a 68k Quadra that had some crazy expensive Avid board in 2001 or something, so I'm not surprised by high end equipment lasting a long time. Though I imagine you could emulate it on a laptop these days. Depending on 20yo hardware that ran really hot for most of its life it's not a recipe for peace of mind.

I was a huge enthusiast of 68k and the PPC and was devastated when Apple switched to Intel. Mediocrity won, I thought. Even though both architectures were much more interesting than x86, the reality of chip manufacturing is that scale is almost everything. Intel had it then, mobile phone chips have it now.


I started to use a Mac for work a few months ago, first time in my life.

I am still often frustrated with keyboard shortcuts, despite having installed a dedicated software to not feel in such an alien place. Sure there is a lot of muscle memory you can blame here. But how does it happens that the default OS doesn't provide the software option? Brew is nice, but here too it's community work filling the hole of the default.

I miss the home, end and del key on the integrated keyboard.

The only way to shutdown the integrated screen and still have the camera usable is to duplicate the screen and diminish brightness to zero. Or use a magnet. Seriously?

No key to show the contextual keyboard.

Where is my select and paste with middle click, outside iTerm2 (community provided)?

Why is there no straight forward way to browse the actual file path in Finder, when a shortcut allows to copy it? It's possibly the file manager that made me feel the most clueless in my life.

It is not like everything is utterly horrible, but I was very surprised at how frustrating it could be as a daily driver. I didn't discovered anything that I would miss from it when I go back on something as basic as a vanilla Gnome.


> The only way to shutdown the integrated screen and still have the camera usable is to duplicate the screen and diminish brightness to zero. Or use a magnet. Seriously?

Can you describe what your use case for this is, because this complaint sounds truly bizarre to me. You want to use the webcam at the top of the laptop display, while not using the laptop display itself, but using a separate display (and thus presumably not looking anywhere near the webcam)? Are you pointing the webcam at something other than yourself?

> No key to show the contextual keyboard.

What do you mean by contextual keyboard?

> Where is my select and paste with middle click, outside iTerm2 (community provided)?

The first-party Terminal provides that feature, too. But the rest of the OS doesn't natively.


Well, I use the webcam for visio calls. So I'm not completely facing the camera usually, indeed.

I meant contextual menu. The only way to open in it in a Mac arms seems to be to use the secondary mouse button.

My bad for the paste in the terminal. I moved to an other terminal for an other reason I can't remember right now. But yeah it is the middle click in general that I miss. To be fair, Windows doesn't provide it either.


> Why is there no straight forward way to browse the actual file path in Finder, when a shortcut allows to copy it?

In Finder, selecting View > Show Path Bar will place a persistent path bar on each window. Additionally command clicking the proxy icon that appears when hovering over finder titlebars will open a path menu (and also works in any application with a proxy icon in its titlebar).

To go to a path, select Go > Go to Folder… or tap Command-Shift-G. This can be rebound to a more convenient shortcut in System (Preferences|Settings) > Keyboard > Keyboard Shortcuts.

Command-Option-C will copy the path of the currently selected item to the clipboard.


Wonderful, thank you so much. :)


Option + arrow / cmd + arrow does exactly the same as end / home etc, if you've just not found that yet.

I always find it interesting when people complain about keyboard shortcuts on macOS - but I feel exactly the same when I use anything else.

macOS keyboard shortcuts are amazing and os-wide. But they're not made obvious. Its really kinda snobbish that apple just assume you know them they treat it like 'because obviously youve used a mac forever'.


Incredibly useful document for the new mac user with experience on other platforms - https://support.apple.com/en-gb/HT201236


They aren't obvious on Windows either, are they? Keyboard shortcuts are mostly a pro user thing.

And don’t get me started on the way that OS inserts special keys. How do you insert ® on Windows? Alt-01whoCanRemember? On the Mac it’s usually something that makes sense like Option + R. Ç? Option-C. ƒ? You guessed it.


>They aren't obvious on Windows either, are they?

No, of course. But the gap between linux and windows for that matter is really smaller, thus my surprise. All the more with the way Apple is marketed as so great in ergonomics.

>Keyboard shortcuts are mostly a pro user thing.

Sure, I would not use a Mac had my employer not provided it. Like many coders out there I guess. But Apple is not willing to pay attention to the adoption ease for this population it seems. Or at least, it doesn't feel like this to me.

>And don’t get me started on the way that OS inserts special keys. How do you insert ® on Windows?

I use a bépo layout everywhere, with it the answer is obvious. It comes out of the box in linux distros. Mac and Windows require third party installation. The Mac one is a bit less functional/buggy. The worst issue being that my IDE won't recognize the combination for underscore. It's more a responsibility of IDE producer here certainly. But still, it makes the Mac UX far less pleasant from a dev point of view.


I have little experience with Linux desktop environments, I use it for servers through the command line only, but the little interaction I had seemed more of an attempt to mimic Windows, so it’s no surprise it behaves similarly.

Most web and mobile developers I know use Macs, by far. Windows development is targeted to enterprise custom software, in my experience. It pays handsomely, but no one seems crazy in love with the stack.

I don’t think Apple has to cater to users of other platforms specially when it considers its conventions superior.

Regarding bépo layout, dvorak is already a tiny niche. Bépo is a niche within a niche. I wouldn’t expect wide support anywhere really.


I remember Mac-style mnemonic shortcuts for special characters way better than I do alt codes. If I were building a Mac style DE that's probably one of the features that would be added.


Of course you do. Some of the things being discussed here are a matter of preference and/or habit.

But remembering a the first mnemonic letter vs 4 random digits is indisputably easier.


And if you're used to Emacs/Readline keybindings, those will work in most Mac text inputs


I went to documentation and found those, nothing to complaint about here. But muscle memory is not something you can switch right away easily in my experience. I'm OK to look at them and possibly learn them when I'll have time for this.

In the meanwhile, I wanted something that would let me focus on my work, not being distracted by basic key combination struggle every few inputs. Karabiner, which is community driven, led me to such a mostly OK situation here.

To me what is baffling is that Apple, with its ridiculously high revenue stream and all its marketing on great UX, is unable to provide that out of the box.


>I miss the home, end and del key on the integrated keyboard.

What do you mean? They’re all there on apple extended keyboards and are accessible via fn key on laptops. I mostly use command/option arrow keys, which, along with shift are also an amazing Mac feature.

>The only way to shutdown the integrated screen and still have the camera usable is to duplicate the screen and diminish brightness to zero. Or use a magnet. Seriously?

I don’t know if I understand you but you can turn the screen off without sleeping in multiple ways, like keyboard shortcut (Ctrl-Shift-Eject) or assign a screen hot corner for the mouse.

> No key to show the contextual keyboard.

What?

>Where is my select and paste with middle click, outside iTerm2 (community provided)?

It’s not a thing. I use it for exposé. Sounds barbaric to select with the middle button pressed though. Your dexterity goes out the window.

>Why is there no straight forward way to browse the actual file path in Finder, when a shortcut allows to copy it?

There are a couple. Straightforward might mean accustomed to, tough in this context.

>It's possibly the file manager that made me feel the most clueless in my life. It is not like everything is utterly horrible, but I was very surprised at how frustrating it could be as a daily driver.

The Finder leads a double life. It inherited traits from both the classic Mac OS (spatial Finder) and the NeXT (column browser). And it shows. Both can be very powerful but their coexistence is confusing at first.


On a second thought, there where some transformative hardwares though.

The iMac 5k for me, almost a decade later, is still better than anything other vendors have to offer. It’s my childhood dream monitor. Such a shame that they never sold it separately.

The M series laptops seem like an inflection point as well. A fanless powerhouse with more than a day's work of battery life and best in class monitor and trackpad.


> Such a shame that they never sold it separately.

Well, they sold the controversial LG 5K, which was the same panel, but certainly not the same build quality. I've got one, and it's... fine, and for a very long time was literally the only 5K monitor you could buy, but for the price it is not a well-built piece of kit. (And the first two versions had weird bugs)

They now (nearly a decade later) finally sell a fully first-party one, which is very similar.


The 2016 LG 5K is still my preferred monitor. In retrospect it was a surprisingly good investment at its $974 introductory price (~$1100 in 2022.)

The Apple Studio Display certainly improves on it, but at ~$1600 it costs more than a midrange 24" iMac.

It's too bad that the iMac 5K didn't support Target Display Mode. Maybe Apple will bring it back someday along with a new 27" iMac.


I was really hoping this would come back once it was feasible (at the time it came out, the only vaguely well-supported way to do it would have been two DP cables, but later versions of DP and Thunderbolt 3 and better can handle 5k without trouble), but, given that the 5K iMac is now gone, I wouldn't hold out too much hope...



You’re correct! How could I forget. Thanks. Shows I haven’t been shopping for Macs in a long time.


I am with the other guy - great hardware, with the m1, married to a barely usable software. I hadn't been forced to used it in a decade, and imho, but hasn't gotten much better. At least it's got brew going for it.

Horrible peripherals, too. I guess you love them for the same reasons I hate them.


> extremely overpriced tiering (e.g. adding storage or RAM

As I understand it, Apple uses a "system in a package" multi chip module that mounts RAM inside the same package as the main M1/M2 SoC.

Seems to work well in terms of memory bandwidth, unified memory architecture, and physical size, but it's hard to crack that SIP/MCM open to add more RAM.

And it's even harder to add RAM to an SoC die itself. And the GPU is integrated as well (although in theory one could connect an eGPU over Thunderbolt - assuming the driver issues could be sorted out somehow.)

Some old Macs from the 1990s included an external L2 cache SRAM slot. But cache RAM upgrades became impossible once the L2 cache was integrated on the CPU die.


Even on the remaining Intel models having Apple upgrade the RAM is a bad deal. The Intel Mac Mini has socketed RAM and they want $1000 for 64GB of DDR4 2666. You can get 64GB DDR4 3200 sodimms for less than $200 right now. Even in 2018 it was a bad deal. I put 32GB in the one I had for less than a third the price Apple was asking.

And for storage that isn't on the SoC, it's just flash chips on the board so it shouldn't be much different cost wise to what the M.2 drive manufacturers are paying. Yet it costs considerably more.


Glossy screens have better picture quality (black levels) in any situation where you can control the lighting. That’s usually true for laptops because you can just put it on your lap and swing around to avoid sunshine.


> That’s usually true for laptops because you can just put it on your lap and swing around to avoid sunshine.

So the options are extreme eye strain due to lights/sun/my own reflection, or extreme back pain due to having to sit like a prawn to hide the light from the laptop? I'd take slightly lower visual quality (doesn't matter in the slightest for what i do on a screen) over either of those.


The glossy screens' higher contrast should lead to less eye strain in the end. That's a big reason for better visual quality, like HiDPI displays and OLEDs.


it's true for everything. You can even more so control the lighting and orientation of a desktop. Matte screens are the industry's worst mistake and I just can't understand why everybody hadn't ditched them already.


There are no comparable laptops to M[1,2] Macs, AFAIK. Linux on an M1 simply flies. It's just stupidly quick. A MacBook with Linux is the most amazing Linux machine that exists, even without the GPU acceleration.

I bought my first MacBook just because of the M1 processor and /despite/ the OS, which is ok-ish but not my cup of tea. I'm looking forward to running Linux on it as my daily driver.


Not that I would buy a new Mac and install linux on it...

But if I did, it would be cause apple has unmatched hardware build quality. (But also the battery time would as you mention also be a nice thing)


Linux is more user friendly for some developers. Linux support with Asahi is also markedly better then any comparable PC. Hardware wise it beats all windows laptops.

Essentially it comes down to that macs have great hardware but shitty software. The later which asahi fixes.


The Mac Mini is actually a pretty nice little unit, and not priced too terribly.


I do not (currently) use this but I am interested in this. My reason is simple, I have a Mac and not a PC running Linux because my employer provides Macs and not PCs. If I got to use the computer I want, than I'd use a PC and run Linux (have at prior employers who didn't have/care about MDM).


Two reasons: battery life + touchpad.


Simple:

Apple is the best built laptop. No other brand even comes close.

Linux is the best OS. I have no idea how people can work on Windows, I think other people used to Linux will agree. OSX feels like a really old fork of Linux that has not kept up with the open source Linux.


Power+performance, great battery life, great build quality, great hardware...


Yup


not at all a silly question...

it runs forever and _fast_, the ergonomics are kick ass for my body dimensions (i mean, it's comfortable), and it's _silent_.


> any comparable PC

Not sure such a thing exists!


Question for those in the know: are there any substantial changes from M1 to M2? I'm sure lots of tuning took place, but is there any major component that was completely overhauled?


There was quite a bit of small annoying stuff, but nothing major.

GPU and display controller were initially expected to have large amount of changes, but this turned out not to be the case.

Amount of changes between M1->M1Pro/Max/Ultra and M1Pro/Max/Ultra->M2 is similar.


Another question for those in the know: there are what seem to be tons of weird GPU problems on macOS under M1— weird cursor tails, choppy scrolling, and very occasional panics that derive from GPU drivers. Are there any workarounds for unstable GPU behavior that were discovered during the RE & driver implementation?

Edit: I've directly observed these on my machine, and it doesn't look to be an isolated incident. There is a video in [2] below.

[0] https://discussions.apple.com/thread/253679057

[1] https://discussions.apple.com/thread/252777347

[2] https://www.reddit.com/r/apple/comments/u486mi/macbook_pro_1...

[3] https://www.reddit.com/r/mac/comments/oldbb9/mba_m1_cursor_g...

[4] https://www.reddit.com/r/applehelp/comments/kfkuqi/is_there_...

[5] https://www.reddit.com/r/mac/comments/r037h2/is_this_amount_...

[6] https://forums.macrumors.com/threads/mac-mini-m1-mouse-curso...


It looks like your particular piece of hardware is defective. Try having it replaced.


YMMV but on an MBP 16" M1 Pro driving its internal display alongside an Apple Thunderbolt Display (yes, the one they sold from 2011-2016) I've seen no graphical problems whatsoever in the past year.


Agreed with other commenter: I'd disregard the reports. Been using an M1 for 6 months now on macOS. One kernel panic when closing the lid. Solid machine and no issues with the GPU.


> there are what seem to be tons of weird GPU problems on macOS under M1— weird cursor tails, choppy scrolling, and very occasional panics that derive from GPU drivers.

Never heard of these. Been using M1 for a year. I don’t think it’s worth taking seriously.


Are you sure you're not using any displaylink cable/drivers?

I had a choppy mouse at a certain point, but that was only with a bluetooth mouse. Bluetooth runs at a slower rate in any case, but I think it might have had to do with some interference.


I'm aware of bluetooth issues, it's a similar experience. However, this is using the built-in trackpad and built-in display, and occurs seemingly randomly. The bluetooth issue made it initially very difficult to find any discussion of the M1 issue.


Issue free for over a year on M1


Sort of a “spec bump+”.

The biggest change, IIRC, is that the M1 was based on the A12(?) but the M2 was based on the A14(?). So the CPU/GPU design was newer. They tweaked and improved other modules like the neural engine too.

So it wasn’t just clock speed, but to most end users it was just somewhat faster and more mature.

Nothing special/amazing/transformative.


M1 was based on the A14 cores, the M2 is based on the A15 cores.


I've been reading bits and pieces from the people doing Linux for Apple SoCs for a while and it sounds like the evolution has been mostly incremental going way back (like A7 era).


No.


Sad that iPads do have open bootloaders. Id be happy using my M1 iPad Pro from time to time.


Some ppl have been saying reason macs doesn't support 4k 120hz using thunderbolt to hdmi 2.1 cables is a software limitation, I wonder if linux on macbook solves that


Most often it's a hardware limitation because the cables/adapters use the MCDP2900 converter chip inside, even when they advertise HDMI2.1 support. That's the same chip inside the built-in HDMI port of the new MacBook Pro and its datasheet [0] says it only supports up to 60hz

That chip is also the reason for a lot of support emails I'm getting on Lunar (https://lunar.fyi/) because it seems to break DDC/CI and hardware brightness control stops working through cables and ports that use it.

[0] https://media.digikey.com/pdf/Data%20Sheets/MegaChips%20PDFs...


Same with Macs not supporting display port daisy chaining over USB C (not Thunderbolt), it's purely a software limitation.


Can anyone confirm if nested virtualization is available when running asahi on m1/m2 ?


M1 doesn't have nested virt in hardware. M2 does but nested virt on arm64 isn't quite in upstream Linux yet.


This is great news and a big win for consumers!


(false statement about a buggy M1)


M1RACLES was a security flaw that was hyped as a joke, because it was such a weak bug, and yet it was hyped to oblivion. It totally does not deserve even a mention on the M1 Wikipedia page.

The flaw means that two malicious processes, already on the system, can potentially communicate without the OS being aware. Even though they already could through pipes, desktop icons, files, inter-process communication, screen grabbing each other, over the network, from a remote website, take your pick. Now, what are the odds of two malicious processes, being on a system, with a pre-agreed protocol for communication, going to need a weird processor bug to communicate over for? Absolutely nothing. It's not supposed to happen - but it's basically useless when you are twice-pwned already.

The other flaw that was found was that Pointer Authentication (PAC) could be defeated on the M1 with the PACMAN attack. However, PAC was actually an ARM standard added in ARMv8.4 that affects all ARMv8.4 implementers - the M1 just happens to be the most notable chip with that ARM version. Versions before ARMv8.4 didn't have PAC at all - so, even with that defeated, you aren't worse off than you were before ARMv8.4, so it's just a "sad, we tried, but oh well" thing from ARM's perspective.


Notably, almost every other Arm A series processor which supported PAC also was susceptible to the same attack [1], the issue is just that actually buying such processors is nigh impossible (up until this year it was actually impossible, now you just need to do research on a phone SoC) whereas anyone can buy an apple silicon device from a million different places.

[1] https://developer.arm.com/documentation/ka005109/


Thank you! I am learning something new every day.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: