Hacker News new | past | comments | ask | show | jobs | submit login
Major Linux Problems on the Desktop, 2021 edition (itvision.altervista.org)
217 points by jamesgeck0 3 months ago | hide | past | favorite | 328 comments



This is a detailed, depressing and really damning view of the current state of the world. Some general thoughts:

1. Some things are clearly not Linux's fault, like proper support from NVidia;

2. The decision to make device drivers part of the kernel (as modules or otherwise) is really showing itself to be a problem. Interestingly, this was one of the drivers behind Fuchsia since device drivers are a huge pain point.

3. Some of these were news to me just because I hadn't attempted to use them (eg the 10 bit color depth issues). That one's kinda sad because it doesn't seem to even require such hardware to expose those issues. Just set your color depth in xorg.conf and watch the world burn (apparently).

4. Further to (3) it seems like a lot of issues remain unresolved just because they don't affect or aren't prioritized by the devs. Possibly they might require large architectural changes. Whatever the case, this is the negative of not having central product direction. Of course there are benefits to that too but this is a list of issues;

5. The Wayland list is damning as it reads like serious architectural errors were made for something that was meant to learn from X; and

6. PulseAudio continues to be the answer to the question nobody asked.

This should be sobering to anyone this with even a modicum of objectivity because it reads like a list that will never be solved.


> 6. PulseAudio continues to be the answer to the question nobody asked.

  1. How do I change the output destination for audio from typical desktop/consumer apps without building it into ever such app?

  2. How do I deliver audio across a network?

  3. How do I use bluetooth audio devices that (were) not yet supported by the ALSA stack?

  4. How do I allow multiple applications to use the same audio hardware without either (a) requiring (like JACK) that they all use the same sample rate and buffer size or (b) using the rather unreliable dmix ALSA library layer?

  5. How do I provide a common volume control panel that allows to me control the relative gain of different applications all in one place ?
I could go on.


Like I get that PulseAudio was unreliable for many years that fostered hate for it but it's absolutely crazy that people say that it's useless. You need something to collect sound streams from applications and mics, mix them, transform them, and output them to a bunch of different kinds of hardware with different requirements. Like Pipewire is fantastic but it's just a more reliable PA.


Yes, it was a necessary endeavor, but the implementation never lived up to the task. I'm glad Pipewire is now getting traction, as I fear PulseAudio will never be able to be the stable and robust solution we need for audio on Linux.

Particularly, it handles badly sleep modes, hot plugging sources, and BT, which in 2021 is not acceptable. I regularly have to set things up manually using pavucontrol.

But yes, we needed it, and I thank the people that tried.


> Like I get that PulseAudio was unreliable for many years that fostered hate

Unreliable, and a shocking resource hog. Did that part ever get better, or did we just stop caring because memory sizes and core counts kept going op?


neither, when pulseaudio was built there where problems all up and down the stack. from the kernel, alsa, and userspace.

the effort around pulseaudio stabilized and resolved most of the problems in the kernel and alsa. leaving most of the problems in the implementation of pulseaudio.

pipewire is now building off that giant amount of work pulseaudio effort resulted in and is able to reuse pulseaudio's IPC framework to seamlessly replace pulseaudio itself.


I stopped caring because PipeWire works well and ships as a drop-in replacement for PulseAudio and Jack by default.


It's not quite a drop-in replacement for JACK yet, but I have very little doubt that it will get there.


What about PipeWire?


PipeWire is an intentional swamp draining effort that takes a lot of these ideas/solutions, and applies them more broadly. In the audio domain it does so by scaling to the requirements of competing APIs like JACK and implementing them as well. But it also expands them to the video domain (think arbitrating access to camera devices, or processing chains spanning multiple processes), which has a lot of the same challenges. And underneath it's built on Linux kernel infrastructure (e.g. dmabuf) that wasn't as present during the early PulseAudio days.

A decent one sentence-summary for PipeWire is shell pipes for A/V between processes, with pluggable policy (hello mixer) and plugins to inject sources into the graph.


When Pulse audio came around, pipewire didn't exist. In fact, the infra required to build it didn't either.


An answer to most of these might be to fix and extend ALSA rather than wallpapering over it and treating it like abandonware.


ALSA was written to allow psuedo-devices to be implemented in user-space, which is how PulseAudio (and now Pipewire) does what it does.

Nobody is abandoning ALSA, it continues to be both the device driver layer and the lowest layer of the audio stack in user-space.

[ EDIT: you could make an argument that after many years of experience, it's time to put the lessons learned from JACK and PulseAudio into ALSA itself, and have ALSA provide the sorts of audio capabilities that the two of them, combined, enabled. I don't have much of a position on this compared with the Pipewire approach, which is to do the same thing, but outside of ALSA. There are good arguments for both, but Pipewire is what we have, and it remains as dependent on ALSA as JACK and PulseAudio and just about all other Linux audio has been since 2000 ]


ALSA is already overcomplicated. What I expect from ALSA is being able to enumerate audio devices and being able to send audio to hardware without any transformations. The rest is the work for the userspace daemon.


That's what ALSA does. You don't have to use any other features of it. However, the task is a bit more than you're describing, because in 2021, users expect things like:

  * hot-pluggable devices (so you need a device notification protocol)

  * ability to lock the device to prevent settings modifications (very important to pro-audio/music creation settings). macOS calls this "hog mode". 
Also, you're eliding the distinction between the capabilities of an API and where it gets implemented. Contemporary audio has all sorts of requirements that could be/should be/are implemented in user space, but it's unclear if they should be a part of the ALSA API or some layer(s) above that.


I guess? If I have a video player or a game/emulator in the foreground then I think it deserves direct access to the audio device and I just don't like the idea of a userspace sound server in the middle. For a music player or whatever other random stuff, sure, mix it however, send it over Bluetooth, who cares. I've given up fighting PulseAudio, though, and with one audio device I can't have it both ways.


PulseAudio obeys a (dbus) protocol that Lennart and I worked out that allows JACK (or Ardour) to show up and say "hey, I want the device". Any other application could implement it. Pipewire has (AFAIK) implemented the same protocol.


That is super cool, thanks for the information!


which is funny because that's exactly how ALSA was started: https://lobste.rs/s/moflv5/freebsd_audio_from_perspective_ha...


I've been developing audio & MIDI software for Linux since about 1997. Maybe that makes me a special case, but none of the experience described by the post is familiar to me. I even used to use xmms with ALSA.

There's some acorns of truth in that account, but it's been turned into a paragraph of hand-waving half-truth.


That comment is somewhat problematic and I went into detail about it last week: https://news.ycombinator.com/item?id=28856907


Great comment. Sorry that I missed this at the time.


That's hilarious. And sad.

https://www.jwz.org/doc/cadt.html

Copy paste the link. Or:

> I'm so totally impressed at this Way New Development Paradigm. Let's call it the "Cascade of Attention-Deficit Teenagers" model, or "CADT" for short.

> It hardly seems worth even having a bug system if the frequency of from-scratch rewrites always outstrips the pace of bug fixing. Why not be honest and resign yourself to the fact that version 0.8 is followed by version 0.8, which is then followed by version 0.8?

> But that's what happens when there is no incentive for people to do the parts of programming that aren't fun. Fixing bugs isn't fun; going through the bug list isn't fun; but rewriting everything from scratch is fun (because "this time it will be done right", ha ha) and so that's what happens, over and over again.


The 3rd quoted point.... if applying it the Linux audio stack then ... this is so absolutely fucking disrespectful to people like Takashi Iwai, who has spent 20 years fixing bugs and doing so much "un-fun" work on ALSA.

Y'all can vent on Lennart as much as you want, but every time you do you forget the heroes who have worked out of the spotlight, doing the boring grunt work that has kept ALSA working for two decades as audio hardware has changed dramatically.


How would my quote apply to people who do that kind of work?

The whole article is about people that <<don't>> do that.


I'd say what's interesting is that the people who don't do that are probably overrepresented within the Linux community. The average tenure of devs there is in excess of the average tenture of most tech employees of today. It might be more meaningful to measure continuity as provided by organizations (i.e. communities vs. corporations, for example) as they address turnover, but I'm not so sure. Frankly tech churn seems to be about equal on both sides. Other systems have revamped their audio stacks on occasion as well. It's the economics/business models that favor certain kinds of finish, especially backwards compatibility.

I'd further submit the Linux story there has improved over time. For example PulseAudio didn't attempt to provide much of an upgrade path to apps using aRts and ESD (its predecessors), but PipeWire provides continuity for apps written against ALSA, PulseAudio and JACK. And as PulseAudio and JACK were difficult to run side-by-side at times, it actually reduces fragmentation as well. By aligning audio and video device access within one system it also generalizes system behavior.


Where we put our attention says a lot.

People like Takashi get almost zero attention, because so many people spend their time blabbering about people doing "sexy" stuff. JWZ does this too. I believe we'd be much better off if we tried to do more ignoring of the flashy stuff we don't like and instead support and laud the unflashy stuff and the people who do it.


It's pretty rude to insult someone by calling them an "attention deficit teenager" regardless of what they're working on. I would recommend against quoting that article.


I see you chose to ignore what prompted the rant. Is it deliberate?


I generally try to avoid responding to rants because every time I do that, the author gets angry and starts complaining at me. In my experience, people who rant are just frustrated and blowing off steam, and are not trying to be taken seriously or start a constructive dialogue. If a person actually has some leadership/direction and a solid grasp on the full picture then there is no need to rant.

But you may want to read my comment here: https://news.ycombinator.com/item?id=28856907


> 5. The Wayland list is damning as it reads like serious architectural errors were made for something that was meant to learn from X; and

The Wayland section is also perhaps the most technically inaccurate of the post, for example in its comparisons to how competing systems are put together - more than once it lays the blame for not providing a certain kind of API at the feet of Wayland, when in the other systems named, the equivalent API is also not provided by the windowing/compositing system but at an orthogonal library level. In other cases the criticism reflects broad technological trends that are equally true for other systems from the same era (e.g. client-side rasterization), and for the same motivating reasons.


> The Wayland section is also perhaps the most technically inaccurate of the post

The author is open to valid criticism and updates the article whenever someone points inaccuracies or falsehoods.

You and the guy replying to you, both say nothing technical or factual aside from "it is all wrong or/and inaccurate". That's very hard to act upon.

> In other cases the criticism reflects broad technological trends that are equally true for other systems from the same era (e.g. client-side rasterization), and for the same motivating reasons.

Yet other systems provide a level of integration and ease of use (APIs) which are leaps and bounds above what the Wayland protocol offers. That's the main gripe.

Check this for more info: https://gitlab.freedesktop.org/wayland/wayland/-/issues/233


I posted more technical comments elsewhere in the thread.


I noticed that as well. The entire Wayland section is completely confusing and incomprehensible to me, and also filled with numerous false statements. It's just very poorly researched.


5. The Wayland list is damning as it reads like serious architectural errors were made for something that was meant to learn from X; and

Not really - remember that Wayland is mostly just doing things the way every other OS (ios, mac, android, windows) does them - passing around raster buffers to user-processes.

Most of the complaints have to do with the politics behind Gnome+GTK, KDE+Qt, and FreeDesktop. It was always ridiculous to have all of these settings stored in and managed by X11 - no other OS manages user shortcut and keyboard settings this way. When you take away X11, presumably you'd want to take away the bad separation-of-concerns as well, right? So it doesn't make sense for Wayland to be managing these things, right?

But then someone else needs to pick up the ball. The fact that everyone is writing their own compositor is just another symptom of the ridiculous project-fragmentation that has always afflicted desktop linux. They could have cooperated and made a common compositor, but everyone had to build their own.


> They could have cooperated and made a common compositor, but everyone had to build their own.

I don't think it's bad to have different projects with different goals.

One thing that's also with pointing out is wlroots which is a library that can be used by lots of different compositors to help share code, which kind of is "everyone coming together to make a common compositor".


> Not really

Yes really. It makes no sense that every application has different non unified font rendering or rendering in general. Also things like kinetic scrolling and all kinds of application behavior is now potentially completely different for different applications. It's a mess, we need standardized solutions and such a solution already exists: X11.

> The fact that everyone is writing their own compositor is just another symptom of the ridiculous project-fragmentation that has always afflicted desktop linux. They could have cooperated and made a common compositor, but everyone had to build their own.

The Linux community already has huge problems to agree on any standard at all and X11 is such a standard with a huge amount of support functionality. This is rare and should not be squandered. X11 is not pretty but the insistence on backwards compatibility led to a growth of a huge ecosystem (just count the numbers of Window managers). If anything I consider Wayland (and GNOME) a deliberate act of sabotage.


> Yes really. It makes no sense that every application has different non unified font rendering or rendering in general. Also things like kinetic scrolling and all kinds of application behavior is now potentially completely different for different applications. It's a mess, we need standardized solutions and such a solution already exists: X11.

A lot of this commentary just doesn't quite reflect reality, even if it's meant well. There's a lot of misunderstanding on what really is different now, or how things are different from other systems.

- Font rasterization and text shaping in all major applications has normed on freetype and harfbuzz.

- Kinetic scrolling/input event handling has (or is in the process of) normed on libinput.

- This is true for X11-based systems as well! The X protocol historically does contain commands for server-side text rendering, but those haven't been used in decades - applications started moving to client-side rasterization ages ago, and the X server moved to using libinput (where its code originated, before then being intentionally moved out).

- In competing systems such as Windows/MacOS, the situation is often the same or reflects the same tech trends. Often there are legacy server-side APIs to do things, but apps have long-sinced moved to newer library APIs. Often there's more than one way to do things (say, text rendering on Windows: Pick GDI or Direct2D) and the perceptual same-ness has just been achieved through effort of alignment. In fact it can actually be worse on other systems nowadays - native app development on Windows has really deteriorated and you can find many apps that just ship freetype/hardbuzz there (bundled with the cross-platform UI lib they picked) and don't match Windows font rendering.

- The separation of duties between the windowing/compositing system and orthogonal library APIs is often the same. For example, in both iOS and Android the display server is not responsible for text rendering in any way. On Android the latter is done through freetype/harzbuzz as well.

- Wayland compositors already share a lot of code in much the same way that X-based desktops did, e.g. by graphics drivers having moved out of the X server into the kernel and Mesa, and input having been moved to libinput.


X11 is not really a viable solution anymore. The API has some pretty major flaws that there are no plans to fix. Or rather, the plan to fix them has been to ship Wayland as the new API. It should continue to be backwards compatible because of XWayland. Also I don't believe X11 has any API for kinetic scrolling, that typically is implemented by the desktop environment so I'm confused as to what you mean.

"I consider Wayland (and GNOME) a deliberate act of sabotage."

KDE is actually fully behind Wayland as well. It's not something that GNOME is doing just to inconvenience you. https://community.kde.org/KWin/Wayland#Why_Plasma_needs_Wayl...


> X11 is not really a viable solution anymore.

These are very unsubstantiated claims. X11 works perfectly. Wayland on the other hand is not a viable solution at all at this point. You can't even take screenshots.

> The API has some pretty major flaws that there are no plans to fix.

Except for the 15bit coordinate limitation it has no major flaws and can be extended if needed. If anything has major flaws it is Wayland. As an exercise I recommend implementing a native screenshot application on Wayland and compare that experience with the X11 equivalent. The Wayland API is pure garbage.

> KDE is actually fully behind Wayland as well.

Maybe in words only. The Actual Wayland version of KDE has major bugs and people doing serious work always switch back to the X11 session where it works flawlessly. Besides that I don't think KDE is any good either.


> These are very unsubstantiated claims. X11 works perfectly. Wayland on the other hand is not a viable solution at all at this point. You can't even take screenshots.

You either haven't used Wayland, or haven't used it lately. You can take screenshots just fine. I used to do it regularly with KDE's Spectacle, and now I use Flameshot.

Screen sharing also works.

> Maybe in words only. The Actual Wayland version of KDE has major bugs and people doing serious work always switch back to the X11 session where it works flawlessly. Besides that I don't think KDE is any good either.

I use Plasma with Wayland to do my work and it's been a breeze, no pun intended.


I'm using GNOME on Wayland and pressing the "Print Screen" key takes a screenshot so I'm not really sure what you're talking about, it would help if you could clarify.

My experience developing screenshot/screencast/webcam applications on Wayland has been significantly better than it has been on X11. Using X11 APIs for those things is somewhat of a misuse and breaks in corner cases, as those APIs are only really safe to use by the window manager. Extending X11 is also a bad idea because that's just more things that have to get plumbed through Wayland/the X server/toolkits/libraries. Pipewire really makes it easy to use and convenient, you just use that and then you don't have to worry about messing with the lower levels of the stack.

If you don't think KDE or GNOME are good then we have not much to talk about, there are no other viable open source desktop platforms on Linux. I certainly don't think they're free of flaws, but they are the best we've got.


> My experience developing screenshot/screencast/webcam applications on Wayland has been significantly better than it has been on X11

I use Firefox on X11 and screen sharing in Google meet just works.

A colleague uses Firefox on Wayland and screen sharing in Google meet does not work.

Maybe his setup is just incorrect, but it does not sound like things have become significantly better.


Everything works on my wayland. Btw I use swaywm and have perfect support for any of the flaws mentioned. The quality of screen sharing is better than other OS even under low bandwidth.


It works for me with GNOME and Firefox 93, your colleague may have to update. Not Google Meet but I did it successfully with Jitsi.

But I am referring specifically to the developer experience I had with the API, not the progress of any given application in implementing this API.


Scanning the OP list.

It seems to involve X hardware isn't supported and "You can divide up Y output - can't have two graphics drivers or two audio systems, have to reboot when things crash or etc". I hope these get fixed but I can't see how they're either "major" or damning. I mean, Linux is intended to be utterly reliable as a server OS but the ability to reboot when X crashes is otherwise a function of how likely it is for X to crash. If it's unusual and rebooting is reasonably easy, what is that problem. Windows itself got by with rebooting as the solution back when problems were far more common than they seem to me to be (once every 1-4 weeks my mouse driver dies and I need to reboot - and that problem IS supposed to be fixable without a reboot but isn't but who cares).

I mean, I have had "fuck around 'till it works" problems on Windows when I used it ten years ago and I have had them on my various Linux laptops since then.

* As I understand it, the major/damning problem on Linux is that a lot of gaming generally isn't support. That's different from all these problems which are "serious annoyance you can often work around".

Like you're #3 is "awful, damning, I never noticed and still haven't tested..."

Let's keep this in perspective.

Edit: a problem mentioned, that I would consider damning/major is using the CPU for video. That is a serious headache for video conferencing. A list that doesn't pile up all the snafus but instead prioritizes them would be good (I think there's actually far more headache-type issues even this list but I still can use my cheap Ubuntu/Mate laptop for nearly everything and enjoy it).


> a problem mentioned, that I would consider damning/major is using the CPU for video. That is a serious headache for video conferencing

If you have a hardware codec, graphics card, or codec support built into your chipset like many AMD and Intel chipsets have, then you can accelerate video encoding/decoding via VA-API, VDPAU or whatever Nvidia's solution is. Check out this page[1].

[1] https://wiki.archlinux.org/title/Hardware_video_acceleration


> 5. The Wayland list is damning as it reads like serious architectural errors were made for something that was meant to learn from X; and

Exactly my thought as well. However I am pretty happy with Wayland as my daily driver though (except some weird issues on Firefox). Anybody with more knowledge about Wayland's development cares to comment?


I have done some hacking on Wayland and most of the comments there about Wayland are misleading or wrong. I could go through that list but I don't know if I'm really interested to do that unless the author reads the feedback and could be convinced to change their article. It is really exhausting to deal with online misinformation unless it's stopped at the source.


You could email him or leave comments under the article. All valid criticism has been addressed and the article keeps being updated. It's its 500th revision or something like that.


Maybe I'll go through it if I have time but there are several other things that are really wrong or misleading in the article, it's a research project in itself to correct all the misinformation there. (A lot of the article is mostly correct though so it's probably an even worse situation for a reader)

Also a huge part of it is that the author is asking for things that don't really make sense e.g.

"Wayland does not provide a basic universal unified graphical toolkit/API (and Wayland developers will not implement it) akin to Win32"

Wayland sits at a level below the toolkit, it is an implementation detail in the toolkit. You're supposed to build toolkits on top of it. It wouldn't be feasible (or even wanted) to move the toolkit inside Wayland. A lot of these other criticisms are along these lines, i.e. a misunderstanding of what Wayland actually is. That bug report that was filed is not even actionable because of this.

Also edit: Sorry, I just noticed you're the author, you could mention that :) I'm not trying to give you a hard time here, but that whole part needs to be rewritten I think, because basically everything in it stems from a major misconception and is not constructive feedback because of that. I am actually in agreement with your thesis though, the "Linux desktop" (whatever collection of random 1000 git repositories that refers to this week) is pretty flawed. And Wayland definitely is flawed too, just not in any of the ways that were mentioned.


> there are several other things that are really wrong or misleading in the article

When I see generalization about a ton of stuff I instantly get suspicious about such claims. The article cannot be as wrong as you're trying hard to portray it.

> it's a research project in itself to correct all the misinformation there

You're really a fan of "misinformation". Sorry, the guy who wrote it, has been using Linux for more than two decades and is trying to be as unbiased and factual as possible. Looks like you're just too invested in Linux and you've been personally shaken that such an article even exists in the first place.

> Wayland sits at a level below the toolkit, it is an implementation detail in the toolkit. You're supposed to build toolkits on top of it. It wouldn't be feasible (or even wanted) to move the toolkit inside Wayland.

What's wrong with this claim exactly? You're not disproving it in the slightest, you are _explaining_ why it has been done the way it's been done. Most people don't give a damn about it.

Most 32bit Windows applications continue to run in Windows 11 26 years later. Linux, be it X11, Wayland or Mir, has no alternative to that. Yes, the author wants some LTS API for graphics, do you have anything to add to that? How does having Qt/Gnome/EFL/etc even address the issues shown in the article like inability to configure all the toolkits font rendering and other features simultaneously?

Had we had a low lever API akin to Win32 which toolkits like e.g. Gnome/Qt could be built upon, that would have never been an issue, meanwhile Linux developers continue to reinvent the wheel and _not_ provide universal solutions which allow to build software on top of it which will work for decades without recompilation.

You really don't see where the author comes from, yeah, Windows, where software can be developed and work for decades.

Just don't start BS'ing me with "everything could be recompiled for $next versions of Gnome/Qt" - this has _not_ been the case, we've lost hundreds of GTK 1/2, Qt 1/2/3/4 applications which no one wants to port to newer toolkits.

Actually please don't bother with your "refutations". I see where you're coming from and I'm sure you'll continue in the same vein which is "Linux is built differently, get used to it".

I can, I have but ISVs and game developers will _not_. If you're OK with no commercial software and AAA games for Linux, absolute most average people are _not_. They want to play native games and run native Adobe/Autodesk/etc. software under Linux.

I can imagine you could be content with having always incomplete layers of emulation and VMs which further prove that other OSes like Windows are built and designed better, since Win32 APIs are so easy to emulate.

Had you started differently I'd have been glad to continue arguing with you but you've used the meanest and dirtiest words to describe the work put into the article which is an instant turn off and a clear indication that you might have a superiority complex.


The article actually is that wrong because it stems from one faulty piece of information, I tried to mention that. Wayland is not meant to sit at the layer of the stack you're thinking it is. You could be asking for all those features in something else, but that's a different project you'd be talking about and a different project you'd need to be criticizing, not Wayland.

"You're really a fan of 'misinformation'. Sorry, the guy who wrote it, has been using Linux for more than two decades and is trying to be as unbiased and factual as possible. Looks like you're just too invested in Linux and you've been personally shaken that such an article even exists in the first place."

This is wrong, please avoid making these accusations of bad faith, and please do not spread this misinformation about me. That really hurts me when you do that because you are lying to my face about the thing I know most: myself. Sadly this kind of unprofessional behavior is so common in the Linux world that I think some of us have gotten used to it. But it needs to stop. I also have been using Linux for more than two decades. Linux on the desktop sucks. It's bad. I'm in total agreement with you there. It's deeply flawed in a lot of ways.

"You're not disproving it in the slightest, you are _explaining_ why it has been done the way it's been done. "

There's nothing to disprove here though, that's what I'm saying. I'm saying the claim doesn't make any sense, it's like saying "the Linux kernel is bad because it doesn't have a feature to do my taxes". Well yes, Linux is just a kernel. It's not tax software. If you want to use tax software you'll have to find something built for that purpose, and then if you find the tax software makes an error you could report it to the authors of it. If you tried to report it to the kernel developers they'd say sorry there's nothing we can do about that. See what I mean? Trying to make all these complaints about Wayland doesn't make sense. What you're asking for is something different entirely.

"Linux, be it X11, Wayland or Mir, has no alternative to that. Yes, the author wants some LTS API for graphics, do you have anything to add to that?"

Actually XWayland serves this purpose to keep old legacy apps running. Win32 API is not a good example since most modern Win32 applications are not built using the Win32 API either. But old software X11 should work just fine, if not then you should mention the particular bug you're having so that could get fixed. I'm not sure what you mean "LTS API" though, you could just ship any old version of an API you want.

"You really don't see where the author comes from, yeah, Windows, where software can be developed and work for decades."

No need to get defensive. I'm curious what's your experience with Docker? That's been good for me to keep old software running.

"Had we had a low lever API akin to Win32 which toolkits like e.g. Gnome/Qt could be built upon"

Well we did have that, it was called Xt, and those toolkits moved away from it because it was very ugly :) I get what you're saying but it sounds like you're asking for them to have developed the perfect API 20 years ago which I hope you understand is not realistic. We know all the things we know now because of the benefit of hindsight.

"Just don't start BS'ing me with "everything could be recompiled for $next versions of Gnome/Qt" - this has _not_ been the case, we've lost hundreds of GTK 1/2, Qt 1/2/3/4 applications which no one wants to port to newer toolkits."

You don't need to port those to newer toolkits though, you can just ship the old version.

"Actually please don't bother with your 'refutations'. I see where you're coming from and I'm sure you'll continue in the same vein which is 'Linux is built differently, get used to it'."

I assure you, you're wrong about me. I'm not trying to refute you, I'm trying to explain that some of these problems actually do have solutions, so that you can use it to help yourself. And the thing about misinformation is that it spreads to other people and gives them the wrong ideas too, so if we want to have any chance to stop it then we need to take responsibility ourselves.

"I can, I have but ISVs and game developers will _not_. If you're OK with no commercial software and AAA games for Linux, absolute most average people are _not_. They want to play native games and run native Adobe/Autodesk/etc. software under Linux. I can imagine you could be content with having always incomplete layers of emulation and VMs which further prove that other OSes like Windows are built and designed better, since Win32 APIs are so easy to emulate."

I'm in total agreement about this. The Linux APIs are a total mess and unappealing to developers. Windows has Linux beat in an extreme way. But that's not an excuse to make false or misleading statements, we still need to get our criticisms correct if we want to have a chance to fix things. Edit: In my opinion the issue with gaming is totally separate from X11/Wayland and everything else. Linux just does not have any real gaming APIs. There's Vulkan and there's SDL and then that's it. That also is something that probably needs to happen at a layer above Wayland.

"Had you started differently I'd have been glad to continue arguing with you but you've used the meanest and dirtiest words to describe the work put into the article which is an instant turn off and a clear indication that you might have a superiority complex. "

I don't think I'm superior to you and I'm not criticizing you as a person. You're great. However some of the things you've said aren't correct, and I do think that you, as an intelligent being, deserve to know that so you can improve your article. There's no nicer way for me to put it, nobody likes to hear that they fucked up, but sometimes we need to hear it. Please do not write these articles if you're going to get defensive and don't want to field any criticism, that is not the way to have a discourse with the community. Edit: Also you've referred to this as an argument which it isn't. I'm not interested in arguing with you, I'm interested in correcting these issues. If you knew me you would probably find that I agree with you in most areas.


Let's quickly disagree with you as well:

> It's deeply flawed in a lot of ways.

Again, a bloody insult. You carefully choose words to insult the author, are you like that IRL as well?

> I'm curious what's your experience with Docker?

It's basically a virtual machine, almost like WSL for Windows. I couldn't care less about it because it's useful for packaging server applications, not for end users or ISVs.

> Actually XWayland serves this purpose to keep old legacy apps running.

Tell me how I can run KDE2/KDE3 applications in my Fedora 35 or Ubuntu 20.10. Your "actually" is a MISINFORMATION when right now I cannot use most graphical applications just from 10 years ago.

> You don't need to port those to newer toolkits though, you can just ship the old version.

Will not work, Qt2/Qt3/GTK1 are not available under new Linux distros in any shape or form. "Too costly to maintain". GTK2 will be removed soon.

Linux proponents always like to talk about various "workarounds" which are just NOT THERE for the average user. Docker is one of them.

No one will ship anything because Linux distros don't have enough manpower to maintain currently available software.

> I'm trying to explain that some of these problems actually do have solutions

This is FALSE, period. Case in point: http://blogg.forteller.net/2016/humble-test/

Linux distros do not give a damn about any software which they don't include. They don't care about any forward or backward compatibility. Ubuntu just recently tried to remove all 32bit support only then they realized their Wine users would not appreciate it. Most don't care about any stability as they don't give you an option of using say e.g. the current stable kernel or LTS kernels. Period.

Again, the article is about _major current unresolved_ issues in Linux and it stands by it.

Your "solutions", "workarounds", etc. are worth nothing if the average user who's going to install Fedora 35 in two weeks will not be able to use any of them.


"Again, a bloody insult. You carefully choose words to insult the author, are you like that IRL as well?"

I'm not insulting the author, that is a statement about the article. If you are suggesting that everything you write has no flaws then I'm sure you understand how that's not a reasonable thing to say. I write plenty of stuff that has flaws. But I'm willing to field criticism and correct it if it's wrong, that's how we collaborate to make something that's twice as good because it combines both our knowledge. Please do not write these articles if you are not interested to field criticism, thanks. Also please avoid trying to second guess the technical direction of a project unless you are involved in its development in some way, otherwise more misinformation like this is very likely to occur.

"It's basically a virtual machine, almost like WSL for Windows. I couldn't care less about it because it's useful for packaging server applications, not for end users or ISVs."

I'm not sure I agree, I've seen many ISVs that ship docker containers, even for desktop apps. You may want to look more into this technology.

"Tell me how I can run KDE2/KDE3 applications in my Fedora 35 or Ubuntu 20.10. Your 'actually' is a MISINFORMATION when right now I cannot use most graphical applications just from 10 years ago."

Someone would have to package those applications for those systems. I could walk you through the process if you want help. It's not exactly an easy process and we will probably have to solve some issues but I think it's doable. The real problem here seems to be that those distros have stopped shipping those applications. Edit: Actually I think this would be perfect to do in a flatpak or snap, you really should consider asking someone to do that and making that more prominent in the article. I'd suggest that you avoid setting arbitrary time frames like "right now" or "in two weeks" or anything like that.

"Will not work, Qt2/Qt3/GTK1 are not available under new Linux distros in any shape or form. 'Too costly to maintain'. GTK2 will be removed soon. Linux proponents always like to talk about various 'workarounds' which are just NOT THERE for the average user. Docker is one of them."

So again that would be the real problem, those things are too costly to maintain and there is no manpower. All of that is true and you can just mention that in the article. Not sure what can be done about it, if you have some ideas I'd love to hear them.

"This is FALSE, period. Case in point: http://blogg.forteller.net/2016/humble-test/"

For whatever reason the spreadsheet doesn't load so I can't see what you're talking about or try to figure out what the problem is. But those are games so I doubt the problem has anything to do with X or Wayland or KDE or anything like that. It's possible it could be something simple, like a few missing libraries. But maybe not, I really don't know, and if you don't know either then I would suggest against saying that it's true or false. This is not something we can productively speculate about on a blog or social media, it needs to be investigated as an actual bug.

"Linux distros do not give a damn about any software which they don't include. They don't care about any forward or backward compatibility. Ubuntu just recently tried to remove all 32bit support only then they realized their Wine users would not appreciate it. Most don't care about any stability as they don't give you an option of using say e.g. the current stable kernel or LTS kernels. Period."

All of this is true but is again unrelated to the rest of the things you said. Please make the article more about this and what can be done about it, if you offer solutions on how to fix it that would be even better. Right now, the article offers very little in the way of constructive comments.

"Your 'solutions', 'workarounds', etc. are worth nothing if the average user who's going to install Fedora 35 in two weeks will not be able to use any of them. "

It would be nice if you could mention more examples of what this average user using Fedora is trying to do. It's not helpful to discuss theoretical users, a valid bug report needs concrete steps to determine what the problem actually is. I think that would also be a major problem with the article, most of the commentary is phrased as "X doesn't have Y" or "X doesn't let me do Y" when that's not really a meaningful thing to say in many cases. It's more helpful to phrase it as "I tried to do Y thing, the result was Z instead, here are the steps I took using software A B and C".


> This is wrong, please avoid making these accusations of bad faith, and please do not spread this misinformation about me.

If you could stop being so rude and insulting I could discuss this further with you. _It was you_ who started with "misinformation" even though it was _never_ the author's intent.

> Wayland is not meant to sit at the layer of the stack you're thinking it is. You could be asking for all those features in something else, but that's a different project you'd be talking about and a different project you'd need to be criticizing, not Wayland.

ISVs and game developers need stable APIs to base their applications on and no Linux library or middleware or whatever you want to call it provides it currently, Wayland or not. You seem to be nit-picky as hell without trying to remotely understand what the author wants and has in mind, and that is "write once and be able to run for decades".

Here's the item you're opposed to and calling "misinformation":

> Applications (or GUI toolkits) must implement their own font antialiasing - there's no API for setting system-wide font rendering. Most sane and advanced windowing systems work exactly this way - Windows, Android, Mac OS X. In Wayland all clients (read applications) are totally independent.

Where the hell is misinformation exactly? Do graphical libraries for Wayland use common _shared_ APIs, interfaces and configuration to render fonts? Are those even standardized? Can I be sure that an application which doesn't use GTK/Qt/EFL will use my ~/.config/fontconfig ? No. That's the answer.

OK, they currently all maybe use FreeType/Pango/Cairo/Skia/whatever, but will these libraries be there in 10 years time? Just 15 years ago _nothing_ in Linux used Pango. It was XFT. No bloody standards, no APIs to guarantee anything in the long term. Well, under Windows all Win32 applications render fonts exactly the same (let's not talk about UWP and other fancy crap) using your font antialiasing settings.

BTW, right bloody now, in Fedora 35, GTK, Qt applications and web browsers all render fonts differently. I can upload screenshots for you to see. So much for "misinformation". Oh, God.

OK, let let's call this article which, every time it shows up, gathers hundreds of comments "a load of absolute bullcrap/misinformation" (which you're so fond of) and Linux is perfect.

If you really want to be technical and factual, approach what article says from the same PoV, not "I will _interpret_ this and that this way based on the _status quo_ and call it misinformation".

Good luck and have fun calling everything you disagree with (even though it's not technically false) "misinformation". BTW, you've used "disagree" quite a lot but still call everything "misinformation. Make up your mind already. Either something is dead wrong or not. First you made it abundantly clear that it's _all wrong/false_ and now you're "disagreeing".

Lastly there are 12 items (!) under Wayland, one which you disagree with, yet the whole article is disinformation. OMFG. This certainly invalidates everything.

Sorry, you've rattled me too much to continue.


"_It was you_ who started with "misinformation" even though it was _never_ the author's intent."

I am not saying it is your intent, most people who get things wrong are not intending to do so. It still however can be misinformation. I think you are confusing terms, the word "disinformation" is what usually refers to intentional false information. I'm not saying you're spreading disinformation and I'm not criticizing you as a person. My issue is with the words that are written in the article. Please avoid taking criticism of your work personally, I'm sure you are a good and beautiful person.

"ISVs and game developers need stable APIs to base their applications on and no Linux library or middleware or whatever you want to call it provides it currently"

I agree, this is the real problem and you could just state that. It has nothing to do with X or Wayland or whatever.

"what the author wants and has in mind, and that is 'write once and be able to run for decades'"

Actually I do understand this, and I feel the article doesn't communicate it well, I wish you would be clearer about what problems you're having here with shipping apps like this. The criticism of all kind of unrelated things is distracting, just mention what you're trying to run that isn't working.

"Do graphical libraries for Wayland use common _shared_ APIs, interfaces and configuration to render fonts? Are those even standardized?"

The answer to all of those is yes, they use freetype.

"Can I be sure that an application which doesn't use GTK/Qt/EFL will use my ~/.config/fontconfig ? No."

I'm really not sure what you're asking for here, if the app opts out of the standard then of course it won't work. You can also implement your own font rendering on Windows/MacOS and opt out of the standard there too. There is absolutely nothing X or Wayland or Mac or Windows or anything can do if an app decides they are going to implement their own font rendering library. And I assume you're talking about games here, those are a great example of something that intentionally doesn't follow the system settings on any platform.

"OK, they currently all maybe use FreeType/Pango/Cairo/Skia/whatever, but will these libraries be there in 10 years time?"

Well yes, the code will probably still be around, I think github has archived all this stuff for 1000 years or something like that.

"Just 15 years ago _nothing_ in Linux used Pango. It was XFT. No bloody standards, no APIs to guarantee anything in the long term."

I have no idea what you mean, you can still use Xft. Pango has more features however so you'll have to upgrade if you want those. That's generally how it works, new APIs get new features, the old APIs stay the same and remain stable.

"Well, under Windows all Win32 applications render fonts exactly the same (let's not talk about UWP and other fancy crap) using your font antialiasing settings."

I don't understand why you are saying disregard UWP, that would be a demonstration of the same problem!

"gathers hundreds of comments 'a load of absolute bullcrap/misinformation' (which you're so fond of) and Linux is perfect."

Linux isn't perfect, I've said repeatedly that it's heavily flawed.

"Good luck and have fun calling everything you disagree with (even though it's not technically false) "misinformation". BTW, you've used 'disagree' quite a lot but still call everything "misinformation. Make up your mind already. Either something is dead wrong or not. First you made it abundantly clear that it's _all wrong/false_ and now you're 'disagreeing'. Sorry, you've rattled me too much to continue. "

Please avoid this kind of combative comment. I've said I agree with you in subjective matters in what constitutes being "good" or "bad" and what we think needs to happen. In matters of the facts of what the software actually is doing right now, that's where we need to worry about facts and misinformation. You owe it to your readers to be as accurate as possible if you want to have a positive effect. Also I apologize if you feel rattled, it's not my intent to upset you, my intent is to get the facts corrected.


Anyway I'll get into it. From my perspective from actually working on this, the problem with most of these statements is that none of these things can actually be fixed in Wayland. Wayland is just a really basic protocol for putting buffers on the screen, it doesn't do much else and it's intentionally not good at being a general purpose API for desktop-y things. Your article seems to assume that it is, which is a misconception, and only causes confusion. If you actually tried to fix these problems by making changes in Wayland then you would make the problem worse. Most of what is mentioned absolutely are problems, but they need to be fixed elsewhere, and the claim should probably be rephrased to reflect that if you want it to be accurate. To be clear, I am in no way trying to downplay any of these problems.

"Wayland doesn't provide multiple APIs for various crucial desktop features"

So some of these are in the portal API, not wayland. It doesn't really make sense to put them in Wayland, some implementations do provide them in Wayland but that's more of an implementation detail and applications wouldn't actually be expected to use that API. Some of them are still missing from the portal API however, which is a problem, so maybe this should be edited down to just those. I'll talk about each of them too.

- global keyboard shortcuts - It's not clear what this means. There were some proposals for this but none adequately explained the problem.

- drag-n-drop - Not sure what else you're looking for here, Wayland has this.

- systray - There are implementations of this, it doesn't belong in the wayland protocol. It would be for a panel to communicate with an app.

- screenshotting, screencasting - This is in the portal API.

- UI rendering acceleration - Not sure what this means, Wayland clients can just use OpenGL or Vulkan.

"which must be implemented by the window manager/compositor, which means multiple applications using the said features can only support their own Wayland compositor implementations"

I'm not sure what this means, applications can use an abstraction layer that deals with it for them. Most applications already are using such an abstraction layer like GTK/Qt/SDL/etc, they should not have to deal with the raw X11/Wayland API.

"Wayland doesn't provide a universal basic window compositor"

This is not the use case of Wayland and it would not be possible for Wayland to provide this. You could build such a compositor on top of Wayland.

"thus each desktop environment has to reinvent the wheel and some environments simply don't have enough manpower to write a compositor e.g. XFCE/IceWM. This also leads to a huge amount of duplication of work (and bugs as well) because various desktop environments need to reimplemen the same features and APIs over and over again."

Well no, if they're short on manpower they can just use an existing compositor such as Mir, wlroots, mutter, etc. The real problem here is that they are short on manpower, nothing can be done in Wayland or X11 or anything to really fix that.

"Wayland and its window managers/compositors have very high requirements in terms of support from both hardware and kernel drivers which prevents it from working on far too many configurations. X.org and Windows on the other hand can work on an actual rock using e.g. VESA."

So this is an actual problem but it's not really related to Wayland, e.g. if you want to run an X compositor on an old PC that will also have bad performance because it requires hardware acceleration. And if an app itself requires Vulkan or something then that definitely won't work. I'm not sure what can be done about this and this seems like complaining about an unsolveable problem, you will also have similar problems if you try to install Windows 11 or MacOS Monterey on an old PC and try to run modern apps on it.

"Applications (or GUI toolkits) must implement their own font antialiasing"

This is completely wrong, basically all applications and UI toolkits just use freetype.

"there's no API for setting system-wide font rendering"

This is also completely wrong, this is what fontconfig is for. It's not the best API though so this section could be rephrased as a review of freetype and fontconfig.

"Most sane and advanced windowing systems work exactly this way - Windows, Android, Mac OS X."

This is also completely wrong, I just checked Android and MacOS and there is no setting to change the font rendering. Not sure about Windows as I don't use it.

"Wayland does not provide a basic universal unified graphical toolkit/API (and Wayland developers will not implement it)"

See what I said above, Wayland cannot implement this. If you wanted this you would have to build it on top of Wayland. Wayland is a API to use when implementing a toolkit.

"akin to Win32 which means applications using different toolkits (e.g. Qt, GTK, Enlightenment) may look and behave differently and there's no way to configure all of them once and for all."

Toolkit fragmentation is a real thing which should be mentioned in this article, but see above again, this is not something that Wayland is ever going to be in any position to fix.

"Firstly, forget about performance/bandwidth efficient RDP protocol (it's already implemented but it works by sending the updates of large chunks of the screen, i.e. a lot like old highly inefficient VNC)"

Another thing that Wayland is not in the position to fix. This would be nice to have but would be up to the toolkits to implement.

"forget about OpenGL pass-through, forget about raw compressed video pass-through"

This has to be implemented in the VM host, there is nothing a Wayland implementation can really do about it. The Wayland protocol only receives the dmabufs and YUV buffers and passes them off, it doesn't say anything about what to do with them.

"In case you're interested all these features work in Microsoft's RDP."

Yeah so that would be a problem with the RDP implementation, FreeDRP or whatever.

"Secondly, forget about proper output rotation/scaling/ratio change."

That thread is really confusing and nonsensical to me. If you have non square pixels then you just make the application aware of the vertical/horizontal DPI in some way. Wayland doesn't even need to get involved, it will just display your buffer as you pass it. Wayland itself doesn't assign any unit to a pixel.

"Wayland doesn't allow XWayland applications to change display resolution which could make running games slower as the compositor needs to upscale each game frame."

This is more of a problem with X11 since there is no "safe" API to change the display resolution there. Maybe someone could fix it but those apps really need to be ported to a different API.

"Also software upscaling might not be the best option."

Well most compositors are using hardware upscaling.

"Wayland doesn't allow applications to exclusively grab mouse/keyboard which is required for games and applications like VMs."

This is wrong, that's what the pointer constraints protocol is for.

"Applies to the X server/protocol as well: neither X.org, nor Wayland window managers offer a way to extend/modify window's title bars and File Open/Save dialogs."

This again has nothing to do with X.org or Wayland. The title bars and file dialogs are handled by the toolkit.

"there's no unified toolkit and no unified window manager (or protocol)."

So I think you could just replace this entire section with this statement and then go from there.

"Wayland compositors don't have a universal method of storing and configuring screen/session/keyboard/mouse settings."

This one's potentially a real problem, but again, it's not clear what "screen/session/keyboard/mouse settings" are. That could be thousands of things. And this also has nothing to do with Wayland because it's not a protocol for configuring settings.

"Currently there's no standard way to remap keys under Wayland."

See above, this has nothing to do with Wayland.

"Wayland applications cannot run without a Wayland compositor and in case it crashes, all the running apps die."

This statement is badly worded, technically this isn't true and it just needs to be implemented. Which is the real problem and is currently underway, see here: https://lists.freedesktop.org/archives/wayland-devel/2021-Au...

"An assortment of various other general and KDE specific issues."

These are valid issues, please make these issues more prominent.


And just a few more things.

The sections on video drivers, printers, Xorg and Linux kernel I think are mostly correct but I would say you probably want to remove any paragraphs that have no sources, or ones that have low quality sources. Random Slashdot and Reddit posts are not a reliable source, and it makes your article look sloppy. It's not responsible to put that information out there with nonexistient/bad sources, so probably take those paragraphs out and put them back in once reliable sources are located. It would be best to have a tracking issue or official statement from the developers for each of those problems.

I think this article is good for maybe a first draft but it needs a lot more editing to really be a powerful piece. You are clearly a smart and capable person and you can do much better than this. You may even want to take it down completely until everything is fixed. I had some other issues but I can't remember what they were, and in any case I think you have enough to work with here. So I'm looking forward to getting these corrected and seeing future revisions of this article. Just let me know if you need more help, thanks.


Wayland and Xorg sections have been reworked. Most of your Wayland concerns have been addressed (but not all).

If you still have other concerns and want to contribute, please do.

Only like I said before, let's talk about items which are _factually wrong right now_ as we speak.


> you probably want to remove any paragraphs that have no sources, or ones that have low quality sources. Random Slashdot and Reddit posts are not a reliable source, and it makes your article look sloppy.

Since there are no official/factual sources but the sentiment is shared by lots of people, what would you do about that? To think about that, _nothing_ in the article has rock-solid proofs or sources. If we continue along those lines, it should simply not exist as it's so effing "bad" yet it's common knowledge that 1) APIs in Linux are abandoned all the time 2) There's nothing similar to Win32 3) There are tons of regressions all the time 4) HW support is spotty at best despite to the proclamations of the opposite 5) Lots of bugs in core applications e.g. DEs 5) Older applications are abandoned with no official way to run them - and people debate even these things based on anecdata, "Linux works for me which means it's perfect".

> It would be best to have a tracking issue or official statement from the developers for each of those problems.

No one cares. Most developers will oppose to this idea HARD. The list has existed since 2009. If anything has been solved on it it was due to developers being _personally_ annoyed by the said issues. Not because the Open Source community demanded it, or someone paid attention to bug reports, no, it was either developers addressing bugs which affected them or companies like Valve sponsoring the things they need (wine/Proton/DXVK).

> I think this article is good for maybe a first draft but it needs a lot more editing to really be a powerful piece.

Nah, just "misinformation". I will never forgive you for using this word, as I have a very strong attitude about people who use it willy-nilly to talk about anything they merely _disagree_ with.

While I'm totally OK with people having different opinions about something which has not been established or proven scientifically, there can be no opinions about something which has been conclusively proven, e.g. evolution, climate change, gravity, etc. etc. etc. You do not debate something which has long become the scientific fact. The author has a scientific background, opposes religions and loves science, so to think that he's willingly spreading misinformation is an insult of insane proportions.

> You may even want to take it down completely until everything is fixed.

Nah, your posts will be reviewed later, if there's truly valuable, meaning it's really good in terms of refuting the article head-on, it will be addressed just like it's been done before. If you find the article so repulsive, don't visit this abomination made of falsehoods. Sorry for kinda overreacting but you started off horribly.

Likely everything that goes under "I disagree because", "This happens because" followed by mostly inane "arguments" why Linux is "different" from well-established OS'es with good stable APIs and excellent documentation, will not be addressed because your opinion, while it's very deer to you, does _not_ solve or address the issues presented therein. Open Source fans have a long list of opinions about Open Source, almost none of which have helped address anything since 1991. Most open source fans love to think of themselves as selected few who are so great for choosing open source, yet absolute most of them haven't contributed in any way shape or form. The guy who wrote this ugly largly false article (according to you) has helped resolve hundreds of bugs including in the Linux kernel, GCC, Xorg and other core Linux projects. Applying the word "misinformation" to his work and efforts is terribly insincere, damning and extremely insulting.

> I had some other issues but I can't remember what they were, and in any case I think you have enough to work with here. So I'm looking forward to getting these corrected and seeing future revisions of this article. Just let me know if you need more help, thanks.

Slow and steady.


> the portal API

I don't care about lightweight virtualization solutions, I care about native applications. If we start talking about virtualization, let's forget about Linux as a software platform altogether. Why would anyone want to run Linux if you can run Flatpaks under Windows, FreeBSD, etc. etc. etc. not to mention increased memory consumption, disk consumption, slower startup, and ... ISVs just don't like it. They like to compile native applications and be done with that like they do for Windows, MacOS, Android and iOS.

> - systray - There are implementations of this

Exactly, "implementations" - which means you cannot write an app which could run under all Wayland DEs and minimize to the systray. This is not an issue under Xorg Windows MacOS - only under Wayland.

> global keyboard shortcuts - It's not clear what this means. There were some proposals for this but none adequately explained the problem.

"proposals", exactly, no APIs, no standard way to make 'em work in all Wayland compositors.

> drag-n-drop

Removed.

> screenshotting, screencasting - This is in the portal API.

Again, Flatpak.

> UI rendering acceleration - Not sure what this means, Wayland clients can just use OpenGL or Vulkan.

It means under Windows you use Win32/D3D and as a programmer you don't have to think how to make your application run fast using system wide GPU hardware acceleration. Again, there's no common API underneath GTK, Qt, EFL which provides it which means a zoo of implementations, bugs and problems.

Still, you're right, it has nothing to do with Wayland, so moved to a new section and rephrased! Good point.

> I'm not sure what this means, applications can use an abstraction layer that deals with it for them.

Doesn't exist currently.

> Well no, if they're short on manpower they can just use an existing compositor such as Mir, wlroots, mutter, etc.

Their lead programmers think otherwise or they would have started using them a long time ago. Please argue with them. Is this an issue right now? YES!

> I'm not sure what can be done about this and this seems like complaining about an unsolveable problem, you will also have similar problems if you try to install Windows 11 or MacOS Monterey on an old PC and try to run modern apps on it.

Windows and X.org both can run on VESA compatible GPUs, that also means you have safe mode in case your graphical driver misbehaves. No such option with Wayland compositors. Is this an issue with Wayland itself? No. Is this a valid current issue? Yes.

> This is completely wrong, basically all applications and UI toolkits just use freetype.

Already mentioned that GTK Qt EFL based applications and Web browsers here on this PC render fonts differently despite all using FreeType. They don't just use FreeType, there's no such thing as "just using FreeType".

> This is also completely wrong, this is what fontconfig is for. It's not the best API though so this section could be rephrased as a review of freetype and fontconfig.

See the previous paragraph. I don't care why they look different and render fonts differently, but the issue remains.

> This is also completely wrong, I just checked Android and MacOS and there is no setting to change the font rendering. Not sure about Windows as I don't use it.

In Android all apps look exactly the same to me. In MacOS last time I used it all apps looked exactly the same. In Windows: classic Win32 applications all look the same: https://www.thewindowsclub.com/disable-font-smoothing-window...

> See what I said above, Wayland cannot implement this. If you wanted this you would have to build it on top of Wayland. Wayland is a API to use when implementing a toolkit.

> Toolkit fragmentation is a real thing which should be mentioned in this article, but see above again, this is not something that Wayland is ever going to be in any position to fix.

Addressed.

> Maybe someone could fix it but those apps really need to be ported to a different API.

Windows has moved from GDI to WDM 1/2/3 transparently and most old applications continue to work. Not a good argument, sorry.

> Well most compositors are using hardware upscaling.

Addressed/removed.

> This is wrong, that's what the pointer constraints protocol is for.

Can you provide more info please?

> This again has nothing to do with X.org or Wayland. The title bars and file dialogs are handled by the toolkit.

> So I think you could just replace this entire section with this statement and then go from there.

Moved/addressed.

> This one's potentially a real problem, but again, it's not clear what "screen/session/keyboard/mouse settings" are. That could be thousands of things. And this also has nothing to do with Wayland because it's not a protocol for configuring settings.

Does Wayland necessitate a compositor implementation for the user? Yes. Does the user want to configure each of dozen possible compositors? Very unlikely, they want to configure their graphics and related settings once and for all.

> See above, this has nothing to do with Wayland.

I've lost you here.

> This statement is badly worded, technically this isn't true and it just needs to be implemented.

Is this an issue RIGHT NOW? Yes.

> These are valid issues, please make these issues more prominent.

Copying and pasting is not my forte.

-------------------------------------------------------

P.S. I beg you - please stop talking about "this needs to be implemented", "this could work if", etc. - expressions like these unfortunately prove it right.

You started with "misinformation" yet the article describes the CURRENT ISSUES plaguing Linux on the desktop.

If you want to actually make a difference (about the article), start talking about _the status quo_.


> 6. PulseAudio continues to be the answer to the question nobody asked.

Getting rid of PulseAudio though became a question to an answer nobody wanted.

Axing these *Kit MacOS wannabe software, and useless middleware is half the business these days if you want to make usable Linux desktop software.


This comment is really confusing to me. If you're trying to build a comprehensive desktop platform then you actually want more and more middleware. That's what entices app developers to the platform when they have a lot of structure they can work with, it's part of why Apple has been so successful at that.


This post is 100% accurate, one thing that people won't beleive is that it's almost 2022 and there is no hardware acceleration when playing video on your browser in Linux, yup CPU is pegging like crazy to play some stuff that is trivial to decode with a GPU. I have a laptop from 2021 and yet somehow VLC is not able to use the GPU for video decoding.

Something that windows has been doing for the last 15 years. So you have slow video rendering, high CPU / battery usage.

Other crazy stuff: have you tried to install that version of x11 to get the latest opengl, then have you tried to put that firmware in the folder etc etc ... Does your kernel has the backport from version 5.10.x no? well when you get back from sleep have a black screen etc ... same problems as 20 years ago on Linux.

https://wiki.archlinux.org/title/chromium#Force_GPU_accelera...


I know it's proving the point in the article, but I thought I'd let you know Firefox can be configured to do hardware acceleration. There's a nice article in the Arch Linux wiki that describes the steps needed to turn it on: https://wiki.archlinux.org/title/Firefox#Hardware_video_acce...

I've done this and the battery life on my ThinkPad (Intel graphics) while watching videos in Firefox has improved heaps. My last 20% is around an hour instead of only 15mins.

It's silly how it's turned off by default even though it works just fine. But again proving the point of the article I guess...


Firefox has enabled hardware acceleration in GNOME Shell by default as of versions 84 (X11) and 85 (Wayland), if you are using AMD or Intel graphics.[2] The feature is enabled in KDE Plasma and XFCE by default (X11 only) as of version 88, also for AMD and Intel graphics.[2]

[1] https://mastransky.wordpress.com/2021/01/10/firefox-were-fin...

[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1696495


I believe you are referring to hardware acceleration for stuff other than video (which I guess is what WebRender is about). I have no problems with this, and indeed WebRender is enabled by default in my installation of Firefox 93, GNOME, Wayland, Intel graphics.

Re-reading my post I realise I didn't specifically say hardware video acceleration -- this is where applications like Firefox does video decoding etc in the GPU instead of in software. This is not enabled by default on Linux yet.


> It's silly how it's turned off by default even though it works just fine

This can't be really stated based on anecdata; for example, my case has been unsuccessful - I had issues, so I had to disable it.


Why are some people not willing to spend money on these things? Isn't it a good thing to be willing to?

HW acceleration, great sleep support, nice trackpad drivers, good battery life, well integrated high-resolution display support, high res audio (AAC) over Bluetooth or integration to your other devices etc. etc.

Where's the freedom in not even being able to pay/opt-in for these things in your OS?

How is this thought providing freedom for the developers of a complicated driver or another great piece of SW?


> Why are some people not willing to spend money on these things? Isn't it a good thing to be willing to?

From what I understand from other comments on this post, it seems this particular problem can only be perfected by more testing across multiple devices etc - that requires a wider effort that is quite difficult to achieve.

For example, I use Linux as my daily driver and I sometimes fix bugs here and there. But, personally for me, this particular one requires so much effort for not-that-revolutionary benefits. More context: I rarely watch long videos on my laptop when on the move. I'd rather fix something that improves my workflow than a bit more of battery life.

Sleep, otoh, is very important to me, so I ensure to buy laptops that DO NOT have NVIDIA stuff in them. For Gaming I got my desktop. Result: my XPS 13 and ThinkPad sleep and resume instantly, WiFi, audio etc work fine, including attaching a dock. I guess its really a YMMV thing, which, as the blog points out, doesn't help Linux's popularity.


I’m not sure why you linked to docs talking about video decode acceleration in the browser after saying there is none.

Anyways, it exists, I use it every day, but it’s not zero effort — like most things on Linux.


For the folks at home, you can try

  google-chrome --ignore-gpu-blocklist --enable-features=VaapiVideoDecoder --use-gl=desktop --disable-gpu-driver-bug-workarounds --gpu-testing-vendor-id=0x[vendor_id] --gpu-testing-device-id=0x[device_id] --enable-crashpad --flag-switches-begin --enable-gpu-rasterization --flag-switches-end
The vendor id and device id of your graphics card can be identified via

  lspci -nnk
As an example, the output for my graphics card is

  02:00.0 VGA compatible controller [0300]: Advanced Micro Devices, Inc. [AMD/ATI] Lexa PRO [Radeon 540/540X/550/550X / RX 540X/550/550X] [1002:699f] (rev c7)
 Subsystem: Sapphire Technology Limited Lexa PRO [Radeon RX 550] [1da2:e367]
 Kernel driver in use: amdgpu
 Kernel modules: amdgpu
Here the vendor id is 1002 and the device id 699f. You can check whether everything is working by navigating to

  chrome://gpu/
Many of the graphics features should now have hardware acceleration enabled.

There are probably a few extra flags here than necessary, but I found these sufficient to force 4K 60 fps Stadia support on Linux. (You would also need the Stadia+ chrome extension if that were your goal: https://chrome.google.com/webstore/detail/stadia%20-extensio... .)


That's not what I could call "working". Most people out there will simply never use such a "solution".


I think you misunderstood my post. My purpose wasn't to argue that the situation is ideal -- I would much prefer it if the process were automatic, as it is on other operating systems -- but rather to assist others in enabling video acceleration.


Hardware decoding in the browser should be 0 effort in 2021, it does not make sense that it's not a thing.

I use Chrome and Ubuntu 20.04 + an intel Iris ( CPU from 2020 ) and it's not possible to make it work.

I'm a dev and I know Linux very well, so imagine average people.


So this is hacker news.. Anyone with the knowledge, how difficult is it actually to enable hardware decoding for the vast HW out there?

Like how many hours of work / manpower do you to have HW decoding support for the hardware out there and to keep it up to date, safe, update it for new HW etc?


How can Windows do it? It's not like MS has lots of resources to do this very specific thing, and Windows needs to support almost all consumer hardware. That makes me think there's some driver-level barrier that Linux devs just haven't cared enough about.


Windows has DXVA which has been standardized for over a decade.

Linux still cannot settle on a single universal API which works regardless of a graphics system or various underlying drivers. In short it's a bloody mess.


Windows and OS X ship APIs for acceleration.


This is a far more specifically problematic issue than a lot of the list imo.


I didn't know this, wow. How do you watch videos on a laptop like that?


Let the CPU decode/render it i guess, and say bye bye to your battery life.


So I've been running Linux (and various BSDs) on the Desktop for at least 20 years or so. All flavors and distros. And I think the problems in 2021 are mostly the same as they have always been: caused by vendors.

It always comes down to what you're willing to tolerate in your desktop and hardware support and what vendors are dishing out.

Most people can just install Linux nowadays on a fresh PC and be done with it. It mostly just works which is amazing in many respects.

But - if you come up against some bug for some specific piece of hardware you have (eg graphics driver is not working properly out of the box), you're in for some work. And that's fine if you are OK with that.

Personally, I am not OK with that anymore and here are my rules:

1. If I have a random PC I will just run Windows 99% of the time.

2. If I want desktop Linux, I will purchase from a vendor who is testing with their hardware (e.g. System 76 or PINE64) OR I might buy a chrome book for Crostini. But I don't consider Chrome OS or Android "linux" because they are at their hearts proprietary operating systems that happen to utilize the Linux Kernel, IMO.

I consider this whole issue orthogonal to systems like macOS. I appreciate the work being done for M1 by the Asahi folks and will donate to it, of course.

I believe in 2021 it's time for me to start voting with my dollars if I want Desktop Linux (or phone Linux..) to be a thing. As much as I enjoy building a software system that matches my expectations, it's time for me to fully pass these expectations back to vendors, many of whom deserve my support.


I mostly agree. Except that on my personal machines, I will run Linux or a BSD variant 99% of the time. ;-) At work, I never had much of choice, it was Windows all the way.

I know it sucks if the OS does not support your hardware well. And I don't blame people who do not have the time or patience to deal with that. I certainly don't. I used to love spending weekends getting sound or wifi to work, but I have been "too old for this s___" for more than a decade.

But personally, I have not run into any problems in more than a decade. I will admit, though, that I have learned to do a little (and I mean little) research before purchasing hardware. Back in the early 2000's, Win modems and printers used to be a minefield for Linux users, but these days, picking hardware that Just Works(tm) is shockingly easy to someone who remembers the "good" old days.

I think the main barrier to most people is that Linux is just different from Windows, starting with picking a distro, then picking a desktop environment, and so forth. I still consider it fun to experiment with different distros and desktops, but I can understand that most people don't want to spend/waste their time with that. Which also means that PC vendors have pretty much no incentive to put Linux on their machines except for targeting developers.

I don't think Linux (or any of the BSD systems) will ever become a mainstream OS on the desktop, and I can live with that. I might sound a bit cynical, but at least it means that Linux is not and will not be an attractive target for malware.

Given all the crap that Windows users put up with, be it malware or system updates randomly breaking things, I think a fair amount of people might find it worth the effort to give Linux a try. I used to work as a Windows admin, so I'm not just guessing here. But if it hasn't happened by now, I doubt it's going to anytime soon. People tend to like what they know, and with all the third-party software available on Windows and Windows only, Microsoft is going to spend a lot of effort maintaining backwards compatibility for a long time.


> Given all the crap that Windows users put up with, be it malware or system updates randomly breaking things, I think a fair amount of people might find it worth the effort to give Linux a try.

System updates don't break things on Linux?

The only reason malware is not more popular is because Linux is less attractive platform to target due to low market share for desktop users, not because it's actually more secure. Same goes for macOS.


I agree with you 100% on the malware part. If Linux got a share of, say, 30% on the desktop, things would start looking quite different, I am sure.

Updates... well, there's updates and there's upgrades. On Debian and CentOS, I have never encountered any trouble with updates. You pay the price by using software that is usually rather dated, but in my experience, once it works, it keeps working.

On rolling-release systems, where there is no clear distinction between updates and upgrades, I have encountered plenty of trouble on Gentoo and openSUSE Tumbleweed. Arch Linux, I'm told, is rock-solid, but I guess once you manage to actually install it, nothing scares you anymore. ;-) (Just kidding, I know that Arch has an actual installer by now. But until fairly recently, installing Arch made the OpenBSD installer look almost decadent.) (Also, to be fair, I was on Gentoo's unstable branch, so in a way, I was kind of asking for trouble.)


> Given all the crap that Windows users put up with, be it malware or system updates randomly breaking things, I think a fair amount of people might find it worth the effort to give Linux a try.

This. And I would be wrong to say that running Windows is easier - it is not. It just has the vendor support to make it a little bit easier with respect to some hardware, sometimes.


Hehe, the update thing bit me badly more than once. Coming from a Debian background, I just approved any and all incoming updates in WSUS when I started working as a Windows admin.

Oh, the stories I could tell... ;-) Siemens SIMATIC, specifically, was so sensitive, I eventually stopped approving Windows Updates for our automation people until I got them to move all their SIMATIC software to virtual machines that had no access to our internal network. Printers, too, were a constant source of head-scratching and cursing. manic laughter

The worst part, to me, was that Windows seems to go out of its way to make troubleshooting unnecessarily difficult. I don't mind if a system misbehaves every now and then, as long as I have good chance of figuring out the source of the problem. But troubleshooting Windows is more frustrating than parsing error messages from a C++ compiler complaining about template instantiation problems (at least for me, having only the most superficial experience with C++).


> And I think the problems in 2021 are mostly the same as they have always been: caused by vendors.

Fewer than 5% of the issues listed in the article have anything to do with HW vendors, while the remaining 95% are caused by Open Source developers who work independently and don't particularly care about the grand vision/unified system which just works.


Weird list of non-problems and not understanding how Linux works (and a few problems). A few points picked at random:

There's no concept of drivers in Linux aside from proprietary drivers for NVIDIA/AMD GPUs which are separate packages: almost all drivers are already either in the kernel or various complementary packages (like foomatic/sane/etc).

I would say this is a feature rather than a bug.

It's impossible for the user to understand whether their hardware is indeed supported by the running Linux distro and whether all the required drivers are indeed installed and working properly (e.g. all the firmware files are available and loaded or necessary printer filters are installed).

Your hardware generally either works or doesn't work. If you are looking for Linux-compatible hardware to purchase, you'll need to do a few internet searches first, which is the same as any other OS.

No high level, stable, sane (truly forward and backward compatible) and standardized API for developing GUI applications (like core Win32 API - most Windows 95 applications still run fine in Windows 10 - that's 24 years of binary compatibility). Both GTK and Qt (incompatible GTK versions 1, 2, 3, 4 and incompatible Qt versions 4, 5, 6 just for the last decade) don't strive to be backwards compatible. The Qt company also changed the licensing model for their toolkit which makes using the library under Linux problematic to say the least.

Complete misunderstanding of how Linux is distributed. The distros keep your software working with newer versions of Gtk or Qt, and deal with licensing. It's not something that end users need to worry about.

Also the obsession with "screen tearing"! I just don't get this frequent objection. I watch videos often in X11 which is supposed to be impossible because of this "tearing" thing, but somehow it doesn't bother me at all.


This is nitpicking and ignoring that the end users have problems because of all of this, and they are not addressed because of this kind of hand waving.

I donate to the FSF, I contribute to FOSS, and I take those kind of articles very seriously. Because they contain very important informations of what we need to work on next.

Of course, that doesn't mean we should take it at face values. Like for the Linux drivers situation, the idea is not to say "yeah, let's change that". But one should recognize the challenges it can bring, and find a way to make it better for those who wants to get hardware support.

Staying on the top of our ivory tower while stating we are technically correct, the best kind of correct, is not a good long term strategy.


Screen tearing is something I've seen happen a few times across different hardware combinations. Sometimes it just works out of the box (e.g. my laptop using Intel graphics on Arch Linux experiences no tearing in any application) sometimes its a huge problem (e.g. my Raspberry Pi using ARM graphics experiences mild to severe screen tearing). Whats really confusing about the Raspberry Pi example is the conflicting info online. Some people swear turning off composting solves it, others say turning if on solves it. For me turning if off sorted it even tho this went against most of the advice I read in Raspberry Pi forums etc. Most of my experiences with Linux has been somewhere in between of the two extreme examples I've just described.

Meanwhile Windows literally just works on every device I've tried it on. I can't remember the last time I saw any screen tearing of any kind on a non-Linux device. Whenever I'm out and about and see screen tearing I'm like "Ah it's running Linux" and chances are I'm right.

For what's its worth I use Linux as the only OS on my ThinkPad and love it. I do acknowledge its flaws though and graphics is an area with room for improvement.


Only time I've seen tearing in windows is when I had no graphics drivers installed.

That really shouldn't be the baseline for an effective graphics driver...


> Weird list of non-problems and not understanding how Linux works

I would like to add this attitude to the list.


Each point he made is really weak too. It puts all the burden on the user compared to other main stream operating systems

> There's no concept of drivers in Linux ...

I'm not well versed in drivers, but this explanation makes it sound like hardware needs to target a specific linux driver to work as intended.

So instead of someone building their own driver against some api and including it with the product, they have to either design their hardware around linux (and maintain compatibility with windows/mac), or make a pull request for support and go through the normal development channels?

Is that really how that works? No wonder graphics + wifi cards have had so much trouble through the years...

> Your hardware generally either works or doesn't work. ... Linux-compatible hardware to purchase ...

I've never bought a device that didn't work with windows. Ever.

I understand what causes that, but for better for for worse, windows wins for usability here. And I'm not entirely convinced no one can come up with a system to smooth this out better.

> ... The distros keep your software working with newer versions of Gtk or Qt, and deal with licensing. It's not something that end users need to worry about.

Until something breaks and I do have to worry about it. Then it's explicitly my problem to solve, with only google on my side to help.

> Also the obsession with "screen tearing" ...

It doesn't bother me either, but "it doesn't bother me so it's a non-problem" is so unsympathetic to the typical user.

Is no one besides techie programmers supposed to being using linux? That's what it seems like most of the time.


>It puts all the burden on the user compared to other main stream operating systems

When I asked for help for my sound (in linux its only 50% volume and sounds like its a cheap speaker inside of a tin can) I was blamed for:

- Using a laptop

- The brand of laptop

- The audio hardware of my laptop (its their fault, not the kernels "generic" driver)

Then was told to either carry around an external DAC + Speakers or buy a desktop with a "proper" audio card.

It was the first time in 6 years I'd installed Linux and it lasted less than a week.

This happens all the time.


And even if a user has had the opposite experience:

"Oh, I had an issue and the community was so helpful, I got my issue resolved in a few days from going back and forth with nice people."

Some may disagree but I shouldn't have to enter communities of people to just get my computer to run. I want to put the flash drive in, wait an hour, and get onto what I actually wanted to do with a computer.

And it's sad, because I believe I share the same vision of people developing this software.

I want a world where every computer can have an OS that's free, completely open source, user friendly for every day computing, and empowers you to truly OWN the hunk of plastic in your hands.

It really seems linux is for no one but kernel hackers, software engineers, or to run on server hardware.


It's a bit like in sports. If you play ping pong, for example, there is a certain group of players that blame the paddle, the balls, the table, the shoes, the floor, etc.

There is only one thing they don't blame.


"Not fair, my controller was broken!"


> Is that really how that works? No wonder graphics + wifi cards have had so much trouble through the years...

No, it's not at all! Linux of course has the concept of drivers. In most cases the drivers are integrated into the Linux kernel codebase itself, and nowadays are in by far most cases contributed by the companies making the hardware (or contractors they pay to do so). In (much) rarer cases drivers are developed by the community. Overall this makes it quite a lot like for example Windows, where hardware vendors develop drivers and hand them off to Microsoft to distribute through Windows Update. Linux is roughly the same, with the drivers coming down through kernel (i.e. system) updates. A few drivers are maintained and distributed entirely separately by their vendors, but this is usually less convenient for the users in the end.

Within the kernel codebase, there are various frameworks catering toward device drivers of specific categories, so that a driver developer doesn't have to start from zero when implementing the driver for their device. Let's say a lot of storage drivers have to solve roughly the same problems, so of course it makes sense to share some code rather than reinvent wheels. This sort of code sharing is encouraged by the developer community, but not enforced.

For some categories of devices there are generic drivers, but those tend to reflect the existence of industry standards enabling their creation. For example USB Mass Storage and USB HID devices are standardized and don't need device-specific drivers (roughly speaking).

The bulk of the Linux kernel source code is drivers, and the bulk of those is contributed or paid for by hw companies.

In some cases kernel drivers also communicate with device-specific userspace components. For example when it comes to graphics drivers, in some cases only part of the driver resides in the kernel and some of the upper layers (e.g. the OpenGL API implementation) is organized through projects like Mesa, again with significant vendor participation.

> Is no one besides techie programmers supposed to being using linux? That's what it seems like most of the time.

There's plenty of Linux desktop devs that would take screen tearing to be a very serious problem :).


Thank you so much for the well explained reply, I appreciate the effort :)

My assumptions are based on my own experience.

Under windows, if a device driver isn't tracked by microsoft, it's typically provided as an installer program from the company that made the product.

But I very rarely see that for linux, besides some passing recollections of nvidia or wifi card binary blobs.

Is there a reason for that? If I made some weird hardware that needed custom drivers, do I target the appropriate kernel APIs then get it submitted for the next kernel version?

And, to save you the breath, could you recommend any blog posts or articles that dig more into these things? I want to learn more.


> Is there a reason for that? If I made some weird hardware that needed custom drivers, do I target the appropriate kernel APIs then get it submitted for the next kernel version?

Yep, there's a few reasons for that:

- Broadly speaking, the Linux kernel has a generic framework for "modules" that can be loaded and unloaded at runtime. Modules can do many things, including provide device drivers. It's possible to develop and build a module separately from the kernel, and within the code of the module make use of the aforementioned targeted driver frameworks.

- However, the Linux kernel intentionally does not provide a stable API or ABI for modules. And the driver frameworks are internal unstable API as well (unlike the kernel userspace API, which is never allowed to be broken). That does not mean it is impossible to develop and release a driver independently from the kernel - in practice, a fair amount of the API/ABI changes rarely - but it makes it much less practical and convenient than contributing the driver to the kernel source tree and developing it in lockstep.

- There's a wide range of (often heated) arguments around the issue of whether there should be a stable API and ABI for (driver) modules. For now the result of a debate spanning decades is I would say a loose agreement that getting hw vendors to contribute their drivers upstream has netted various benefits, such as the ability to maintain/update the driver when vendors cease to exist, get vendors to contribute to shared frameworks, make the drivers do things the vendor never planned for (e.g. expanding support to a new CPU architecture!), allow the kernel to develop faster (want to make a fundamental change somewhere? port all the driver users in one swoop and move things forward), etc.

- Overall this has lead to an ecosystem where this is now business as usual for I'd say the majority of hw vendors, who will make "contribute the driver to Linux before we release the hw" simply part of their dev and business routine.

- Some vendors still stubbornly refuse to (or have historical legal reasons they can't) contribute drivers however and have opted to try and maintain them separately. For those cases there exist some mitigating hacks. For example a little open source glue module that sits between the kernel and a proprietary binary-only driver, and distro frameworks that will recompile the glue module at boot time to fit an updated kernel (this gets around the ABI issue, but also some hairy legal license incompatibilities). Nobody loves these, but they exist and in some cases work OK where the vendor is very active in shouldering their maintenance burden.

A few decades in, I honestly find it hard to say which approach is better for which device. For example, you could say if the OS has a stable driver API/ABI, it means the vendor can cease to exist and their old driver release for my obscure hw will continue to work. This is true, but it also means you can never fix a bug in the driver, and the driver may no longer work with a major new version of the OS (say, Win 95 to Win XP, or Win XP to Vista) or in a computer with the same ports but a different CPU arch (say, x86 to ARM). In many ways the Linux kernel is the biggest actively maintained collection of "runs old obscure hardware" drivers around today, and there are plenty of gaps to "every hw works with Windows" that are filled by it.

Practical example: Recently I wanted to digitize some of my folks' old Hi8 family video tapes. My mother's boyfriend used to do this himself with an old EyeTV video capture stick they bought for their Mac circa OS X 10.3. Its drivers ceased to work around OS X 10.6 and EyeTV never provided support for their old product for newer versions of the OS, because they moved on to newer products. The fix? Capture from Linux using the same HW. Why does Linux have a driver for it? Because all these branded capture sticks map back to a just a few popular chips built into these products, and Linux has a well-maintained driver for the entire family, while the MacOS ones are vendor-specific.

And it runs deep: The entire mythical origin story of the free software movement began with a guy at MIT being upset about a bug in the driver for their expensive laser printer and the vendor refusing to release the source code and holding the device commercially hostage. This motivated the project that gave us modern open source licensing (GNU and the GPL, the latter used by Linux) and changed the entire software industry!


Shipping drivers with the kernel also has some big disadvantages, like the need to run a bleeding edge kernel version if you want support for newer devices.

I had to wait a few months before Ubuntu moved to a kernel version with the support for my motherboard's network adapter.


Yep, definitely. Potential lag time on newly-released hardware can be a major issue if the device vendor doesn't upstream the driver early enough, and/or if the system vendor doesn't provide kernel updates in a timely manner. The optionally more direct channel from device vendor to HW customer can be a big advantage then, or the (maybe?) faster turnaround times of the Windows Update channel (Windows Update seems to do a really good job of addressing the annoying driver hunting and manual updating Windows was plagued with in the past these days).

I've been hit by this as well, when I bought an nVidia card in its launch month years ago and had to fiddle before the drivers became widely available. That said, nVidia is by far the most prominent example for swimming against the tide of the general rhythm of the Linux world and the other GPU vendors do a better job with open sourcing and early upstreaming.


Yeah seriously. Been dealing with this attitude from the Linux Desktop community for literal decades now.


It will never be the year of the linux desktop if they can't put their feet in the shoes of a non-linux hacker.


> It will never be the year of the linux desktop if they can't put their feet in the shoes of a non-linux hacker.

My honest impression is that the developers of the software are often quite in tune with the needs and pains of the users, and the sort of "your problem is not a problem" responses lacking empathy you see are largely from other users. This isn't universally true, but I would say broadly. And even in cases where it isn't that one uncaring dev is often not representative of the project as a whole.

What I mean is - I think it's valid to call this a frequent failing of the broader Linux community, with users included. It bothers me as well. But the ones who do the most to make the software are often the ones who do care the most. And for them it's also quite a bad experience when generalizations go the other way.


> My honest impression is that the developers of the software are often quite in tune with the needs and pains of the users, and the sort of "your problem is not a problem" responses lacking empathy you see are largely from other users.

Perhaps you have never perused the GNOME issue tracker then.


It's like "Works on my system" manifested as a person.


> The distros keep your software working with newer versions of Gtk or Qt

This doesn't help obscure software (like BambooTracker or vgmtrans) which isn't shipped by most distributions.

Additionally a distribution isn't going to port a Qt 5 app to Qt 6 for you; if the original developer doesn't, someone else might do it a few years later, or never. Sure, a distribution might ship Qt 5 indefinitely, but it's going to slowly fall behind over the next few years, like what's been happening to Qt 4.

Worse yet is software dependent on the KF5 KDE support libraries. Ironically, old creaky-looking software like Strawberry (based off Clementine, based off KDE 3's Amarok) and Qsynth/QjackCtl (dated look and feel) are the first ones to port to Qt 6 (before KDE applications), because they don't depend on KDE libraries which need to be ported first.


The desktop application ecosystem is kind of a mess from what I experienced; it's also possible I have the wrong expectations as I'm not usually building desktop apps. Gnome and QT generally have to be installed in order for apps that use them to work. I found that when I was trying to make an app, this ended up getting in my way far more than anticipated.

There's a whole new class of electron-like (usually WebView oriented) frameworks appearing for languages now that make building desktop apps fairly straightforward. This is kinda nice because the framework then deals with environment integration (for instance X11 vs Wayland, or Windows and MacOS) -- but at a minimum it makes it so you can build native feeling apps that are entirely web frontends. The only limitation here is that I've found UI components for OS-native look and feel (as opposed to mobile) to be fairly limited in quantity.

At the end of the day, I guess what I'm alluding to is maybe QT and Gnome are great, but I feel like a lot of applications that are outside of serving a particular OS or ecosystem will need to build with UIs meant to run on multiple machines with minimal dependencies.


> There's a whole new class of electron-like (usually WebView oriented) frameworks appearing for languages now that make building desktop apps fairly straightforward.

It was already fairly straightforward. Now it's just understandable to people who are mostly familiar with web-centric development, which is a different claim entirely.

Qt is not an ecosystem; KDE is (like GNOME). Qt (like GTK, at least historically) cannot be an ecosystem because it has to be cross-platform.

You can build GUI apps in Qt or in other GUI toolkits and they will just work. They do not need integration into a desktop environment (though it can help in some cases if they are.

"OS-native look and feel" is overrated. There are numerous apps from Apple, particularly in the media creation sector, that do not conform to their own "native look and feel", and several of the most successful apps in this sector from 3rd parties also ignore it. Maybe for "office/productivity" apps it is important...


"OS-native look and feel" - +1 on being overrated. In most cases, I use programs, sometimes with diverging methods for doing common things, but it's not a deal-breaker. If you are not familiar with how a complex piece of software works and you need to use it, you have a bigger problem than if the UI seems a little different than other applications on the platform or if a few keybindings are different than expected.

We work in the programs that we use primarily. Browsers are one such example. I wonder if they break OS UI guidelines; websites themselves certainly do. The only issue is if the look and feel is good enough on desktop applications. KDE makes things look consistent even with non-KDE/Qt programs, so I am happy enough.


On Windows, Electron-based apps generally bundle Electron (Chromium and Node.js) with the app, resulting in a large download size. You can do that with Windows Qt apps as well, and it generally works out too (the app is larger than a Win32 app but smaller than an Electron app). I have less experience with GTK but I assume it works the same.

Linux distributions expect Qt and GTK apps to use a system toolkit. Some like Arch try to package Electron apps the same way (using a system Electron). However Electron releases incompatible major versions often, and you often have to install multiple Electron versions in parallel for different apps, and which version changes over time, so I have to juggle installing/uninstalling "electron" and "electron12/13/..." based on when apps update. (Admittedly, I also have Qt5/6 installed and GTK3/4 installed, but I expect these version numbers to remain stable for the next 5+ years.)


Question: If we want to write an gui app and have the binary continue to work on anything linux just about forever, how far does static linking get you?

In most cases disk is not the resources most needed to optimize. Granted you don't get bugfixes in your libraries but you made the call that they were working well enough, and if security is at issue old binaries probably aren't a good answer ever?

Statically linked to everything including libc should be forwards compatible with any kernel upgrade - "don't break userspace"

Your app isn't the xserver. Do X upgrades break it? I imagine a distro switch to Wayland would? Or do they reverse compatible an X abi? Anything else that could change under you? Other servers than X? Anything else I'm missing?

Anyone else remember being told that the answer was Java? What happened to that?


I can't speak to the long-term limitations of static linking. I will say that the JVM is a user-side hurdle which is undesirable. I usually use Rust or Go for cross compilation.


Meanwhile the UI framework situation on windows became much worse than on linux. Windows 2000 had comctl, but it was consistent with user32, on XP they became different, vista added ribbons, then came WPF, UWP, Xamarin, Electron, Metro, immersive UI, now windows 11 added something new that looks like tenth UI framework.


> I watch videos often in X11 which is supposed to be impossible because of this "tearing" thing, but somehow it doesn't bother me at all.

It doesn't bother you or you don't have tearing?


Like OP, I never have tearing in Xorg. I also will never own Nvidia. The issue isn't X11 which is perfectly happy to run in any environment including Wayland or GDI. It's crappy drivers that are handled by the B-team devs.


I think the GUI applications thing is another of those times where it's a tradeoff. Much like the way drivers work, it's fair to be annoyed with it, but it replaces (more or less equally) annoyances in Windows's opposite approach.

Yes, your usual desktop Linux distro doesn't expect you to be running a GTK application that was built twenty years ago. If you do that, you probably need to do some manual steps. (That's why it installs to ~2 GB). On the other hand, GNOME uses one version of one UI toolkit across all of its core applications. If you click through the Settings application, you will not - at any point - come across a control panel from 1995. Meanwhile, want to make a native app for GNOME? Great, you have one option[1]. The lack of screwing around with backwards compatibility - freely dropping stuff when it isn't useful anymore - makes for a thinner, tidier stack in a lot of ways.

It's better if we can have both - obviously it sucks knowing you'll have to "fix" something eventually because Lenart invented a newer better duotronic transporter buffer - and maybe stuff like flatpak will be able to help with that in the future, but I for one am grateful desktop Linux isn't forced to drag around a mountain of shims and forgotten libraries for a minority of users.

[1] https://developer.gnome.org/


> which is the same as any other OS

Apple's value story is that you can get a machine guaranteed to run MacOS by walking into a store and buying their hardware, no Internet search necessary (unless you need directions to the store).

There are a couple companies that do that for the Linux ecosystem, and I suspect it would make Linux easier to use if more were out there (but I don't know if the market would support them).


> There are a couple companies that do that for the Linux ecosystem, and I suspect it would make Linux easier to use if more were out there (but I don't know if the market would support them).

Even then those Linux systems are only supported with whatever distro the vendor ships. If you try to run a different distro that has license restrictions for closed binary blobs that "Linux machine" still may not work right.


Yeah, and that might be as much a marketing problem as a technology problem. We may need to be considering another major category of desktop / laptop OS experience besides Windows, MacOSX, Chromebook, and "Linux (pick your distro)": "Linux (vendor)." Like Linux (pick your distro) but more likely to be working out-of-the-box and more likely to break under exotic manipulations than Linux (pick your distro).

I have a great ThinkPad X1 that I just tried to switch the window manager it shipped with for xmonad. Everything broke. It's not particularly intuitive that the list of things depending on the window manager config involves wifi, audio control, Bluetooth control, power management (machine no longer sleeps when the screen is shut), screen auto-configuration when a monitor is plugged in or unplugged, and what happens when you plug a USB drive in... But they do.

From an adoption standpoint, there might be meat on the bones of the messaging "Linux on the desktop is great if you buy from a vendor; just don't assume that any old hack you do to the default install is going to just work." Then widen the set of "blessed" hacks that do work from that base template of well-verified default configuration.


Screen tearing is like keming - once you see it, you see it everywhere.


It's least that one is possible to fix, at least with Intel graphics drivers. I get a little obsessed when I see deviations like this on the screen. That's one example, but there is also burnt pixels or pressure spots on the screen that can drive me nuts.

Here's hoping for distros to make that solution the default.


Wayland fixed screen tearing. There's also the TearFree setting for Xorg.


No video tearing in years. Sure, I can haul an old laptop out of the basement. QT 5.6 is holding me back a bit. That is about money though and not linux. It is easier to absorb license fees in commercial software, sure. Need to read what the KDE people are doing.


Have you read the Security category?


> you'll need to do a few internet searches first, which is the same as any other OS.

For everything I can buy in a store I can be sure there's a Windows driver/userspace software, and for everything above a couple hundred dollars I can be reasonably sure there's an OS X solution (although that one seriously changed in the last years with the deprecation of 32-bit in 10.15 and the shitshow that is 11+).

For Linux? Oh jesus. That requires dedicated research and, worst case, manually compiling stuff from some random github repository where someone hacked a solution.


About third-party drivers, especially from random Chinese vendors, I like to treat them like they don't exist. Mainline drivers tend to be higher quality, and they are more or less guaranteed to keep working indefinitely. The only exceptions I currently have are an nVidia GPU in a secondary computer (GPU compute box) and a Brother printer. Both have a long history of working well, and have packages, not tarballs or Git repositories that need compilation. The Brother drivers are also user space only.

So, IMO, if you are doing it right, it's more a matter of research and less a matter of compiling your drivers from an obscure repository.


So, how to do it right? What are THE right steps to crosscheck stuff available from local vendor (to make eventual warranty claims easier) with linux compatibility?

I am actually using linux desktop 20 years and still it feels like groping in the dark every time i buy some peripheral.

Plus I can confirm the regressions mentioned in the article caused by relentless churn in the kernel are very real. Oh and userspace churn too. I have whole system from gcc4.9/Qt4 era saved to chroot into whenever I need some "old" software. For lay user such hoops are impossible.


> I have whole system from gcc4.9/Qt4 era saved to chroot into whenever I need some "old" software. For lay user such hoops are impossible.

A simple "docker run -it -v /:/mnt debian/eol:$CODENAME bash" should give you a working Debian chroot without any hassle - you can go as far back as Debian potato from 2000 per https://hub.docker.com/r/debian/eol/. In theory according to a SO post (https://askubuntu.com/a/1161647) you should even be able to get GUI applications working inside a Docker container.


You're missing the point, confirming the generally dismissive or even ignorant attitude. What if there's no deb package? How to make home directory accessible?

The point was, every time Gtk/Qt major version is updated, whole desktop gets obsolete. Perfectly fine applications must be reworked or abandoned or mucking with chroot/docker. Throw in the transition to systemd and wayland.


> You're missing the point, confirming the generally dismissive or even ignorant attitude.

I get the issue if you're still dealing with chroots. I hated this with a passion, too - but using Docker is a real breeze in comparison.

> What if there's no deb package?

A Debian archive docker image behaves just like an ordinary Debian install would - they are fancy chroots. Install the dependencies, run the usual "./configure && make && make install" dance, that's it.

> How to make home directory accessible?

Add a "-v /home:/home" in the invocation.

> The point was, every time Gtk/Qt major version is updated, whole desktop gets obsolete. Perfectly fine applications must be reworked or abandoned or mucking with chroot/docker. Throw in the transition to systemd and wayland.

Agree with you on that one, with one exception... systemd unit files are so much easier to deal with than init.d shellscripts.


I don't trust systemd and docker, sorry. No idea if it's my bad luck or what, every time I tried out such a complex thing with "easy" interface, I ran into some insidious bug.


I recently got a TI-99/4A and ran a program that did a pretty poor job of waiting for vblank to do sprite position updates. Result: Horrible, horrible tearing.

You "don't have a problem with" tearing because tearing has always been with us and some people are just used to it. But other people are bothered by it and it doesn't have to be this way. Notably, one of the goals of Wayland is to solve for tearing when X hasn't.


> Your hardware generally either works or doesn't work. If you are looking for Linux-compatible hardware to purchase, you'll need to do a few internet searches first, which is the same as any other OS.

I'm sorry, but this simply isn't true. I've seen this specious argument for almost 30 years now. It was not true in 1994, and it is not true in 2021. The fact that this is still making the rounds, is perhaps the most damning indictment of desktop Linux.

When making a hardware purchase for Windows or Mac, pretty much everything is supported right out of the box. I pick up a webcam, it works. I pick out a printer, it works. I don't have to worry the hardware is supported on Windows or Mac. It is, right out of the box. It's a very rare occurrence if I have to download anything to support it, and usually then it turns out to be needless OEM crapware.

When it comes to Linux it's a crapshoot. Hardware description often don't say if it's supported. So you have start to google around and see if it's supported, and if you're lucky, you'll get get answer, but often you don't. Instead, you just find people asking if it is supported, or maybe talking about how it's kind of broken. A lot of this has to do with the fact that the kernel support is based on chipsets -- which makes sense from an engineering perspective -- but makes zero sense from a consumer perspective. This information isn't easily available.

Purchasing desktop hardware for Linux is like walking in a minefield, and it's always been like that. You'll never get everything to work all at the same time, so you'll just have to satisfy yourself with which subset of features you want supported, but of course, you won't actually know what features are broken until you actually start using it, because you'll never list and test all your cases before purchase.

While not stated in the above post, the lack of hardware support is often followed up with either a hopeful, "It's open source! It will get better!" Or a dismissive, "You've got the source. Fix it yourself."

That's a lie. Even as a software engineer, I don't have the time or knowledge to develop my own driver for some sound card, or figure out why everything coming out of my printer has a pink a hue. I don't even know where to begin to debug that, and that's assuming I have the time to do it. If an SWE can't do it, there's zero hope for anyone else to fix their own problems.

So what about the developer community? Surely they'll fix it! In my experience, the answer is probably not. I've never seen partially supported hardware go to full support. Either the developer moves on because it's "good enough" for him (e.g. see your comment about how screen tearing "doesn't bother me at all"), or some newer version comes out, and everything stops to (partially) support the new one.

"Linux on the desktop" sounded great 20 years ago, and I really wanted it on my desktop. But then I realized that it's always tinkering. Desktop linux is like having a broken down car for your daily driver. What I wanted wasn't "Linux", but rather a unix for my desktop.

Then MacOSX came out. An honest to god unix, but with hardware support. Once I made the jump, I haven't looked back. Now the only linux device I own are a raspberry pi and and an underused NAS. I'm cool with that.


Plus a million.

As for MacOS, yeah, it's truly great but the hw is quite expensive and limited to what Apple sells to you. And then Apple deprecates older devices all the time and you're left with an OS where you cannot run any modern apps including the web browser which makes using them a major threat to your security.


Until you try to compile your Qt-4 software and… dang!


> which is the same as any other OS.

Did you ever used Windows?


The author has monetised all outgoing links by showing an ad before redirecting. They are making money off this publicity, nothing more.


There are barely any economic incentives to make the Linux desktop usable. I can't name anyone whose income would be dependent on people being happy with a Linux desktop to any significance. It's a miracle it works as well as it works in my view.

Red Hat, SUSE and Canonical make some money off of Linux and sell desktop operating systems, but their customers are the enterprise (their users have to use what they get) and their primary revenue stream is around servers anyway. They finance part of the development of Gnome or KDE, but that's not a lot of money. The Gnome Foundation's total income last year was under $1 million, KDE e.V.'s was under €250k. These are by far the largest and most professional Linux desktop projects.

One tiny exception is Elementary OS - which makes some donation money from their community directly to make a usable desktop experience. There you see the difference - it's a tiny project but is being well received for its core experience. Sadly it's too small to have a significant impact the wider Linux ecosystem.

If this situation doesn't change, the free desktop will always be viewed as a hobby project for people scratching their own itch. Many like it that way. But vendors will find it hard to take the Linux desktop seriously, and this limits adoption from a wider audience too.

The Linux desktop has no shortage of opinions, but a big shortage of money.


First, there's a lot of developers scratching their own itch as you say. Second, there are probably more paid developers for desktop work than you think. They work for Red Hat, Suse, Valve (not necessarily directly employed by them), AMD (excellent drivers these days), Blue Systems (a philantropy organization supporting mostly KDE), the Qt Company, etc.

If you add up all the paid developers working mostly for the benefit of desktop Linux, and "partial" paid developers whose work somehow benefits desktop Linux, it's probably several hundred at least.


A lot of distros accept donations, how is Elementary different? Zorin even has a "paid version" which people just torrent anyways.


This makes me glad I switched from being a Linux desktop enthusiast over a decade ago. I’m still a Linux server enthusiast, and I greatly appreciate all the hard work volunteers put into Linux desktop software — but the hassle and lack of polish keeps me on the Mac for desktops. It was fun for a while Linux Desktop, I wish you the best but I’ve moved on.


Linux on the desktop was great when I was in high school and college and had tons of free time to investigate, debug, and understand things and very little money to spend on solutions. I learned a lot.

Now I have far less time and far more money. I use Windows. It just works.


I don't run desktop Linux anymore because I got a substantial discount on a Macbook about a year ago and my current laptop was falling apart, but I really don't feel like this has been an issue in the last ~5 years (at least for me). Before last year, I had run Linux for nearly ten years, and the last five of which I would typically be happy enough if I just flashed Ubuntu to a flash drive, plugged into my laptop, installed, and moved on with my life. Generally, Linux would just work; I would get my games through Steam, basically everything else through the Ubuntu store, and things generally worked fine out of the box.

I will say that prior to ~2016, it was a completely different story. Whenever I had to reinstall Linux (which was frequent), I could expect to spend several hours hunting down drivers and figuring out how to finagle GRUB or DKMS to work, but I largely don't feel like this is an issue anymore, at least not with the "mainstream" distros like Ubuntu or Suse.


Same here that is why im waiting on hopefully some indepth reviews/benchmarks about the m1 pro and max. Because windows is usually a 2nd rate citizen when it comes to dev tools and gets the cool stuff usually as last. Mac usually gets 1st class citizen treatment.


This might change in the near future, with WSL2/WSLg.

Additionally, unless ARM starts replacing x86 on servers, WSL2/x86 is a lot closer to the production setup, than either emulation on M1, or developing for ARM platform, and targeting x86 in production.


Exactly the same. I just hope they don't fill Windows with ads.


I've used linux daily for the last ~4 years and I would not say mac osx is better. Every OS has issues, but with linux I understand what's going on and can configure it to my liking.


Cool! I definitely learned a lot about Unix and how OS’s work in general because of Linux, and I really appreciate that. What I also learned is I don’t care about configuring my desktop and rather spend time hacking on Web Application code.


"What I also learned is I don’t care about configuring my desktop and rather spend time hacking on Web Application code."

This is fair, and it's a selling point for Windows and OSX. I also dislike having to tinker more than necessary. As long as you're on the happy path, those work pretty well for most people. But if you're not on the happy path, it can get real painful real fast.

With linux I'm able to set things like audio (pipewire), foreign language inputs (fcitx), etc exactly how I want it- and that means no desktop environment at all asides from i3. I had a macbook for a while and ended up getting frustrated in certain situations due to lack of freedom to configure things exactly how I want, even if other parts of the OS were more polished.


Congratulations on getting your ports back!


What's really crazy is that Apple can screw up several times in a row (abrupt removal of all ports; Touch Bar; flawed keyboard design), even such that the screwups overlap and are occurring all at once, and... there's not really anyone else to go to. Their products are basically their own category—everyone else is that far behind on overall UX, capabilities, and average (maybe not worst-case) experience—so your choices are "just live with the bad parts and hope they fix it next time" or "slice off your nose".

It really sucks. I'd love for them to have anyone nipping at their heels.

(Windows user from '94-present [gaming and other non-serious uses, for most of that time], mainly a desktop Linux user from 2000-2011 and I still try it out every couple years, FWIW—alternated between nodding along and going "holy crap, still?!" for all of TFA)


Maybe in regard to their new APU's, but everything else I think PC's are just as great.

There has also been some really nice Windows laptops in most recent years. I got a business HP laptop (think it was probook series) for work when I started 2 years ago. It was awesome, great trackpad, keyboard, battery life, speed etc.

Had to move over to Macbook Pro 16" due to some technical setup in the workplace. It's great too but not ahead on anything really. Lower battery, more heat. Slower. Also Intel though. Lot of issues with connecting non-Apple displays.

Also heard the Dell XPS should be great.

I just think many people think Macbook good, PC bad. Because they ARE willing to spend 2500$ on a Mac but their last PC they spend 1000$, and then they to compare.

But I concede, currently Apple APU's has no match.


> I just think many people think Macbook good, PC bad. Because they ARE willing to spend 2500$ on a Mac but their last PC they spend 1000$, and then they to compare.

FWIW we've never spent more than ~$1300 on a personal MacBook (older Pro, when the lower-end models were much cheaper, and a recent Air; my wife's owned two Airs). Aside from one case when a workplace provided me a much higher-end unit than I would have requested—I'd really rather not have a discrete graphics card, thanks, as they're the cause of a solid 50% of all problems with laptops, both Mac and PC, in my experience—I don't think a company's ever spent more than ~$1700 on a work MacBook for me.


> Lot of issues with connecting non-Apple displays.

I have the 16" Macbook pro as well, and haven't ever had any issues connecting to an HDMI monitor/TV with just a cheapy USB-C dongle I bought off Amazon. What kind of issues are you having?


The place I worked when the all-usb-c Macs rolled out, I do recall a lot of "that monitor doesn't work no matter what", and "that one works but only with this cable and dongle, swap either and it won't work", and "that one works but only with the displayport dongle, not HDMI", and "that one works but it'll blank about once an hour and you'll have to unplug it and plug it back in", et c.

I hope that's been ironed out by now, but don't really know (I WFH now and have for years, and my monitor's fine, but that's just one monitor)


We have issues with our dual HP screen setup going completely black out for 5-10 seconds randomly 3-5 times throughout a work-day.

When it happens the screen setup/orientation often reset, so left becomes right. We've tried multiple different cables, cheap ones, short ones, expensive ones, Thunderbolt 3 etc. etc. Console does log an error when it happens, but we don't have the expertise to look into it. And really who to contact? HP would point fingers at Apple and vice versa, or Apple at our corporate setup. Doesn't seem to happen with only 1 screen attached.

For headset connection we use an off-brand adapter where we plugin the wireless USB-A dongle, but you sound like a robot. Probably adapter fault, but same issue, who to contact? We found a particular brand of a very cheap small USB-C to A adapter works. I am using bluetooth to connect my headset, but it's very spotty requiring me to reset bluetooth module one a week to get the connection. So why am I using bluetooth you might ask, and not the dongle? Well if I use the dongle it interferes with my personal Logitech mice/keyboard setup, so that begins to lag.

I know the solution is to use Airpods, Apple keyboard and mice/trackpad as well as Apple screens and adapters... but expensive and shouldn't be necessary.


Sometimes I feel like I’m the only one that never uses any kind of port for anything but charging. But I’m glad people are glad!


and 120 hz screen


> I’m still a Linux server enthusiast

Ditto for about two decades, but I'm working on becoming a FreeBSD server enthusiast instead. Between being the "native" platform for ZFS, which is a huge deal all on its own, and my disliking a whole bunch of things about the direction of Linux (I am pretty sure Red Hat is pushing almost all the new stuff I dislike, in that they're the de-facto owners of the projects and are having great political success pushing them on everyone else) I think it's time to start surveying the broader field of free server unices.

Still working on getting used to Jails over Docker + LXC. Docker's so damn convenient, even if you just treat it as a cross-distro isolated-install package manager with a huge and up-to-date package list, and don't leverage anything else it does.


Ah, funny. The OS X launch drove me to linux/bsd. The classic I still boot once a year. Sometimes I boot to netbsd. And I was a mac user who would complain about the idiocy of windows in the 90s. Typing this on linux mobile.


What is "This" which you refer to?

The article is really a rather random list of issues that different demographics might or will encounter. Is there any specific issue on there that made you glad you switched?


Well, once I bought a laptop with integrated intel graphics because it was supposed to have good Linux support. Suspended/wake didn’t work, and 3D didn’t work, I spent months on mailing list trying to figure out the problem, and I ended up being the one merging the fix for Ubuntu, you can probably still find that on launchpad.net

I get it, that was over ten years ago, and maybe things are better, maybe not according to this guy. I’m just tired of this kind of finickiness and would rather focus on hacking on server code.


I once bought a laptop with a trackpad and it came with Windows ME. The trackpad kept dropping. It was a problem with the driver crashing.

I once got a macbook from work, and the mechanical stress from carrying it in my backpack broke something internally. Started randomly rebooting. I had to bring it in, b/c Apple would not allow me to swap the broken piece myself. Got it back with everything wiped and all my files gone.

This is why Windows will never work for normal people and will never become popular. This is why Apple won't work for any common use like a backpack and will never become usefull /s

But first: thanks a lot for spending the time to make a patch!

Yet, my point is: sure, that is valid criticism. Stuff should Just Work. But it also is kindof anecdotal and a little random to write off the entire desktop experience of Linux on a piece of unsupported hardware 10 years ago. We all have experienced bad hardware support everywhere, with every OS, with every ecosystem, haven't we? Hell, even one of my thinkpads did weird stuff once.


Sure, maybe we should gather a random group of people, give each of them a laptop with Linux and Windows, and tell them to do some common tasks on both of them. Let's see which one they prefer, and have less issues with.

Common tasks include:

- Zoom/Google Meet meetings including screen sharing

- Playing music on bluetooth headphones

- Watching movies on Youtube/Netflix

- Playing games (with official support for both platforms)

- Using external monitor with different DPI


It’s not just a one off though, I’ve used windows everyday, at home or at work since windows existed, I also used Linux on the desktop full time for about eight years, it wasn’t a couple months of me trying something and not liking it. Now I’ve been using windows and Mac OS on desktops at home and at work for the past 11ish years —- while still working with Linux and windows servers everyday for the past 20 years. I’m pretty familiar with the strengths and weaknesses of all of these systems, and I agree, if you value configurability a lot, Linux can be a nice desktop —- but I’m not wrong for not valuing that. What I want is a desktop experience with fewer rough edges than what I’ve gotten from Linux so I can just write my app code and not worry about drivers or whatever. I think you’re being pretty silly if you’re trying to tell me Mac OS hardware support on macs has a comparable number of rough edges to Linux on random x hardware — because no I haven’t had Mac hardware incompatibility issues with Mac OS, that’s almost the whole point of what they do, and the knock on effect is their limited hardware options— but that’s a trade off that works for me, and it works for a lot of other people.


> But it also is kindof anecdotal and a little random to write off the entire desktop experience of Linux on a piece of unsupported hardware 10 years ago.

This is a fair point but I think anyone who has used Linux for any length of time knows it isn't just one issue. It's a whole crowd of them. Windows and Mac have issues too, but far fewer.

I assume that he has experienced more issues than that and that was just the worst one.


I understand some of these are problems for some users on some hardware for some use cases, I really do. But that doesn't change the fact that I'm not particularly a Linux specialist, but have happily and without trouble used Linux as my primary desktop OS since 2007. It's gotten easier every year, with a little hiccup when Gnome got too useless and I had to learn something new. I've used it on old and new hardware, high- and low-spec machines, and have had no problems. I guess I don't get what expectation it's not meeting at this point.


As a dev involved with some of the cited technologies, or using them for commercial products throughout my career (desktops, phones, smart TVs, cars, gaming gear): This is is a low-quality, badly researched piece. The amount of actual errors or conceptual misunderstandings in it is so numerous and overwhelming it is difficult to respond with anything other than exhaustive blow-by-blow refutation. The incorrect claims are not limited to Linux, either; a lot of the technical info about Windows/MacOS/Android are equally incorrect.

That said: It makes me yearn for what a well-researched version of the same piece would look like and what it might accomplish as a signpost :-). A lot of this "what are the major problems, and what are we doing to solve them" information is currently rather fragmented and distributed over presentations at Akademy/GUADEC/XDC. The closest to a decent aggregation I can think of would perhaps be Christian Schaller's updates on what the desktop team has in store for new Fedora versions, but even that is selective by nature of the product.


> The amount of actual errors or conceptual misunderstandings in it is so numerous and overwhelming it is difficult to respond with anything other than exhaustive blow-by-blow refutation.

Normally people who spew things like this don't really have anything factual to say from my experience.

The author has extensively edited the article over the past decade and addressed all the valid criticism.


I'm not familiar with Linux that much so I'm curious, do you have a couple examples of errors in the piece?


In general, the article has an odd mix of "this sucks about computers in general right now" and "this sucks about Linux specifically and with better alternatives available" points, and I think is worse for not picking a consistent direction. It feels a bit like a collection of claims found on forum threads over the years, by authors with differing levels of expertise and at different points in the timeline. For example this point:

> Traditional Linux/Unix (ext4/ reiser/ xfs/ jfs/ btrfs/etc.) filesystems can be problematic when being used on mass media storage.

- This is not equally true for all of the file systems named and for all of mass storage media. For quite a few of these combinations, Linux actually has the best solution available. Some of these file systems are also very niche and a general user will never even encounter them. Standard systems default to the most generally useful ones.

- I can take a guess that this is probably thinking of things like inadequate TRIM support and then may be outdated, and also no different from highly complex, often device vendor-specific scenario trees on the competing systems.

- In the combinations where it is true, it is generally also true for broadly used competing filesystems like NTFS - and for the same reasons (lack of wear leveling, etc.).

In general, the points that focus on general user experience more broadly and are not Linux-specific tend toward being the better ones in the piece. The more technical and Linux-specific it gets, the worse:

> Wayland doesn't provide multiple APIs for various crucial desktop features (global keyboard shortcuts, drag-n-drop, systray, screenshotting, screencasting, UI rendering acceleration, and others) which must be implemented by the window manager/compositor, which means multiple applications using the said features can only support their own Wayland compositor implementations.

It's a conceptual mistake to assume that Wayland (an IPC tech stack and protocol to regulate exchange of pixel buffers between processes and arbitrate shared access to i/o hardware w/ event routing, with facilities to add further protocols) should be responsible to provide all of the listed features. The equivalent features in other systems are usually provided by seperate APIs as well.

It's also mistaken to assume that just because Wayland does not provide a feature, a standard does not exist that fills the role, making it a non-problem. In Linux desktop architecture Wayland intentionally specializes in the functions listed above, while many other IPC problems are addressed through D-Bus and standardized D-Bus protocols. An example of this from the list would be the systray.

Some other entries in the list are just outdated and have been addressed by later work in the Wayland protocol family.

> Wayland doesn't provide a universal basic window compositor thus each desktop environment has to reinvent the wheel

In practice the Wayland implementation landscape has greatly consolidated. There are projects providing Wayland implementations as middleware used by many of the smaller HMIs, e.g. wlroots. Having a standard protocol spec independent from implementation does not mean implementations can't be reused, that's just orthogonal.

> Wayland and its window managers/compositors have very high requirements in terms of support from both hardware and kernel drivers which prevents it from working on far too many configurations. X.org and Windows on the other hand can work on an actual rock using e.g. VESA.

Nothing in Wayland requires specific hardware.

> Applications (or GUI toolkits) must implement their own font antialiasing - there's no API for setting system-wide font rendering. Most sane and advanced windowing systems work exactly this way - Windows, Android, Mac OS X.

This is correct as far as the first claim goes, but then goes off the rails when describing the competing systems and in painting it as a problem. First of, font rasterization and text shaping in Linux has long since standardized, with all major app stacks consolidating on the freetype and harfbuzz libraries. Those two are used in the same roles on Android. On Windows and MacOS, modern APIs for this are likewise no longer part of the display server/windowing system, and often there are actually multiple APIs for the job supporting different features. Windows, say, added Direct2D in addition to the existing GDI at some point. Plenty of Windows apps also render through freetype/harfbuzz these days, bundled by the cross-platform UI libs they use in lieu of Microsoft APIs.

I.e. the technical mistake here is assuming Wayland is somehow unique here and not just reflecting broad tech trends. In some cases the reality on Linux is actually more consolidated than elsewhere, and this is one: There is no meaningful competition to the freetype+harfbuzz stack.

This inaccuracy is where it gets really complicated: If you know to recognize it as such, this talking point originates in internal debates in the Linux community about the X to Wayland transition, the logic of the change-averse going "X had standard fonts and Wayland does not, so it's bad!" - even though that's already wrong, because while X had standard fonts at inception, they stopped being used by apps decades ago. And then it's backed up by an incorrect comparison to the other systems, because the original author of the claim, a Linux user, didn't know better. This sort of thing makes this list hard to evaluate by users of other systems - the content is often sifted from a strange place. A lot of this reads like excerpts from a flame war on fictional Star Trek technology. Phasers are bad because tachyon couplings suck vs. Klingon disruptor grids!

> Wayland does not provide a basic universal unified graphical toolkit/API (and Wayland developers will not implement it) akin to Win32 which means applications using different toolkits (e.g. Qt, GTK, Enlightenment) may look and behave differently and there's no way to configure all of them once and for all.

The API scope of Win32 falls far short of what a modern UI lib provides, and Microsoft has many different generations and flavors of UI libs on top of Win32 (MFC, WinForms ... Maui is the latest one I think?). Many Windows apps are in fact written using non-native toolkits such as Qt and Electron as well these days.

This is a variation/continuation of the earlier sort of inaccuracy. Broadly speaking, windowing systems never particularly favored server-side rendering or server-side toolkits, with Wayland's predecessor, the X Windowing System, just a short-lived odd one out - but even in X-based desktops, apps converted to client-side rasterization using UI toolkit stacks long, long before Wayland came about. Disagreeing with these directions is fine, but it's disagreement with much of an entire industry and spanning many systems.

The rest of the Wayland section is largely the same - a weird mix of incorrect, misattributed and outdated.


> This is not equally true for all of the file systems named and for all of mass storage media. For quite a few of these combinations, Linux actually has the best solution available. Some of these file systems are also very niche and a general user will never even encounter them. Standard systems default to the most generally useful ones.

I've stopped reading here - you did *not* follow the corresponding bug report and you are trying to refute something which you took 0 seconds to understand and as a result your comments to this issue are 100% worthless and equally irrelevant.

Good luck and have fun.

Maybe next time you could be a little bit more attentive and inquiring.


Saying, "Don't call this FUD" up top doesn't prevent this from being possibly the most blatent FUD I've seen in a long time.

Check if your hardware supports your OS before you buy it. Guess what? Then it all works great. What a waste of time.


People are used to Windows where literally all hardware works. So that statement is not that evident.


>People are used to Windows where literally all hardware works

Hahaha fuck no. I spent two weeks trying to get a Lenovo Y50's touchpad to work properly on modern Windows. It's impossible. Official drivers are unmaintained and whatever you get whether generic or frankensteined-in works terrible compared to the out of the box experience on a Linux distro.


Just get/buy a PC pre-installed with Windows?

Don't most people do this?


You could equally well get or buy a PC pre-installed with Linux where someone else made sure everything was well-supported. It's not that Windows has inherently better hardware support due to its architecture—rather, hardware manufacturers put in the effort to ensure that their (new) hardware can be used with the latest few versions of Windows, since they would otherwise lose a large part of their target market.

On the other hand, Linux tends to be much better at supporting older hardware that the manufacturers are no longer interested in selling, as well as working across a broader range of hardware platforms ranging from embedded IoT devices to supercomputers.


lol, family member's acer laptop (pre-installed with Windows) had audio driver issues to the point where the audio output jack simply doesn't work. No drivers are available for the machine b/c it's pre-installed, so why would there be issues?

The audio out works perfectly under ubuntu


> People are used to Windows where literally all hardware works. So that statement is not that evident.

I wouldn't be so confident, I had my fair share of crappy drivers on windows as well to the point if that the hardware is old, there's a good chance that it will run smoother on Linux.


This is correct. The "happy path" for a new linux installation is much worse than for windows. It's gotten much better over the last decade, but it's still much harder.

Further, if you really want to drive adoption the linux onboarding process has to be better than the windows onboarding process, which is a massively difficult because the windows process is "purchase a computer, hit the power button, and watch the OS as configured by the manufacturer boot."


I have anecdata of the opposite: my ThinkPad T590 cannot install windows with the .iso from microsoft, you need hardware drivers for the NVMe SSD off another disk during the install. I think windows and linux are about equal in terms of hardware support, with windows trending more towards good support tomorrow's hardware (it works when the new product launches) and linux needing 1-2 weeks before the new hardware gets a driver update + packages are updated.


> needing 1-2 weeks

Sometimes up to several years or never.

Are VirtualBox and VMWare host drivers included in the Linux kernel? Granted they are not "drivers" per se but anyways.

OK, let's talk about hardware: literally thousands of WiFi adapters continue not to be supported by the Linux kernel some of which were released as early as five years ago.

And then, if you have a Windows driver you know it works and it's full featured.

The Linux driver may offer very rudiment support just to make something work.

So, it's not so simple. Far from it.


Exactly. Or, just buy an integrated system from people who support Linux

Example - Intel. I've never had a single hardware support problem with a NUC. Intel processor, Intel video, Intel wifi, Intel bluetooth. All supported right in the kernel, out of the box.

AMD is getting there, too. I currently run a 4x4 with ryzen and vega, and an Intel ax200. All peachy.


Twenty years ago, I had fun building new Linux systems. Today, I just want to use the system for doing other things (I don´t need another hobby), so when it was time to replace the old Debian Thinkpad, I bought a Darter Pro from System76.

I expected an experience similar to buying from a MSFT OEM or an Apple macbook (price was certainly comparable), however, I had to spend a fair amount of time getting _basic_ things to work (wifi - Intel - kept cutting out after a few minutes, problems with graphics, driving external monitors, etc). I think System76 probably did the best they could, but they don´t have a silver bullet for the problems discussed in TFA.


Anecdata: I have an intel/dell laptop and bluetooth is a trainwreck on Focal Fossa, let alone display modes and orientation changes. Video playback in FireFox works sometimes, sometimes i need to reboot because it freezes and then won't play again even if I kill the process (I think it is related to going into suspend with a video paused). Yeah, I'm just one person, but damn it is frustrating to do anything else but dev work on linux.


You're running a two year old kernel on Focal. Try something newer. I have Intel WiFi and Bluetooth and they work perfectly fine on Linux, as does video playback on Firefox.


Lol. ~30 years of effort and the answer is still: "Download the latest version."

Linux is a workhorse, not a Windows replacement for the unwashed masses.


This is like using Windows 7 and complaining when it's Windows 10 that has compatibility with your hardware.

LTS releases are explicit that they have limited hardware support, that is if they even attempt to backport new/updated hardware support at all. That's the point of Ubuntu's LTS Hardware Enablement (HWE) program[1], although it isn't perfect and I wouldn't rely on it for hardware support.

[1] https://wiki.ubuntu.com/Kernel/LTSEnablementStack


20.04 is literally the latest LTS version of Ubuntu. Windows 7 is TWELVE YEARS OLD. You're analogy is garbage.

On top of that, I've run the auto updater that comes with the GUI (does more than sudo apt update).

If that isn't sufficient to use 4 year-old bluetooth headphones that are older than the OS, and FireFox to watch Netflix, it is far, far worse than Windows. Look, I really want this to succeed, but be honest, it's far behind Windows and Mac.


I didn't make the comparison because of its age, but because it's a version of Windows that came before 10, and I forgot Windows 8 even existed. Fill in Windows 8 or 8.1 or whatever for Windows 7.

IMO, your average desktop user's use case isn't served well by an LTS if they plan on using relatively recent hardware, peripherals or software. LTS desktop releases suit Canonical's corporate customers' needs well, though, which is why they were released in the first place. A distribution that uses up-to-date software suits most use cases.

Bluetooth headphone support is something that bit me in the past, as well, but it's an issue that was solved with PipeWire. Ubuntu 20.04 still uses PulseAudio.


Windows 7 has long been unsupported, Windows 8.1 support has recently been dropped by most OEMs as well since Microsoft has deprecated Windows 8.1 earlier than expected.

Four years ago anything you bought was fully supported by Windows 7 and 8.1 aside from lazy OEMs who refused to provide older OSes' drivers.


> "Download the latest version."

Yeah, and if you're running something like Debian 11, CentOS or Ubuntu LTS the answer will be, "Go compile the latest kernel".


> Check if your hardware supports your OS before you buy it. Guess what? Then it all works great. What a waste of time.

Who runs into hardware compatibility issues with Linux? People installing it themselves.

Who installs Linux themselves? Tech geeks/enthusiasts.

Who should know better than to install an operating system before checking for hardware support? Tech geeks/enthusiasts.

That's why I can't take those people seriously. If hardware support was a real problem, Apple would have gone out of business long ago. People like that aren't trying to have an honest discussion, they're just trying to bash something they don't like or understand.

There are many issues that need to be overcome in the Linux community before the Linux desktop can go mainstream, but we're not going to find answers to those problems by listening to these people.


>Who installs Linux themselves? Tech geeks/enthusiasts.

Perhaps this wouldn't be the case if we fixed some of the issues in the article :)

There are plenty of people "on the fence" but problems mentioned in the article (among others) keep them away.


What type of person knows how to install an operating system that isn't a tech geek/enthusiast? Not even Linux, who can install Windows or MacOS on an empty hard drive, but doesn't consider themselves to be a "computer person"?

If the Linux desktop can't take off until my grandma is able to install it herself, then well, yeah it's truly fucked.


If I want something to work on my mac, I look for a mac logo on packaging.

I can assume the company has done proper QA and testing to make sure that's the case.

If I want something to work on a typical linux installation, I have to dig through hardware wikis, pages of forum posts, and be ready to dive into config files in case the info above is wrong.

And even then there can be less of a guarantee

The year of the linux desktop will happen once Ubuntu or whatever distro is most popular has a logo that appears next to the windows and/or mac compatibility stickers.

And to add, if anything, we SHOULD be listening to geek enthusiasts if we want linux to go mainstream. Those are the people actually using it day to day, and the ones that will be installing it on other people's computers.

Do you really think we'll get to mainstream adoption by just ignoring issues techies face that a typical user would also face?


> The year of the linux desktop will happen once Ubuntu or whatever distro is most popular has a logo that appears next to the windows and/or mac compatibility stickers.

That's wrong and ignorant. I installed Kubuntu on my main desktop ~7 years ago, and have had zero compatibility or stability issues, even through 2 LTS upgrades. If everyone who wanted to try Linux used my exact hardware, the Linux desktop would not suddenly take off.

> And to add, if anything, we SHOULD be listening to geek enthusiasts if we want linux to go mainstream. Those are the people actually using it day to day, and the ones that will be installing it on other people's computers.

> Do you really think we'll get to mainstream adoption by just ignoring issues techies face that a typical user would also face?

Did you reply to the wrong person? I didn't say anything even remotely close to this.


> Who installs Linux themselves? Tech geeks/enthusiasts.

> Who should know better than to install an operating system before checking for hardware support? Tech geeks/enthusiasts.

> That's why I can't take those people seriously. ... we're not going to find answers to those problems by listening to these people.

Am I missing something here? My point is issues techies face are the same as a non techie would. So why wouldn't you listen to that kind of feedback from techies?

It's not like they're complaining about kernel internals or architecture, they're complaining about user experience compared to other mainstream operating systems.

If a linux desktop distro can't or isn't willing to address that, then it's simply not designed for a typical user as much as people want it to be.


Ok I see the mix up. When I said “those people”, I wasnt referring to tech people in general, just the ones that would say things like the Linux desktop is doomed because of hardware support.


> The year of the linux desktop will happen once Ubuntu or whatever distro is most popular has a logo that appears next to the windows and/or mac compatibility stickers.

That last network card I bought that had a Tux on the box used an unmaintained GPL-but-out-of-kernel driver that no longer compiled on mainline kernels.


Let me refine that then. The sticker needs the same guarantees the windows and macos ones do if it wants to be on the box.

Not just some kind of pandering for market share to get linux users to buy it.


Exactly this. These are the exact reasons why users having to 'install Linux distros' these days will always be limited to tech geeks and enthusiasts and not widely adopted as a mainstream desktop.

It also explains why Windows 11 (with WSL) is probably the best Linux distro for the desktop as it is already been 'painfully' admitted. [0]

[0] https://news.ycombinator.com/item?id=28960864


> Exactly this. These are the exact reasons why users having to 'install Linux distros' these days will always be limited to tech geeks and enthusiasts and not widely adopted as a mainstream desktop.

I don't think you've thought this through. How many Windows users out there have installed it themselves? How many could install it themselves? How many could complete an installation if somebody else booted up the installer for them on a machine with a single empty hard drive?

The popularity of Windows and MacOS have nothing to do with how easy they are to install, or with "hardware support".


> I don't think you've thought this through.

   are you sure?
> How many...

Windows is pre-installed on the majority of OEM desktops and laptops so there is no 'install' or 'install it themselves' step. Same for Apple computers with macOS and that is 100% a given for users.

ChromeOS succeeded due to this. But given that is going to be replaced by Fuchsia, we are just going to see Google putting Fuchsia on the desktop (and phones) instead.

The reasons why Google decided to migrate is another story but the hint is related to "hardware support" similar to how Apple supports their devices.


> Check if your hardware supports your OS before you buy it. Guess what? Then it all works great.

Not even then. I have a Ubuntu laptop direct from Dell (who has been selling Ubuntu laptops for 14 years), and every time I go to open it the battery is dead. So it's plugged in constantly. My cats love to sit on the closed lid because of how warm it is. Ostensibly it's asleep, but to the touch I can tell that it's running hot, and that's why it's dead within a few hours of being off the charger, even if it's not "on". I mean, it runs hotter sitting there idle and off than my new Surface does when it's on and compiling my Rust code. That's sad!

This is a laptop direct from a hardware manufacturer who supports Linux. This kind of thing was a problem when hibernation first became a thing. Okay, understandable. But 2+ decades later, in 2021, I can't purchase a Ubuntu laptop directly from a manufacturer that has this figured out. That's a problem for Linux. It's one thing to fall short. It's another to fall short for 2 decades.


I have a Dell Inspiron with AMD A12 processor that I bought with Windows preinstalled in 2019. Never even booted into Windows, just booted right into a USB stick and installed Elementary OS. Putting the system to Sleep was an issue, but now I have Linux Mint Cinnamon installed on it. No more issue with going to Sleep.

I can expect to use the laptop until it packs up.


What does Dell say if you report this as a defect in their product?


I'm free to return it, but otherwise they will investigate the issue and get back to me. As if.


> Check if your hardware supports your OS before you buy it. Guess what? Then it all works great. What a waste of time.

Ever opened bugzilla.kernel.org? Ever seen the number of regressions? In the past seven days alone:

https://bugzilla.kernel.org/buglist.cgi?chfield=%5BBug%20cre...

Count the number of "stopped working" - at least a dozen and these are only from the people who have bothered to find the kernel bugzilla and file bug reports which is not an easy feat.

There's no such thing as "checking if your hardware is properly supported under Linux". Everything may stop working at any time.


Windows users don't need to check as their hardware is fully supported as soon as they press the ON button on their new PC.

On top of that why waste time installing a distro or playing around with display managers, Linux NVIDIA drivers when WSL does it all on Windows?

The fact is: WSL saves time and makes installing Linux redundant and actually a waste of time.

Oh dear.


> The fact is: WSL saves time and makes installing Linux redundant and actually a waste of time.

If I ever need to run an OS that spies on me, I'll use Windows, then WSL to tun Linux. But I don't want that, so I run Linux. It's freedom, which is more important for me.


> The fact is: WSL saves time and makes installing Linux redundant and actually a waste of time.

Not sure what point you are making here. Yes, you can virtualize and OS and WSL is a nice way to do it?


> Yes, you can virtualize and OS and WSL is a nice way to do it?

Yes. That's the point. Installing Linux using WSL is several clicks away on the Microsoft Store, rather than wasting hours migrating, rebooting, formatting, dual-booting and then installing a Linux distro on the system.

WSL saves lots of time and makes installing Linux on another partition or in general a waste of time.


If are building both Windows and Linux software, that might be convenient. But not particularly more convenient than any other virtualization layer, just more limited (can't install other oses) and with a difficult to maintain and secure host OS. WSL seems to not compete with dual boot, it is competing with other virtual machine software now that is just nicely dressed vm.


But it's not FUD. These are issues that affect end real-life end users and many of them are hardware independent.


> Guess what? Then it all works great.

You know, except for literally all the non-hardware related problems in that list.


> Saying, "Don't call this FUD" up top doesn't prevent this from being possibly the most blatent FUD I've seen in a long time.

Anything more concrete, please? The entire 12 thousand words piece cannot be wrong considering it's shared so much and contains over three thousand comments most of which share the spirit of the article.


"Check if your hardware supports your OS before you buy it"

Have you ever tried this? It's shockingly difficult. Even if someone says "yes this device works under Linux 100%" half the time you'll see ANOTHER post telling you it's garbage on Linux.


The common path for someone using Linux for home computing is transitioning from windows on hardware they already own.


This is, unfortunately, not necessarily true (of course, the strategy makes sense).

I "voted with my money", and the result is that my laptop, which is supposed to be the poster child of Linux compatibility, has at least one serious hardware compatibility problem out of the box, along with at least another minor one.


The thing is, Linux distros make it super easy to check if your HW will work. Just download the distro you want, transfer to a USB stick. Reboot your PC with it and check any connected HW. If it works with the live system, it will work when installed on the PC.


What happens when you want to upgrade to a new PC? Do you take the live USB stick to the computer store and try to boot it?


> Check if your hardware supports your OS before you buy it. Guess what? Then it all works great. What a waste of time.

Say, I am buying Thinkpad T480s and Thinkpad X1 (2018). What would you suggest that I check?


In this scenario, the fact that Lenovo will sell you a Thinkpad X1 with Linux pre-installed shows there is a solution (how easy it is to get to that solution on your own without just buying one pre-configured from them may be an exercise for the reader). https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadx1/x...


Why shouldn't it be expected to work with what I already own?


I'd love to nitpick and disagree with some of these, but I think that misses the broader point:

I'm someone who believes that it would be a huge benefit to the world if desktop Linux were 10X more popular than it currently is. And ultimately, if there are concrete problems that are holding back Linux adoption, that's good. That means they can be fixed (at least in theory), and that as a result desktop Linux will have wider adoption.

If it turns out that all these problems are just "not our fault, dumb end users, just learn to google" and so on, then we're screwed. Linux adoption is just out of our control and there's nothing to be done.


Years ago I used Linux as my daily driver. That was before kids and a real house and all the other things that go with being a father and husband. Now, I ain't got time for that. I need something that works and doesn't require a lot of fiddling. I also support four other machines for various family members (total of eight including mine). I encouraged everyone to buy Macs. That lowered my support effort by 80%. I use one Windows machine for work. No Linux anymore (except WSL2) and I wouldn't even consider it.


My solution to the excessive fiddling was Debian Stable. I've been using it since 2007-ish and it does not get in the way of anything. But I agree with you on the Mac OS thing--the only tech support I have to provide is babysitting Time Machine and the occasionally hardware failure (which just requires a payment to Apple, and two round trips to the Apple store).


It's somewhere in the middle. I think that what hampers Linux adoption is the same thing that hampered Windows adoption: lack of OEM preinstalls. Windows was a nuisance that took too much resources and slowed your system down, so nobody ran it and few ISVs targeted it as a priority -- until a 386SX with VGA and 4 MiB of RAM became the default low-end option in 1990 or so. Then, vendors started bundling Windows 3.0 in with their systems (likely urged by Microsoft) and suddenly there was a huge Windows installed base, and applications software increasingly targeted Windows.

Oh, and the last thing people booted MS-DOS (or "MS-DOS mode" in Windows 9x) for was the last thing Linux enthusiasts boot Windows for today: gaming.

In short, none of the UI bikeshedding we do to Linux will do any good. We have to capture OEM hearts and minds.


> I'm someone who believes that it would be a huge benefit to the world if desktop Linux were 10X more popular than it currently is.

Linux' unpopularity is a direct result of its bugginess and not adhering to certain popular proprietary OS principles which are: 1) API/ABI stability including on the kernel level 2) Strict QA/QC 3) Broad HW support 4) Rich API support

Linux is lacking everywhere.


The biggest issue preventing me from switching to Linux on my macbook pro is trackpad support.

Before I tried Linux, I never realized how important a good trackpad driver is. The mouse never moves where I want it so I have to constantly correct. It registers a tap when I don't it to yet doesn't tap when I do. Trying to use a trackpad on Linux is so frustrating.

Whereas on macOS I've mastered the trackpad so that using it is easier than a mouse. I can move to and click on a random button in like 1sec.

I also need to mention that macOS has two-finger scrolling, and four-finger navigation through multiple windows / screens. I use both features a ton. Linux has two-finger support but it sucks, and I don't know about any four-finger navigation.

I've tried both the default linux driver and mtrack, mtrack is slightly better but still nothing compared to whatever macOS uses. AFAIK this is a well-known problem, but until it's resolved I just can't imagine using Linux on a laptop. Having a good trackpad just makes everything so much easier.


It turns out great trackpad support is a relatively hard problem. There's low-level analysis, heuristics, and even some honest-to-god machine learning pattern recognition in the MacOSX software that determines "Is the user doing something or did their palm bump the trackpad there?"


Trackpad support has gotten a lot better with Wayland and its libinput support. I have a Macbook Pro and trackpad support under Linux via Wayland is comparable to macOS.

Wayland and libinput support multitouch gestures. Two finger scrolling and four finger gestures work well.


thanks, I am actually going to try this

Linux is otherwise a very good OS for my needs, but bad trackpad support is a deal-breaker


Check out gestures[1] if you get libinput working with Wayland or X. libinput for X has less features and gesture support, though.

[1] https://gitlab.com/cunidev/gestures


That's an interesting list. A lot of those problems, however, are things that most people won't ever see. Of course, some will have an impact on feature availability (like how HDR is still nowhere to be found, be it on Xorg, Wayland or Vulkan apps).

Users care about:

* Hardware compatibility. This is actually markedly improved. I did not do any research when building my PC, and yet everything works. Including the Intel wifi6 card - that doesn't work on Windows 10 out of the box and I had to use Linux to download the driver.

Yeah maybe some CPU power states are not working great on my processor. I haven't hooked a Kill-a-Watt to make sure. Temps look fine.

* "Every game ships broken". Ok great. However, I've been working through my entire Steam library and the vast majority work great. Latest failure I saw was on Sea of Thieves (in game voice chat broken). Graphics though? Working great. Including on other titles like Cyberpunk. Vulkan and Proton were a game changer. With AMD open source drivers.

* "NVidia this, NVidia that". Yeah. That's the main thing. If you don't need a feature that's specific to NVidia, please don't buy their cards. I don't care if you are running Linux or Windows, they are actively hostile to the ecosystem. If you ARE on Linux, then good luck. You depend on their proprietary blob and that's very hit and miss. AMD will work much better on Linux (most likely, out of the box with open source drivers).

Yes, for now they have some tech that's specific to them like DLSS. Their raytracing performance is better than the competition (but mind you, they are the ones actively pushing for raytracing support on mainstream games and spending marketing dollars to entice customers). They have CUDA (then again, they did whatever they could to push it). I'm actually surprised they didn't come up with a fully proprietary rendering API yet.

There was a reason why Linus gave NVidia the finger.


A ton of anecdata, yet

1) Tons of Wifi adapters are not supported by the Linux kernel - you need to scavenge the web and compile random git repos

2) The games _you_ play work perfectly which means all others play well. Oh, great. Now please check DXVK which already contains hundreds of quirks thus mimicking AMD/NVIDIA Windows drivers in making badly coded games work.

3) https://gitlab.freedesktop.org/drm/amd/-/issues https://gitlab.freedesktop.org/drm/intel/-/issues - absolutely nothing there. All work perfectly.

Good luck and have fun.


Many developers don't seem to understand that people just want their computers to work. They also seem to not understand when you're throwing away a codebase, you're throwing away years of experience through bugfixes and user feedback. Everything must change and everything must be invented again. There's just no way you're ever going to get a solid desktop that way.


I've actually read through this rant some years past. It's mostly low-SNR opinionated junk mixed with some valid observations. In general, I could not recommend it.


> mostly low-SNR opinionated junk

Care to expand on that? The poll under the article kinda proves the opposite.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: