
Linux Problems on the Desktop (2018) - iron0013
https://itvision.altervista.org/why.linux.is.not.ready.for.the.desktop.current.html
======
fooblat
I have been a daily linux desktop user since 1997 and I will freely admit that
part of the enjoyment for me was getting everything working and setting up my
perfect desktop environment.

These days, as many others have pointed out, choose the right hardware and
everything just works.

At my company, we are currently in the middle of upgrading aging Windows 10
desktops in the Customer Service department to Ubuntu LTS. So far the feedback
is universally positive from the CS agents. Ubuntu runs faster on the existing
hardware and that's about all they notice. Chrome is still chrome and that's
what they use for all the CS apps, including voip calling.

------
0xb100db1ade
What I get out of this is that while each of these issues may be individually
dismissed, they together make for a very "non-premium" experience.

I use linux myself, but when people I know try it out, they immediately leave
after encountering problems with audio, dpi scaling, etc

~~~
baroffoos
I have found its super dependent on what hardware you get, Especially on
laptops. I always pick hardware that works perfectly with linux and I am left
with a very premium experience but when using linux on bad hardware like
macbooks and broadcom wifi everything just doesn't work right.

~~~
ilovecaching
Yep. Exactly this. If you get a laptop that a GNOME developer uses, chances
are you’ll run into maybe two issues a year. But try to get GNOME to run on
two 4K displays and an RTX.

But here’s the thing. Linux has won. Desktop is a dying market. Linux is
literally the most used operating system on phone and in the data center,
which is where it counts.

The DE has gotten much better transitively from all the work that’s gone into
those other use cases. But we still have a gamer market that is incredibly
proprietary holding up the middle finger to those of us who think proprietary
drivers and codecs are evil relics of a less open world.

If you really want to see Linux on the DE, boycott NVIDIA, or write them a
letter.

~~~
Animats
_Desktop is a dying market._

Actually, tablets are a dying market.[1] Laptops are a declining market. Fewer
desktops are being sold, but they're lasting longer because there's no reason
to replace them.

[1] [https://www.statista.com/statistics/272595/global-
shipments-...](https://www.statista.com/statistics/272595/global-shipments-
forecast-for-tablets-laptops-and-desktop-pcs/)

~~~
rbanffy
So, as we move forward, all these desktops will end up having stable and
mature support. Drivers are often lacking for bleeding edge components, but
stuff that was released 3 or 5 years ago tends to be very solid unless it's
something obscure nobody has ever seen.

~~~
Animats
Microsoft doesn't want you to have stable and mature support. They want you
using the latest everything and paying for it each month. Microsoft pushed
hard to force users to upgrade to Windows 10.[1] Even today, many enterprise
users refuse to convert. They don't want Microsoft's new rent, rather than
own, software. They don't want Microsoft looking into their machines. They
don't want to store corporate data in Microsoft's cloud. But they will
convert, like good sheep.

[1]
[https://www.forbes.com/sites/tonybradley/2016/01/22/resistan...](https://www.forbes.com/sites/tonybradley/2016/01/22/resistance-
is-futile-you-will-be-assimilated-into-windows-10/#3a776c34af39)

~~~
rbanffy
Come to Linux. We have cake.

And your 3 year old hardware probably will work a lot better than it does
under Windows ;-)

~~~
Animats
I'm on Linux. Xorg crashes on every login, and the crash reporter crashes
complaining the dump file is too short.

------
cossovich
I've been using Ubuntu LTS versions since 12.04 on Thinkpad T and X series
laptops and I'm a very happy camper - out of the box Ubuntu doesn't suck for
me, it "just works". I moved from OS X on latest Apple laptops to make my
daily job (interaction design + web development) more productive (e.g.
workstation running the same OS as servers, tooling etc) but now it's my
preferred OS + hardware combo from a end-user perspective. I have to switch
back to an Apple machine for testing and pairing with co-workers at least once
a week and between the new Apple laptop keyboard, the random reboots (awaking
from sleep), shitty web font rendering and intermittent errors relating to
Apple ID, I don't miss it. I really loved OS X quite a few years ago but
between the latest hardware (don't get me started about cords/dongles needed
for a 2018 Macbook Air) and growing list of OS X quirks I'm always happy to
return to Ubuntu 18.04 on my Thinkpad T450s.

I think many of the points raised in the article affect people making desktop
software for Linux rather than end-users of desktop Linux. It seems like a
global list of issues for the entire desktop Linux ecosystem - which is
totally valid but I think a more accurate title of the article might be "Why
developing desktop software on Linux sucks" or "Why creating a desktop Linux
distribution sucks" because I think my desktop Linux setup rocks!

~~~
pitaj
What file explorer do you use? Nautilus pisses me off. I'm 100x more
productive with Explorer on Windows.

Some features I'd like:

\- Being able to open the context menu for the current folder, even if there
are enough files to fill the view, without going up a level \- Being able to
jump to files/folders in the current directory by name without opening search
results \- Being able to add functionality to the context menu

~~~
AsyncAwait
Try Dolphin.

~~~
shaan7
Yep ^

------
Animats
He's right, of course. The Linux community has been in denial about this for
years.

At the kernel level and close to it, the areas that consistently give trouble
are video/GPU support and audio support. GPUs are hard, but there's no excuse
for the mess in audio persisting for a decade. Video/GPU support is tough, but
the current situation, where you have a choice of five different NVidia
drivers for the same board, all with different bugs, is not good.

As the author points out, regression failures are a big problem. The sheer
bloat of Linux has made it unmaintainable. And who wants that job? Big chunks
of important code are abandonware.

~~~
rbanffy
> The Linux community has been in denial about this for years.

That's not right. We all know support for some hardware is spotty and we all
have learned to avoid that. My laptops tend to use Intel GPUs, for instance,
because I want to work on them, not fix them.

I'm eyeing that new Lenovo thingie with an epaper keyboard, but I know it'll
run Windows and probably never be upgraded because nobody will write the
drivers to keep that thing alive past Windows 12.

> you have a choice of five different NVidia drivers for the same board, all
> with different bugs

Stop buying NVidia hardware. They actively sabotage Linux development. AMD is
much better in that regard. Buy AMD instead
([https://www.phoronix.com/scan.php?page=news_item&px=AMD-
Hiri...](https://www.phoronix.com/scan.php?page=news_item&px=AMD-
Hiring-10-More-Open-Source)).

> The sheer bloat of Linux has made it unmaintainable.

Nope. It's still moving forward and it's still quite reliable. All my
workloads run on it (except my pets that run on FreeBSD and OpenIndiana
because I get a kick out of managing different OSs).

> Big chunks of important code are abandonware.

There is a process to move obsolete codebases out of the kernel. That's why
you can't use one of those half-IDE CD-ROMs that came with "multimedia kits"
of the early 90's.

~~~
sametmax
> Nope. It's still moving forward and it's still quite reliable. All my
> workloads run on it (except my pets that run on FreeBSD and OpenIndiana
> because I get a kick out of managing different OSs).

That's the denial there.

I'm a Linux guy. I post this from a linux distrib.

But realistically, we do have a huge amount of technical debt, and less and
less incentive to work on them.

Case in point: every time we touch something to improve it, we break things
for one year or two. Pulse audio ? Took 4 years to be stable. Systemd ? 3
years at least. Network manager crashed for 6 good years, and still can't work
decently with sleep mode.

We manage to provide features because the linux kernel devs are incredibly
competent. They also limited the bloat to a manageable stack on their side.
But around that it's the far west.

~~~
jim-jim-jim
If complexity's got you down, you might have better luck with OpenBSD. Coming
from Linux, you'll be amazed by how simple it can be and how much of it just
works.

Though non-intel graphics are still shit on it afaik.

~~~
sametmax
Linux has already usability issues because it's a niche. I'm not going to use
a niche of a niche.

------
Mister_Snuggles
I wish that there would be one unified API for creating desktop programs on
Linux. Right now it's somewhat coalesced on GTK/GNOME and Qt/KDE, though there
are a number of others out there.

I use Linux in a VM for very hobbyist level embedded development (think
Arduino and the like). Driver problems are non-existent, all of the technical
problems are non-issues in this environment. The problems that I see are all
to do with the lack of a common set of services for building a GUI
application.

Why do my text editor and Arduino IDE use different file pickers? It's because
my text editor uses the KDE API, but the Arduino IDE uses something else. GIMP
uses yet a different file picker from the other two. LibreOffice uses yet
another file picker, that's similar to Kate's but slightly different. I'm sure
that installing Atom and VS Code would introduce me to two more file pickers.

The reason for this is that each of these programs uses a different GUI
toolkit and, as a result, has a different concept of what a file picker needs
to look like. Some of them don't even agree on which order the Open and Cancel
buttons should be.

Network transparency is another thing that suffers from this. On Windows, you
can basically use a UNC path (\\\server\share\path\to\file.txt) almost
anywhere because the entire system from the file picker all the way down to
the file APIs knows about UNC paths. In Linux, KDE apps do this one way, GNOME
apps do it a different way, and command line tools need you to somehow mount
the target server before you can even think about it. I last seriously used
Windows about 14 years ago and I _still_ miss this greatly.

None of these are insurmountable problems, but it needs someone to make a
decision about the one true way to do things.

~~~
meditate
>someone to make a decision about the one true way to do things.

Things don't really work this way in free and open source development. There
is no one person to make decisions, consensus is reached when the quality of
something raises "above the bar" and actually improves things for all involved
parties. If someone wants there to be an über-library that serves everyone's
use case then it's up to them to go and do the work to build that.

And it has been getting better in this regard. For example KDE and GNOME used
to have their own IPC, multimedia & audio mixing backends, but now both have
converged on DBus, GStreamer and PulseAudio, in part because these were
intentionally built to be flexible low-level solutions. I'm sure there are
more examples of this too but those are the first that come to mind.

~~~
Mister_Snuggles
You're absolutely right. I wonder if something like DBus and PulseAudio could
happen with my UNC pain point.

With the assumption that the goal is for "vi //server/share/file.txt" to work
the same as "notepad.exe \\\server\share\file.txt" does on Windows, here are
my thoughts.

First off, notepad.exe doesn't really care about the fact that it's a UNC
path. It just opens the file with CreateFile (either CreateFileW or
CreateFileA).

There would need to be replacements for the libc file functions. These could
be a shim in front of libc, or baked right into libc. Note, there's a LOT more
needed than "just" new file functions - any functions that do anything with
paths need to be looked at. Shells would likely need some changes to work
properly, though it's not like the Windows shell can truly do much with UNC
paths - copying files to/from works, but you can't cd into them.

How does it ask for credentials? If it's via DBus, a desktop environment
provide the authentication prompts, but what about a pure-commandline system?
Maybe the transport is just SSH and relies on the existing public key
authentication? But what if you're just doing a one-off thing and don't want
to set that up? Using SSH is probably a decent idea since it's got
authentication, security, and a file transfer protocol, already built in.

On top of all of this, when you open //server/share/file.txt for writing, what
does that actually mean? Is there a file descriptor? How does that work with
the kernel? Does libc now manage all file descriptors with only a subset
corresponding to kernel file descriptors? Could a pure user-space solution
fake this well enough to actually work? Would this need to be a FUSE
filesystem along with some daemon to automatically unmount the remote servers
when the mount is no longer needed? Would it be something like the
automounter, just a lot better? Does a kernel need changes for any of this to
work?

This is one of those things that touches so many layers and potentially
interacts with so many parts of the system, potentially all the way down to
the kernel.

My guess, and I don't actually think this will happen, is that Apple will do
something like this on Mac OS X and have a reasonable mapping to the BSD world
underneath, then someone in the Linux community will come along and do
something similar in a way that's better suited for Linux. As a parallel,
Apple came out with launchd in 2005 to replace init scripts, systemd made an
appearance in 2010 - both do very similar jobs, with launchd tailored to the
needs of MacOS and systemd tailored to the needs of Linux. Maybe something
similar could happen with UNC-like file sharing.

~~~
meditate
All that has been doable for quite some time, you could mount SMB shares like
that with smbfs since early releases of Samba, and later with the CIFS fs
driver. You do need root to mount things that way, so it isn't ideal.

For the more complicated stuff it can be done but not everything is available
via a simple GUI. GNOME and KDE have their own virtual filesystem layers in
userspace, GVfs and KIO, I don't know what KIO does but GVfs supports a bunch
of network backends and has a FUSE driver that can mount its own virtual
filesystems and expose them to outside applications. So the features are there
but I don't think they are well-presented right now, maybe someone can prove
me wrong though.

It would have been nice if the kernel had better support for fine-grained
control over filesystems like HURD or Plan 9 do. But instead it was decided
that it was better to handle those things with userspace daemons, so that's
where we are now.

~~~
Mister_Snuggles
These aren't the same thing though. The GNOME and KDE VFS layers only apply
for applications written for those APIs. It's not a universal thing.

Being able to mount a CIFS filesystem is fine, but it's not the same thing. In
Windows, you can basically use a UNC path anywhere because CreateFile knows
how to deal with it. The point is that you don't _need_ to mount the remote
filesystem (the Windows-equivalent being mapping a network drive).

What I'm really looking for is the user experience, not the underlying
protocol. On Windows, I can just go "notepad.exe \\\server\share\file.txt" and
edit the file, on Linux I need to either use a KDE application or go through
the ceremony of mounting the remote filesystem. It's the fact that the feature
is silo'd into GNOME and KDE (and the fact that it doesn't even exist on Mac
OS, but that's another issue) that bugs me.

~~~
meditate
There is currently no kernel interface that I know of to do that, and I don't
think it would be too hard to hook into an open() on an invalid path and try
to do something (mount a network fs, call out to GVfs or KIO, etc), but I can
tell you you will meet resistance if you try to because things like "//stuff"
and "smb://stuff" are already valid local file paths in Linux. So I leave it
up to you to figure out how to do this without breaking things.

~~~
Mister_Snuggles
Yeah, this is definitely not an easy problem to solve given the design of
Linux.

I don't know why I didn't remember this earlier, but I actually explored this
a number of years ago and came up with two things that are close, but not
quite there:

First was to use a systemd automount unit[0], but I didn't really get anywhere
with it. From the looks of it you have to know all the possible things you
could want to automount, it can't do wildcards. Being able to do some kind of
pattern matching on the requested path and translate that into a mount command
would go a long way to making this work.

I also explored the good old automounter[1][2], but it has a lot of the
limitations that systemd's does. It does have the advantage of supporting host
maps, which gets me a bit closer to what I'm looking for. The unfortunate
thing that remains is that this is NFS instead of a modern protocol. If this
were somehow backended on sshfs, I suspect it would be quite useful. Of
course, sshfs is missing the concept of shares but that's not a showstopper by
any means. Authentication becomes a problem since the automounter probably
can't ask the user for a password, and may not even know which user is
requesting the mount.

I have no idea how well either will work in practice. Modern Linux on the
desktop is a very different environment than the one the automounter and NFS
were built for. The systemd automounter looks like it serves a very specific
purpose and can't currently do what I want.

Maybe all we really need is a modernized automounter and/or some extra
features in systemd's automounter. These could lead to to "vi
/net/server/share/file.txt" working as expected which, quite honestly, is
basically the same as what I suggested earlier.

[0]
[https://www.freedesktop.org/software/systemd/man/systemd.aut...](https://www.freedesktop.org/software/systemd/man/systemd.automount.html)

[1]
[https://linux.die.net/man/8/automount](https://linux.die.net/man/8/automount)

[2]
[https://linux.die.net/man/5/auto.master](https://linux.die.net/man/5/auto.master)

~~~
ti_ranger
> I also explored the good old automounter[1][2], but it has a lot of the
> limitations that systemd's does. It does have the advantage of supporting
> host maps, which gets me a bit closer to what I'm looking for. The
> unfortunate thing that remains is that this is NFS instead of a modern
> protocol.

What limitations affect you?

(At home, I have linux running on an HP MicroServer as my NAS, it exports
filessytems via NFS. Other machines run autofs with the hosts map, so for
example my wife's desktop - and mine for that matter - auto-mounts NFS shares
on-demand and she can open any file directly in any application by accessing
/net/$hostname/$path).

NFSv4 is pretty modern ...

I believe this should also work for CIFS, if the server-side supports unix
extensions (to do user mapping on a single connection), but I haven't had time
to try it in the past day in my limited time at home.

> Authentication becomes a problem since the automounter probably can't ask
> the user for a password, and may not even know which user is requesting the
> mount.

If you have Kerberos setup, NFSv4 does the right thing ...

If you don't have Kerberos setup, then you're probably ok with just normal NFS
user mapping.

~~~
Mister_Snuggles
Interesting, I'll have to give automount another look.

The last time I tried it was years ago, so I can't remember what limitations I
found. If I get a chance to do this in the near future I'll report back.

------
smlacy
How about a list of all the things that really work great?

I've been using Linux on the desktop for nearly 20 years and I'll have to say
it's fantastic, despite the occasional headache, which seems to be at a far
less frequency than other major desktop operating systems.

~~~
cheerlessbog
I'm guessing you're relatively tech savvy and enjoy troubleshooting (within
reason). Most people aren't, of course.

~~~
thecrumb
I have a friend who was complaining about Windows 10 so I set him up with
Ubuntu. He's about a dumb as it gets with computers and he has never had an
issue with Linux. The few things he's installed have been in the software
store and he clicks the apt update once a week. He's much happier and less
frustrated with Linux vs Windows.

~~~
gerdesj
Cool beans mate. I think you are describing someone actively updating their
system because it isn't too onerous contrasted to it being a good idea
(risk:reward).

Many, many years ago a decision regarding Window's software management was
made and ever since it means that updates take sodding ages and sometimes
require multiple reboots and are generally unpleasant. One day that will be
fixed - it is not normal.

------
jrockway
I feel like a lot of the usability quirks that Linux has are trying to
shoehorn a multi-user system into a single-user context. For example, there is
so much work done (and even complaining in favor of doing that work in this
article) to make it possible for multiple people to sit at the same computer.
Nobody does that! Most people have more than one computer! Why do people spend
their free time on that use case?

This could be a long rant, so I'll keep it short... but someday I'm just going
to rip the concept of users out of Linux and see what it looks like. Oh no,
you say, malware will get you! Unlikely. Malware running as my user can fuck
over my life just as easily as malware running as root. So why even pretend
that that's a good isolation model? It doesn't prevent any attacks.

(As for how Linux in 2019 is doing... I recently switched back to Ubuntu for a
desktop. Whenever I lock the screen and have DPMS enabled, it forgets that I
have two monitors and that I want 200% DPI scaling when it wakes back up.
What? Back in my day you had to hard-code the resolution and monitor
configuration in the X11R6 config and there was no way to change it without
restarting the X server. May I please have those days back? At least once it
started working, it kept working.)

~~~
AnIdiotOnTheNet
Right? The only reason user accounts exist at all in Desktop OSs is that all
of them today were originally server OSs. Placing restrictions on user
accounts is only useful for protecting the system from users, which is a valid
concern on a network with shared resources but worse than useless on a
personal desktop.

Mobile OSs got this right: on a personal device, the permissions model should
be applied to the applications.

~~~
JonathonW
> Placing restrictions on user accounts is only useful for protecting the
> system from users, which is a valid concern on a network with shared
> resources but worse than useless on a personal desktop.

True for home users; not necessarily true for corporate users-- where
computers are IT-managed (i.e. "don't let end users fuck them up") and may be
shared (which is highly situational-- the degree to which computers are shared
varies highly from company to company, or even deployment to deployment).

Heck, it's not even unheard of to end up with multiple "simultaneous" users on
a single-seat desktop machine-- every major OS these days supports some form
of fast user switching, which will leave one user's programs running while
another user's physically sitting at the machine.

~~~
AnIdiotOnTheNet
A few things, since I work in IT: we don't give a damn about your workstation.
It's a fungible resource. Reimaging is easy and relatively quick. We even let
you have local admin because who cares. Users are not prone to playing around
with settings they don't understand, in my experience. If someone was
constantly needing their workstation reimaged we'd probably just fire them for
being incompetent. Ideally, the OS would be completely separate from the
applications and configuration and be immutable, and that would go a long way
towards eliminating those kinds of problems.

We managed to share home desktop computers in the 90s without significant
problems, even though the OSs we used didn't support multiple user accounts at
all. And there's no reason you need user accounts to accomplish what you're
describing. You can still have profiles (preferences, application configs,
etc), and you can encrypt them with a passphrase if you have any reason not to
trust others using the same device.

> Heck, it's not even unheard of to end up with multiple "simultaneous" users
> on a single-seat desktop machine

A vanishingly small use case inside an already vanishingly small use case.

~~~
IWeldMelons
Not everyone lives in a first world country. I live in the Central Asia, and
$200 a month is a decent salary here. And it is normal to share a computer
between different family members.

~~~
AnIdiotOnTheNet
Again, we did that in the 90s here in the west all the time and user accounts
weren't necessary and their introduction did not really solve anything.

------
swebs
>Year 2015 welcomed us with 134 vulnerabilities in one package alone:
WebKitGTK+ WSA-2015-0002. I'm not implying that Linux is worse than
Windows/MacOS proprietary/closed software - I'm just saying that the mantra
that open source is more secure by definition because everyone can read the
code is apparently totally wrong.

Huh? Those 134 vulnerabilities were found _because_ people can see the code.
If it were closed source, they would probably still be there today.

~~~
maxerickson
Closed source software is not opaque, people study how it works all the time.

------
jcastro
I get that some people don't like linux, but some of these examples are just
ridiculous.

Linux is administered by ssh therefore administrators don't know how to check
so therefore they don't bother to update systems because "they're afraid that
something will break." C'mon.

~~~
dcbadacd
That's unfortunately true.

------
blaze33
Linux as the open-sourced work of brilliant software developers wouldn't power
most servers if it sucked.

But could designing good desktops need more than just good code?

Good kernels successfully run code. Good desktops successfully help users. I
guess different goals require different designs?

Edit: to clarify, I didn't mean desktops don't require well designed software.
Just had in mind that a desktop also have to take human psychology and human
limitations into account.

~~~
gmueckl
Wait, you are confounding some things. A software design is good when it
allows all of its parts to be elegant and meet the requirements. In no way
does that say that a desktop OS is required to be designed badly.

Linux is powering servers and high performance computing because it is good at
these things: mostly static hardware configuration, set up once during system
installation high performance, modularity and the ability to inspect a deeply
running system of you are an expert. It ticks all the boxes for these specidic
environments.

On the desktop, not so much. For example, the concept of device files is
hindering use cases that should "just work". When I plug in USB headphones, a
new audio device is created. Fine. But I need to enter the device file name or
ALSA device string onto half a dozen programs to use it. All I would want is
to have the audio rerouted automatically. Pulseaudio was touted as the
solution to that problem, but ar what cost? We're now literally stacking audio
systems on top of audio systems and sacrifice to arcane gods to have it work.

When I plug in a USB drive, I now have to look up its device file name in
order to mount it manually. The software stack required to automount it from a
desktop environment is atrociously complex, because it requires root
privileges to mount a device not listed in /etc/fstab with a user flag. And
because any number of drives can be connected in any possible order, no
entries in fstab can be made.

This clash of UNIX-like concepts and modern user expectations is what is
holding Linux back. The underpinnings are not bad. They were just designed for
a different task.

So, yes, you can build a user friendly OS. Yes, it can have a clean design.
But it won't be called Linux anymore.

~~~
fatboy93
Just curious, but I had the same issues, turns our I didn't install gvfs and
associated handlers for mounting unmounting drives. Turns out installing it
resolved most of my USB connection issues

~~~
gmueckl
I know that there are solutions (I use KDE, so it works differently there).
The point I am trying to make is that UNIX device files were never designed to
deal with the dynamic hardware configuration we see today, especially on
laptops with periphery gerting plugged in and yanked periodically. And some of
the solutions, like gvfs, are overly complex user space workarounds. A system
that accounts for these dynamic usage patterns would have to look differently.
But it does by no means have to be ugly code. In fact, it would probably be
much simpler and more elegant than the current Linux desktop user space.

------
hartator
The only thing about Linux I never really like is the package system.

Like if you want to install a new software, you usually don’t get .exe or
.dmg. If you are lucky, the developer or some fans took the time to package
it. Then you can do ‘apt-get’ ‘yum’ or ‘pacman’. However, packages got stalled
and sometimes don’t match the original author intent. You can also build from
sources, but it takes time and you have to know a bit of CLI. It never felt
true freedom to me. But more like whatever the community feels make sense for
whatever distribution weird dictactorship. Just a feeling and I still love and
support Linux.

~~~
gerdesj
You clearly have never used Linux in any form. There are multiple package
managers, granted.

However, one strength they all share is: "you usually don't get .exe or .dmg"!
Absolutely! Apps are integrated and not simply add-ons as they are in Windows
or Apple land. When I want to install say libreoffice or wireshark I simply
ask the system to install them. I absolutely do not browse the internet and
download something, extract it and run some "installer". When I update my
system, all apps and the OS are updated in one go.

My system is curated for me, end to end, to a greater or lesser extent. When I
update, all my system is updated - OS, apps and all.

I don't think yours is (whatever it is).

~~~
hartator
If you need the last version of Wireshark, it’s getting complicated. And when
you have a setup with Wireshark that works why update it? I am dubious that
Wireshark can be an attack vector so security updates won’t be useful.
Installing last versions is very straight forward when you are on Osx or
Windows. You just go to the website and download it. Plus you get the original
binaries not a doctored version to fit whatever launcher and log organization
is currenty trending at Ubuntu HQ. Don’t get me wrong I still love my Ubuntus.
But this we need to rewrite all software to fit our distributions is not a
strenght. I think this line of thinking is kind of shared by Linus himself.

~~~
mr_toad
Going to dozens or even hundreds of different websites to download the latest
versions of software and then manually installing them all isn’t fast at all.

c.f. apt-get upgrade or similar, which takes a few seconds.

~~~
sandov
>c.f. apt-get upgrade or similar, which takes a few seconds.

Except when the package isn't included, or it isn't the version you needed,
because then you have to spend 40 minutes trying to install all the
dependencies and building the package from source, instead of the 3 minutes it
would have taken to install an .exe on Windows.

~~~
ti_ranger
> because then you have to spend 40 minutes trying to install all the
> dependencies and building the package from source, instead of the 3 minutes
> it would have taken to install an .exe on Windows.

No, you take 2 minutes to install it using flatpak, or download a .appimage
file.

And if neither of those are available, you spend that 40 minutes packaging the
software, and submitting it to the distro you use for inclusion.

(If it was already packaged, but not new enough, that's a ~5 min job to do the
update for your distro)

------
Anon1096
Most of these aren't Linux on Desktop issues that any day to day user will
ever encounter. A lot of them yes, but stuff like the author's issues with
x.org being a bad piece of software are completely orthogonal to anything a
desktop user of Linux will care about. This list would be a lot better if it
were cut to just the major issues.

~~~
AnIdiotOnTheNet
Why is the defense for "there are problems with Linux" always some appeal to
the workflow of a strawman 'average user'? A class of people which, I might
add, still show no interest in using Linux.

~~~
yjftsjthsd-h
Because as a general rule the power users already have minimal problems and
can fix them pretty easily. So the next concern is people who aren't familiar
with the system, i.e. average users.

~~~
AnIdiotOnTheNet
Have you considered that even many power users don't want to use Linux
Desktop? It has by no means won over even that crowd.

------
hn_throwaway_99
I've been using a Google Pixelbook (their flagship Chromebook product) as my
main development machine for the past couple months, and I love it.

The ChromeOS "Crostini" project allows you to run a full Debian instance in a
container, but with the other benefits of a Chromebook and ChromeOS. Each new
release of ChromeOS brings better enhancements to Crostini, e.g. right now
backups are manual but a native "one click" backup is coming soon -
[https://www.aboutchromebooks.com/news/crostini-linux-
backup-...](https://www.aboutchromebooks.com/news/crostini-linux-backup-
restore-import-export-tremplin-chrome-os-74/) . FWIW some of the tools I run
are the IntelliJ suite, Atom, postgres, Docker, etc.

IMO Linux on the desktop is awesome, it just happens to be running in a
container on ChromeOS.

~~~
hartator
Oh cool. My 2017 MacBook Pro is just getting the stuck key issue. Is
performance wise the chrome browser faster than the Windows one?

~~~
jamiek88
Apple will fix that for free btw.

I took my out of warranty 2017 mob and had no issues getting a repair.

------
SMFloris
I use Linux for gaming almost everyday with Steam/Proton. It works flawlessly
90% of the time. I find no issues with my linux desktop whatsoever. It is
stable, I rarely restart it, updates so far didn't break a thing.

~~~
simonh
To be viable as a general desktop environment for non-technical users, that
has to be true not just for some users like yourself, but for all but about a
few in a thousand users across a very wide cross section of available
hardware.

~~~
AsyncAwait
> across a very wide cross section of available hardware

This is a sticking point I just cannot get behind. When you're purchasing a
computer running Windows, it is optimized for Windows. When you're purchasing
a Mac, it is _optimized_ to run macOS.

So it should logically follow that if you want hardware _optimized_ to run
Linux, you should purchase that specifically. Expecting Linux to work
flawlessly on any random junk is a feat you're not expecting of any other OS.

Therefore by that logic, for Linux to be good enough on the desktop, it has to
ascend to places no other OS does.

~~~
breakingcups
What? When I'm purchasing a computer, any computer, I know with (near) 100%
certainty it will run Windows flawlessly. I definitely expect Windows to run
on any random junk.

~~~
AsyncAwait
> What? When I'm purchasing a computer, any computer, I know with (near) 100%
> certainty it will run Windows flawlessly. I definitely expect Windows to run
> on any random junk.

So do you expect Windows to run flawlessly on a Chromebook? I'd guess not.
When you're purchasing random hardware in a store it most likely comes per-
installed with Windows and has been made for and tuned for Windows.

It's just that Windows has such marketshare that the vast majority of
computers are per-installed with Windows and have drivers primarily for
Windows.

~~~
kkarakk
same difference, if chromebooks weren't actively hostile to windows they'd be
running windows too.it's easier

~~~
AsyncAwait
If certain Windows laptops weren't actively hostile to Linux, they'd be
running it smoothly out of the box too, what's your point?

------
sandGorgon
> _Create a universal packaging format for bundling software which supports
> signatures, weak dependencies, isolation (aka sandboxing /virtualization),
> clean uninstallation and standard APIs to make it possible to integrate an
> application with your DE._

Well this battle is lost. There was always the RPM/DEB/PKGBUILD split. But
rather than unifying the standards, we now have Flatpak vs Snap split.

It is seriously frustrating that when distros can agree on core infrastructure
stuff like systemd-vs-upstart and wayland-vs-mir ... we still have a software
distribution split that is more political than technical.

This ultimately hurts linux - because there's never going to be that clear
monopoly in the packaging space. Someone or the other is going to say "I can
only package for X. All others can go figure it out themselves".

I dont know... maybe APK (android) has won as the predominant linux packaging
format ? I'm kind of waiting for ChromeOS as the true Linux distro.

~~~
geezerjay
It's not a political issue if you want to force distros to move their entire
packaging infrastructure to a backward incompatible format that might still
not be compatible with other distros because they ship specific versions of
some packages which are in addittion packaged in a very particular way.

Switching to RPM/DEB/PKGBUILD is not a simple problem, because the problem
isn't really which packaging infrastructure is used.

~~~
sandGorgon
But I'm not talking about that. I'm talking about TWO competing FUTURE formats
that were invented to carry the battle forward.

Snap and Flatpak is already incompatible with what came before. They chose to
not come to a a middle ground quite deliberately.

------
fghtr
Everyone for some reason expects that GNU/Linux should work on every hardware
configuration. I don't understand that, honestly. Why is there no such
requirement for MacOS? Why do Windows-certified hardware have to work
flawlessly with GNU/Linux? Just buy a Desktop/Laptop certified for GNU/Linux
and stop complaining.

~~~
Krasnol
> Why is there no such requirement for MacOS?

Probably because nobody expects much from MacOS anyway?

One relevant part of the Apple world is, that it works within it's narrow
world of Hard- and Software. This is how they sell it.

There is much much more Win-certified hardware out there. Much of it isn't
used (for Win) anymore and therefore cheap as hell for excample. This is the
hardware you'd expect your software to work with. This is where it's worth to
invest time in.

~~~
neuronic
> One relevant part of the Apple world is, that it works

You already figured out, why people enjoy using Macs. It works and I don't
need to cycle through three different system compositions before graphics,
audio and WiFi work.

~~~
Krasnol
I'm not sure if I misunderstood OPs comment or if your comment is in no way
related to the topic.

But yeah, thanks for the ad. I guess... ;)

------
wwarner
All this is true, but not really fair. Count up the issue raised by the OP,
and Nvidia really has the most power to stabilize the linux desktop. I read
that gamers who want peak performance on Linux can get the proprietary Nvidia
drivers running well. So it's possible, but it will never be without the
friction of open source running along side signed proprietary binaries.

What I would love to know is why linux isn't equal or superior to Apple and
Msft in power management.

~~~
shereadsthenews
Linux seems equal in power management to Apple. On my ThinkPad X1 Carbon with
the 57 W-h battery I get a good solid 8 hours of work and often the system
consumes as little as 4W. My MacBook Pro with a 54.5 W-h battery runs for at
most 4 hours. Now, it's true that I had to edit dozens of little configs to
get the ThinkPad with Linux up to where a ThinkPad with Windows is right out
of the box, such as enabling PCIe ASPM and making it still be enabled after
resume from suspend, but to my knowledge there are not such power efficiency
hacks available to the user on macOS, and it's bad by default.

~~~
asark
The biggest power efficiency hack for macOS is to avoid running JS anywhere
other than in Safari, and there as little as possible. Chrome and Firefox both
eat battery like crazy. Firefox worse than Chrome, but either will take a
couple hours off your battery life versus Safari. And running lots of heavy JS
tabs or "apps" will do even worse.

Not that that necessarily has anything to do with the differences you're
seeing in particular, just worth noting. Wrong browser, 20-30% worse battery
life. Wrong site open in some tab in the background, 30-60% worse battery
life. The overwhelming majority of what takes my Mac battery life from 8+
hours to 4 or less is Javascript or some heavy Java IDE or something. User
software that gives a damn about how much power it uses makes a huge
difference, well outside of just "don't run video games on battery power" and
other obvious stuff.

------
ekianjo
This article is wrong on numerous on its claims either because some are just
false, or some are skewed by a perception of how things "should" work which is
utter bollocks.

Like LTS not being suitable is complete nonsense. You can easily add a repo to
have up to date GPU drivers and never have to worry about it.

Steam having only indies games misses completely the Proton era.

I could go on and on. It seems like a hit piece from someone who is out to
prevent people from considering switching.

------
newnewpdro
That page is so riddled with inaccurate/mischaracterized statements I couldn't
even make it past the halfway point.

It's not without any truths, but there's a clearly demonstrated preference for
divisive negativity over communicating facts.

------
UncleEntity
> All native Linux filesystems are case sensitive about filenames which
> utterly confuses most users. This wonderful peculiarity doesn't have any
> sensible rationale. Less than 0.01% of users in the Linux world depend on
> this feature.

The horror...

Though I wonder why users would be confused since this only really applies to
the command line and there were (multiple) other points complaining about
users having to use the command line.

    
    
      user: cd Foo
      bash: cd: Foo: No such file or directory
      user: wget https://www.microsoft.com/en-us/software-download/windows10/win10.iso && echo "screw you linux!1!"

------
magicalhippo
I've been running KDE Neon on NUC for a while, and Kubuntu before that since
KDE 3.5 days. But still it's not ready to become my primary desktop.

The main issue for me is RDP-like (not VNC-like) remote desktop experience.
Without that I'm not even gonna try.

I mean the kind where I check my desktop before I leave for work, resume the
session remotely from work (when compiling or whatever), and pick up when I
get home. With performance which makes one forget it's a remote session, and
bidir clipboard sharing.

So until then, Windows on the desktop it is. I can always run Linux on the NUC
or in a VM.

~~~
blackfawn
I've been quite happy with NoMachine[0]. It has probably even more options
than RDP (file transfer and drive sharing options, various graphics and audio
options, etc.) You can even watch videos through it if your connection is
decent. You can pick up an existing X session or start a new one.

[0]: [https://www.nomachine.com](https://www.nomachine.com)

~~~
magicalhippo
I just tried, again. Even on LAN (same switch in fact) the performance is
worse than what I got when I was RDP'ing to my desktop in Norway from a hotel
in Hawaii. And it didn't forward audio at all.

I'll check it out from work tomorrow, but so far it's usable but not exactly
great when compared to RDP.

------
Dahoon
> _" most Windows 95 applications still run fine in Windows 10"_

Uhm, no. Not really. With hacks by the user (or OS) maybe, but if hacks are
allowed then most of that list can be deleted.

~~~
gsich
Yes, really. Most games do not however. I don't have programs though for
testing.

Edit:
[http://www.ecsis.net/pub/netuser/ftp.html](http://www.ecsis.net/pub/netuser/ftp.html)

tried Rftp, worked with Windows 10. At least 22 years of compatibility.

~~~
Dahoon
Testing a single .exe as simple as Calculator doesn't show much. How about
running StarOffice 4.0?

~~~
gsich
It shows compatibility, which you doubted in the previous post ("no, not
really"). Doesn't matter how simple you deem the program.

Where can I download it?

------
jaimex2
Steam's Linux support is fantastic these days. Almost everything works through
their Wine fork.

~~~
pjmlp
And yet it hardly matters.

[https://www.engadget.com/2019/02/19/linux-gaming-steam-
valve...](https://www.engadget.com/2019/02/19/linux-gaming-steam-valve-epic-
games-store/)

------
arianvanp
> multiGPU rendering for games in Linux is not currently possible in any shape
> or form (no games or solutions).

that's because crossfire has been abandoned for proper mGPU support in the new
Vulkan API. which is 100% supported on Linux. this is total FUD

------
deafcalculus
PS4 is basically a modified Free BSD. So, "fixing" Linux desktop is probably
doable, but extremely difficult to profitably do. Valve seemed interested when
Windows introduced the App store and seemed to be going App store / Play store
way, but now that Microsoft is losing interest in Windows, they don't have as
much incentive to pursue Linux.

~~~
ddebernardy
PS4 is not really comparable. (Nor is macOS, for that matter.) It's one thing
to get some *nix flavor to work on standardized hardware. It's another to make
it work on the wide variety of computing devices where you could install
Windows.

~~~
deafcalculus
True, but hardware compatibility isn't the only thing wrong with Linux
desktop. There's a lot of broken stuff on the software side too (ex: X windows
as the article points out). In fact, hardware compatibility hasn't been that
much of an issue on desktops in recent times. We have a couple hundred Linux
desktops (not laptops) at my university, and what I find is that if you're
using last-gen Intel CPU with iGPU on a desktop with Ethernet, hardware compat
isn't much of an issue at all.

------
rawmodz
Linux "sucks". True. For last two weeks I was trying to install a distro, that
will work wirh my Broadcom internet adapter, but no success. Also BT did not
work well. Of course I could get Wifi to work after a heavy and bloody battle,
and only at 2.4 Ghz, but then BT stopped working. The point is, that all those
are working fine under Windows 10. No hassle. My question is, why there is no
Linux distro, that can work just fine out of the box? I am not a system
programmer specialist like 95% of PC users. Millions of people would like to
install Linux, but just cant do it, due to lack of knowledge. Average person,
just want to download, install and run a system, without thinking of finding
lost drvers, kernels, waste time to search internet, to find a solution. I
went trough dozens of distros, and could not find one, that would work with my
ACEPC T11 mini PC. Windows 10 just do.

~~~
jon-wood
Broadcom chipsets are notorious under Linux, because its had to be reverse
engineered without support from Broadcom. The reason it works out of the box
on Windows 10 is because the drivers are fully supported, with Broadcom
engineers providing a full implementation of everything - if they did the same
for Linux then it would work perfectly there as well.

~~~
AnIdiotOnTheNet
Linux isn't blameless for this problem, because they religiously enforce that
drivers must be open source.

------
ensiferum
The problem is that a lot of these issues would require co-operation and
agreement from multiple different teams to have a solution that would cover
the _whole_ system. But in the Linux world nobody really has the authority to
_make_ things work and play nice nice and to manage and design the whole
system architecture. Distros do what they can of course, but they don't want
to deviate too much with custom patching from upstream. So their work doesn't
solve all of these problems.

The year of the Linux on desktop is always 10 years from _now_ ;-)

Ps.

I'm a Linux user and I quite like it on a desktop, wouldn't bother on a Laptop
anymore, getting things like suspend to RAM or hibernate, bluetooth, wifi to
work reliablly is just too much effort.

------
xvilka
"Linux has a 255 bytes limitation for file names (this translates to just 63
four-byte characters in UTF-8) - not a great deal but copying or using files
or directories with long names from your Windows PC can become a serious
challenge."

Not sure what he is speaking about.

~~~
vbarrielle
I'm curious how one gets long names from a windows pc given the path length
limits on windows
[https://stackoverflow.com/a/1880453](https://stackoverflow.com/a/1880453)

Maybe I'm not understanding that limit correctly but it looks like the whole
path cannot be longer than 260 characters. I've hit that limit when
transferring files from linux to windows.

As you mention, the linux limit is in bytes, so the issue could appear in some
character sets, but it still looks like the path limits are a lot more drastic
under windows.

~~~
kkarakk
unicode version of windows api lets you do longer paths -upto 30k(it is
transparently handled in windows explorer)

example of a function that uses it- [https://docs.microsoft.com/en-
us/windows/desktop/api/fileapi...](https://docs.microsoft.com/en-
us/windows/desktop/api/fileapi/nf-fileapi-findfirstfilea)

------
stendinator
_For instance Microsoft and Apple regularly update ntoskrnl.exe and
mach_kernel respectively for security fixes, but it's unheard of that these
updates ever compromised the boot process._

I had updates rendering my computer unbootable by messing up the MBR at least
twice.

~~~
jaimex2
Exactly. You cant walk away from Windows 10 without the chance of coming back
to a bricked system because it auto-updated and broke.

~~~
sgc
This issue of Windows update and other lock in trends is 80% of the reason I
just (in the last month) switched to a Linux desktop. The other 20% is that
the ui on Linux has finally progressed enough that I could consider it the
least worst option. That is a first since I tried many times and wound up
reverting to Windows.

So far the only thing that really bugs me is that I have to apply fixes in the
terminal. Like installing a printer only to find the scanner function was
installed but wasn't configured - and the only way to get it to work is read a
few forums until you can figure out which of the myriad installed utilities
will actually return the info you need to configure the driver, etc. It's
often a few, because the first half dozen results are wrong!

Strangely enough, I don't find anything in Linux to rival Notepad++, WinSCP,
or AutoHotKey - just the type of thing I was expecting to find new and
improved. Those programs just have a better UI, and great functionality, to
get things done without going full command line commando or buying into a the
resource bloat of full ide. Right now I am using both sublimetext and visual
studio code, because they both have things the other lacks. I could just use
np++ for everything before. Doubling my ram made that less painful than it
would have been otherwise. Obviously a very personal itch, but one I suspect a
number of people who are on the cusp of accepting a Linux desktop might have.

But I trust Linux's stability since I have had zero issues in decades of
administering my servers, and don't worry about updates the way I do on a less
informative OS. Specifically, once you are used to apt, the list of packages
that will be installed tells you a lot about what might break or not on your
system. And there will always be a way to revert to older libs if necessary,
which might not be true on a locked OS.

~~~
aepiepaey
You may want to try Filezilla or gftp for something similar to WinSCP.

~~~
sgc
I'm using filezilla, it really doesn't compare in terms of ui or even
functionality.

------
billfruit
With linux, I do find that if you have a problem, finding solutions from the
internet is quite easy. Some of the distributions like Arch, has very detailed
documentation and active forums, and IRC channels that finding help is not
difficult, unless you have some exotic setup.

OTOH, getting help for windows issues is often more difficult, if you visit
windows forums, often the suggestions, solutions talked about are pretty
generic, since one can't peek under the hood the amount of diagnostics
possible is limited. Even if a problem solved in windows by
reinstalling/restoring from backup etc, most often one does not what exactly
caused the problem.

------
rbanffy
I refuse to take seriously a person who complains of lacking Adobe Flash
support in 2018.

------
bayesian_horse
I'm really happy with Linux on the desktop. Yes, occasionally I have problems
with graphics drivers, but normally not (NVIDIA), both on and older desktop pc
and a laptop, and I really prefer it.

------
dschuetz
I believe that those problems are the price of free development. When
everybody can contribute anything, which is a good thing in itself, there is
no streamlining or refining of the end product. There are soo many interfaces
and standards which often are used in _parallel_ that it's actually an
achievement when a Linux system is still able to boot. Open Source is a good
thing, but not in regards of user experience and stability. There is _no
guarantee_ for anything. When it works - it works. When it doesn't work -
people will hopefully fix it, or at least contribute resources. When it
doesn't work and nobody fixes it, what then? I wish I had the expertise so
that I could fix everything myself if needed, but let's stay realistic. Nor
have I resources to spare. I have to rely on others' efforts.

I've cross-compiled a Linux system from scratch for PowerPC once and it worked
great. It worked up until I needed to recompile glibc because of some bugs
threatening system security. Then everything went to hell. My prime interest
in Linux and Open Source ceased to exist at this very moment. The dependency
system in Linux is a nightmare, because basically there is none that
guarantees consistency. How are cross- and circular dependencies even
possible? And that's the price for freedom - chaos. It's great that anything
works somehow. But there is no future to build upon.

------
sprash
Wayland was supposed to be the rescue from the arcane outdated X11 and it is
severely disappointing. The reason is that it does not offer enough features.
It should have font rendering, it should have window decorations by default it
should have native UI widgets and much more. None of it is offered and the
responsibility is pushed away to others like Gnome, KDE and believe it or not,
X11.

IMHO the monolithic kitchen sink approach of systemd should be applied to
wayland. Not the other way around.

~~~
Meing4im
I would argue that wayland is still in beta. They are still designing
extensions to reimplement X11 desktop features. But with wlroots things have
picked up pace.

~~~
AnIdiotOnTheNet
Beta for over a decade. That's Google levels of beta. When can we expect a
real wayland release then? 2035? When it will still be 10 years behind DWM?

------
Mikeb85
As imperfect as Linux is, it's still better than Windows' dumpster fire and
Apple's jail.

And as of right now I have an Acer Swift with an AMD Ryzen/Vega API which is
running Ubuntu completely flawlessly. As in, everything works, all the time -
WiFi, suspend, function keys, plugging in external monitors, etc...

It could be better, but I can say that about every product I've ever owned
(and yes I've had an iPhone, it's the reason I never bought another Apple
product ever again).

------
AtlasBarfed
I think one of the disturbing parts of the linux ecosystem is how it is
starting to resemble the software turnover of a gigantic enterprise.

Gigantic enterprises have functioning systems, but the people that wrote them
eventually leave, and the complexity of the systems is such that when they
need to be updated, they are just rewritten from the ground up, with the usual
mixed bag of new bugs, new features, and missing features from the previous
regime.

This just seems what Wayland and system.d are, especially reading the list.

The number of distros is really really counterproductive.

Linux also DESPERATELY needs a massive hardware support information site with
graphical matrices, to help guide purchases. This alone might shame hardware
providers into sponsoring device drivers.

But realistically the window of Linux desktop adoption passed with Nadella
taking over Microsoft and starting to right the ship on Windows, even if it is
still utter garbage. There was a solid five years where Windows was being
utterly insane and that was the time to strike.

Linux probably should just concentrate the resources of desktop into a near-
perfect clone of OSX so they can at least unite forces with the macintosh
people in usability / interface familiarity.

------
doctorpangloss
Well, a big problem here might be _engineering thinking,_ as opposed to
_design thinking_.

If you view the problem as, "I have to enumerate all the [Linux on Desktop]
problems, and if I enumerate enough problems and fix all the ones I enumerated
in the right order, then I've solved [Linux on Desktop]," then "Main Linux
problems on the desktop" is, by virtue of having done exactly that, and not
being the first or last iteration of that, part of the problem!

Not that I begrudge Linus Torvalds for saying things sort of like, Linux is
evolved, not designed. It just shows that one of the big holes it has is its
use on the desktop, and the major Linux designed product (Android) isn't
really libre, and that Linus is a brilliant guy but he doesn't have answers to
everything nor claims to.

Meanwhile take a look at elementaryOS, which I think has a lot of opinions
you'd never find in an evolved or engineering-rational platform (like their
own UI programming language). I think if they had the resources of a giant
corporation they could make a meaningful impact on the desktop market.

I'd say it's very similar to a debate I heard from head of an architecture
school attached to a university better known for its engineering. "When we
looked at expanding the campus, decisions were made in terms of parking spaces
per square foot, and whichever had the most parking spaces per square foot is
where we would build the building." It's not that he's going around arguing
his ambitious and less efficient designs are better, just because he's coming
in with qualitative or emotional impacts incalculable by an enumerative cost-
benefit analysis.

He's just saying enumerating all the considerations, and then solving, is a
really reductive way of thinking about things.

------
xvilka
As usual, NVIDIA is a major cause of Linux desktop problems. What a surprise!

------
kakwa_
The Linux Desktop has problems, yes, but that's Ok.

We just have to accept that Linux will never be a mainstream desktop
environment.

A desktop environment needs a well polished experience. And this experience
can only be created by centralized organizations with extensive resources,
like developers, QA, designers, ergonomist, user's behavior studies and a
common and consistent vision of what the experience should look.

In the Linux DE world, a project can consider itself lucky if it has enough
developers.

Yet, these developpers are doing a wonderful job, specially given the lack of
resources, with applications that are more than usable and can solve 90% of
use cases.

Yes, Linux DEs are sometimes a little clunky, and the overall experience might
encounter issues not solvable by the average user. But that's Ok, having a
well polished experience would require an order of magnitude more resources,
and an organization far more vertical than the existing array of communities.

Diverting so much resources for the DE goal would be a mistake. Even if by
some miracle we manage to solve all the issues mentioned, we would be "X but
different/better", and this business model doesn't generally do so well, it
will not displace Windows/Mac OS.

Linux and the OSS ecosytem should remain what it is today: a powerful and rich
toolbox to build things from giant web services or phones to Vacuum cleaners,
cars or milking machines, and this toolbox should keep improving.

It doesn't mean we should not look around and see what is happening around us
in the DE world, but it should not be the absolute priority. Building a Linux
desktop should not jeopardize the use cases that made Linux a success.

Linux is a clunky Desktop Environment, yes, but this clunky environment has
enabled me to build software reaching thousands of people.

Thank you to all the devs who have built this (mostly) working experience.

------
stendinator
All this time spend creating this list could've been spent fixing at least one
single little thing.

~~~
sullyj3
Making the issues known isn't just pointless complaining. Everyone who reads
this might potentially be interested in contributing.

~~~
stendinator
Nice strawman. I never said it's pointless complaining.

~~~
petersellers
That's certainly what you implied with your post, though.

------
MivLives
I use Linux, but only in a VM. Every year or two I try to install it on actual
hardware. In the past ten years it has gotten so much better.

That said it is still not quite at the level I would use it as my main OS. If
my laptop stops working when I'm away from my desk I really don't want to have
to fix something broken. I need it to just work. And for all the many
shortcomings of Windows it has been hammered on and written for a complete
idiot to keep it just working a lot easier.

I will keep monitoring the situation of course and the second I feel
comfortable enough will make the switch. Until then I'll just keep using a vm.

------
peterwwillis
There are a couple main problems with Linux as a desktop.

1) Is (obviously) hardware support. While one or two vendors will produce a
small number of Linux-compatible drivers, they don't do QC testing of all
their products on Linux, whereas they almost certainly do for Windows or Mac.

Distros, and OSS devs in general, have to support a wide range of software and
hardware in every possible configuration. This includes not only individual
components on a system, but how they are tied together, and the proprietary
extensions (keyboard buttons) that allow the user to operate them. But there's
no way any company could possibly test all software with all hardware. Even if
they did such insane amounts of testing, they'd need to pay someone to fix all
the bugs that would come out of it. No OSS company I'm aware of has the
bankroll of an Apple or Microsoft, to say nothing of all the hardware vendors'
investments.

Trying to support OSS on proprietary platforms is like trying to become the
development and support for every such product in existence, and those
products are often black boxes. The only _reliable_ option is to pick a
distro, then find hardware which has been explicitly certified for that
distro. This is usually a short list, and becomes shorter as you try to find
something that fits your needs and budget.

2) Is an even more intractible problem: limitations of the software.

Do I need some software which is platform-dependent? Then I should use that
platform. Trying to shoehorn it into Linux is just a recipe for frustration
and support calls to your cousin's son Eddie who you heard is really good with
this Linux thing.

Then there's the difficulty of operating a system which is only designed to
work in a particular way. Want to use some software which doesn't have an
official package? Good luck figuring out how to install it. Have some problem
on the system? Good luck figuring out what magical combination of "console
commands" might make it work again. And don't even bother telling your ISP or
work that you use Linux when you call with a support problem, because they'll
just tell you to get bent.

Really, it all comes down to money. Nobody is spending the money on Linux to
become an officially supported Desktop, because it would be unaffordable.
Linux will always be a hobbyist OS as long as nobody supports it.

------
cuillevel3
I find the 'how to fix Linux' chapter hilarious.

Anyhow, the article touches a lot of the problem zones and it would be great
if companies profiting from Linux would start investing in the desktop.

Regarding security Linux has some advantages, like being open source and not
being an attractive target for malware.

I remember there was a Blackhat talk a few years back, detailing the security
features of windows. I'd love to see a comparision with Ubuntu or Fedora.

------
kgwxd
"Problems stemming from low Linux popularity"

That's the only real problem. Given popularity, everything else would quickly
stop being an issue.

~~~
shaan7
And popularity will happen only if the problems go away :P

------
blt
Linux will never completely shed its heritage as a software development system
for a machine hooked up to a teletype and a tape drive.

~~~
pjmlp
NeXTSTEP and macOS did it pretty well.

------
username231
>linux problems on the desktop >Linux is administered by ssh

Right... on desktops, Windows is far worse with updates as far as breaking
things goes. We have to deactivate the internet on Windows 10 machines so the
updates don't break everything.

Microsoft do not care about breaking stuff with their updates at all. Ubuntu
is the stellar opposite.

------
bb88
The article is not entirely accurate, and can be hit or miss. Granted, I'm
cherry picking, but I'm cherry picking the pieces which I have direct
knowledge which are definitely misleading.

> ! X.org architecture is inherently insecure - even if you run a desktop GUI
> application under a different user in your desktop session, e.g. using sudo
> and xhost, then that "foreign" application can grab any input events and
> also make screenshots of the entire screen.

A linux user would say, "So that's not a bug, that's the power of the root
user. Don't do that or lock down your root access."

Then this one:

> ! The kernel cannot recover from video, sound and network drivers' crashes
> (I'm very sorry for drawing a comparison with Windows Vista/7/8 where this
> feature is implemented and works beautifully in a lot of cases).

To be fair, if you google "Nvidia BSOD" and you'll get something like this
from four months ago:

[https://www.eteknix.com/nvidia-release-new-2080ti-drivers-
st...](https://www.eteknix.com/nvidia-release-new-2080ti-drivers-stop-bsod-
issues/)

And if you think the mac's are awesome, (which the software might be, I don't
know) the hardware has some serious issues which has turned me off from buying
one.

[https://www.youtube.com/results?search_query=louis+rossmann](https://www.youtube.com/results?search_query=louis+rossmann)

Also to be fair, it's possible that Dell laptops have similar issues, but
Apple hardware failures make a lot of noise in the media.

~~~
swebs
>> ! X.org architecture is inherently insecure - even if you run a desktop GUI
application under a different user in your desktop session, e.g. using sudo
and xhost, then that "foreign" application can grab any input events and also
make screenshots of the entire screen.

>A linux user would say, "So that's not a bug, that's the power of the root
user. Don't do that or lock down your root access."

It's actually recognized as a huge problem and one of the driving forces
behind Wayland.

------
dsego
A few killer apps and none of this would matter. Unfortunately, the linux
desktop has none, which is a shame.

------
mijoharas
One surprise for me is that there is no mention on bluetooth. I have a
reliable way of connecting a bluetooth headset; trying to connect between 2
and 10 times until it works. There are constant transient failures which is
not what I expect from software in 2019.

------
sandov
I think it would be useful to separate this in problems that are Linux's(!)
fault and problems that are someone else's fault(e.g. nvidia, the hardware
market, intellectual property laws, the user himself).

(!) With "Linux" I mean GNU/Linux distros.

------
Octoth0rpe
> Android is not Linux

I really, _really_ hate this statement. Android meets the only definition of
'linux' that matters IMO: it runs on a linux kernel. That's linux, full stop.

That said, I do find myself agreeing with most of the points listed there. I
just not sure I'd call them 'linux on the desktop' issues per se. They're
wayland/xorg issues, deb/rpm/flatpack/snap issues, pulseaudio/alsa issues,
gtk/qt issues, etc. None of which are tied to the linux kernel. The fact that
all of those technologies can and do run on bsd kernels (er, maybe not alsa).

Android is linux. Chromeos is linux. The issues above are really issues with
the free/libre desktop distributions, mostly not linux per se.

....maybe I'm just a pedant on this issue.

~~~
swebs
I'd just like to interject for moment. What you're refering to as Linux, is in
fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux.
Linux is not an operating system unto itself, but rather another free
component of a fully functioning GNU system made useful by the GNU corelibs,
shell utilities and vital system components comprising a full OS as defined by
POSIX.

Many computer users run a modified version of the GNU system every day,
without realizing it. Through a peculiar turn of events, the version of GNU
which is widely used today is often called Linux, and many of its users are
not aware that it is basically the GNU system, developed by the GNU Project.

There really is a Linux, and these people are using it, but it is just a part
of the system they use. Linux is the kernel: the program in the system that
allocates the machine's resources to the other programs that you run. The
kernel is an essential part of an operating system, but useless by itself; it
can only function in the context of a complete operating system. Linux is
normally used in combination with the GNU operating system: the whole system
is basically GNU with Linux added, or GNU/Linux. All the so-called Linux
distributions are really distributions of GNU/Linux!

~~~
kkarakk
/g/ on hackernews

------
jmakov
Switched to Windows in 2018 because of Killer WiFi card not supported on
Linux. First time after 10 years I was able to suspend my laptop without
having to worry it will wake up in my bag and overheat.

~~~
syn0byte
Linux support sucks for a high-end network card that runs the Linux kernel
internally for TCP offload and acceleration _?

That's some deep irony.

_Their original product lines at any rate. Newer wireless models don't
specify.

------
rejd
[https://lkml.org/lkml/2016/4/7/423](https://lkml.org/lkml/2016/4/7/423)

------
AtlasBarfed
If Google was truly not evil they would supply leadership and the billion
dollars.

Aws and Google alone make hundreds of billions on the back of Linux and give
back almost nothing.

------
Sangeppato
He's absolutely right. The problem is that Windows 10 sucks as well, just in
different ways (check out "why windows 10 sucks", from the same author)

------
Havoc
Yeah this is why I stick to windows for desktop. Just the thought of having to
fight pulse audio and bizarre gfx driver issues gives me ptsd

It’s cool for servers though

------
Iolaum
I ll just chime in with something cool I heard once:

Linux is very user friendly, it's just picky about its friends.

P.S. Exclusive home linux user for 3 years and counting.

------
Sangeppato
He's absolutely right. The problem is that windows 10 sucks as well, just in
different ways

------
ilovecaching
I use GNOME (Fedora 29). Works fine, except for things that NVIDIA and Cisco
keep locked down (driver and codecs). Other than that, GNOME is just kind of a
wacky desktop (activities and no desktop launcher are just stupid). But it
does work, and I would be fine recommending it to my grandma with some
extensions installed to make it more like Windows UX.

~~~
ilovecaching
I should also mention that Linux has improved, macOS has seriously regressed.
All the time bugs, and if you want to plug in you Mac into anything not made
by Apple, good luck.

------
michaelmrose
There are plenty of valid complains including many touched on in the article
but there is also some unvarnished nonsense.

"Pulseaudio is unsuitable for multiuser mode - yes, many people share their
PCs (an untested solution can be found here)."

This would lead one to believe that multiple graphical logins playing sound
doesn't work in actuality Pulseaudio runs as a process per user and the
observed behavior is the same as switching users on a windows machine. When
you switch to a users graphical session that users sound comes out of your
speakers. Further this allows it to be configured per user and for it to run
without super user privileges. The only thing you can't do is say play music
as one user and switch to another users desktop and listen to said tunes while
hearing your users applications.

In most cases you actually don't want random apps you can't by design effect
or shut up playing over your desktop. So this is the right decision in every
possibly way.

"No reliable sound system, no reliable unified software audio mixing
(implemented in all modern OSes except Linux), many old or/and proprietary
applications still open audio output exclusively causing major user problems
and headaches."

The last application that I recall that I had an issue with grabbing the sound
device directly was a 15 year old version of Skype. Pulseaudio does all of the
above and as it is and has been the standard desktop apps are expected to
integrate with it and by and large that has been the case for about a decade
now. The fact that somewhere out there old broken non compliant apps exist
isn't a compelling argument. All popular platforms have a mixture of crappy
and useful apps. People generally deal with this by using their favorite
search engine to find good applications for their task.

"What if the user decides to switch from Windows to Linux when he/she already
has some hardware? When people purchase a Windows PC do they research
anything? No, they rightly assume everything will work out of the box right
from the get-go."

They "rightly" expect hardware that the manufacturer doesn't want to support
on Linux will be reverse engineered by volunteers because they already
invested their money in a manufacturer that only supports windows. When the
kind of person that could help you out makes 200k at Google it turns out that
50k of their time is not available to protect your $99 investment in your
printer.

If you pay $15 a month for hulu + hbo, 10 a month for spotify, $14 a month for
netlfix, $150 for cable in the next 5 years you will pay.

$900 for hulu $600 for spotify $840 for netflix $9000 for cable

If you paid for all 4 you bought a halfway OK used car. Perhaps a fraction of
that might buy a more polished experience you don't need to piss and moan
about.

Ultimately everyone will eventually upgrade their machine. Almost all hardware
that a consumer would encounter or consider could also run windows. If you are
considering switching to Linux and your current machine consider making Linux
a part of your NEXT purchase. If you don't like it you can easily turn around
and put Windows on it or hey put both.

------
dcbadacd
Whoa this article has brought up some painful memories and I agree with so
much, but yet I can't imagine using Windows, it pisses me off even more.

> Most distros don't allow you to easily set up a server with e.g. such a
> configuration: Samba, SMTP/POP3, Apache HTTP Auth and FTP where all users
> are virtual. LDAP is a PITA. Authentication against MySQL/any other DB is
> also a PITA.

Just thinking about it gives me ptsd, I do not want a thousand users or
folders on my system, they're a pain to migrate, it's really a PITA to make
everything virtual. Please someone give me a nice short guide how to set up
dovecot so that it stores all e-mails in postgresql?

> KDE is spiralling out of control (besides, its code quality is beyond
> horrible - several crucial parts of the KDE SC, like KMail/akonadi, are
> barely functional): people refuse to maintain literally hundreds of KDE
> packages.

KMail has had so many bugs for me, I should report but man, it's such a pain
in the ass. Such ridiculous bugs as well (e.g. connection loss spawning 1000
error boxes that my GPU can't handle the layered transparency.

> ! Linux security/permissions management is a bloody mess: PAM, SeLinux,
> Udev, HAL (replaced with udisk/upower/libudev), PolicyKit, ConsoleKit and
> usual Unix permissions (/etc/passwd, /etc/group) all have their separate
> incompatible permissions management systems spread all over the file system.
> Quite often people cannot use their digital devices unless they switch to a
> super user.

In theory they're all separate things, but they interleave so much, there has
to be a better system.

> ! No equivalent of some hardcore Windows software like ArchiCAD/3ds
> Max/Adobe Premier/Adobe Photoshop/Corel Draw/DVD authoring applications/etc.
> Home and enterprise users just won't bother installing Linux until they can
> get their work done.

I really miss Solidworks/Fusion 360, there's no effort at all to port those :(

> ! Open source drivers have certain, sometimes very serious problems
> (Intel-!, NVIDIA and AMD):

Flickering windows, low FPS, laggy videos, everyday struggle :( Changing
compositor OpenGL mode fixes it though.

> ! An insane number of regressions in the Linux kernel, when with every new
> kernel release some hardware can stop working inexplicably. I have
> personally reported two serious audio playback regressions, which have been
> consequently resolved, however most users don't know how to file bugs, how
> to bisect regressions, how to identify faulty components.

I was affected by this too, qemu was broken for me for almost half a year.

There's one thing that's very subjective in the article, even if the author
claims otherwise. I like to call it the Winduslexia, it doesn't occur for
anyone other than previous Windows users:

> All native Linux filesystems are case sensitive about filenames which
> utterly confuses most users. This wonderful peculiarity doesn't have any
> sensible rationale. Less than 0.01% of users in the Linux world depend on
> this feature.

------
benbristow
TL;DR is here for anyone wondering:
[https://itvision.altervista.org/why.linux.is.not.ready.for.t...](https://itvision.altervista.org/why.linux.is.not.ready.for.the.desktop.current.html#Summary)

------
hutzlibu
So many words..

old, but sadly still valid:

[https://xkcd.com/456/](https://xkcd.com/456/)

