Hacker News new | past | comments | ask | show | jobs | submit login
Ubuntu squeezes more performance than Windows 11 on new AMD Zen 4 Threadripper (phoronix.com)
351 points by marcodiego 5 months ago | hide | past | favorite | 388 comments



Not surprised though, with AMD being able to patch the Linux kernel and scheduler to its liking in order to squeeze out the maximum performance for it's architecture, versus depending on Microsoft to have the good will to figure it out on their own dime.

I'd be curious if the businesses buying these $20k+, 96-core, HEDT workstations actually do run Windows on them in order for this to be that big of a problem for Microsoft worth addressing, or if it's Linux all the way anyway for them so Windows isn't even on the radar.

Anyone here in the know?

Also, obligatory: "But can it run Crysis?" (in software render)


ive worked at companies where we buy these types of workstations for fea/cfd/ML

if the company has a overbearing IT, we run windows for "security". at places where we move fast and break things, we run ubuntu and IT says for us to manage it ourselves.

for anyone wondering why we done use the cloud or a server, we do too. but model setup, licensing, and small jobs are easier and quicker to do locally.


> "...depending on Microsoft to have the good will to figure it out on their own dime."

They don't have to figure it out on their own; everybody maintains close contact. For example, Intel has a field office literally one block away from Microsoft's main campus. One would assume AMD also has a field office similarly near by. (EDIT: they do, a couple of blocks further.)


One of the major markets for these would be visual art studios. They aren't generally Linux shops.


Funny you mention this because a lot of big visual arts studios use Mac for the artwork in the Adobe ecosystem, and heavily invested in Linux for the 3D rendering pipelines as they moved away from SGI/BSD with Windows almost nowhere on the radar.

I'd estimate the number of big-name shops running Windows is a minority with only smaller indie studios like Corridor Digital being all-in on Windows because it's a jack of all trades can-run-anything OS, and managing that one ecosystem without sys-admins and an internal IT department is a lot cheaper easier for small businesses of amateur-professionals.


The majority of professional movie and television VFX work is now done on Windows and Linux computers, with Linux especially being used for the rendering portion of the process.

Mac was the dominant platform for a long time, but Windows caught up and zoomed past Mac about a decade ago. Same or better performance, cheaper hardware, cheaper software, easier to upgrade/repair, and more choice for all of the above.

There's a reason that Apple has to pay big bucks to Olivia Rodrigo and others to use Apple products to film and edit their music videos: it's because Apple fell out of favor with creatives and Apple is trying to buy its way back in.


I think Apple lost a lot of goodwill in the market with the desaster that was the initial release of Final Cut Pro X (and rightly so probably).

And now I see quite a few people moving away from Premiere to Resolve, which seems to take a pretty platform-agnostic stance.


My experience is exclusively with small studios, which were Windows shops.


Yup - A while back (not sure if changed much since), Weta Digital people would talk about being pretty much mostly Linux for modelling/rendering/etc and heavily Mac for audio. Very little Windows.


"a lot" is nowhere near "all"


And none is nowhere near "a lot".


None is nowhere, a lot is somewhere and all is everywhere. sort of.


The improvements coming for Wayland's HDR/color management are likely to help with that, as the features they're aiming for appears to beat Window's more slapdash implementation with per-window color management, where window contents are accurately tonemapped and composited within the widest color space the monitor supports.

Adobe would need to be incentivized to port their suite over for it to be taken seriously, but maybe Wine could bridge that gap at first.

Ah, the mythical Linux Desktop. One day...


Will Wayland be able to render mixed HDR and SDR content correctly, with e.g. an HDR video on YouTube rendering with extended range while the rest of the screen renders as normal?

Currently only macOS can do that. With Windows you have to choose between SDR and HDR display modes which affects everything on screen regardless of type, which makes SDR content look dingy in HDR mode.


On that subject, anyone know why shifting to HDR dims everything that way? My mental model of it is that SDR brightness goes from 0 to 100, and HDR brightness goes from -100 to 100, and that turning on HDR moves everything not HDR-aware down to the bottom of the brightness space.

I could look this up, but never think about it outside of conversations like this and figure it might be more fun to talk about it.


Consider that your display can only do 0-100 brightness (not really but for sake of argument)

In SDR, you map the full SDR range (also 0-100) to that 0-100.

When you add HDR, you’re now adding levels above 100 (let’s say 0-200).

If your display can only do upto 100, you now need to put all the 0-100 stuff in 0-50. Or you get a display that can also display 0-200.

Very few computer displays can do over a standard SDR range of 500 nits.


But why isn't SDR then scaled from 0-200, then? That is, why isn't "fully bright SDR" mapped to "fully bright HDR"?


Well, it's pretty arbitrary. What is missing from most image (or sound) data is metadata to say what physical intensity is represented by the signal. I.e. how many nits (or decibels) should be emitted for this signal.

AFAIK, most encoding standards only define a relative interpretation of the values within one data set. And even if standards did have a way to pin absolute physical ranges, many applications and content producers would likely ignore this and do their own normalization anyway, based on preconceptions about typical consumers or consumer equipment.

To have a "correct" mixing, you would need all content to be labeled with its intended normalization, so that you can apply the right gain to place each source into the output. And of course there might be a need for user policy to adjust the mix. I think an HDR compositor ought to have gain adjustments per stream, just like the audio mixer layer.


What context are you talking about?

In single mode where it’s just SDR, it’s mapped to take the full range of the display up to whatever is deemed a comfortable cap.

In a mixed HDR/SDR mode, the H is range above the S , so it doesn’t make sense to scale it up. I prefer Apple’s terminology of Extended Dynamic Range because it’s clearer that its range above the SDR range.

Now you could say that you intend for that SDR to be treated as HDR, but without extra information you don’t know what the scaling should be. Doing a naive linear scale will always look wrong.


Because windows map sdr contents to srgb color space. Which nobody except designers uses. Most monitor today ship with a much brighter and, high contrast, vivid color profile by default. If you toggle your monitor to srgb color profile. You should see a color that looks really similar to sdr contents in Windows hdr mode.

In my opinion, I also don't like it. But there is surely no way for Microsoft to chose a color profile that looks like without toggle hdr on given there are so many monitor manufacturers in the market. I think chose the most safe srgb option is understandable.


Hopefully, DCI 3P will be standard in future.

I have monitor that have 187% sRGB, 129% Adobe RGB and 133% DCI P3 gammut volume. But to have correct sRGB colors with maximum coverage on monitor I need to clamp ICC profile via Novideo SRGB. Without it, sRGB content looks oversaturated in orange spectrum.


It's more like SDR goes from 0 to 255 and HDR goes from 0 to 1024. In SDR mode, 255 = (say) 500 nits while in HDR mode 1024 = 1000 nits and thus 255 = 250 nits so SDR content looks dimmer.


A better way to describe it, IMO:

SDR goes from 1-100. HDR goes from 0.01 to 100. Twice as many orders of magnitude difference from bottom to top. So if you peg the top to max brightness in both cases, the HDR looks brighter because the contrast is bigger.

(Note that this is an analogy. In other words, it's wrong, but a way of looking at it)


I am super excited for this, Wayland seems extremely promising. I'll probably try Fedora out again soon - my last experience with Wayland was GNOME and it was super nice.


We sell systems based on these high-end workstations and yes they do run Linux, Rocky Linux 8 at the moment.

Still stuck on Xorg because reasons, but for HDR monitoring you're usually using a display interface card with SDI or HDMI outputs anyway.


Pretty sure Microsoft is very cooperative on working with AMD and others on this stuff, it may just take a little longer in this case.


> versus depending on Microsoft to have the good will to figure it out on their own dime.

Cutler stated in a recent interview that he had a 96-core machine as one of his two daily drivers. I wondered at the time if he pre-announced that CPU since the press about it seemed to reach me a few days later.


Epyc has had 96-core cpus out for a bit over a year now, so probably that.


Epic Games runs mostly Windows Threadrippers for Unreal Engine development. Compiling Unreal faster, or anything really, even Windows itself is a compelling argument.


The portion running a hypervisor would also be interesting to know. That’s one hell of a processor.


Back in the day I used Linux for rendering my Blender animations because my tests had shown it was 10% faster than the same machine running Windows 10. With render times of more than 8 days this quickly paid off.


I noticed that AMD/Xilinx Vivado synthesis runs 20% faster on WSL2 than on Windows. After seeing such differences using it on plain Windows becomes unpleasant.


Yeah, when I was doing more of that sort of thing I'd generally use ISE or Vivado on a big Ubuntu server and VNC or X-forward it to my machine (the X forwarding could be a bit janky with Vivado but it was OK for getting things going in a pinch).


In the context of Linux adoption on the desktop it is not big news, unfortunately. To my mind, overall end-user experience is already better in Linux compared to Windows where similar tasks are more complicated, even before the raw performance starts to matter.

Better performance would be a prerequisite indeed - no one wants to move a slower OS - but then there must be other convincing reasons to drive the desktop user adoption. And no roadblocks.

Edit: sorry, rephrased a lot, but the core idea is hopefully the same.


> And yet, users don't convert to Linux. So there must be other reasons, and performance benefits don't seem to work so far.

For enterprise desktop purposes:

One central reason is that there is no Excel for GNU/Linux. Accept it or not: the workflows of lots of departments are deeply intertwined with Excel (and it would take an insane amount of work to replace Excel by some other spreadsheet application). Another important reason is that GNU/Linux is not some singular operating system, but a proliferation of various different distributions - this is not something that enterprise customers like.


We run thousands of Linux desktop workstations from low-end workstations for developers to dual-socket machines with >1TB of memory at work. That's primarily because productivity for our use cases is so much higher with a Linux/Unix environment than Windows that it's not even remotely funny. However, almost all users still have either the standard corporate-issue Windows laptop, or a Windows VM on their workstation. Practically exclusively for Desktop Excel-based administrative tasks.

Generally speaking, the vast majority of our userbase likes their Linux workstations far better than their Windows machines, and that's after experiencing the significant downgrade to Gnome desktops due to Red Hat removing KDE.


As a dev, wish I worked there, IBM was the only other place I worked where many devs and even a few thousand ordinary users like marketing, ran Linux. Also, I wasn't aware RedHat removed KDE, what an odd and crappy move. KDE rocks. Gnome not. RPM based distros sooner or later corrupt themselves, I run and recommend Ubuntu or Kubuntu (for KDE people), with Snap disabled with a small apt preferences file.


Would it? I exclusively use Office365 on the browser for all my Excel needs. It works perfectly, even if it's a bit sluggish.

But I admit, Excel is a side hustle for me.


I rarely, if ever, use Excel. But I find that Outlook in the browser wipes the floor with local Outlook, be it "OG" or "new" Outlook. Everything is much smoother, there are no window widgets crapping up (right now, the min / max / close and window title of my new outlook are black on a dark grey background - I'm using the windows dark theme). This holds even in Firefox on Linux – it's how I mostly interact with MS Office.

What's also fun to me is that the other day, I tried "opening in word" a .docx attachment I received in my locally-installed "new" outlook. It didn't even try to open my locally installed word on ask me anything. It uploaded the file to onedrive without asking and proceeded to launch word online to read it.


The end user experience is way better on Windows than Linux. And I say that as someone who uses Linux (Ubuntu Gnome) daily. You really have to appreciate the freedom of choice that FOSS brings you to appreciate Linux. If you don't care, as most people do, you're probably better off with Windows.

Ofcourse you can counter my reasoning by listing a lot of problems with Windows. And I can assure you that most people don't care about the stuff you (as a Linux user) would care about.

Windows just works. It has Office, Excel, Word, PowerPoint. It's familiar. Instructions for any kind of software contain Windows instructions. Friends and family can help you if you have problems. Etc.

(Windows is the only OS that can handle different fractional scaling factors properly for multiple displays. Fractional scaling on Linux is a joke, and macOS doesn't care about low DPI screens at all so everything looks terrible.)


I agreed with this for a long time, but in the past 4ish years I no longer agree. On Windows I regularly run into frustrating issues with audio, drivers, printers, random odd errors.

On linux (currently PoP OS, first on a System76, now a Dell) I have not been running into issues. The biggest issue I ran into was strange Wifi connection issues, which System76 had me submit logs for and identified it as a failed Wifi card. But my audio, drivers, printer, scanner and day to day tasks just work.

Yes there are features that are missing on one platform or another. But day to day is now much more pleasant for me on desktop linux as I run into far less frustrating issues.


I'll reinforce what the parent is saying: for most users, Windows just works. Although I don't like Macs too much, they also just work.

I'm a Linux fan, but I can get myself productive in a new Windows install with no grief. For Linux, I always need to go online to remember "that config change" I had to make to get things working for me - and even then, from time to time I still have to tweak and work around a problem or two as I add more stuff to my computer. What you just wrote as your biggest issue is actually a fairly common Linux experience - submit or search logs from some piece of hardware to diagnose some little glitch.

I'm willing to put on the time, but my nobody else in my family is.


Windows is not fault-free. But for most people (>95%) it works without issues.


Reliability is not that high. But Geek Squad can help when it doesn't work.


I'm not sure I agree that "most people don't care about the stuff you (as a Linux user) would care about". Some people put up with Windows problems because they don't know it could be different, to them that's just how computers work. Take automatic updates for example. More than once have I heard from a friend how they lost unsaved work or an overnight render because their PC rebooted for updates. While on Linux months long uptimes are the norm, and you can trust it won't reboot on you out of blue. Same with ads and other Windows annoyances.

Other people are bound to Windows by their software needs. You're right that many important programs are just not available on Linux (to your Office example I'd also add the Adobe suite). But the end user experience of Windows itself has been downright abysmal for a long time now, and for casual users who don't need much besides Chrome and a couple other Electron apps, modern Linux desktop might genuinely be the better choice.


I'd say the windows experience is maybe more familiar, but not better. As another commenter said, people are just used to computers being crappy the windows way. And even that familiarity may not be all that great with all the things moving around in windows 11.

As another commenter has said, window management is a shitshow, and I don't even mean "missing X feature I love from this other WM". Virtual desktops are broken. You have UAC windows sometimes going to the front, sometimes staying in the back. Sometimes they're at the front covering your old window but they don't have focus, even though the caret is blinking. You get windows maximizing behind the taskbar (Teams). Window management is handed-off to applications, so a broken application will poison the rest of the window management.

> Windows is the only OS that can handle different fractional scaling factors properly for multiple displays

I don't know what you mean by "properly", likely not the same thing as I do.

That it kinda sorta works as in "can turn it on"? But even basic OS things are borked. Have you tried opening the start menu after switching scaling factors?

My use case: I have a laptop running at 100%, and an external monitor running at 150%. If I boot it up this way, the start menu looks fine. If I plug or unplug the external screen while running, the start menu breaks. But here's the kicker, which shows the quality engineering: pressing the start button shows an OK menu. Start typing looking for something, and it becomes blurry! This is under windows 11 with all the latest updates.

Also, Wayland on Linux supports this, too. I don't use Wayland, but I understand that whatever problems there are, as in "not all apps are compatible" is the same problem with windows: the app has to cooperate. Try opening some configuration panels in Windows on a 200% screen, and you better have some good glasses on hand.


> The end user experience is way better on Windows than Linux.

Window management in Windows 10 is so terrible that it makes TWM look modern.

Cluterred title bars so you need to aim for some free space to be able to move the window on other monitor, 1px window borders - good luck resizing windows on HD and UHD monitors. Gray on gray. Terrible scrollbars.

And the best one: Programs keeping files open after you close them.


20% extra performance matters a lot to anybody buying a $5000 Threadripper machine. If they didn't care about performance, they could have saved a lot of money...


Tangent

I have this suspicion that Windows has basically a layer of analytics tracking over the top of the DE that slows the system for most desktop uses, and that they essentially "tunnel" through that layer when a program actually needs the performance. Like gaming or exporting a video project.

This is why Windows is able to benchmark high for specific tasks, but for overall usage it feels very slow compared to every linux desktop.

Am I crazy or is there a nugget of truth there?


Brother we have known this for years.


That's good to know! Is there any formal proof of said theory? Like what is this analytics layer called, and is the tunneling something that devs are aware of? Just interested in putting some names to this theory so I don't sound like a loon when I try and describe it.


Oh it's definitely still a theory - but we've collectively proven it through our anecdotal experiences over the last 20+ years of using Windows.


That is not how it works lol


It doesn’t actually work that way, but the net result is the same. Windows is a Trojan horse that manifests itself as an OS but is really just a delivery mechanism for crap from Microsoft. One drive. Cortana. Edge. It’s all so invasive.


I hope the community really gets behind Snap or Flatpak or one of the other systems for bringing a modern permissions and privileges system to the Linux desktop. It would help me be a lot more comfortable recommending Linux to non-technical people.

What distro would you recommend to somebody like me who wants to be asked before an application gets access to my location, microphone, camera, network, etc...?


Snap can die in a fire.

Flatpak... I'll warily try it for "user facing" apps.

But after a clusterfuck of trying to get a flatpak Jellyfin to work (HW accel fell over, and it wanted me to dumpster dive into flatseal to unsandbox it so that media OUTSIDE OF HOME could be accessed... never got around to looking at getting it to even boot at startup) I'll shy the hell away from it for daemons.


As someone who has been using Linux exclusively for my personal machines since 2007, here's the chicken-and-egg problem that the Linux world is constantly battling: new technology is always being released and pushed out before it actually works.

It's happened constantly since I started using Linux: KDE 4.0, PulseAudio, btrfs (to be fair, most distros didn't really push btrfs as a default), Wayland, etc, and now Flatpak and Snap.

And it's not to say that I don't understand the dilemma: you need to get user feedback to improve the tech, and the pool of Linux users is very small (for desktop stuff). But what that means is that desktop Linux is almost always in a state of partial brokenness. (And please also note that I'm giving a 100% pass on driver issues--that's a given when hardware manufacturers simply only care about working with Windows.)

Honestly, when it comes to "just working," desktop Linux is worse today than it was when I first installed it on my budget laptop in 2007. Wayland is mostly fine for my uses, but others still struggle with its missing features; snaps/flatpaks are slow, bloated, and have strange behaviors because of the permissions models that are NEVER going to be obvious to a casual computer user; and managing an Ubuntu install is still really bad UX.

As an example of the latter, I usually run an "expert" distro on my personal machines, but I decided to slap Ubuntu on a PC that I hooked up to my TV, and after a year or so of just chugging along and running updates whenever prompted, it started displaying errors that my boot partition was full. You've gotta be kidding me... THIS is supposed to be the "easy" distro for normal humans? How the hell do you expect non-tech people to micromanage the /boot partition and delete old kernels and shit?

So, while I agree with all of the technical issues and complaints about things like permissions, security, etc, that advocate for Wayland and Flatpaks/Snaps and whatever, you can't pretend to care about adoption while also pushing out half-baked technology to "stable" versions as though everyone will be happy with beta-at-best functionality.


I think part of it is because a lot of these more user-oriented features (desktop compositing, sandboxed app stores, hdr, etc.) don't get a lot of attention on Linux until after the two major commercial operating systems get them and prove people actually want them.

The monied interests driving a lot of Linux development do not have the same need for these things as end users. To them, Linux is a server, or a lightweight embedded OS, or needed for some other specialized use case.


That's absolutely the issue. Linux on the desktop is extremely niche, and as such, it doesn't get much attention.


I'm not sure what distros you have been using but Linux Mint and Debian (if you set it up right) are extremely stable. Snaps have been slow but I have never experienced that with Flapaks; reasonable critique is it takes a lot of disk space and takes too much time to update different versions of underlying tech gtk or qt.

Ubuntu has not been a good distro for a few years now. Even Fedora is more stable now and you get new software.

We never had as good Linux desktop situation as it is now. The biggest complaint is that if you want to use graphics optimized programs you kinda have to use Nvidia and that's still problematic on Linux.


My experience does not match your assertions. The Linux desktop has never been more accessible.

My mother in law has been happily running manjaro with gnome for a while. Now, truth to be told, she was navigating gnome 2 Ubuntu more than a decade ago but things were mostly not working back then. Upgrading versions was a minefield with mostly terrible results.

Maybe the state of things suited you better somehow. I just can't imagine how.


> Honestly, when it comes to "just working," desktop Linux is worse today than it was when I first installed it on my budget laptop in 2007.

I bought a new Alienware PC in summer 2022, installed Ubuntu immediately, wiping Windows. Everything worked. Still does. This happens on every new Dell (or Alienware), HP or Lenovo desktop or laptop I've bought the past 2 decades. I can't buy a new machine that has issues with Linux, what on Earth could I be doing wrong that they all work.

Honestly, do you work in Microsoft marketing.


> Honestly, do you work in Microsoft marketing.

I shouldn't feed the trolls, but I'll take the bait this time.

Read my comment again. In it I literally say that I run Linux exclusively on my personal machines. I don't have a Windows computer in my home and the only Mac I have that's still running macOS is a work-issued laptop. What are your Linux-never-Windows credentials? Because I bet they don't beat mine unless you're just significantly older than me and have been doing the same as me but for longer.

If one were to read my comment history, they could only conclude that I'm a huge Microsoft hater/skeptic.

Given that the ONLY mention of Microsoft or Windows in my comment was a statement that hardware manufacturers only test their drivers with Windows, this comment is just a strange emotional reaction to my point.

> Everything worked. Still does.

Read the rest of the thread about Flatpak and Snap. Read people's complaints about Wayland. Dig up some old discussions about PulseAudio and Xorg and how we used to have to hand-edit xorg.conf files to set up multiple monitors. If you insist that there are no problems, shortcomings, or missing functionality with Flatpaks, Snaps, and Wayland, you're deluding yourself.


I'm in that older+longer cohort so I'll comment...

I think you are talking about workflow-breaking changes or stability issues during transitions to newer Linux bits. You explicitly excluded hardware/driver issues that a lot of people gripe about with Linux on laptops. I think your experience here can vary dramatically depending on your choice of Linux distribution, your upgrade tempo, and your expectations.

I've had some frustrations with the UEFI transition and secure boot, because it forced me do research when I just wanted my system to boot. I've similarly suffered some regressions with MATE recently, where it boots to a black screen and I had to dig around to find that switching to a text console and restarting lightdm would get it unstuck. I'm struggling to answer whether these are "driver" things or not.

But, I've been on Fedora for ~20 years now and am happily oblivious to Flatpak, Snap, or even Docker because they have nothing to do with my day to day experience. I also don't think I've dipped my toes into Wayland yet. I've been using XFCE or the MATE desktop rather than GNOME. But I do remember when ALSA was the new thing, then pulseaudio, and now pipewire. I mostly didn't care unless trying to setup some specific sound peripheral. I never had btrfs anywhere except an experimental system, because I always customized partitioning and never really considered the defaults to matter.

If I exclude hardware/driver issues, it seems like this same story applies to Windows. It all depends on what part of the ecosystem you consider to be part of your platform experience. Some people might have some favorite apps that essentially work the same as 25 years ago, while others have experienced multiple upheavals as third parties abandoned or hijacked an old favorite application or forced some kind of migration. It's only gotten worse with all the rent-seeking cloud integrations with everything.


The thing about snap and flatpak is, that while canonical is heavily pushing snaps, the community seems to be mostly behind flatpak.

What this boils down to for regular use is that some big companies only offer a snap for their proprietary app while most open source stuff seems to be on flatpak. With significant overlap of course.


Ideally, we get the best of both worlds. The isolation technology that Flatpak uses (bubblewrap) isn't exclusive to Flatpaked apps, so someone could probably make a distro with isolation by-default if they were motivated enough.

> What distro would you recommend to somebody like me who wants to be asked before an application gets access

Use an immutable distro like Silverblue or Kinoite. Those are more-or-less Flatpak-only, extremely stable and fairly rigid with security.


> The isolation technology that Flatpak uses (bubblewrap) isn't exclusive to Flatpacked

I wish flatpacked apps actually used the isolation (most don't, so http://flatkill.org/2020/ is still true today) but that isolation is mostly under the control of the packager, not you.

On this point, Deno ( https://docs.deno.com/runtime/manual/basics/permissions ) and pledge.com ( https://justine.lol/pledge ) are better; you have to explicitly allow-list what you want, instead of hoping that the potentially very untrustworthy and third-party packager had your best interests in mind and denylisting what you don't want.


Most casual users want to see a quick list of permissions and just click allow---not build their own allowlist for every app they download.

If you want to build the allowlist yourself, doesn't firejail already do what you want?


It would be nice to grant permissions in a fine grained way and not just have a blanket accept or reject. For example, if I download a weather app, chances are it will want location and internet access. If I don't want it to have my precise location I should be able to deny location but grant access to the internet.


Enough of the the community is behind Flatpak that it's possible to run an immutable distro and get most of your apps as flatpaks without much trouble.

Snap is terrible; there is no reason to consider it. Mostly because there is only one app store allowed and it's really slow to launch apps.


Snap can't even fix basic things. Like running stuff over X11, or having your homedir on a different harddrive than /.


> I hope the community really gets behind Snap or Flatpak

NO. No. No. No Snap. No flatpak. No rolling release distribution nightmares either. Standard LTS for me (with snap and flatpak disabled with an apt preferences file).


A standard stable LTS system with the ability to update your gui apps is basically what flatpak is all about. I agree though that Canonical is trying too hard to move system packages over to snaps.


Snaps are a dumpster fire on Ubuntu. In particular Firefox.

Flatpak... I tried a dosbox flatpak and it was 3 gigs. Regular package? 3 megs.

I run Linux mint to avoid them as much as possible. All of the flatpaks pretend nobody has files somewhere other than the home directory, and softlinks don't work. Have a mdadm raid in /mnt? Can't see it.

So basic frequent use cases can't be supported by a "modern security and permission system". Then no thanks.

All communication of this with the developers of snap, for example, are dismissed with a high amount of condescension. Which is pretty typical with most things whenever you have to deal with security. They imposed the dogma, and you have to comply to it.

And the delay whenever the snap has to interact with the file system, there's like a 5 second delay every time it happens. When trying to access a SSD based file system. That is unacceptable

Look we do not have infinite Moore's law steppings, or gigahertz of serial speed coming down the pipeline. Moore's law is running out. We can't be adopting some system like a stealing light, 50% of the practical performance of the user interface. Hard disc drives are going to start to not get bigger. The technologies we use for increasing the size of hard disk now already reduce the reliability of the drive.

So snaps in their elk really are not something I look forward to. For now just run the distro's lake Linux mint which doesn't use them


> I tried a dosbox flatpak and it was 3 gigs. Regular package? 3 megs.

You are counting the runtime size. That is shared between all apps that use the same runtime. As I already had it installed for another app, the download size for the dosbox flatpak on my fedora machine was 1.6MB.

I also recommend installing flatseal to easily see and manage file access permissions of flatpak apps.


[deleted]


Now try saving something from Firefox.

By default, it shoves it into some parallel file system within the snap.

Oh did you forget to actually explicitly save it into the downloads directory? Yeah, you have no idea where it is now.

Oh and of course when the download file explorer widget opens up, you can't get the mouse focus. Probably because there's some security boundary between Firefox and the file system.

Usability of firefoxes vastly decreased as a result of snaps

Finally, well this isn't a big deal to me. Apparently a lot of the open source people don't like it because the Firefox nap is basically just a call. The internet pull down binary command that pulls a huge binary blob and they don't like this.


Weird. I use the Firefox snap (due to my lazy inertia), and my saves/downloads are all in $HOME/Downloads

I don't think I did anything to change that either.


Yes! Downloading things in Firefox has been annoying too now that you mention it.


I know what you meant but generally really unfair to even mention or phrase it like that. “Downloading things in snap” is more correct.

Snap has done such a disservice to open source software’s perceived quality.


Ok.


This is why I've been using Firefox downloaded directly from Mozilla for a while now instead of the broken one shipped with Ubuntu. I just need to manually update via the "Help -> About" every once in a while, but things look much better otherwise.


You don’t need to do that. You can install it via apt from the Mozilla ppa, then it updates automatically along with everything else.


It's nearly a year now that I am using Ubuntu 22.04 (and that the Firefox from Mozilla has no problems) so I don't remember all the details. I also tried using the apt version of Firefox as others also suggested. But it eventually got automatically replaced with the snap version after an update.


Yes, it’s stupid. Recent Ubuntu releases have added meta packages to apt that actually instruct snap to install instead. You have to first add the Mozilla Firefox ppa and then use apt pinning to say that you want the “Firefox” package from this repo and not that one. But then it works forever after.


Thanks, I’ll try that soon. Chrome has nothing that Firefox doesn’t have, for me. As long as it works. :)


It's a well known fact that firefox startup time when packaged as snap is significantly worse. This should be less noticeable for actual use.

> On Firefox 119 I thought I was losing my mental faculties (even more) since I kept losing track of the cursor for a whole day. Turns out that it’s a Windows “feature” where you hide the cursor when you start typing into some input box. Maybe this is a Firefox-wide regression on Linux though. Difficult to find other examples of it on Linux via Google.

Firefox hides the cursor while typing, but it should unhide as soon as it's moved (at least it does for me)

> - More fun than a problem: `firefox --version` spews a warning: `update.go:85: cannot change mount namespace according to change mount ...` > - My dear XCompose wasn’t respected because of course you have to copy it to inside some dang `/home/me/snap/common/utensils/computer-applicances/apps/Firefox` tree

Those should both be snap specific.

> - The keyboard just does not work sometimes. No other apps seem to have this problem. So it’s apparently not some idiotic “applications that use X work but under Wayland it doesn’t” (or vice versa). Does a restart help? No. Seemingly only a computer reboot.

I don't know about that one. Could be Ubuntu or snap, but it does not happen on Arch. Out of curiosity, are you running firefox through xwayland or native wayland? (It says so on the about:support page under "Window Protocol")


I hope for a future where we are not trapped between Windows, MacOS, and Linux. They are all very complicated, and have much state. They're great for general-purpose computing, but it would be cool if you could use the same hardware in these GP computers with lightweight firmware that doesn't get in the way, and can be tuned to a specific task better. Simple, responsive GUI, could run your specialty software etc with direct hardware interaction.


> Simple, responsive GUI, could run your specialty software etc with direct hardware interaction.

Are "simple and responsive" network protocols stack and file system, for example, also included in the FW? There is a reason why systems that used to work like that (game consoles) started to ship with general purpose OS to handle such things some time ago.


Sounds a bit like what UIO tries to do in Linux? https://www.kernel.org/doc/html/v4.12/driver-api/uio-howto.h...


That's what I've been thinking about.

Modern runtimes try to do a lot of OS' job(s) - memory management like gc, threads management, etc, etc.

What if we stopped using OSes and their perf. penalties and run our software (with its runtimes like clr/jvm/wasmish) directly on the hardware


Is there really performance to gain? I doubt a well configured OS is slowing the CPU down.

The alternative could be GPU like devices: A PCI-E card with sockets for the chip and some RAM running it's own firmware. Intel tried that with it's Xeon Phi line and they opted to run Linux (called uOS) on the accelerator board, so apparently that was never an issue.


Do you mean something like MirageOS? https://mirage.io/


> with direct hardware interaction

you're not gonna see this ever again. it'd be reckless to do so with how unsafe it would be


sounds like you wanna do work in the embedded space


I've been playing around with Ubuntu training myself for the day that I have to give up windows 10. My only complaints are that Linux can be tedious, everything is just 43 commands away; it requires greater understanding of not just what the system is doing, but how it is doing it; and the documentation is written like a Wikipedia article on advanced nuclear theory.

Windows and OSX have, for the most part, just worked, and it was easier for me to understand what was going on, and how to engage tasks, fix problems, and use the system. With Linux it can feel like I'm fighting with the system at times.

I'm enjoying the experience but it has not been without frustration. Perhaps I should have chosen another flavor?


No worries. I have installed windows 11 lately and i find it tedious. It constantly wants to register me somewhere, it asks me to have this as standard, and it is always updating something and asks me to reboot. And why i cant remove certain things from my taskbar? So, i guess, we both have out difficulties. Like always.


Windows 11 should be classified as a hate crime.. Just the fact that I'm deceitfully "required" to create a Microsoft just to use the system is an abomination. On some systems, I've seen that there is a way to bypass that account creation, and just have a local account, but it was so hidden and deceitful that it was… emotional..


This isn't any different than being "required" to create an Apple account. I agree that it is a problem that both systems make you jump through hoops to use them without doing that, but it doesn't seem to elicit the same emotions that come from it being Microsoft.

They're both trillion dollar companies that only care about keeping you in their walled garden.


> This isn't any different than being "required" to create an Apple account.

There's no requirement to create an apple account.


The same goes for the Microsoft account. That's the whole point. I have been happily running Windows without a Microsoft accounts for years. You can just skip the step during installation.


There isn't a requirement to create a Microsoft account, either.


Actually, Windows 11 Home does require one. And I don't mean "You can just be offline" or "Just click use Domain Account", but actually "Without Internet and a Microsoft Account, you cannot continue".

Maybe someone found a script/custom installation image, but the previous posters point about it being required stands for that one SKU at least. (I hope that there's a EU version that one can acquire in the future that gets around that, but I don't know if the Account requirement falls under the current legislation.)


Indeed there isn't. I bought an OEM license for gaming and decided to unplug my ethernet just in case. Wasn't asked to create an account. Not sure if this is the only way to avoid it short of installing one of those cool corporate builds.

I read things will improve next year thanks to the EU, though: https://arstechnica.com/gadgets/2023/11/europeans-can-soon-s...


It varies apparently. HP laptops have no option for local account as of four months ago. It's a hard stop.


As both Windows/Mac user Windows seems way more aggressive with have this online account. I don't know its got that kind of hectoring tone. Well you've been stalling for 2 months and now its time to have login created.


"Windows is a service and updates are part of that service. So if you could reboot to apply updates, say, now, that would be terrific."


> And why i cant remove certain things from my taskbar?

You actually can, it's just they don't expose most of it to the user. However people have built tools [1] that give you a pretty UI to manage those more complex configs and apply them to windows. Bonus points in that you can use all the old windows UI elements from previous versions of windows and mix and match them as you choose.

Honestly I couldn't tolerate Windows 11 without something like ExplorerPatcher.

1. https://github.com/valinet/ExplorerPatcher


be careful. i used this for a while, but then windows updated and ended up bricking my normal user account. had to go into safemode and figure out the incantations to uninstall and resolve. all during a high priority time for needing my computer of course.


Yeah I'm not sure how long ago you used it but ExplorerPatcher nowadays is setup so that it disables itself if explorer crashes more than like twice in a row so that you can debug the issue. Normally it's just a matter of fetching the new symbols (which happens automatically) and occasionally updating. Afterwards you can relaunch EP... or you can just leave it disabled if you are busy.


I've been able to disable most of the things I do not like on win11 pro, so far, but if it weren't for how much I use Ableton I would be all in on Linux at this point.

If anybody knows anyone at Ableton, please tell them to do a Linux edition :P


Or at least check that it works on Wine. I find these days that when Windows programs work on Wine they work very, very well.


Hah, a friend of mine was swearing it wouldn't work. How's performance and latency?


Maybe this tool can help you disable some of those things: https://www.oo-software.com/en/shutup10


I honestly don't understand how macOS and Linux are better with respect to OS updates. With Ubuntu I always have to update and reboot, almost weekly. And with macOS if there's an update, you can at least schedule it, but not without this big bright red notification badge that's always in your face and can't be dismissed.


>And why i cant remove certain things from my taskbar?

IDK, because the real question here is why can't YOU do it? Because that's definitely possible, anyone can do it with a few clicks from the GUI, there's nothing stopping you from doing that.

Pretty sure my 5 year old nephew can figure out how to get to Right click -> Taskbar setting, where stuff can be removed from the taskbar.

There are plenty of faults with Win11, but the fact that you haven't managed this, is more an issue on your side ("I tried nothing and it doesn't work") rather than with the product in question.

Excuse the bluntness but I'm tired of seeing all the FUD spread on HN and I need to correct it.


How do i remove teams or the stupid weather badge from the task bar? I can remove explorer and edge with a right-click, just those two i can not. It is inconsistent.

That it is not impossible with a few more clicks and that i have to know, where to find it, well, duh.


>How do i remove teams or the stupid weather badge from the task bar?

In the time you have written this rant/question here you could have typed it instead in Google, Copilot or ChatGPT and have gotten your answer already instead of falsely complaining that "its impossible to remove something" when it's not.

Yes, some things will always be unintuitive to the user that's new to any OS, but that's the issue with any other OS including MacOS and Linux distros when you're brand new to them and use it for the first time.

There's always gonna be a learning curve and you'll need to Google a bit to figure some stuff out in the beginning regardless of your OS of choice, but that's a long way from saying "it cannot be done" when the main issue is you can't be bothered to Google something basic.

Excuse the bluntness here in my comments, but how do people manage to get into highly paid technical careers while not being able to Google "how to remove X from Windows taskbar?".


On Windows 11

- Right click Taskbar

- Choose "Taskbar Settings"

- Untick Widget. Weather badge gone.

- Untick Teams (if you have it- I don't, and don't have Teams popping up)


Sometimes I think people compalaining about windows are just to lazy to look through the settings and search for the things they want to change. Windows has a lot of things to complain and be 'angry' about, but 90% of the issues I see people mentionning are completly solvable with a few clicks.


>Sometimes I think people compalaining about windows are just to lazy to look through the settings and search for the things they want to change.

Sometimes?!


i really dislike this line of reasoning. Windows has a much larger installed base so finding support and solutions are pretty easy. Whether that is a video, a blog or a forum, you can always find someone with the same problem.

No, windows nagging you to register is not in the same ballpark as your audio not working or your video resolution changing with multiple monitors. No, windows updates adding or removing features are not equivalent to the problems linux has, many of which are showstoppers.

You're trying to be snarky and this is one reason why so many regular Joes are reluctant to adopt linux. you never know when you will need the terminal and when you will encounter an unhelpful response like this.


So serious. I find the linux community indeed very resourceful. And the ones for windows, after you have clicked thousands of banners away, that tell you the workaround for avoiding the registering after a long text of nothing, i find them resourceful too.

My point was, that it is always difficult to enter a new domain. So it is a shared pain.


My limited point here is that linux has a lot of problems that are showstoppers. You sometimes cannot do what you want to do.

The average person cares about about usability first. Visually, linux is excellent and very useable. The problem is, should things go wrong, the level of technical skill needed to use community resources is relatively high.


> Visually, linux is excellent and very useable.

On a superficial level maybe. But generally it’s still very unpolished, inconsistent and has poor UX design. Of course it’s matter of taste.

Not sure why your comment is being downvoted though? I find it hard to imagine how could anyone disagree with it from the perspective of an “average” computer user.


> But generally it’s still very unpolished, inconsistent and has poor UX design.

Came here to say that. If you stick to the 2 or 3 desktop apps that they offer and do everything else in a terminal, then no worries. But the minute you try to get things done, beyond just the terminal - it comes apart.


>I find the linux community indeed very resourceful

I do to. As a “power user”, however it feels that for people who don’t know how/fear/don’t want to use the command line and don’t know how to read documentation accomplishing even basic stuff on Linux would be a struggle.


The old joke goes, though, that you have to prompt-engineer the Linux community to get a useful response, e.g., "Linux sucks because I can't do X" instead of "How do I do X?"


I don't disagree with you but have you ever had to google a windows issue in recent history? You get 90 links of 'Microsoft experts' that ask you to run "sfc /scannow" or "chdsk". None of them having a clue what they are talking about.

I think it is usually easier to fix windows if you have 'audio not working' but windows 10+ has started to have a lot of non trivial issues and there are no simple google fixes. I have had serious problems as an IT worker with EVERY large windows update in the last 3 years.

On the other side of things, I have seen linux desktop issues that are a nightmare to figure out, including audio not working after a recent update.


> You get 90 links of 'Microsoft experts' that ask you to run "sfc /scannow" or "chdsk".

adding the word "reddit" to the search usually improves the results significantly


I tried searching for workarounds of network shares causing explorer freezes when no connection can be made.

The only "solution" I found was sfc /scannow


> Perhaps I should have chosen another flavor?

That'll ignite a holy war.

The trick with linux isn't to learn the flavour, it's to learn linux - once you get an overview of how all the bits work then switching distro is (much) more straightforward.

Ubuntu is an excellent first choice because it's one of the major distro's so you'll be able to google stuff more easily.

My personal recommendation would be Fedora (and if you want a familiar windows 10 at least in approach GUI) Cinnamon which is excellent.


Where do I learn modern desktop Linux? I am using it on servers since 1993 and used for desktop 2004-2017 (and was very unhappy with it) so my desktop knowledge is severely outdated.


This is almost me. I used Linux on and off over the last 15 years. Mostly in servers, but I often installed a linux partition to try it out.

I always went back to windows since windows was much simpler and I thought KDE and Gnome was similar anyway, I didn't see any benefit in switching.

I permanently switched to Linux in august. I found out about window managers and now I see a real benefit on using Linux over Windows. As much as I'm forced to use windows at work and I try to emulate the functionalities of a WM.

What helped me is support for most of my devices. Linux progressed a lot in the last few years.


Can you hotplug a GPU or do you need to still (effectively) reboot? (I know it's only the windowmanager that needed a restart but if all programs run under then, well.)


What do you mean? Switching between integrated and dedicated GPU? Or opening your case and directly removing or adding a GPU?

If its the first, I think there is support for this. There's NVIDIA optimus for NVIDIA for instance.

The second one, I never thought it could be a use case, even less that Windows would even support that. I always turn off my computer to do anything on my motherboard.


I mean eGPU over Thunderbolt.


Oh yeah, didn't think of that one. Following what's written on arch wiki https://wiki.archlinux.org/title/External_GPU, Xorg doesn't support it and never will. Wayland seems to support it, but I don't know what it implies in use. The issues for KDE, Gnome and wlroot are all mergred.


I'm not sure what there is to learn, if you have a good working knowledge of Linux servers. This is exactly why I prefer the terminal to any of the GUI components that may exist. Much more stable interface. As far as programs that I use daily, other than a few obscure VPN clients, I've rarely, if ever, needed anything that I couldn't find on AUR. For that reason and for reasons of having up-to-date software available, I like something in the Arch family for Desktop use. On the other hand, .deb packages are often available for more obscure software like those VPN clients...


> I'm not sure what there is to learn, if you have a good working knowledge of Linux servers.

Everything about the graphics and windowing stack, for starters. Why isn't my external display being detected after waking from sleep? Why is it always set to the wrong resolution at boot? Why doesn't VLC have any controls? Why is this simple Unity game suddenly running at 2 fps?

Thermal management, CPU governors, that sort of thing. Not really an issue on servers, occasionally misbehaving on laptops.

Occasional stuff around device connectivity - bluetooth speakers, NetworkManager, etc.


I would be really interested what you were unhappy with? the only major thing that truely changed after 2017 was that wine became almost universally compatible.

It was freakish how little issues i had in 2019 when i switched completely.


Some experiences

Ubuntu distupgrade breaks my machine so badly I need to spend days putting it together again

Switched to Arch

Upgrade. Bluetooth breaks. MFC device breaks. One of the two almost always.

Compared to this the only thing I dislike with Windows upgrades is how it can wake up the machine even from S0 state (wtf!) to upgrade and reboot. But aside from that, it's a complete nonevent when it upgrades. It's been near six years and I never had an issue.

I had a client using an F5 VPN requiring MFA and there was no MFA support except on Windows. I got it working with an old Firefox running as root (because that supported old style extensions and I managed to find an old extension that worked). The only question regarding this setup was from the head of IT at this company asking but not receiving any responses on how to do this.

Weird enterprise wifi always a headache.

I just got a GPD G1. Here's my experience getting it to work:

1. Plug in power. 2. Plug in the Thunderbolt cable 3. Install AMD Adrenaline driver only (only step that requires a tiny bit of expertise -- don't install all the stuff) 4. Reboot 5. It works. 6. If I unplug it, machine goes back to the nVidia GPU inside the laptop. No fuss.

Tell me true, is this going to work on Linux ? I seriously doubt.


2017 was in the early phase of pretty disruptive era with Linux - eg systemd, audio, wayland, app sandboxes etc etc. I get the feeling things are settling now, and stabilising / consolidating into a better place overall with less churn. But for a while things weren't as stable as they typically were before that.


I tried wayland maybe a little over a year ago and the copy and paste behavior was bad enough to switch back. I recently switched to get freesync to work on multimonitor and the issue is now manageable.

app sandboxes I fell less positive about - I dislike them since I think they are only fixing stuff that is already unaceptable to happen.

As for audio - yes while pulseaudio was working i remember some issues with that (mostly related to s/pdiv). This also has lessened with pipewire.


Depends on your focus. If it is privacy or security, you may want to consider Tails and Qubes OS recommended by Edward Snowden.

[1] https://tails.net/

[2] https://www.qubes-os.org/

[3] https://www.youtube.com/watch?v=s8-B9d7iz1A


Qubes is cool, but Tails is for Tor and is meant to be a boot-from-live-cd or flash drive.

The whole point is that you want all instances of Tails to look the same on the network so they can't fingerprint you. Once you start making it a daily driver, picking up a lot of cookies, making it your own, etc. then the point of using it is moot.


Install Gentoo if you want to learn how a modern functional Linux system is put together


> That'll ignite a holy war.

And that has been the major problem with Linux for years.


The duplication of work is staggering. The best people at Mint are working on the same problems as those at Fedora and PopOS and Cube and Suse … infinite loop.


Any comment on Arch one you get going? I find the documentation quite thorough for the niggly bits, and have used it to troubleshoot Ubuntu (taking into account differences between the distros).


I am an Arch enthusiast since I like building my system up and learning how things are put together, but it is not a terribly good platform for learning how to use or manage a Linux system. One of the issues is that you end up with a system is tailored to your own needs. While that sounds great, you have to make an effort to generalize those skills since no one else will have a setup quite like yours.


Thank you, The documentation is really important for me I guess. I feel more comfortable with documentation carefully laid everything out; screenshots are gold.

Is there a book that you would recommend that would help a novice like myself learn Linux better?


IMO, the best way to get comfortable with Linux is to get comfortable with the command line, because although every distribution is going to have different UI and built-in apps, the command line is going to stay pretty consistent. Also, a lot of troubleshooting you Google is going to involve interacting with the command line, and it's essential to understand what the commands you're executing are actually doing.

I'd recommend The Linux Command Line by William Schotts to get started.


Was trying to configure a network bridge for a vm just the other day from cli. The guide (for Ubuntu which I was also using) was using nmcli (Network Manager), tried it and command not found, back to searching and was nudged to systemd networking by stackoverflow which didnt work either. Turns out that my system was using Netplan. Three different systems to handle networking, really? Ok, chatgpt convert this nmcli command to netplan, sure here you go just put this in your netplan config file and apply config. Ends up with a botched network config on a headless system.


netplan(.io) is an abstraction layer on top of either NetworkManager (GUI installs) or systemd-networkd (servers/non-GUI) and is not really needed except as a convenience for Canonical's own designs for automated mass deployments especially linked to cloud-init. Under the hood it just converts its YAML configuration files into the syntax for the underlying actual network management tool.

For NetworkManager it'll write the config file to /run/NetworkManager/system-connections/ and for networkd to /run/systemd/network/ on EVERY boot since /run/ is a tmpfs (file-system in RAM).

For almost all servers, and most workstations, netplan is an unnecessary indirection since most hosts (including containers) have pretty static network configurations that only require writing once (to /etc/NetworkManager/system-connections/ or /etc/systemd/network/ ).

nmcli is the NetworkManager command-line tool. There is also nmtui for a text user interface. These are terminal alternatives to the GUI applets such as nm-applet (network-manager-gnome) or plasma-nm for KDE.

networkctl is the CLI interface to systemd-networkd. There is no widely used GUI interface to it (yet).


That's the exact experience I went through about a year ago trying to set up a bridged VM on a headless Ubuntu system. I mean right down to the sequence of nmcli, systemd, and Netplan, winding up with wiping it all away and just running Virtual Box on a way overpowered and mostly idle Win 10 system. Because I just wanted to run a VM connected to my local LAN.

Linux networking and DNS resolution, while working fine for the happy path, are a dumpster fire from a system management viewpoint. Especially if you want to do anything even mildly off-script. And I say this as a Linux user since before the kernel hit 1.0.

I don't know, maybe it's just a documentation problem. The accumulated junk of 50 years of obsolete documentation that you have to wade through to find out that the whiz-bang Linux distro you're using today is not the Linux which worked fine last year.

The exit off my lawn is in this direction.


And that right there is why Linux is so frustrating to start with. It is easier to solve most problems via command line however a normal user should never ever have to touch a command line. Everything for a common person should be easily accessible via a gui.


> Everything for a common person should be easily accessible via a gui.

They are. Just there are many different GUIs. Each distro by definition has different ones and you can easily change them yourself etc.

There are dozens of sound GUIs, hell, I've tried 4 redshift GUIs before settling on QRedshift. It's so easy to write software for Linux that you have countless opportunities. That's why the command line is easier - it always works(ish).


Only way to do it is to do it.

- try something like Mint if you are used to windows and PopOS if you are used to OSX

- remember that everything in GUI has a commandline equivalent


I have been eyeing up Fedora Onyx for the immutable root


I've been using an immutable Fedora for quite a while now and it's amazing. I use toolbox and run everything in containers, including GUI apps, so not even using Flatpak which I'm not a big fan of. I think I have like 3 or 4 packages installed on top of the base image. The peace of mind while upgrading can't be overstated. Highly recommended.


I run mongodb for work and that requires openssl1.

On Ubuntu I just copy the libs over to /usr/lib

But that wouldn't be an option on an immutable system.

I will look into toolbox.

Edit: looks like tool box is exactly what I'm after!


You mean Fedora Silverblue? Haven’t heard of Onyx.

Edit: ok so I learned about Onyx, which appears to be doing roughly the same thing?


There are different spins of Fedora that are immutable and based on rpm-ostree and flatpaks, the variation being the desktop manager environment:

* Silverblue: GNOME

* Onyx: Budgie

* Kinoite: KDE Plasma

* Sericea: Sway


And don't forget the community spins based off ublue, like Bazzite!


Just installed Onyx yesterday on my work machine! So far so good, some niggles as always (it's my first time using Budgie), but overall fairly usable and easy.


Same here but the other way. I find Linux much simpler to grasp than endless hard to navigate menus, windows where each is from different land, registry, shell and what not. I rarely use Windows but when I do, it feels like MS either ruined it or I simply forgot how to do things. I used to think like you so I guess it's just a matter of taking the leap and learning new OS

Bonus points: no ads, no Edge or Drive bullshit, nearly any small quirk that annoys me can be rid of (which admittely can take some effort BUT it can be done)

Example: - Linux: just type the command vetted by hundreds of people on StackOverflow to edit the config file - Windows: open app, click this and that... Where is it? Ohhh outside of screen because it uses super duper stylish gaming UI. Let me change system-wide UI scaling just to see thing I want to toggle. No, there was no way to just drag that window.


If you’re having to faff about with the registry every day you’re doing it wrong!


If you're having to use the terminal for anything other than your own work (actually even then it should be optional, but I find it easier to run make/gcc/etc. directly vs using an IDE) everyday something has also gone wrong, so I think it's a fair comparison.


If you are never having to faff about with registry, then you are either lucky or using a different OS.

I include installing a fresh copy of Windows to resolve the inevitable corrosion that Windows has over time.


> I include installing a fresh copy of Windows to resolve the inevitable corrosion that Windows has over time.

Remember that one from the past and countless times I had to do this for others because Windows would become a mess.

Nowadays I just upgrade old Linux distros running for 5+ years because of security concerns or out of boredom. Magically still running same as fresh install


One of my favorite things about NixOS is that this problem is permanently solved.


I'm a Poweruser of Linux since the late 90s and I can tell you that if you're a Windows or Mac poweruser with zero Linux experience, you're not going to have a good time in Linux.

Linux has come amazingly far, but I still wouldn't recommend it for everyone.

It's all about context. You're a power user so you're going to want to do advanced things, which requires total immersion in a completely different ecosystem.

But my 80 year old father on the other hand, he just wants to edit documents, scan images, browse facebook and play solitaire. He can run Fedora Silverblue with no problems.


I have been using using Linux for 16 years in total and 7+ years as my primary OS.

Ubuntu itself is a performance hog due to snaps. And Ubuntu is NOT beginner friendly. As an ex-IT guy and as a linux enthusiast, It was not when I deployed it to developers and tech people in my company. And it is not beginner friendly when I install it to my friends & family.

I used to use Ubuntu for work until last week. Used it because of the worry for Ubuntu compatibility since most work related tools etc supports Ubuntu if you want to use Linux at work. It used to use 7.5-8GB ram and >40% CPU. So I switched to Linux Mint. It is based off Ubuntu but it is faster & is better for UX. And I have been using Linux Mint since monday. Ram usage wend down to only max 6GB RAM and <25% CPU usage. For the same workload. Since it is Ubuntu compatible, Linux Mint will support all those work related stuff for which they need Ubuntu for. And I have a lot of horror stories with Ubuntu.

I recommend Linux Mint if you want point release distro. It literally takes care of you and works OOTB. The only problem is it doesn't have wayland support yet. If you have the time to invest (which I think you don't since you mentioned tedious), and only if you have time, I recommend Arch Linux.

Another green flag is to prefer community led/oriented Linux flavours over corporate funded for long term UX. Way too many cuts and bleeds to trust a corporate run Linux distro. CentOS and Ubuntu being the latest ones. The only exception so far has been OpenSuse IMO. And I have heard good things about POPOS as well. I haven't used it to comment on it though.

I really see FreeBSD as my north star. But it is not yet there for my work and desktop usecases.


Arch can be made easier if you use Manjaro[0], for example. I used Manjaro for years before switching to Mac, and it was rock stable. I had Timeshift[1] installed and set up to be able to roll back just in case I messed something up. In general though, I never felt like I had to mess around with the terminal and a bunch of commands, only if I really wanted to.

[0] https://manjaro.org/

[1] https://github.com/linuxmint/timeshift


I used Manjaro for years before moving to Arch Linux myself. Manjaro broke their 2 weeks testing cycle promise. It used to be stable. Then they started cherry picking systemd and kde directly from upstream instead of the traditional Arch stable repo -> Manjaro unstable -> Manjaro testing -> Manjaro stable cycle for a package. Things started breaking and they were hostile to the community which resulted in a new forum. I have both technical and reasons of morale to ditch Manjaro being an early ~2016 Manjaro user.

I also have a sour taste regarding Manjaro because I was one in the minority of people who supported them when they wanted to kickstart a company with Manjaro. I am also going to ignore the whole issue about the leadership issues and outsting of Jonathon (R.I.P Jonathon!). I walked away from Manjaro because there was no technical reason or joy to use Manjaro and partly for Jonathon. He was the glue for our community. Old Manjaro users know how awesome their community was initially.


Thank you for the detailed explanation! I didn't know about any of that. I either got "lucky" and switched to Mac before all these happened, or just didn't pay enough attention to see these changes.


I mean, if you didn't know, you probably came in after all these things happened. A lot of the old community fled to Arch or Endeavour.

It is good right? You had a pleasant time with Manjaro. Fun is good. Excuse me if I have kind of spoiled it for you. :)


Is it possible that some of the perceived performance gains come from comparing an old installation (with lots of tools and packages that are no longer used) with a brand new installation?


I don't think so. Based on my experience. I have enough experience with snap performance issues. When I was using snaps in another distro (Manjaro), removing snaps shaved me a nice 11-12 seconds boot time. Snaps adds boot time. They are slow to start. They also have these performance issues. What I didn't know or somehow missed was Firefox snaps was eating way too much memory than I thought. And the whole package was not appealing or solving my problems.

Linux Mint in my past experience have always stood tall even in old installations. So I don't expect that as a reason. And I don't expect LM to slow down as well. But thanks. I will keep an eye on it and see.


Arch is terrible due to their bare testing package release policy.

Generally accepted recommends:

- Windows converts: Mint

- General: Fedora / PopOS

- Rolling: Suse Tumbleweed

- Config fanatics: Nix

- Minimalistic fanatics: Void / Alpine


> Arch is terrible due to their bare testing package release policy.

It is NOT. This is biggest mystery I have about people's perception about Arch Linux. Arch provides latest upstream stable packages unlike point releases. NOT developer branches of packages BUT STABLE versions of package as soon as it is available. Point releases like ubuntu gives you old versions of software. And backport security patches against that version from latest stable packages which again leads to possible bugs etc etc. Apart from Linux, you will never see anyone calling a latest stable version of a software as "bleeding edge". You don't say iOS 17 as bleeding edge. It is the latest stable version of the iOS. Windows 11 is not bleeding edge, it is the latest stable version of windows OS.

I have a 5+ year installation Arch installation from which I am writing this comment. It is fast easier to use and predictable. Point releases always give you a easy jumpstart and it is often a pain to maintain in the longer run. Arch definitely requires initial time investment (It is after all a DIY distro, not a managed distro), but it is easier to maintain. I use Arch because I am lazy. I don't want unpredicatable changes.

I don't think I am going to complain getting firefox 119 (which is the latest STABLE version at the time of writing) within a week or two. Especially in these times where we should be using latest packages for security and stability.

Also repost for Linux distros (like stable/unstable/testing) is testing a package against the distro. It is NOT checking the stability of the software itself. It is checking the stability of a software/packages against a particular distro.


Not who you're replying to but I'm sorry, he's right.

The GRUB incident last summer was the last straw for me. I found myself with an unbootable system. The r/arch sub was radio silent. r/EndevourOs posted a sticky about the grub issue. On researching it more, they were using a build of grub off master (not even a release!). When I asked an arch dev if he thought it was appropriate to do that given he'd just left thousands of people without a bootable system, he dismissed me with something to the effect of 'If you can't repair your bootloader maybe you should use an easier distribution'

You know what? I agreed with him. I formatted my laptop and installed PopOs and it's been really nice running a system where the devs actually seem to care about stability.


Oh no. I am sorry that happened to you. My experience have been pleasant from the community. But I understand that might not be everyone's experience.

Might I suggest setting up timeshift or other things like BTRFS which have rolling back features? I never have to use it in my 5+ years old arch install, but I have used it once while using Manjaro (Arch derivative). If your system breaks, I just use Linux Mint live boot and revert it back to last working state from Linux Mint which bundles timeshift. And you just wait for a fix before you update again.

I suggest this because my experience has been that point releases creates problems in the long term and hence is not without issue. So this might be helpful.

PS: I have heard nice things about PopOS. Hope things are fun over there. :)


> It is NOT.

It is.

Arch has an explicit policy that if there are breaking issues with a package, they will push it regardless and it is your task to read about these yourself in the package update/release messages.

No sane person is going to do that for the many many packages that their system is comprised of.

SUSE holds fresh packages for a little bit and puts them through a more rigorous testing process, before pushing them onto Tumbeleweed. This leads to much much less breakage.

Maybe you aren't aware, but Tumbleweed is also a rolling distro. This is why I specifically mention it as a replacement for Arch.


Hi,

This is an interesting point you make. I agree that this is a policy and I am not a fan as well. The solution is keeping an eye on the forums and news. But yeah, it can sound problematic. But like I mentioned just above in my comments, zero issues with just sudo pacman -Syu for the last 5 years. I have only done manual intervention twice, both times as per https://archlinux.org/news/. One was something I can't recall and recently JRE, JDK. It didn't break.

Tumbleweed updates are big batches because of the way they update. Which I am not a fant of as well. Not to mention, I am a big fan of Arch community. More over, albeit only hearing good things about OpenSUSE (AUR is based on OpenSUSE tech IIRC), I would like to use only community led/oriented distros. They are always better for the end users. Especially Arch, whatever technical problems exists, they are always improving things. Look at the archinstall. Now I can have a new arch install in like <5 mins.


Why not a stable distro with KDE for Windows convert? KDE is the closest thing that looks like Windows.


Linux Mint also looks pretty close to Windows, and they seem to explicitly try to accommodate Windows-experienced folks, with a lot of preinstalled tools and a similar UX.

I run NixOS with KDE on both my desktop and laptop, and whilst it is closer to Windows than Gnome is, its not as close as Mint, or rather, Cinnamon. KDE looks close to Windows with the default UI, but the UX differences are more noticeable.

Another good distro for Windows users would be Zorin, with their Windows layout.


What makes me stay on KDE is the file manager (Dolphin), it's even better than the Windows one.

I would be on Gnome otherwise, but its file manager is for people that don't actually manage files. It's quasi useless (at least for my needs).


It's the complete opposite for me. With Linux I'm in control of the OS and can make it work the way I like. Sure, you need to know what you're doing but the learning curve these days is rather easy. With Windows and macOS I feel like they're treating me like an idiot and I always have to settle for 80-90% of the behaviour I want if I'm lucky.


> Windows and OSX have, for the most part, just worked

I bought a System76 machine running their own PopOS distro and everything Just Works as I'd expect, with the added benefit of actually having the ability to muck around with the innards of my system when I want to, unlike Mac or Windows, which are increasingly locked-down, opaque, and user-hostile. Neither Apple nor Microsoft are consumer desktop OS-focused companies these days; the former is a mostly a phone manufacturer and the latter is a confusing mess that might be best described as an enterprise software company. They have no material incentive to care about the quality of Mac or Windows, and it shows; their desktop OSes are afterthoughts.


>I bought a System76 machine running their own PopOS distro and everything Just Works as I'd expect

Including touchpad gestures and browser hardware video acceleration out of the box?


It has touchpad gestures, and I even used a GUI program (I think it's called Touche, trivially installed via the app store) to customize the gestures further (it's quite powerful, it lets you assign arbitrary keys or shell scripts to gestures and scope them by context, e.g. I assigned four-finger swipe left/right to Alt+Left/Right while inside Firefox to perform back/forward navigation (this was before Firefox started shipping with gestures natively)).

As for browser hardware video acceleration, I sure hope so (the model I got has a beefy GPU) but I'm not sure how to check, and in any case I reflexively watch all videos in 480p after years of watching my old Windows laptop overheat while trying to decode video so I'm the wrong person to ask. :P


Firefox has hardware video decoding enabled by default on linux for years now, only google chrome doesn't want to bother doing the work to enable it. That said, in some cases hardware video decoding may be unavailable due to IP issues that forbid the necessary code to be distributed freely.


YMMV.

I switched off of OSX as my primary machine around 2015 and moved to full time Linux. Just decided to commit and fully dove in. I tried several different flavors for months at a time from Fedora to Ubuntu. Eventually I settled on PopOS.

I was really happy with everything up until my work shifted to management and spending > half my day on Zoom calls. For some reason, I periodically would have issues with peripherals. Picking the wrong mic, having to close and reopen Zoom, correct camera not working, etc. None of this is an issue if you're on a laptop but it regularly was a problem since I used my machine as a desktop all day.

I decided I finally need to get a machine with a beefy GPU this year with all of the LLM stuff happening (plus my son is getting into PC gaming) so I bought an Alienware desktop with Windows. First Windows computer I've owned since 2005. Now I have a 3 screen setup where my middle and left screen are the Windows computer, my fully loaded System76 Meerkat is on the right using Barrier and 99% of my development work is remote on that machine.

No issues with the peripherals for meetings on the Windows machine. Still greatly prefer the Linux machine for everything else but the reliability of knowing that my peripherals are going to work for meetings has been important. Plus, I started a tech podcast (Carolina Code Cast) and that's been important for all of the A/V that goes with it.

All that to say, there are tradeoffs. Eventually, I hope that Linux will have first class peripheral support. I'd like nothing more than to install it on this Alienware machine one day.


> Still greatly prefer the Linux machine for everything else but the reliability of knowing that my peripherals are going to work for meetings has been important.

As a long time Ubuntu user, used to use Skype, now Teams, Google Meet, Google Voice (used to be Hangouts) and sometimes Zoom, I have had no issues, or no more than when work makes me use Windows, when using peripherals.


Have you tried pipewire? It fixed some weird issues like the ones you mention for me


I have not. Any good links?


Yes Google "Ubuntu 2022" or "Kubuntu 2022" if you prefer KDE (I do), install and run it, that's all you have to do.



In that case, the problem still exists. I was running the most recent PopOS (Ubuntu based) and it was still a problem.


PopOS isn't from Canonical. Use Ubuntu or Kubuntu for a wonderful Linux Desktop experience. It's only worked for me since 2006.


>"For some reason, I periodically would have issues with peripherals. Picking the wrong mic, having to close and reopen Zoom, correct camera not working, etc.".

This problem is probably even worse on windows for me, FYI.


Have you tried a modern Fedora? I don't know about Alienware but for me in a wide range of laptops (including Intel Macbook Pros) hardware issues are a thing of the past.


For 90% of things I'd agree with you. When it was plugged into a USB hub that had a couple of cameras, mics and speakers there would be issues.

It's entirely possible that the hub itself was a problem but it hasn't been an issue thus far with Windows.

I _really_ hoped that WSL would be better than it is when I decided to try this. It's quirky.


Ubuntu is mostly fine and a good middle ground for new users between bleeding edge (Arch, Fedora) and outdated software (Debian) IMHO. It's also probably the best place to start simply because of the mountains of documentation that exist for it because of its large userbase.

I would also like to add that often you'll find answers to stuff telling you to use a couple of commands when instead you could use existing gui tools. That might be because commands are faster for people that already know them, but often they're not the only way.


Instructions like type "this and that" in a terminal are short and perfectly reproducible. Instructions like open Settings, find This, click that" are less precise and could leave people stuck halfway.

Even on Windows forums there are often NET USE or Powershell commands mixed with instructions to perform the same task using GUI apps.


I deeply appreciate the GUIs.


As ex-UNIX zealot, that came back into Windows around Windows 7, and then settled to use GNU/Linux from VMWare, Windows has gotten worse, but still not bad enough for me to use GNU/Linux as main OS.

Just last weekend I have spent a good part of it fixing the way installing clang messed up with Ubuntu's system clang, plugged via LAN cable, because after all these years my little Asus netbook still cannot keep a stable WLAN connection to my router.


I feel the exact opposite. Changing things in Linux is usually just a checkbox in the settings. Whereas changing things in Mac or Windows can be an uphill battle, digging through registry keys, or installing third-party software just to change basic features.

For example, how do you rebind the "switch window" hotkey on a Mac from Cmd+Tab to Alt+Tab? There's no way to do it out of the box. You need to install some third party program called AltTab just to set hotkeys.

How do you disable tracking in Windows 10? Well here's a 50 step plan. And you better pray that none of these settings will be reset the next time there's a forced update.

https://nordvpn.com/blog/disable-windows-10-tracking/


Your Windows link is interestingly all examples of "just a checkbox in the settings", no registry editing, no group policy fiddling and definitely no obscure DOS/PowerShell commands to paste!

(Personally I go further with Win10 - disabling web crap in the start menu does need a registry change. I gather Win11 is worse still.)

OTOH a Linux change I wanted to make recently was to rename Gnome's "Files" and "Files" (!) apps to to "Nautilus" and "Nemo" so I could tell the f*king difference. Right-click & rename? Nope! We're in editing-little-files-land straight away.


About 15 years ago I received the advice, "If you want to be hacker, stop using Windows and start using Linux." Today, I am well aware that this is not the only path to enlightenment, but I count it as some of the best professional advice that I have ever received. This is colored by the direction my career has taken me: the technologies I use are open source, and that fits much better into the Linux box than others.

It's also not about writing code. Sure, when using Linux exclusively, sometimes you might have to hack together a little script to make your computer do what you want, but that's really not necessary, especially in the year of our Lord 2023. It's about tooling. So many younger devs that I meet still have irrational fear of the command line. Inability to use built-in documentation (like manpages), and again a fear of trying (because web browsers exist). Worst of all is the lack of understanding that these younger devs have of Unix permissions. We all know the guy who just pastes `chmod -R 777 .` or something from StackExchange. Since most of our production software still lives on Linux, knowing the proper way to configure these environments is valuable (though unfortunately undervalued, in my opinion, since improper configuration can still "work fine").

Using Linux full-time for years will make you more than comfortable. And yes, it probably will take years. You may come to prefer the terminal to most of the GUI wrappers provided in desktop Linux distros. You won't even notice when you come out the other side. You'll realize that everything only seemed like it was 43 commands away because you only knew 2 commands to begin with. Typing the most common commands will be second nature and take less time than moving your hand to the mouse. Anything that does need to be reasoned out and typed slowly you will learn to embed in a script, with comments so you can remember how it works, and that will save you even more time.

Most importantly, in the end you will have the confidence in your abilities to write long, condescending comments on Hacker News. I kid of course, but you will no longer fear tooling (though you may grow weary of it - looking at you nodeJS), and I truly believe that's a more important and difficult skill than reading and writing code of all sorts.


Interesting, my experience is the opposite.. since windows 10 came out it became almost unusable and problematic with random stuff, even after debloating scripts. Ubuntu has been my go to system for a while now, and i basically just want to watch YouTube and edit text.


> My only complaints are that Linux can be tedious

After using Linux daily over 10 years now, this is exactly how I feel about Windows now. I instantly get frustrated any time I have to use Windows.


My biggest gripe with Linux is that it's an incredibly brittle system that you can easily blow up, either through no fault of my own (a grub update once made my system unbootable), issues which were triggered by trying to work around ubuntu's shortcomings (I installed an nvidia driver from a ppa, which broke my system), or through curious experimentation gone wrong.

Combined with the constant weird bugs ranging from annoying to potentially system-breaking, and the lack of resources you have for troubleshooting (because few people use it, and the system changes so fast that existing posts become outdated), it's really hard to fix, too.

I honestly try to never update my Linux box, unless I'm prepared to invest time into potentially having to fix it. And Windows' annoying updates are absolutely nothing compared to the daily barrage of packages, each triggering a restart.


> With Linux it can feel like I'm fighting with the system at times.

All your points are valid, and it's been a meme for a while now that the year of linux on the desktop is always next year.

HOWEVER, consider the following:

When you're on linux you're fighthing the shortcomings of the operating system (and learning stuff as you go) where as when you're on Windows/MacOS you're fighting companies actively trying to screw you over (and over and over again, always in new ways).

The question now becomes: what fight are you willing to fight?


Most of your interaction with your computer is via the Desktop Environment. Which is Gnome in a default Ubuntu installation. If you're not particularly interested in exploring alternate operating system philosophies, I'd recommend looking into other Desktop Environments to use on Ubuntu. In particular, I'd recommend XFCE or Mate since they behave conventionally.

One thing to understand when moving from a proprietary to an open source OS is how much control you have over the system. Just about any part of it can be disabled or upgraded or swapped out for something different. If you learn where the seams are and how the different pieces work together, it gives you far greater control over your computer than any proprietary OS. If you expect everything to just work, you may be setting yourself up for disappointment.


I have been switched for a year now, for the 4th? 5th? time in my life, and it looks like I'm gonna end up back on windows once again.

The perpetual problem with linux is that it is written and maintained by people who love linux. Its great if you just need a machine to check email, and it's great if you are well versed in linux OS structure and know the CLI command set/structure through and through.

But if you are middle of the road power user, linux is just a constant annoying nightmare of reading forum posts and copy+pasting seemingly random strings of characters into the terminal hoping that it will fix the sound issue on the video you're trying to play.


My personal experience has that Windows is easier to get up and running as long as I want to use it as-designed. Getting off that happy path turns into a fiasco for me. I'm sure a lot of my difficulties would be trivial to fix for Windows expert, but I'm not one.

For me, it's way easier to bend Linux to my will.

(I'm typing this on a Mac. I spent my first months on a Mac trying to make it act like my Linux desktop, and hated it. Then I decided to try a month doing everything the Mac way instead, and ended up loving it. Go figure.)


I've been having a reasonably good experience with Pop OS, which is Ubuntu-based but different in a bunch of ways. The makers (System76) put it on the computers they sell so they have a strong incentive to make it "just work", and that seems to have mostly worked out well.

As someone with years of light Linux experience but no patience for pissing about with config all day, my golden rule is to avoid updating anything ahead of the version that the distro bundles. E.g. no I shall not be updating my Nvidia drivers independently!

I like docker for arbitrary command line software I haven't checked the source for, Flatpak (or Snap/AppImage I guess) for big complicated GUI software, and if those aren't options I sometimes run things in a VM if I don't trust them not to screw up the system (hello Tizen Studio!). I ended up installing Blender from Steam, of all places.


I hear you, but windows seems like a black box in a lot of ways. I don’t have any trust that what it tells me about running processes is accurate. Under Linux I feel like I am actually in control of the machine on a much greater scale. And I’m sure you know know what they say about great power..


As also stated by others you should start learning the basic concepts and no specific flavor/distro to understand how everything fits together. You sound like someone who could very well succeed in that, since you didn't give up yet :)

I can only recommend starting to read https://wiki.archlinux.org/title/Installation_guide

Once I installed arch I never looked back. I heard good things of the rolling distros of Suse and Fedora as well, but arch feels more vanilla than the others to me.

If you have a lot of time and passion you could also have a run of linux from scratch. You do it once for the learning part and then choose a distro of your liking.


"Once I installed arch I never looked back"

I'm also very fond of Arch Linux and I use it on my laptop.

However Arch is very much DIY and is great for the person who wants to get their hands dirty because with Arch you assemble and maintain the installation yourself.

For someone who wants things to work out of the box with no (or minimal) tinkering, I would recommend a different distro.

Arch is a distro for learners and enthusiasts. But it would be frustrating for someone who just wants a working OS.


If you want the primary benefit of arch (AUR, best package manager repo) but don't want to do a custom set up then Manjaro is a very good choice.


Yeah the control you get over Linux is both good and bad. You can pretty much get it to work exactly how you want, but getting there is very complex, and remembering CLI commands is extremely difficult for me even after many years of doing linux server admin.

It's slowly getting better though, things are starting to have sane defaults more often, and there are more easy to use GUI settings for things, instead of needing to deal with stuff that requires having documentation open on the side.

My benchmark for 'good' is that I shouldn't have to look at documentation or remember CLI commands to set up and use an OS.


So... your complaints are about Linux as a desktop, direct-use environment. The linked article is about a datacenter device (or at least a dedicated build box, virtually all 7995WX chips are going to be deployed headless for sure). But for sure all the skills you need to use to deploy and administer a dedicated 96 core monster are for sure ones you can productively apply to install packages or whatnot on your Ubuntu/Fedora/Arch/Mint/whatever laptop.

But also, all that said: if you want a Linux laptop where everything "normal" works with no fuss, just buy a Chromebook.


I have used the main OS. Linux (plain LTS Ubuntu) has worked way better than anything else, as long as the software was available. With Figma and an iPad I do not need a Mac or Windows anymore. I still have a separate hard disk for Windows for an occasional game.

Sometimes I reboot and people are surprised how quick I am back. Also I use only wired accessories, so that might be a factor.


Reading the responses to this comment are interesting; wildly different experiences with Windows and Linux. Likely due to the different use cases involved, hardware the OS is running on, and software/drivers installed.

My 2c: If you use common PC hardware with standard peripherals, and stick to software in your distro's package manager, you're fine. Ubuntu and Mint are popular and easy-to-use choices. Most people have an easier time than I do, but it feels irresponsible not to post this; I've been trying Linux every few years for the past 20. My experience hasn't deviated much from this:

I've found that once you start installing drivers for non-standard hardware, or installing software not in your package manager, there's a pattern of

  - C+P text into CLI
  - Errors
  - Sudo edit system config files 
  - Browse internet forums, Stack Overflow, and generally channel XKCD: 979. Tens of browser tabs open; hopefully one has the answer.
  - End up kicked out of the GUI after a reboot
  - The system is totalled, in that it's easier to do a clean install than fix it.

It feels like there's a tension between you're not supposed to use sudo except in special cases and you have to use sudo to do anything.


When you say Ubuntu I presume you mean the default Gnome version? If you're really looking to replace Windows then KDE (i.e. Kubuntu) is far closer in terms of UX and 22.04 is pretty solid in my experience. Still tedious at times of course.


I've had good results out-of-the-box with PopOS, an Ubuntu variant.


How long have you been using Windows for? You're "playing around" with Ubuntu and expecting to be as good at it as your full time OS of many years?


I would recommend Pop OS as a distro that just works. You don’t have to configure anything, and the built in pop shop meets all my software needs.


Maybe you should try FreeBSD, it's much more coherent


Linux Mint is quite good. It’s my go to, the UI makes sense


What kinda tasks are you talking about here, that requires N commands, sometimes tasks do actually take that many options because you're doing more complex things than you could without the CLI.


>My only complaints are that Linux can be tedious, everything is just 43 commands away

That is because you are using Ubuntu which is a prehistoric Linux OS. Try Fedora Cinnamon.

People complain about linux, but are using something that is designed to be a decade old for 'stability'. Meanwhile I have 0 issues with Fedora Cinnamon's stability, I think Ubuntu + children are cashing in on name recognition. Its time to move on.


> My only complaints are that Linux can be tedious, everything is just 43 commands away

Perhaps your training was done with chatgpt or some other procedural text generator because that hasn't been the case for quite a while now. Some people should stick with poorly engineered operating systems such as Windows, since stepping out of one's comfort zone is not for everyone. Some people _need_ to be told what to do and how to think in all aspects of life.


43's an exaggeration, but when something goes wrong the command-line is generally the only way to fix it even in the friendliest of distros.

In the old days in order to effectively use Linux at all anyone would have had to become familiar with the terminal and how it all works. Nowadays users only have to do that when something goes wrong - and of course that's great progress, but it does mean that those users are less prepared to deal with issues.

It's also easier to get help than it used ("RTFM n00b") to be, but that help is almost always in the form of some arcane command without any explanation of what it is, so it doesn't really help anyone learn.

Personally I'd like to see all this stuff hooked up to a GUI, with easily accessible docs for everything built in.

(I think Windows is actually quite good, unfortunately parts of Microsoft have become very user-hostile. I wish they'd just say "buy the new Windows! It's $200, we'll support it for 6 years, it will have no ads, it won't require any "cloud" and it will not harvest your personal data.")


> Personally I'd like to see all this stuff hooked up to a GUI, with easily accessible docs for everything built in.

That is indeed something i’d also like to see more of, but the number of times when i needed the command line is quite low. Ironically I have more issues with my windows install on a very modern laptop - drivers suck, need to use the command line to disable windows built in spyware, file sharing is flakey and so on. My linux machines all work. Linux on the same laptop just works. Also linux not being a commercial os is expected to have some issues around the edges but overall it beats windows in all technical areas except where vendor lock in kicks in.


And some people need to thinking about how inherently superior they are compared to most other people to feel good about themselves.


I ran an experiment a while ago. My windows work machine takes a good 3 minutes to compile one of my C++ projects, but a Linux VM on that same machine builds the same project in 45 seconds.

That's really all I need to know about Windows


On a work machine, that's more likely to be due to excessive background services installed by the company, particularly antivirus (which is known to kill disk performance, which means it's particularly awful for compiles). Don't get me wrong, I despise windows on its own merits, but I don't think that particular case is a fair comparison; there's a good chance that a completely clean windows VM in the same circumstances would also be faster. (Now, that might still be slower than linux, at which point we would have a fair comparison)


Nope. This is a totally vanilla Windows install.

The general consensus I've found while researching this problem is that Windows Defender inserts itself into the compiler process to monitor it for malware.


On Windows antivirus is a necessity, not so on linux. So it OP made a fair practical real live comparison not a synthetic benchmark.


It's not even that, windows forces you to use Defender. It's always on, and you can only disable it temporarily. Even when it's disabled, the process still seems to be active and consuming resources.

It's Norton all over again, except now the malware is an integral part of the operating system and it's harder to remove than any real virus.


Why wouldn’t the first assumption be that the bulk of these differences arise from compiler and runtime library (memcpy, etc.) efficiency distinctions rather than the OS?

I appreciate that to the end user it doesn’t matter which parts of the stack are better tuned, but framing this as “Windows vs. Linux” seems unjustified without more evidence.


I can't find any mention of the compilers, compiler settings or whether any prebuilt binaries were used. That makes it even harder to distinguish between compiler and runtime performance impacts.


I have two rigs here at home. A 12700k gaming box with a high end Nvme disk. Also, a 13900k workstation with Nvme disks as well. The gaming box runs Windows 11, and it feels generally snappy until I’m navigating the filesystem or dealing with compression. On the flip side the other box is running Debian 12 and feels significantly faster. Could be 12th gen vs 13th gen but I don’t think that’s the case.


I have a Zen 2 Threadripper box that I dual boot (it's a workstation for coding that I sometimes play games on), and I can confirm. I don't know what the exact technical reason is or why nobody in Redmond is embarrassed. But anything relating to the filesystem is simply dog-slow on Windows, and the more high-end your system is, the more this becomes the major bottleneck.


I blame Windows Defender (anti-virus).

It's the first thing I disable whenever I want to do some serious disk I/O.

Also Windows still uses NTFS which dates back to 2000. For compatibility reasons, fair enough. But surely NTFS wasn't designed with SSDs in mind, let alone NVMEs.


> I blame Windows Defender (anti-virus).

It would be nice if Windows Defender tone down their on-the-fly aggressive scanning for anything I/O related. For any I/O activity, it regularly consumes up to 30% of my CPU. Moving files to a different directory rev up Windows Defender even it scans the folder in the past.

I wish there is a granular control over Windows Defenders in a way that it would not be aggressive with every I/O activity. I would assume that Microsoft intentionally did this due to rise of randomware attacks and we unfortunately have inept people who mindlessly clicking anything. And they don't want to be liable for being lax.


You may want to give dev drive a try https://learn.microsoft.com/en-us/windows/dev-drive/


Exactly. If only I could tell Windows Defender to ignore every node_modules directory I have in my computer. Literally millions of files.


The solution is to not use windows.


NTFS has had several updates over the years, and supports trim just fine.

Of course, that doesn't make it comparable to ext4 or zfs or whatever, but I wouldn't expect it to be the worst bottleneck in the system.


Some updates, but it's still old. There is only so much you can do while still maintaining Microsoft's legendary backwards compatibility.

For example I'm running Windows 11 Pro on this machine with a brand new NVME.

"fsutil.exe fsinfo ntfsinfo c:" tells me [0] I'm running NTFS v3.1 which was released 22 years ago [1].

[0] https://i.imgur.com/VcXNO4P.png

[1] https://en.wikipedia.org/wiki/NTFS


> There is only so much you can do while still maintaining Microsoft's legendary backwards compatibility.

...like adding support for more filesystems.

I'm curious about what (and when) other parts of the IO stack have been updated. Something has to have changed to add trim support: I'm guessing the scheduler.


Indeed. It's impressive how they managed to extend it.

Wikipedia provides some hints:

> Although subsequent versions of Windows added new file system-related features, they did not change NTFS itself. For example, Windows Vista implemented NTFS symbolic links, Transactional NTFS, partition shrinking, and self-healing.[21] NTFS symbolic links are a new feature in the file system; all the others are new operating system features that make use of NTFS features already in place.

If I had to guess, most of these new functionalities leverage NTFS metadata: https://ntfs.com/ntfs-system-files.htm


TRIM has nothing to do with performance. It’s for longevity of the disk.


It's because the NT filesystem is optimized for a different workload. This is why WSL2 (which you should be using) uses ext4 (that's the default with an Ubuntu distro, at least).


> the NT filesystem is optimized for a different workload

a different workload than browsing files in explorer?


TWUP (Transactional Windows Update Processing)


LOL had me fooled for a minute there.

oh wait maybe this isn't a joke? https://en.wikipedia.org/wiki/Transactional_NTFS


IIRC there is a filesystem filter on windows where various apps like antivirus hook themselves, which was why filesystem access under WSL1 actually faster than windows.


Isn’t this hilarious? In order to achieve high performance on windows you need to access your file system via a virtual machine. Typically it’s the other way around.


The day I realized I could find files by file extension just like that in less than seconds from the terminal or nautilus changed my life.


My experience has always been that a Linux workstation environment outperforms Windows simply on account of the known poor performance of filesystem stuff on Windows vs Linux.

It's just different trade-offs. Windows has a user space env that does a shitload of background filesystem indexing/scanning. And then corporate environments usually cake on virus and compliance checkers on top of that. And then my understanding is that NTFS itself has always been optimized for different workloads than most Linux filesystems.

But right back into the 90s it was always like this.

When I boot this Ryzen laptop of mine into Windows, the core DE comes up really fast but there's then a good 2-3 minutes of busy cursors and background I/O hogging the machine before it becomes responsive. Ubuntu takes longer to boot, but then is immediately snappy as hell.

For mass market, consumer user stuff, the trade-offs make sense, I guess.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: