I know that open source and Linux in general don't need/want another DE, but selfishly... I really want System76 to succeed. They give me the same kind of feeling I had when Apple was kicking butt in the early 2000s. And I think that for them to become a general-audience company, they need to really own more of their software. They have some great ideas, and they've already plugin'ed Gnome to within and inch of it's life; it's time to move out on their own. And it's not like a new DE means apps won't work. DEs are just the window dressing and computer management.
What I really wonder about is Wayland support. Is this going to be a brand-new DE that's X only? That would be a real shame. I know System76 has stuck doggedly to X because they sell so many NVidia cards, but NVidia supports GBM now.
> I know that open source and Linux in general don't need/want another DE
Unpopular opinion, but more DEs is fine and good, especially if they will have teams that are either (a) large or (b) well-funded. Plasma and GNOME are very good, and Unity was actually great to use in its heyday.
Imo what we don't really need more of are the conservative, under-resourced 'classic look and feel' DEs like most of the minor players in the space. Those tend to end up incomplete and ill-performing, and there are already lots of them. I hope the existing ones thrive, but I don't think having more f them would do much good.
But anything as good as the big two, but with a different focus? Let's see it!
If this DE has low resource usage and working freedesktop screensaver protocol implementation and idleness implementation but doesn't force me on their toolbar and window manage I'll take it immediately.
Lxqt is the closest to allowing that because it wraps standard tools when possible, but this wrapping of standard tools also means that those tools don't really work.
For example, I use slock for screenlocking, but there is actually no working third party freedesktop screensaver implementation that doesn't tie you into their DE.
Xsecurelock seems to have hacks for it, but it can't even do something as simple as just showing an image without breaking with the wrong window compositor.
I tend to disagree here. I recently switched back to MacOS after years of running both. The DEs on Linux are just one generation behind. Things have gotten better, but macOS and to an extend Windows are ahead.
Linux DEs would be well advised to onboard more graphics and UI experts to make them more user-friendly and stable. Desktop metaphors have more or less stabilized and it's counter-productive to have 5 takes on the Linux-side alone.
The entire problem starts and ends with this term. It is ambiguous, personal, localized, fluid and case-dependent.
What is friendly to one user, is insulting to another. What is friendly to me at 10:00 in the morning is frustrating when I have a presentation in two minutes. What is friendly to an American pensioned welder, is insulting to an Iranian accountant.
It's interesting how people want so different things. I can't stand windows for 5 minutes before it's user interface starts to really annoy me. And mac looks great but all the Linux technology underneath is from the Stone Age.
Just to explain why I love Linux so much - when I'm in Gnome, I'm not bothered by any alerts or notifications or popups or sounds or ads or anything! It's a silent, stable, good looking desktop that runs my programs and doesn't get in my way at all. It's amazing! I truly love it.
I don't disagree, which is why I use KDE, as it is trivial to get it to look and behave like macOS and do so stably. Probably why many KDE distro that aren't following KDE's defaults use it too (Garuda Linux, XeroLinux, FerenOS)
I think Linux DEs are fine, it's just that the default config / on-boarding rarely asks what the user wants and what they need.
More distro/DEs need to incorporate UX/Desktop Layout Switcher, like Ubuntu Budgie, Zorin OS (Gnome, XFCE), Manjaro (Gnome, there are plans to do KDE as well in the forum?), and FerenOS. It'll solve the issues of personal preference in UX, as people could just one-click choose what feels/looks the best for them.
I personally don't mind taking a few hours setting up the UX I want with Fedora KDE, but I doubt most others do.
If the majority of companies that rely on Linux use it as a server for some kind of service(s), desktop environments or even graphics are not a priority. That's not to say that such use-cases don't exist. But the focus is, generally, not on outshining another OS or platform through the user interface or graphics.
A DE is pointless without the apps that run on Linux.
And that's the problem with the proliferation of DEs. Applications, which range from having no developers actively working on them, to the best teams at Microsoft working on them, have to handle yet another DE, and have to deal with the bugs raised by the users of yet another DE, etc.
UI/UX is hard enough to begin with, but requiring devs to either maintain several UI/UXs, or try and come up with a design that works across the proliferation of DEs is highly counterproductive.
> UI/UX is hard enough to begin with, but requiring devs to either maintain several UI/UXs, or try and come up with a design that works across the proliferation of DEs is highly counterproductive.
Very few developers actually attempt to do this, on any platform— virtually none, as far as I can tell. It's slightly more common on macOS for developers to actually try to emulate or reuse the design language and look-and-feel of the base system. But on every platform, it's extremely common for GUI apps to just throw the design language and other details of platforms they run on completely out the window. It's the M.O. for the most popular cross-platform toolkit right now (Electron). For every app that maintains multiple UIs, there are at least ten that do that, maybe a hundred!
Besides, COSMIC doesn't use a new graphical toolkit. It uses GTK+, just like GNOME.
A lot of developers are very concerned about UI/UX. Linux just makes it impossible for them to do so.
But even without them caring for UI/UX, the proliferation of DEs, even with common UI toolkits in the Linux world means they spend a lot of time and energy working on bugs that are not true bugs but are artifacts of a UI theme that the developer did not consider while developing their app.
For example, it's almost certain that the unified tooltip in COSMIC will lead to bugs complaining about how users are unable to drag the window by clicking on the chrome. In this case it would be fairly easy for a dev to close the bug because this is such a prominent and obvious difference in COSMIC. But there will certainly be other UI differences that will not be as easy to pin down to being the result of COSMIC's UI changes, and will not be as easy for devs to close.
> And that's the problem with the proliferation of DEs. Applications, which range from having no developers actively working on them, to the best teams at Microsoft working on them, have to handle yet another DE, and have to deal with the bugs raised by the users of yet another DE, etc.
Ideally, that's what the freedesktop.org specifications help with. You build against the specs and interfaces, and the code should work across DEs that implement specs from freedesktop.org. In my (albeit limited) experience of writing against freedesktop.org specs, it works pretty well.
I think devs can use Gtk if they're developing for Linux or don't mind using pyGtk for multi-platform, use Qt or Flutter otherwise if they're going multi-platform, make sure everything follows freedesktop specifications, and then just pack it to Flatpak, test it for Gnome and KDE... then just tell everyone whose DE doesn't support it to just deal with it.
Heck, many apps are web-based and uses electron anyways, and those are generally multiplatform anyways.
I think if whatever app they make/port worrks via Flatpak, it should work on all/most modern distro.
Many app developers have complained that even within Gnome, apps developed using GTK get a very high number of bugs that are caused by unique customizations that the user may have installed and no one else does. These bugs, despite being significant in number, consume an even more outsized amount of resources and energy because they are very difficult to reproduce.
Yeah a new DE would be welcome. I hope this will be less opinionated and barebones than gnome. That just needs too many add-ons to be useful.
I use KDE now which does provide a lot of choice but I'd love something that has tiling built in. And yeah I know there's add-ons for KDE that do that :)
You never know what's possible in the future but I imagine right now they're working on the core features so supporting every compositor around is down the road a bit.
DEs are just the window dressing and computer management.
That's how you end up in a situation where the contents of the window clash with the dressing because more and more apps can't be themed. If they want a consistent look they'll need to fork or write a whole new set of apps.
The irony is that the reason why authors claim when they remove theming is to have a consistent look.
For some apps I don't care too much, because I run them in full screen anyway, and their specific look is adjusted to their function: IDEs, DAWs, even graphic editors. But if they do support DE-wide themes, I do appreciate that!
If you’re thinking of the website I’m thinking of [1], the screenshot they lead with isn’t just inconsistent. It’s broken, with light text on a white background. And I’ve seen apps fail in that same way for real when trying custom GTK themes.
It’s the same reason web browsers can’t change the system CSS stylesheet without extreme measures like Reader Mode that strip almost all of the design work out of the site: it’s hard to mix design work from two different people (the application author and the theme author) without risking a total mess.
Not only that, you also end up with half baked applications because there isn't a consistent development stack so applications are stuck with UNIX IPC for communication about themselves.
GNU/Linux apps will never have something like OLE 2.0 or XPC that actually works consistenly across desktops.
There is D-BUS, but not everyone cares it is there.
As a Linux newbie I'm wary of distributions that include many of their own custom apps. I know most of these teams are just a few people and I doubt in their ability to create the "Apple" (OS + Apps) experience. Alas, I'm new to this so perhaps I'm missing something.
I don't mind system76 succeeding, that would be great, but I have not really liked any of their laptop offerings. I also really don't want another half funded DE for Linux. I really wish they would dedicate these resources to improving GNOME and gtk instead. Things like VRR, DPI scaling, touchpad behaviour, prioritising wifi driver bugs, basic things like copy/paste interactions between apps, screen sharing et.al. are still less than great experiences on Linux and is more important to improve than a new half baked UI.
If what we want is a rewrite of GNOME with COSMIC features, that's not going to happen with GNOME. We already submit patches to GTK, and GTK4 is gaining layer-shell protocol support as a result.
Screen sharing is already made easy with Pipewire. I've not seen any issues with copy/paste between applications. WiFi driver issues can only be fixed by kernel driver developers with hardware documentation for the exact model of the PCI device they're writing a driver for. The same goes for touchpad drivers, but I've not had any issues with them on our laptops. We sell systems where we can vouch for the quality of the driver support in Linux for the WiFi and touchpad.
In my opinion, you're wasting your time with layer shell. The design of it is very X11-like and flawed. Panels setting their own position is problematic and prevents the shell from doing layout updates in one pass. You'll want to drop it eventually and use a private shell similar to the way it's done in weston.
I left PopOS and never looked back when I needed nvidia-docker to do some deep learning work and learned they had knowingly made it difficult to impossible to reliably use nvidia-docker with PopOS, and they recommended their own unmaintained project that doesn’t even intend to be a replacement for nvidia-docker. It couldn’t have been more useless to me.
Breaking compatibility with Ubuntu-compatible dev tools is one thing, but the lack of awareness or consideration of tooling standards is evidence that Pop_OS! is not for me.
I won't ever defend nvidia, but... I really think Broadcom is worse. They've ruined tons of routers and wireless cards. Also, the Raspberry Pi is full of blobs because of them.
I’m not looking to blame anybody. System76’s approach just conflicts with my priorities when choosing an Ubuntu-based distro. I prefer to not have my OS surprise me with requirements that I replace a major portable dev tool standard with one of their bespoke unmaintained side projects. I doubt I am alone and maybe others will find the info useful. I just can’t trust their work any longer after realizing they were serious about that tool of theirs being a replacement.
`sudo apt install nvidia-docker2` works on 21.10. I have been packaging NVIDIA's container toolkit since tensorman was created, because tensorman doesn't work without nvidia's container toolkit. It's a dependency.
Nvidia-docker is not the same as nvidia-container-toolkit, and neither of them have official support for 21.10 the last time I checked (a few months ago).
It is a component of nvidia-container-toolkit. Take one minute to browse the Pop repository and you'll find three packaging repositories forked from NVIDIA. libnvidia-container, nvidia-container-toolkit, and nvidia-docker. Installing nvidia-docker installs support for the whole system. Now you have functioning NVIDIA support in containers.
This is a new repository that replaces a pre-existing nvidia-container-runtime repository that we've been using since 20.04 that does the same thing. It still remains that tensorman would not function without NVIDIA docker container support, because it's literally just a convenient frontend for interacting with Docker's CLI.
I never said otherwise. As you know, nvidia-containter-toolkit only very recently merged with nvidia-docker, and you know perfectly well that you have not always packaged or supported nvidia-docker.
They occasionally have driver mismaatches between the upstream containers and the distro-provided driver. This happens because nVIDIA only supports Ubuntu LTS releases and Pop moves faster. I solved it forever by pinning nvidia's repos to a higher priority. Otherwise I can't figure out what "unmaintained project" you're talking about, or what "tooling standards" are.
They are most likely referring to `tensorman` https://support.system76.com/articles/tensorman/ which is regrettably premised on the notion that you have any intention of using Tensorflow in the first place.
In all actuality, the correct article to reference is https://support.system76.com/articles/cuda/ but alas, this is presently broken on PopOS 21.04 unless you pin nvidia's repo's as you've suggested.
At the very least, an unpleasant experience. Probably some blame to be shared by system76 and nvidia.
It wouldn’t be a significant if the Pop OS developers weren’t telling users that they don’t need nvidia-docker. Of course, they can tell users whatever they want, but I’m not going to trust their OS if this is how they plan to handle things like this.
I am actually participating in a thread right now where contributors are happily trying to solve this issue. Their devs are very helpful if you reach out.
Since when have we told anyone that they don't need nvidia docker packages installed? You can validate that the Debian packaging has always shown nvidia-container-runtime/nvidia-docker as a recommends for CUDA support.
I’m sorry, I don’t quite recall the details but it was a few different things. The main one is forcing certain Nvidia libraries in the package manager so you need to take a few steps to undo some of the Pop OS Nvidia installation, which breaks other Pop OS stuff, of course.
There is no need to undo anything. We regularly test CUDA functionality with our packaging, and it's the whole point of having the packaging in the first place. So you don't have to manually install things from a third party repository or deb.
“CUDA functionality” and nvidia-docker are not the same thing and you know it. This is the attitude that has me turned off off. Just acknowledge and we could move on! Unless you deleted them, anyone can search your github to find your comments telling users to use your bespoke tensorman tool instead of installing nvidia-docker, acting like one is a replacement for the other. And with the same presumptive attitude.
I have no idea why you're trying to argue with me over absolutely nothing. You could have saved so much time by asking questions instead of accusing me of things I haven't done, and spreading misinformation about it.
CUDA support in Docker requires nvidia-docker+nvidia-container-toolkit+libnvidia-container, and that is the whole point of having nvidia-docker installed. We validate that CUDA functions inside of Docker when pushing updates to nvidia-container-toolkit and friends.
Tensorman is not a replacement for nvidia-docker, and you're not going to get anywhere by trying to convince me about the functionality of something that I personally wrote. It is quite literally just a simple command-runner that runs docker commands, specifically for the purpose of managing the official Tensorflow Docker images, and getting a more streamlined setup for managing your local Docker images based on them. The sole purpose is to replace our previous tensorflow packaging in Pop!_OS that became impossible to package because newer versions of Tensorflow did not build beyond 18.04. Tensorflow themselves recommend using their docker images instead of trying to package and install their libraries on your host OS.
I agree there's an old-school Apple feel to it. The difference may be that System76 is concerned with user freedom: from coreboot to COSMOS they intend to produce and support FLOSS.
I think if Apple had embraced free software early in its history, it could have known a much brighter future (err, present?). Hell, even HyperCard may still be alive and kicking.
I developed software for Mac OS X around 2005 and I can guarantee you that Apple certainly wasn't kicking butt back then and is probably even worse off today with OS X becoming bloated and full of cruft.
> lots of Linux-based hardware that they design in-house.
Are their Clevo-based laptops designed in-house? If they are, why are they designed with such abysmal speakers, mediocre webcam and no attempt to go past the full-HD screen resolution? Why not design something cool like Framework?
Nope, I don't consider those in-house. They are working on an in-house laptop, and if it ends up anything like their in-house desktops, it could be really nice.
This would really be a game changer. Unfortunately, you need a really great number of units, that building your own custom laptop hardware is feasible. I really which System76 success with that.
I always thought, if Dell were smart, they would do their own Linux-based brand, where they would reuse parts from mainline Dell. Kind of how they produce gaming laptops.
In November 2019, System76 announced that it would start designing and developing laptops in-house starting January 2020. The development process was estimated to take 2-3 years. The first generation of System76’s in-house desktop line (Thelio) took over 2 years to develop.
Sure, but there are already a number of different desktops hurting for maintainers. There is a lot hardware that doesn't work great with my os (which might be a BSD) and I want them to fix that.
Not re-implementing Gnome. Not re-implementing macOS which Gnome strives to imitate. OTOH I see the value of that: many people got used to macOS, and making things similar, and the cognitive load of switching low, makes business sense. Same as with Windows in early 2000s.
HackerNews is really a bubble sometimes. I don't get who the "many people" that "got used to macOS" are.
Apple had fiery fest of sales in Q4 2021, outsold all Wintel manufacturers, let alone Chromebooks. And after that, guess what, they now have 8% of desktop/laptop market share, instead of 6% they used to have. You can guess who covers most of the other 90+ percent.
I have seen this comment a few times but basically half the desktop environments for Linux are GTK. So, using GTK does not equal “re-implementing Gnome”.
I use Cinnamon with Manjaro - it is not GNOME but it is GTK. Unity was the same. MATE is old GNOME but quite different from new GNOME — still GTK though. XFCE, LXDE, and pretty much every popular DE outside KDE and FVWM2 is based on GTK.
Choosing Rust has nothing to do with re-implementing GNOME either although I do like that choice. If nothing else, this ensures well tested Rust bindings and paves the way for devs to write other GTK apps in Rust—-perhaps even ones intended to target GNOME itself.
> But I'm always skeptical when underlying language choice is featured prominently as a selling point for any new project.
> It tells me, this is a technology-first, users-second enthusiast project.
> And thus, I'll be surprised if it tackles the deepest issues users need solved.
Some of the deepest issues that users need solved are ones that Rust was designed to solve at the language and compiler level.
1. System stability and memory efficiency, zero or fewer crashes due to memory-safety or thread-safety problems.
2. Security and assurance via the elimination of entire classes of attack vectors like buffer overflows.
3. Highly performant, responsive applications that are a joy to use.
Solving problems at the compiler level eliminates the reliance on fallible programmers to do so. People also tend to discount maintaining those solutions over years of dev team turnover, startup failures, etc. Building those solutions, capabilities, and constraints into the language itself makes maintenance over the entire product lifecycle more consistent.
It's like buying a Toyota, you know from the brand alone that you're getting a certain baseline level of reliability, maintainability, durability, and longevity. Rust is like the Toyota of programming languages - it can produce many different types of cars/programs, but they're all guaranteed to come with a baseline level of assurance, and to eliminate common classes of problems that degrade the end-user experience.
It may not make sense for other languages to be prominently featured as a selling point, but Rust is an exception.
> It's like buying a Toyota, you know from the brand alone that you're getting a certain baseline level of reliability, maintainability, durability, and longevity. Rust is like the Toyota of programming languages - it can produce many different types of cars/programs, but they're all guaranteed to come with a baseline level of assurance, and to eliminate common classes of problems that degrade the end-user experience.
It's possible to make buggy, slow, crappy software in any language. Rust is no exception. Nobody should derive any confidence from the language a product is written in.
> It's possible to make buggy, slow, crappy software in any language.
True.
> Rust is no exception.
Also true.
> Nobody should derive any confidence from the language a product is written in.
This, on the other hand, is a non-sequitur, or at least it's too strong, because Rust can do things that C can't do: safe deterministic memory management and safe concurrency without data races. These things are not just available, they are the defaults in Rust. You have to go out of your way to get these things wrong. So, I wouldn't derive absolute confidence from an application being written in Rust. But I'd be willing to bet a whole lot that it has fewer memory bugs than an equivalent application written in C.
Exactly, this is what I should have explicitly said. Language defaults have a powerful effect on a project's norms and culture.
> You have to go out of your way to get these things wrong.
Yes, you actually have to try to write shitty Rust code, at least where there are defaults that prevent it. Thus the average Rust program will have that baseline level of quality. Like TQM/TPM/Six-Sigma for manufacturing software.
IMO the fact that the actix-web debacle was so well publicised speaks volumes in itself. The issues were actually discovered ahead of time by people reading the source code rather than people encountering bugs or vulnerabilities in the wild. That's a huge improvement over C software. Of course there are still bugs in Rust programs and libraries, but much fewer. And I think fewer to the point that eliminating them entirely might actually be feasible with enough effort.
For every vocal example there are tons that fly under the radar until they are on the news for the wrong reasons.
It is no accident that Ada and Java also have security guidelines similar to MISRA, because even with memory corruption bugs out of the picture, as log4j newly remided us, security only happens when it is part of the daily process, regardless of the language.
Isn't the whole reason you're mentioning actix-web because the community mostly balked at the amount of unsafe code in it? Surely that should be some evidence that the norms in Rustland are qualitatively different in many ways to what you'd find elsewhere?
Yes, pretty much like that. The author put "unsafe" blocks everywhere and the rest is history. Unsafe as a keyword is a kind of pledge: as a dev, you swear to the compiler that you've checked this part for correctness/safety. If you don't, you may get bitten, but the simple fact that "grep -R unsafe" will yield most (if not all) instances of potential memory fuckups is a huge win for auditability.
Rust's unsafe wouldn't have done anything to protect against an hypotetical log4r.
As for the memory fuckups you are right, pity that they are using a C GUI library.
Even if the binding has been deemed safe upon validation, if only remains a valid certification until the next Gtk version update, even minor ones might break the assumptions validated on the unsafe code block.
> Rust's unsafe wouldn't have done anything to protect against an hypotetical log4r.
You are correct. No programming language will protect you from writing flawed application logic.
> pity that they are using a C GUI library. (...) Gtk version update, even minor ones might break the assumptions
I don't know about that. GTK is more "battletested" than most frameworks, and has a lot of contributors to fix bugs. Also, and that's a very important point, GTK cares for accessibility! Using any other framework would have meant excluding a lot of people from the GUI.
As for assumptions, i certainly hope GTK devs don't push breaking changes to the ABI on minor updates. But i guess that's yet another assumption that can be broken.
But the point is the exact opposite because any use of unsafe is clearly marked as such, and therefore a code audit will instantly find its use and question it. Hence the fallout surrounding it.
Also that is besides the point, unsafe only deals with a very specific cause of errors that plague C code bases (and those of languages copy-paste compatible with it).
Rust's unsafe wouldn't have done anything to protect against an hypotetical log4r.
My point is that Actix Web was still safer than it would have been if it were written in C++ (which is Rust’s whole goal as far as safety), and moreover the norms of the Rust community are so strong that most of the unsafe usage has since been removed, since it was unnecessary. Unsafe deals with memory errors, which are consistently the largest class of security vulnerabilities in native codebases. This is uncontroversial at this point. Rust won’t protect you from log4j, but it never claimed to. Java still protects you from mishandling memory, despite all the stupid automatic classloading features we keep discovering in it.
This is true and also mostly unhelpful. It's possible to build a car without a muffler. Does that mean it happens often enough that we need to dedicate a whole pooh-pooh session to the possibility? Is a manufacturing process not more likely to be useful if it prevents this error from ever occurring?
There has to be a middle ground between "Nobody should derive any confidence from the language a product is written in" and "I only use products written in language X".
You shouldn't dervive all confidence in a product from it's implementation language, but when you are looking at all available products and need to filter down to ones you might be interested in. Implementation language can be a useful litmus test for understanding how it might behave without doing a complete in depth evaluation.
For example if I learn that a desktop application is written in javascript, it is likely to be using electron or an OS webview. Or if an application is written in C++ it is unlikely to have GC pauses cause stuttery UI. Both of those assumptions may not be true (Sciter for JS and Boehm GC for C++ respectively). However those assumptions are a useful starting point unless proven otherwise.
You didn’t give any reasons about adoption of Vala. You said it’s hard to hire Rust programmers, and then just “Remember Vala anyone?” without actually saying anything more about Vala.
Well, what about Vala? Obviously C, Python, and especially JavaScript are infinitely more popular programming languages. Yet the number of developers building GTK apps with any of them seems to be in the same ballpark.
I would add from experience, that using Rust would increase the speed of development like with any higher-level language than C and without the need to track memory problems. It shifts focus more on the actual development. Cargo helps to this cause as well, not always though.
Maybe you're right in some fundamental way, but as an example iOS, macOS, Android or Windows are all considered stable and efficient, secure and highly performant by most end-users. As a specialist one can quibble and point to counter-examples, but all of these are fine choices. The only big differences are in privacy and level of control offered.
It's more likely that by the time Rust makes an impact in Linux stability and performance - because you're using an OS, not just desktop environment - nobody will even remember what System76 was.
Ok, but GTK uses Javascript, which is where most of that toolkit's defects (both security flaws and slowdowns) come from, not C. Calling javascript from Rust isn't going to be a materially different experience than calling javascript from C.
Ah just to clarify, the blog post you referenced is from this past June. COSMIC based on Rust is a new development that System76 has not blogged about yet besides mentioning it in a Reddit post.
As a System76 fan, I too am curious to see what the point is. I have no idea what Rust is nor do I care.
From what I know, system76 didn't shill cosmos as a rust DE, the only reference to Rust being done in some reddit comment. Most PopOS users don't know what Rust is, so it wouldn't make sense to talk about Cosmos as a Rust project.
Now, 90-95% of users use GNOME or KDE, having some competition from a `new` GTK4 DE can't be bad, so I would like to see Cosmos gain attention, and maybe be ported to other distros.
The people that System76 sells to most likely buy their hardware. In which case the word 'Rust' is worrying at least. Would you buy a laptop that ships with rust?
Jokes aside: during early development it makes perfect sense to advertise tech being used, because the audience is so different at that time. So even if system76 would "shill" it as a Rust DE, at this phase, that would be fine, because of the highly technical audience.
System76 is just using Rust for COSMIC because the existing engineering talent here already use and like it for other things they work on at the company, like their device firmwares. It's a good common/familiar language for the teams they have. I don't think they're expecting Rust to do any magic for them.
They do this for getting higher ranks here at hackernews. What turns me off most is that they succeed, which in turn indicates that large parts of hackernews readers are biased towards Rust. Personally, I'm pretty sick of this evangelism and hype because I fell for it some years ago and started a project with Rust. Countless problems and dark corners. I moved in time to a mature garbage collected language. There's a reason Rust has kind of succeeded only in small niches where Rust is a good fit. And not everywhere (backend services, CLIs), like it was advertised by prominent evangelists.
However, for System 76 Rust is a very good choice.
But calling it "Rust based" when it's GTK is clearly a bit bold ...
Especially when you have top voted sibling comments saying "Some of the deepest issues that users need solved are ones that Rust was designed to solve at the language and compiler level."
What?! The year of the Linux desktop hasn't been eluding us because of buffer overflows and memory safety. The Rust hype train is incredible. The Linux desktop problems have nothing to do with the goal Rust sets out to solve.
It's definitely a good choice of language for the problem, don't get me wrong, but choosing that language doesn't improve the chances of COSMIC revolutionising the Linux desktop situation one iota.
Actually, a lot of the early Linux desktop was hampered by this exact problem. I remember in the late 2000s experiencing segfaults constantly in a lot of applications, Mesa/X11, the kernel, and even the desktop environment itself. You had to spend a lot of time tracking down a setup that works and sticking to it when you find it. It's gotten remarkably better since then, but GNOME Shell is still really fragile with segfaults when developing extensions for it.
That said, this is not the sole benefit of Rust. It is more of a side effect of aliasing XOR mutability than the primary benefit of Rust as a whole. Let's take a look at the C and Vala landscape as a comparison. Neither support message passing with channels, and async support is virtually an unmaintainable hack. Vala has generics thankfully, but C doesn't have generics and it's really bad. It also doesn't help that your only options for building software is pretty much almost entirely GLib and whatever sparse set libraries might be available for the package manager. Whereas with Rust there's a vast immutable repository of libraries you can leverage, and publishing open source libraries on the platform is easy.
I don't know about revolutionizing the Linux desktop, but i can assure you rust made me a better programmer (even when i write C or python) despite being a very bad programmer to start with.
I also know that Linux desktops are often victims of memory bugs, and that a higher-level language for extensions is something people appreciate, that's why there's JS in GNOME.
> I'm always skeptical when underlying language choice is featured prominently as a selling point for any new project.
> It tells me, this is a technology-first, users-second enthusiast project.
I disagree. There are thousands of pieces of software out there using convenient technologies at the expense of performance. I think a real, legitimate advantage of unreal over unity is that it uses c++ instead of c# for gameplay code. I’ve never been cpu bound in unreal. Js algorithms often run 100x slower than Java ones, and if gnome were written in rust we wouldn’t have had quite so many memory leaks.
Also, pg credits much of his startups success to his language choice, letting him iterate faster
For a relatively new language that still is not everywhere it's a good example of its viability for such a project. Also, it either testifies the richness of the ecosystem, or promises that it will get enriched in the process, because the development appears to be open-source.
Original Nautilus and early GNOME developer here. I am very out of touch with what is current, so excuse some possibly ignorant questions. How is this a Rust-based environment if it is based on GTK? I assume GTK is still essentially the C-based GTK we used with some improvements.
Why is this called Rust-based? I’ll do some more research but would like to get some insight from more knowledgeable sources.
How pure does your program have to be to be rust for you? How few libraries in C or C++ must it call? The developers are writing all their code in rust, so it is in rust…
1) do the yak-shaving and create a whole new GUI stack in Rust (which would be an absolute boon to the Rust community but will be a tremendous effort), or
2) switch to Qt (and basically become KDE)
Thinking about it, maybe Sciter (https://sciter.com/) would be an okay foundation to build a DE in (lightweight stack, flexible theming, solid Rust bindings). But then it isn’t open source (only the interface is, you need to pay for source access), so maybe not.
No stable release yet, and you generally want your DE to be the most dependable bug-free part of your software, especially if you are selling hardware with your own software pre-installed.
but looking good so far, gonna check this out for myself. I was rooting for azul, but iced seems to be further ahead
Can you clarify what you mean by "these projects never do"? Your comment comes across as dismissive. Especially since I already noted that it was lacking a11y in my original comment.
Anyway although a11y and i18n support have not been implemented, they are planned.
It wasn't meant as dismissive of the project. It's totally fine to write a UI library that has another focus (games, for example). I'm just tired of those projects being suggested as alternatives to native/GTK/Qt in discussions here and elsewhere.
If Iced actually gets cross-platform accessibility support, that's great! Very few projects have that today. More certainly wouldn't hurt. But until it does, you shouldn't base a DE on it.
Edit: The issue you linked to is about RTL support. That's also required, but doesn't touch on a11y.
Iced looks pretty cool but it's focused on being a cross-platform toolkit and will probably not ever be the best choice for making programs native to Linux, like a desktop environment. If you thought there were enough problems between GTK and Qt with skinning, fonts, keybindings, and all that stuff, adding another toolkit to the mix just makes it worse.
With the transition from version 3 to 4, GTK is now more focused on being a generic UI toolkit, with GTK4-based libadwaita now being the place to be for GNOME-specific patterns.
>Implementing keyboard selection inside vte means that every terminal based on vte benefits; adding only hooks for you means that all the other terminals get no benefit.
The patch being pushed wasn't going to help other projects. The developer of that project was working on a fork of the library but it appears it didn't get much use and was abandoned: https://github.com/thestinger/vte-ng
I don't see much more that anyone from GNOME could have done there.
libadwaita exists solely to move Gnome specific stuff out of GTK into libadwaita.
And it's hard to see how an open source project with as many developers as Gnome/GTK have can have some secret agenda that is counter to what they are saying publicly and creating libraries and code publicly to implement.
So XFCe == GNOME and LXDE is GNOME or KDE, depending on which version we're talking about?
UI toolkit doesn't equal a desktop environment. There is tons of stuff in the GNOME ecosystem (some of which it will make a lot of sense for System76 to reuse, no doubt) that can be entirely ignored when you're using GTK to, essentially, draw your UI widgets for you.
Understood. I have made significant contributions to GTK. My contributions possibly had memory leaks or memory corruption issues that will bring down the higher Rust layer. I am trying to understand the purpose of the Rust layer. It is fine with me if it is because Rust is interesting, but what is being presented doesn’t seem like a Rust desktop environment _to me_.
The big selling point of Rust in respect to these kinds of scenarios is that it's very often that a C or C++ library says things like "You must not call function CallAfterFoo before function Foo is called", or "Once you call DestroyObject, that object must not be used again", or "You must not call SomethingDangerous while a ResourceOwningObject exists", and so on and so on.
In well-tested libraries like GTK, SQLite, Curl, and such, they are often quite robust just based on having been heavily developed and tested by many people over a very long time, and there are still ways that they can be misused and abused that are usually well-documented and warned against. A well-developed Rust wrapper actually makes it impossible to misuse one of these libraries from Rust, and therefore better enables a much smaller team of developers to write secure, robust applications. Rust can guarantee these documented restrictions at the type level and even make impossible many error conditions.
So even though the UI is GTK, Rust still enables developers to write more robust applications with less fear. Personally, I find that GTK with Rust is a very pleasant experience. It's less about guaranteeing that the lower libraries have no bugs and more about preventing people from interacting with the libraries in dangerous or buggy ways.
I've made some small contributions to gtk-rs-core (the library providing rust bindings to glib, gdk...).
While the lower layers written in C do impact the overall safety, the bindings are made to be as safe as possible.
For example: every glib parameter that may take NULL in Rust becomes an Option<T>.
GObject's methods are defined on traits and checked by the Rust type system.
There are also some macros to provide an easy and safe interface to the GLib type system.
All of this directly applies to gtk-rs.
Overall, the bindings are well documented and with many examples. There's even a book. Also, there's a great community around them.
It's not 100% Rust but I think you would expect that GTK itself is fairly well tested - probably much better tested than any new app you write that uses it, so it is still worth it to write that app in Rust.
And it has to be said that memory safety isn't Rust's only compelling feature. It also has a pretty great build system, a decent library ecosystem, a very strong type system which gives you an "if it compiles it works" experience surprisingly often, probably the best multithreading system, etc. etc.
The issue is that Rust is extremely opinionated at to how code should be structured and there is going to be considerable interface friction with a system that is built around aliasing data every time there is a callback.
Do you have an example of that friction? Looking through the docs at e.g. https://gtk-rs.org/gtk4-rs/stable/latest/book/hello_world.ht... it doesn't seem that bad but not having used it I would be curious how it works out once you're building a serious app.
Things are already getting quite hairy by page 4[1] of that link. And that example is showing data completely owned by the GObject. If you need to pass a mutable reference to third party data to a GObject it's not going to work in Rust. Imagine a button that you click to change the contents a GtkTreeModel connected to a GtkTreeView--you're going to need to work hard. You're going to write a fair bit Rust specific glue code to work around these issues.
You can avoid virtually all friction by sidestepping shared mutable references altogether and writing your GTK application in a more idomatic Rust way. For example, instead of passing shared references of application state to every widget signal, you can pass async Rust channels to signals. You can spawn a single async event loop attached to GLib's main context that listens for events sent from those signals, and manage all your widgets and application state in one place.
There's some libraries that effectively automate GTK in this way, such as relm4.
Asking as someone who's never spent significant time/effort working with Gtk/GObject in C (only a bit in Python). Isn't it still generally desirable to have the compiler yell at you, if you can't convince it you know what you're doing? Rather than allowing the possibility of memory corruption.
I do agree that the code in the example is far from beautiful. I wonder if we were to redesign GObject from scratch, if we could make interfacing with Python, Rust, JS, etc a bit less hairy.
The work you have to do to show your compiler that certain constraints hold (say that i and j point inside the same array and that i < j can be extremely cumbersome, even in cases where it’s obvious that they do hold. That makes it not seem worth the trouble.
Of course, that obvious may turn out to be incorrect for some edge condition.
Basically, that’s the same reason why mathematicians don’t put all proofs through a proof assistant.
Of course, to the extent you assumed things the machine didn't test that.
The core of TLS 1.3 was proved before it shipped. But, the proof makes an assumption which has consequences. It assumes when communicating using an agreed shared secret† you have a separate shared secret for every such pairing. So Alice and Bob need a shared secret but (and this is where humans trip up compared to what was actually proved) the proof says Bob and Alice also need a shared secret different from the one for Alice and Bob.
The consequence of this accidentally missed assumption is the Selfie attack. Alice and Bob share a secret S and communicate over the Network using TLS 1.3. Bob fed the cat half an hour ago. The cat has employed Mallory to trick Alice into feeding it even though Bob already did. Alice sends a message on the Network. It is encrypted with S and it says "Did you feed the cat?". Mallory doesn't know S and can't read this message or tamper with it. But Mallory simply redirects the message back to Alice. Alice receives a message, properly encrypted using S, which says "Did you feed the cat?". She presumes this message is from Bob, so she answers "No, go ahead and feed the cat". Mallory redirects this response back to Alice too. Alice receives "No, go ahead and feed the cat" and she concludes Bob hasn't fed the cat so she feeds it again.
Oops.
The proof was fine, but we brought an assumption along that we did not clearly articulate.
† You never use this mode in your web browser, but IoT things might do this because it's easier than all that stuff with certificates.
> The work you have to do to show your compiler that certain constraints hold
In rust, that's what the unsafe keyword is allowed. It doesn't mean that the code is actually unsafe, but rather that the compiler should trust you on this one.
Depends on what parts of the environment you're looking at. The compositor that's the basis of the entire environment will be written entirely in Rust, based on smithay. But applications written for COSMIC are currently based on GTK4.
GTK is currently the best GUI toolkit for Rust application developers. It's the only toolkit that's fully functional with first class and official bindings. It's been pretty well endorsed by GNOME for most of their new applications lately.
There are some Rust GUI toolkits out there that are shaping up, such as some former Qt developers actively developing sixtyfps, but it'll be a while before we have a GUI toolkit in the Rust space that's truly ready for complex application development.
This blog post focuses on some superficialities of how their DE's apps will look slightly different. That's understandable for first impressions.
I do hope these superficialities don't have all of System76's focus, as they're a dime a dozen in Linux DEs. Even the category of "we kind of look like Gnome, but with more familiar workflows" is oversaturated amongst Linux desktops (Budgie, Xfce, Cinnamon, MATE, Elementary/Pantheon, even "Gnome+extensions" are all in this category to various degrees). I suppose one distinguishing factor that Cosmic has is a strong Wayland focus, which is still missing from nearly all Gtk based alternatives.
System76 with Pop_OS! has an opportunity to tackle topics head on like "we can make fractional scaling work somewhat decently across all apps" (IIUC currently requires shipping a forked XWayland, unfortunately), "we can make trackpads the best they can be" (requires shipping some forked libinput related things IIUC) or "we can make font rendering best we can make it". The actual desktop environment stuff I'd be interested in.
A desktop environment needs more vision than shipping the same old Linux desktop problems with some other apps. I really hope System76 can make an effort there. They're trying to make their paycheck depend more on their own Linux desktop's success, and that I can only encourage.
> System76 with Pop_OS! has an opportunity to tackle topics head on like "we can make fractional scaling work somewhat decently across all apps" (IIUC currently requires shipping a forked XWayland, unfortunately)
I'm excited to see System76's implementation of fractional scaling in this new desktop environment. Since they have actually sold laptops with 1080p and sometimes 4K displays, they have a real incentive to get this feature working smoothly on Wayland.
System76 previously developed a HiDPI daemon for X11 to be used with GNOME Shell:
It handles multiple scaling factors, including fractional ones, flawlessly across displays.
If the next version of COSMIC supports fractional scaling on Wayland as well as this daemon does on X11, this alone would make the entire project worthwhile. GNOME Shell still hides fine-grained fractional scaling behind an experimental flag for both X11 and Wayland,[1] with X11 needing a patch for Mutter.[2]
The only distro that manages to do this OOTB is Ubuntu. I don't know how many patent demons they had to slay but It's second only to Windows 7's rendering.
It's early enough that I'm giving the benefit of the doubt on the spacing issues. They're obvious enough that I would hope System76 (who seem to care about design in most respects) are aware and will get it eventually.
I think it's because designing good user interfaces is hard and very time consuming (at least for me it certainly is). And generally Linux users are more contempt to put up with inconsistencies and annoyances because many probably don't care and leave most things in a default state, or customize stuff to their heart's desire.
I'm not saying this is exclusive to Linux distros, I sometimes find the UI in Windows 10 to be confusing, having to jump between the settings and control panel application to find some niche option. It's apparent that parts of that UI were made 20 years ago while others are made with modern toolkits.
I hope that System 76, being more consumer oriented than other companies that mainly develop for Linux, listens to feedback from a wide range of users and manages to develop an ecosystem to be the "MacOS of Linux workstations" in the sense that everything is polished and working out of the box and everyone from regular home users to advanced professionals and enthusiasts can pick up and use without major inconveniences.
> I'm not saying this is exclusive to Linux distros, I sometimes find the UI in Windows 10 to be confusing, having to jump between the settings and control panel application to find some niche option. It's apparent that parts of that UI were made 20 years ago while others are made with modern toolkits.
They're rewriting all of that but it's a pain in the neck to do it.
My guess is that they'll finish in 10 years :-))
If you want compatibility worries, check their Windows Terminal blog.
Or Raymond Chen's blog for some real compatibility howlers.
> Why is it that so many Linux GUIs — between apps and desktop environments — suffer from a lack of attention to detail?
Because nobody who notices files issues. Open source projects do not have the UI teams of Apple and MS. Please, find the bugtracker and file the issues that you've noticed. Thank you!
It's still very early and there's no way that the people making this can't see the alignment and spacing problems. I would assume they will get to it when they can.
The other stuff (like rounding everything) is a deliberate choice.
How is that the case? Open source software only thrives when there is a community around it that is dedicated to making it better. Many people seem to think that only programmers can do this, but there's always a need for designers and documentation writers to step in and help as well. Admittedly, some programmers have trouble accepting advice and criticism from non-programmers contributors, but I do believe this has been getting better over time.
What is the alternative, if not for people who notice problems and care about them to step in and contribute?
I honestly don't quite understand what System76s intentions with COSMIC are. To me it feels like a regression from the previous gnome and i've kicked their plugins to get a regular gnome back.
It just... Doesn't do anything my custom openbox or i3 setups in the past didn't do (imo better). Nor is it better at being gnome than gnome.
And now that i'm getting old and gotten used to stock gnome it just seems like it makes me change my workflow again for no good reason.
I have machines where I run Alpine with sway customized. I have had machines where I've run without a DE at all. I've customized things that weren't meant to be, and not customized things that were.
But if I want something where I can walk up to any somewhat modern machine (currently typing this on an old Alienware running pop), and install an OS that gives me an i3-like environment with no extra work? Pop is fantastic. There are things that I would do differently. But they're minor and being able to just go is worth a lot.
Right. And so Pop!_OS gives you a decent tiling window manager, with sane defaults, without having to spend hours on customising your configuration. What's more, the tiling they provide is an optional feature. You can enable and disable it at any time without messing with you config for hours.
It had to be hijacked because HJKL are often used for keyboard navigation for people who prefer right-handed home-row navigation. Super+Esc is the shortcut you're looking for.
"There’s no visible distinction between the window title bar and the body of the window."
Strikes me as a bit amusing, as this was the case also in early versions of GNOME 3. I believe it was abandoned because non-GNOME apps (or more specifically, ones that didn't follow the GNOME HIG) had a visible distinction and greater consistency was desired. Changing GNOME's own look was the way to do so.
(New dark pattern: popups which have an X box to dismiss them being harder to dismiss. First, the box around the X disappeared. Then, the X moves to different places in the box. (Bing does this for their house ads.) Then, the active portion of the X shrinks to make it harder to hit.)
After unity, I tried couple of DEs but then forced myself to stay with Gnome because I wanted to stop endless tinkering and get work done. I tried it for half a year and then switched to Sway, a tiling window manager.
Sway has little to no footprint. I have configured my system the way I like it. Now I do minor tweaks sometimes but nothing major. My entire configuration is in a single file! I am not going back to DEs ever.
You don't need a tiling window manager to get everything in a single config file. Fluxbox has historically been great at this, while still being extremely light on resources.
Could you share your config? I also use sway but to me it seems like behind the statement "I have configured my system the way I like it" there is quite a lot of configuration. Also when saying "My entire configuration is in a single file", do you also include things that are normally configured in a DE settings app, like wifi, disk auto mounting, bluetooth connections, sound settings/volumes, display configurations, etc?
I like (and use everyday) sway and I like using WM's as a part of puzzling together a system. I just don't think you are comparing apples to apples here.
Sway is a window manager, not a DE. The right thing to compare it to is Mutter. I also use sway, but quite a few of my applications are from Gnome (like evince, nautilus, gnome-calendar, fractal...).
Selfishly, as a MacOS refugee and someone who has an on/off relationship with using desktop Linux as my main (and a desire to use it daily); I would love it if there was a shameless clone of the MacOS DE for Linux.
I don't care for customisation, I just want a sensible default that I can get up and running with immediately.
Gnome41 on Debian Bookworm is my current setup and am actually typing this comment from there.
I use it about once a week and as far as I can tell, Gnome40+ is giving off signs that I might be able to use it as a daily driver (for work).
Looking forward to the updated Files application and whatever else is coming in G42.
That said, there are a lot of design choices that cater more to mobile form factors and the desktop experience suffers as a result.
Then there are non DE related issues like application installation. Chromium has bizarre window and mouse performance issues related to Wayland. Graphics card drivers are difficult to install, even as someone who knows how to read documentation. Flatpak has a permissions structure that doesn't always make sense for certain applications (like OBS, Discord, IDEs) and installing applications using a package manager is very hit and miss (e.g. OBS is broken in Debian Bookworm when installed via apt because of a qt5 dependency)
I don't mean to sound negative - I only complain because I love Linux and want to be able to go to my friends and colleagues and say "you can use this" - feeling confident that they will have an experience on par with MacOS.
That Chrome bug seems frustrating... have you tried changing chrome to use Wayland instead of XWayland? Go to chrome://flags and change Preferred Ozone platform from X11 to Wayland.
It's a great approximation, but I haven't had a lot of success using Elementary.
The best experience I have had so far is Gnome41 on Debian Bookworm (my current setup which I log into about once a week) - though the DE has some design choices that cater more to mobile desktop environments at the expense of desktop usability (I don't want to appear ungrateful towards the volunteers developing it, it's a great project and I know critique is easy).
I'm talking about a complete rip off. Things like per-window virtual desktops, screenshot/video recording via hotkey. The global menu. The rock solid stability. Window decorations, spacing, fonts and overall feel. MacOS's DE _feels_ really solid.
Sounds like you just want to keep using MacOS. All of that should be more or less easy to tweak on Linux but that’s currently the trade-off. What you’re asking for is someone to rip off MacOS but it’s only acceptable if it’s a rigid and as well supported as MacOS which is developed by a trillion dollar company.
That or perhaps join the Elementary OS QA team so that it can have a chance at reaching that really solid _feel_.
As someone who started on PCs running Windows and Linux for decades, I moved to MacOS for work because it's a zero config Unix based system with a productive and ergonomic desktop environment. I don't have an inherent bias towards Apple/MacOS and currently no longer use it (though I have an old x86 MacBook I use when not working on my desktop)
It was always an issue that I was not able to use my existing PC to run MacOS due to Apple's restrictive license model and since Apple's move to proprietary hardware (M1), I have abandoned MacOS.
I am actively polling various Linux variations to see if something offers me the same level of productivity but am using Windows+Linux within VBox for my work/engineering requirements in the meantime.
So it's not that I want to keep using MacOS specifically, it's that I need to use a Unix based operating system that has the polish, usability and stability of MacOS.
I respect that the Linux ecosystem is maintained by volunteers and I have as much opportunity as anyone to contribute (and I have been trying to contribute by submitting QA tickets to the Gnome project). That said, I would be more than happy to pay a substantial annual license fee for a Linux experience that rivals that of MacOS.
EDIT:
> All of that should be more or less easy to tweak on Linux
A lot of this functionality simply does not exist on Linux. With multi monitor setups, virtual desktops scroll on both screens. Gnome 4 is the first DE to pseudoreplicate MacOS's model.
Screenshots and screen recordings are close but lack the nuance of MacOS (and Windows) that make them valuable. For instance no DE offers the ability to draw notes on a screenshotted area, copy the modified version to clipboard (rather than save a file). There is also no way that I know where you can create an mp4 video of a selected area of the screen as easily as you can take a screenshot.
This is invaluable for me as I like to send detailed representations of bugs, with screenshots, notes and videos. Really helps when working remotely or contributing to something like Gnome.
Licensing is where it gets weird. If you want a bit more control than Gnome, I have since switched to KDE and find it more flexible and feature rich. I’m enjoying discovering the KDE ecosystem and the “have it your way” philosophy.
The DE included software might not get you there but there’s seems to be plenty of variations you could install. Again it’s not as polished but I’m not super picky about the specifics of how my OS appears to function.
I’m looking to add some niceties myself to KDE soon as apart of an attempt to learn some C++ and/or Rust.
I think if you gave KDE a try, you'd be really surprised. I too like fancy screenshots with drop-shadow compositor effects and transparent elements; the Spectacle screenshot utility does this perfectly for me, and can be bound to a universal keybind. Desktop utilities like 'peek' give you instant gif-recording capabilities for the more advanced stuff, and of course there are full-fledged apps like OBS for more advanced capture work.
> That said, I would be more than happy to pay a substantial annual license fee for a Linux experience that rivals that of MacOS.
Are you looking for BSD?
Joking aside, you're going to have to be a lot more specific than "rivals" MacOS. There are a lot of cases where MacOS has dug itself into arbitrary, insurmountable leads (eg. nobody is going to add haptics to your touchpad with a software update), but there are also cases where Linux whoops MacOS up and down the block (full WINE functionality, Vulkan drivers, Nvidia compatibility, system-agnostic filesystems, etc.)
> A lot of this functionality simply does not exist on Linux. With multi monitor setups, virtual desktops scroll on both screens. Gnome 4 is the first DE to pseudoreplicate MacOS's model.
This is tacitly untrue, KDE scrolls all monitors by default, and most x11 sessions can be configured to do the same afaik. The only holdouts are esoteric window managers like i3wm, where arbitrary desktop/monitor association is just part of their design paradigm. I'm pretty sure GNOME 3 could be configured to do the same, too (the developers just didn't put the option in the default settings app because they don't trust you, I guess). Not sure where you're getting this impression from, but it sounds like a misunderstanding.
> Screenshots and screen recordings are close but lack the nuance of MacOS (and Windows) that make them valuable. For instance no DE offers the ability to draw notes on a screenshotted area, copy the modified version to clipboard (rather than save a file).
Again, KDE's default screenshot utility ("Spectacle") has robust annotation and saving options. I just let it save the screenshotted areas to my clipboard, and it works fine. It should probably be compatible with your GNOME system too, if you want to try it out.
TL-DR:
Linux is not MacOS, for better and for worse. Most of the features you've listed are indeed configuration options, but nobody really ships a "MacOS clone" distro because it doesn't exist. You can't pay developers to make something like that because there's not a single other organization on earth with 200 billion dollars in liquid cash to spend on decadent UIs and eye candy.
Nice, exactly what Linux needs the most - another desktop environment!
Sarcasm aside, it does look pretty good. But between indistinguishable window title bars and yet another settings app (wanna bet it will miss some config so users wil still need to run one of the others?), I think I'll pass. As far as I'm concerned this problem was solved ages ago, so I simply don't understand why the designers keep mucking with it. Maybe I'm just getting old. :-/
> Nice, exactly what Linux needs the most - another desktop environment!
One could argue that because no one desktop environment has taken over the Linux mindshare, we haven't invented the one yet, so people keep trying. It's not until something takes over the mindshare (like systemd did), we can all unity and start improving upon the same base.
I'd argue that the battle over desktop environment supremacy leaves desktop Linux less compelling overall. There is so much choice that it's paralyzing and new users constantly second guess themselves. You have thousands of developers doing their own thing rather than working together to make mainstream choices like Gnome more viable.
Gnome doesn't want thousands of developers implementing their own visions for their projects and others don't want to donate their time to implementing gnomes vision.
The proper thing for them all to do is all implement their own visions and people to use or work on what they please.
Nobody asks why Tesla, Ford, and Toyota are wasting everyone's resources by being different company. Nobody suggests having different car manufacturers is paralyzing. Nobody opines that once the perfect car is invented we can just deprecate the rest because those would all be silly positions.
The car market and computer uis are an evolving multidimensional entities whose evolving product lines are and have been dependant upon many parties pulling in different directions.
It's like looking at only a duck and a pond and asking why the universe couldn't have just made a duck because it would be an ideal choice for that place and time. Of course it couldn't possibly work like that because arriving at that exact solution without intermediate steps would be impossible and besides the duck is no good in the desert or tundra.
Also gnome is so flawed in so many ways from leaking memory due to unfixable mismatch between js and compiled code, to nonsensical handling of multiple desktops, to add-ons that both rely on monkey patching your desktop due to lack of addon api and can with a single crash kill your whole session, to hostility towards themeing, to ugly header bars, to hostility towards support for non gnome desktops.
> Gnome doesn't want thousands of developers implementing their own visions for their projects and others don't want to donate their time to implementing gnomes vision.
This is definitely looking more-and-more like the case. The GNOME team's insistence on cutting people out of their workflow has left them with less contributors... which leaves them with a worse desktop and a roadmap that's being pushed further-and-further back. Every GNOME contributor you meet will try to deny it, but it really has come to a boiling point over the last few months. There have been so many feature cuts, unaddressed issues, internal hostility and asinine workarounds (text is broken on native apps, so ship with Flatpak until we get it fixed! wait, Flatpak has more issues?) that I really struggle to have any respect for their leadership now. They're throwing out pieces of the plane to get it off the runway, and the complete lack of communication combined with their internal holier-than-thou social ethics has turned a transparent, thriving community of users into a miasma of apologists, skeptics and people who have never tried anything else.
I've been playing with Silverblue lately[0], which is my first real exposure to GNOME 4X, and I would argue that it isn't even a desktop. It's a mobile interface. It wants badly to be installed on a phone and use a touchscreen.
[0] After initially attempting Kinoite, which has some rather unfortunate problems that don't seem to be being addressed.
It's not easy for me to take this comment seriously because gnome doesn't actually have anything anyone could define as leadership. It's a decentralized open source thing. You might be incorrectly assuming bad faith.
GNOME has a foundation[0] that is responsible handling their funding as well as setting goals/standards for their desktop. As a matter of fact, they're the most well-funded desktop environment in the entire Linux desktop landscape. I think criticism is fair, especially when it pertains to removing features and working against the community using funding that is, in large part, raised by said community.
That isn't how gnome works, being decentralized and all. The foundation pays for very little of the development and has little to no influence over what developers actually do. If you wanted the community to have more influence over development, the way to do that would actually be to get more money for the foundation so they can afford to hire more developers from the community. Right now, they don't employ any. The funding they have now is actually shockingly small for a nonprofit based in the USA, and almost unnoticeable compared to what a tech company based in the USA would have.
Removal of features would still happen though because that's a natural part of any software project responding to the ever-shifting priorities of a large group of users.
But that's the whole thing. Redhat and the foundation don't have any leadership over a decentralized group of open source developers. Gnome is not like the Linux kernel where there's one giant repository with one lead maintainer serving as a bottleneck. There are individuals who hold more influence but that's more because a lot of other volunteers chose to follow them, not because they seized power from anybody.
>Also gnome is so flawed in so many ways from leaking memory due to unfixable mismatch between js and compiled code, to nonsensical handling of multiple desktops, to add-ons that both rely on monkey patching your desktop due to lack of addon api and can with a single crash kill your whole session, to hostility towards themeing, to ugly header bars, to hostility towards support for non gnome desktops.
Most of these issues are just bugs, not issues with some vision. The gnome people I've talked to want them fixed and want help with fixing them. But I get that it's lot easier to complain on hacker news than it is to fix architectural issues in a large codebase.
How they handle multiple monitors is a choice and the memory leaking is a design decision as is the lack of addon api and a system where any crashing addon takes out the whole session.
The only one out of those that's a intentional design is multiple monitors. The other ones are known shortcomings that are just really hard to fix. The memory leak was fixed a few years ago: https://feaneron.com/2018/04/20/the-infamous-gnome-shell-mem...
It is worth noting that the first release of Gnome 3 which had the leak from the outset was in 2011 and in 2018 they mostly "fixed" the problem by just constantly running the garbage collector and marveling at how not so horrible that is.
Unfortunately the problem actually continued for some at least into 2020.
In summary a broken design nearly completely ignored for 7 years and fixed with a band-aid that doesn't actually solve the underlying issue over the next 2 years. 11 years in and the design fail is still with us. I appreciate the fact that many of the contributors aren't paid for their efforts and I applaud their efforts to give to the community but large parts of gnome ought to regarded as learning experiences not something to build the rest of our software on top of.
I hope System76's work will be a better basis especially since their financial success could fund future efforts.
I understand. Tesla, Ford, and Toyota all have cars that work though. They all work so well you can basically swap any of them out and the user doesn't have to change anything. The user can pick any of those cars and do well.
I'm not talking about cars though, I'm talking about Linux not getting a lot of traction on desktop because of too much choice. The mainstream environments are often not viable for somebody trying to switch. We can ignore it and create a dozen more distros and environments but this pollutes the Linux landscape even further.
You're right, a car is a bad analogy. Cars are several orders of magnitude more complex than a desktop environment. They probably even have more lines of code.
A desktop environment isn't that difficult to make, that's why we have so many. Gnome and KDE peaked 10 years ago and they're now stuck in a local maxima, afraid to move outside their comfort zone. A new desktop environment is an opportunity to question assumptions and try to find a better design.
If KDE and GNOME are stuck, it's not because they can't innovate with their UIs enough with traditional ways of interacting with a computer. In fact, I don't think innovating in a way that doesn't involve new ways to interact (such as VR/AR) is very desirable to most users.
The truth is, most people don't want to install operating systems, even if they're free and easier to install than Windows. The only way to go further is to ship with hardware and for that combination of hardware and software to be in demand. On top of that, when the operating system/desktop environment has reached a decent baseline of general usability like KDE Plasma and GNOME have, the software ecosystem around the OS/DE is more of a limiting factor than the UIs of the OS/DE.
The fact that System 76 sells laptops could actually give their DE a significant advantage over other small DEs if they ship their own DE. It's unlikely to overtake KDE and GNOME since System 76 doesn't sell tons of laptops like Dell or other big hardware companies, but I guess never say never.
I don't expect instant world domination, but I hope the Steam Deck (which will ship KDE Plasma for its desktop mode) will greatly boost the popularity and developer community of KDE. With success, there is a chance at a snowball effect where new users who enjoy the product contribute and bring attention, which brings more new users, etc. All things considered, the share of people using the Steam Deck isn't going to be that big compared to Windows or even MacOS, but it is getting a lot of attention.
> A desktop environment isn't that difficult to make, that's why we have so many.
We actually don't have that many and they actually are pretty difficult to make. Linux has quite a lot of window managers, but not complete desktop environments. The 2 most popular ones require a lot of maintenance. Even GNOME, which has gained a bit of a reputation for dropping features is very large in scope.
Not at all, if you would say "A window manager isn't that difficult to make, that's why we have so many.", I would agree.
A desktop environment is so much more, is a full stack development experience, with development tooling, APIs, and UI/UX workflows that users can be confident they work across most applications unless they go out of their way not to support the native idioms of the desktop environment being used.
Most of those "so many experiments" fail at being half as good GNOME/KDE are at being a full desktop environment as other desktop and mobile platforms are.
There's no easy answer. The freedom of choice is beautiful and part of why I use Linux but an ecosystem can become polluted and drive away new users.
I don't know what else to say. I'm describing the landscape as I see it.
I do disagree that KDE or Gnome are stuck, I thought Gnome 40 was a nice refresh. It's evolved quite a bit from the Gnome 2 of Red Hat Linux that I used as a youngster.
Insofar as the overall level of hardware support and available software on Linux suits you then you could certainly use certainly use any of several interfaces especially on X11.
I'm pretty sure there will never be The One due to political and psychological factors. Many people use Linux specifically because there's isn't One way to do things.
What I really wonder about is Wayland support. Is this going to be a brand-new DE that's X only? That would be a real shame. I know System76 has stuck doggedly to X because they sell so many NVidia cards, but NVidia supports GBM now.