Not really a relevant thing to talk about here, but I find there's this trend in Linux ecosystem that people are explicitly implementing more and more policies on the code level instead of making flexible abstractions and data-fying user decisions. I'm not talking about exposing user-friendly configuration options, but is about mechanism-over-policy. The presenter mentioned "use-case over mechanism/policy", but, given a good mechanism, use-cases are just policies.
Although I do understand the value of the current approach in the industry (i.e. MVP), I think it does come with a cost - you get a large corpus of low density code that gives a lot of surprises in many different corners. Redhat-funded projects are really leading this trend (I'm looking at you systemd, the project). If you read their code, you'll instantly notice that they are becoming more and more labor-intensive, and less and less hacker-friendly. We are losing smartness for the sake of grunting.
Some might think Wayland is a smart approach, but it's actually not. It's mostly just pure grunting - a showcase of large engineering horsepower. It's a large-scale refactoring & optimization of the modern day X11 desktop ecosystem. Everything works in the exact same way, but things happen in slightly different places, and some features are banned by policy (arbitrarily outlined by wayland devs). These missing special features now need to be implemented explicitly as "protocols".
Now, I really want to ask - where are we heading here? Are we really stepping forward? I don't think so. We're just going side ways.
Waiting for the steam client to have wayland->x11 fallback runtime tables of functions. xwayland is there for software which has no more technical maintenance (not the steam client, yet).
But I plan to write my own wayland compositor, probably based on linux dma-buf, like the one running on the steam deck (basically an improved fork of valve one).
And in assembly, x86_64 at first, but I am lurking on risc-v.
IIRC, the blob stuff should be able to handle buffers it created itself no problem. The issue here are foreign buffers that are not supported, yet. Supposedly version 525.x will do that, so lets see how that goes.
nvidia drivers generally work but will remain unsupported for as long as they are not fully open source. Providing support requires the ability to debug the driver when issues arise.
(Improvements to Nvidia support is accepted, if it is a general improvement and not just a hack around a driver bug.)
That policy does not doom anything. wlroots does not accept driver bug hacks for any driver, and is merely consistent in that regard. It works for others, so it can work for Nvidia. Things will just always move slow as long as Nvidia are the only ones that can fix bugs.
Maybe it will be better one day: They are (also slowly) moving in the right direction though, and nouveau might be revived by taking some bits from their new (barebone) kernel driver. But until then, swear your allegiance to the GPU vendors that play nice.
was trying to run swaywm this morning to no avail as it refused to start when it detects nvidia driver is in use. why did not wlroots work with nvidia drivers? is it because nvidia drivers are closed that wlroots(i.e. swaywm) can not work with it? a bit disappointed.
Every other driver uses GBM, which is well supported by all the various software related to gpu buffer management. Nvidia decided to not use GBM and instead use this EGLStreams thing, which meant it wasn't compatible. Politically, this was contentious since nvidia was offloading a huge development burden onto the ecosystem to create parallel implementations of buffer management specifically for nvidia. Wlroots ended up deciding this was such an inane thing for nvidia to do, that they refused to implement EGLStreams at all. Eventually, nvidia backpedaled and decided to implement GBM. Now they are just behind, but everything will probably eventually support nvidia.
Because nvidia drivers traditionally didn't provide APIs that sway and other wayland compositors expected (drm, dmabuf, atomic) and tried to do it's own things (EGLStreams). For compositors it meant to have to code paths, one nvidia-specific and another one generic. Some compositors implemented both, others implemented generic only.
For comparison, it would be like complaining that Windows/DirectX games do not run with a GPU that doesn't come with DirectX driver. Well, duh.
However, Nvidia signaled a u-turn, that they are going to be more cooperative. How and when, well, it remains to be seen.
Current Nvidia drivers support GBM, and thus should work with wlroots. Their status of unsupported mean that the wlroots maintainers cannot help if it does not work as they cannot debug a closed source driver, making it impossible to see what might be wrong.
General improvements that help nvidia compatibility are accepted though if other's identify an issue.
It does work, you have to add —unsupported-gpu and there’s a bunch of other tweaks to make it good. There’s a long thread on the Nvidia UNIX driver forums on how to configure it.
It is still unsupported though as wlroots only actively support open source video drivers.
All the block diagrams prominently putting systemd-logind between X11/Wayland-compositor are very misleading, and seems likely to create more misinformed ire for systemd.
systemd-logind is basically just an arbiter of the seats for reasons like you can only have one drm-master on a given drm device at a time. It's not involved in all X11/Wayland-compositor<->kernel interactions. Once the seat has been acquired it gets out of the way. That should not be some huge box right in the middle of the two.
You can trivially confirm this by stracing systemd-logind on your system while interacting with your desktop. It's just sitting idle in epoll_wait().
It's to handle multiseat scenarios where a single linux box handles multiple "seats" as in numerous monitor+input sets facilitating multiple seated irl users.
Imagine multiple kiosks in a train station/airport running off a single box in a corner. Or a library with a dozen monitors+keyboards hanging off a single box. Consumer PCs are so powerful these days it's quite practical to do such things, at least in terms of the hardware.
*NIX has historically cared about such multi-user use cases, and Linux/Libre software specifically has proudly facilitated making more use of our hardware vs. being wasteful by forcing purchase of more machines through superficial software limitations. The entire era of shared web hosting that arguably made Linux a commercial success throughout the 90s and early 2000s was a product of hosting more customers per less expensive commodity PC.
I really wanted to like wayland as the 'next big thing', but for the life of me I can't.
* I've yet to see any actual, honest to god benefits for the end user. Literally everything I've seen is the fact that it makes development easier for the wayland devs and harder for literally everyone else.
* Trying to use a wayland window manager outside of gnome or kde leaves you reliant on a bunch of hackish utilities (redshift, taskbar, screen recording / sharing, etc) which are available on AUR and basically no where else
* Scaling and perf for xwayland apps (i.e. most apps) is pretty bad.
Is wayland the future? Maybe, but I'll happily use xorg for the next decade if needed until the user experience is rock solid over on wayland.
I used to be a Wayland maximalist and have somewhat given up. X11 works just fine. The theoretical security benefits haven't really been realized and the incompatibility inside of the ecosystem due to the aggressive "out of scope" response by the Wayland team massively killed my enthusiasm. All it has done is cement the Gnome monopoly because instead of general-purpose desktop tools, you now mostly have Gnome shell (or KDE) extensions for everything and all other desktops are left out.
I have no experience with variable refresh, but I have been using up to three monitors with different resolutions, up to 4k, under X11, for almost a decade, without any problems whatsoever (mostly with NVIDIA GPUs, where their Settings utility simplifies the configuration of a multi-monitor layout).
X11 is very far from an ideal graphics system and I would like to see it replaced by a better system, which would still have to also implement the X protocol for the legacy applications.
Nevertheless, I have not seen any argument yet that would indicate that Wayland is the appropriate replacement for X11. On the contrary, some of the ideas on which Wayland was originally based were definitely wrong and they have shown that the Wayland developers lacked experience about how many computers are used.
Even if a part of the initial mistakes have been patched meanwhile, that lack of vision at the Wayland origin makes me skeptical even about the quality level of the Wayland parts about which I do not know anything.
Not defending wayland, but X11 has serious downsides for me with multiple monitors.
Are you using fractional scaling (or different scaling values per monitor)? Does vsync work? How about the scrolling stutter at high refresh rates? And if you want to use variable refresh, you have to disable/unplug all the other monitors and reboot.
All my monitors are fixed 60 Hz, so I have never used variable refresh and there is no scrolling stutter.
Vsync works, but I must choose with which of the monitors.
For about a decade, I have used only 4k monitors, even starting with the early models that were seen by the computer as multiple monitors, because HDMI and DisplayLink could not carry 4k @ 60 Hz on a single link at that time.
Nevertheless, I still cannot understand why would anyone want to use any kind of "scaling" in relationship with a monitor.
Any kind of "scaling" is guaranteed to generate sub-optimal images.
The correct way to deal with monitor resolutions is to set for each monitor the appropriate dots-per-inch value, depending on the monitor size and resolution.
With the right DPI value, all typefaces and vector drawings will be rendered beautifully. Scaling is never needed per monitor, but only per window, either for the windows containing bitmap images, i.e. pictures or movies, or for the windows that contain GUIs implemented by incompetent programmers in Java with typefaces sized in pixels, instead of using scalable typefaces sized in points, like any decent (non-Java) GUI.
I did not need to set different DPI values for each monitor belonging to a multi-monitor layout, so I do not know if that is possible in X11.
There is one DPI setting per X server I think, and I use 192 because I have two 4k screens. Which forces me to xrandr the 1440p display at 2160p then output a blurry 1080p to match the 4k screens at 2x scale.
As for why use scaling, I can't read anything at 4k native and the UI elements don't scale correctly with the font dpi. Windows has fully functional 150% and 175% options (one of the few things Windows does well now) and in MacOS has really nice global super-sampling options between 1080p-1440p hidpi.
I am also using 192 dpi with a pair of 4k monitors, a 27-inch and a 24-inch, both at the native resolution of 2160p, so everything is crisp.
With the default font size set at 12 points, everything is very easy to read. Even 10-point or 9-point fonts are still comfortable, despite the fact that otherwise I need glasses for reading small printed books.
Under XFCE, almost everything is scaled fine on 4k and 192 dpi. The only exception is that some of the scrollbars remain rather narrow, but they are still usable.
> On the contrary, some of the ideas on which Wayland was originally based were definitely wrong and they have shown that the Wayland developers lacked experience about how many computers are used.
Wayland's "just don't send callbacks to hidden windows" approach is completely backwards and should have been replaced by the same event-based visibility notification every other GUI uses. The scaling model was wrong, but they've finally admitted that and real fractional scaling (as opposed to over-rendering+downscaling) is close. Wayland blandly dragged X11's biggest technical debt -- implicit synchronization -- along with it, even though every other modern GUI synchronizes explicitly (i.e. moves a lot of work out of the critical path of drawing a frame).
Most significantly, the "only the focused app should be able to read input" is wildly, fantastically wrong, mind-bendingly deviated from the norm on literally every other graphical user interface in the history of humanity, and utterly incomprehensible to anyone who is not an outright "cybersecurity" fetishist. Imagine a windowing system where you are playing a video game with a USB controller, you mouse over to a window to send a text message and your video game loses the ability to process the controller input. This is clearly surprising behavior, if not outright user-hostile, and only Wayland gets this wrong. Rather than fixing it, Wayland devs gaslight users and developers into believing anyone who opposes this bad behavior is anti-security and therefore pro-badguy, or something. So, each separate desktop environment tribe is now having to produce their own bespoke utilities to mux input clientside to enable normal use of computers, because the Wayland protocols are empowered by divine right instead of technical merit.
I want to add that despite the above litany I do regard Wayland as a marked improvement over X11 and I have no interest in going back to the bad old days. None of the problems with Wayland are unfixable in a 2.0 or so bad that it's worth going back to X11. The actual big problem with Wayland is the unwillingless of the core developers to take advice, but that was a problem back when they were the X11 core developers, and I don't see it changing. For those of us who don't have to interact with them, it's a non-issue.
> The actual big problem with Wayland is the unwillingless of the core developers to take advice, but that was a problem back when they were the X11 core developers, and I don't see it changing.
Is there a solution to this? Any alternative at all? Like, I agree with this problem and it makes me almost irrationally angry that fundamental software on what should be a flexible and open platform are built by people who are so hostile to basic settings or extensible functionality that they are even more egotistical than Apple engineers :/.
The fundamental problem is this kind of stuff is hard work and to a first approximation, nobody wants to do it. So if the X11 core developers and the Wayland developers (who are generally the same people) want to do it their way, and their way is better than nothing, we kind of have to deal with their way, don't we?
I mean, I can run X11 as long as possible, but sooner or later, I'm probably going to have to deal with Wayland. Especially since I'm also not willing to consider Mac because I've suffered enough, and Windows 11 looks like it might piss me off enough to go back to an opensource desktop.
Alternatives probably look like borrowing from other projects that have managed to wrangle things. Android doesn't use X or Wayland, afaik, but I don't know that it makes a good base for a desktop. I believe ChromeOS uses X11 (EDIT: I'm probably wrong, looks like they use Wayland) and their own window management etc, that doesn't help if you don't like X11/Wayland.
Otherwise, maybe it's possible to build on top of Windows apis. There's NDISWrapper, maybe someome crazy could build something to use GPU drivers for Windows, and run wine or something. If you look around, you'll see articles about how the portable executable file format for Linux GUIs is windows PE, and it sort of makes sense, a little. That'd be a big transition in expectations though.
I really like the idea of window isolation - I can run something in a container and not care about it misbehaving unless it’s leveraging a zero day, but I also agree that screen sharing / recording is a fundamental need. Surely there’s a middle-ground, like policy based access?
Making a GUI that supports isolating a client is a great innovation. Making a GUI that force-isolates ALL clients is obnoxious. I really think the functionality should be behind a gate like OpenBSD's pledge/unveil tools -- a process (or cgroup etc) should declare that it should be isolated, then launch whatever. Otherwise it should continue to work the way people expect computers to work! But this and other suggestions were disregarded because security.
The one that has been most frequently discussed was the lack of support in the beginning for screen sharing and the like, but there are many others.
X11 is such a central part of the software required to use a computer that it is not acceptable for a substitute to implement only a subset of its functions.
For any X11 replacement, its architecture should have been conceived since the beginning to enable the implementation of all X11 functions, even if for some of them a lower performance caused by interposed compatibility layers is acceptable, if it is hoped that they will be deprecated eventually.
It should have been obvious for the Wayland developers that some of the X11 features will never be deprecated, e.g. screen snapshots, screen sharing or remote desktop access.
Instead of trying to remove such features, they should have tried since the beginning to find better solutions for them then in X11.
While a network protocol should be implemented by a program distinct from the graphics system, when designing any modern graphics API one should ensure that it has good compatibility with something like the Remote Desktop Protocol.
In general the Wayland designers have attempted to minimize the work they had to do by claiming that various functions belong into other programs, but their arguments are not convincing.
Most, if not all, functions of a window manager should be implemented in the same process as the graphics system, even if the window manager should be some kind of replaceable plugin. That includes the window decorations. A GUI application must be concerned only with the client area of a window.
Moreover, it is inexcusable that Wayland has not been designed with an up-to-date color management since the beginning.
> While a network protocol should be implemented by a program distinct from the graphics system
Who uses just one computer anymore? We shouldn’t make architectural choices that assume one GPU local to an app server, because in real life GPUs are found attached to each user’s display.
My multi-monitor X11 setup works fine, thank you. But my setup is mostly static, did you mean dynamically adding/replacing monitors? I think I remember doing that without issue on KDE some 10 years back.
Wayland needs a restart. I get (and applaud) the aggressive "not in scope" as the developers don't want to be maintaining the universe and dealing with nVidia, AMD, and Intel is difficult enough. I really wonder if implementing this stuff in something other than C/C++ (Go, Nim, D, Ada, Zig, Rust, etc.) would be an improvement as it would force developers to come to grips with "No, you DON'T own that memory and you can't bash on it and people who want to use your system won't either. Figure out how to abstract that."
It's become pretty clear that their abstractions aren't correct. If I have to link against a gigantic C library just to get window decorations, we have a deeper problem.
> Currently there is 11 THOUSAND lines of Rust in wlroots-rs. All of this code is just wrapper code, it doesn’t do anything but memory management. This isn’t just repeated code either, I defined a very complicated and ugly macro to try to make it easier.
> This wrapper code doesn’t cover even half of the API surface of wlroots. It’s exhausting writing wlroots-rs code, memory management is constantly on my mind because that’s the whole purpose of the library. It’s a very boring problem and it’s always at odds with usability - see the motivation for the escape from callback hell described above.
Disclaimer: I don't know Rust beyond println!("Hello World"); C is my preferred language for decades now.
The example Rust code in that linked post is just an absolute unreadable horror show. If that's representative of what it's really like to interop C with Rust or just write real Rust code in general, I'm strongly discouraged from bothering to put more effort into the language.
Can anyone skilled and experienced enough with Rust to judge what's going on there comment? Is the author just a noob making a mess of things as anyone can do in any language, or is this limited to C interop? Or is Rust normally such an inscrutably verbose disaster?
I've written a bit of Rust/C interop. It doesn't look like that; most of the time it's... not pleasant, since it's interop, but certainly not this terrible.
On the other hand, I can't think of a way to do it better either. wlroots might just be uniquely horrible here.
Author here. Certainly part of the ugliness of wlroots-rs is my fault (unsafe code + macros is not a fun combination).
However, I tried quite a few ways to make this safe and ergonomic. Perhaps some better way exists that I missed, but even if a solution exists the work needed to implement it is very, very boring.
wlroots is very much a C API, but I admit I was expecting it to be easier to use from Rust and is my current biggest criticism with the language.
Normal Rust is normally not this inscrutable - however it does have a lot of features that could make it as hard to read. It requires a certain amount of discipline that you don't see in simpler languages like Go. I for one never write macros if I can help it anymore.
> It's become pretty clear that their abstractions aren't correct. If I have to link against a gigantic C library just to get window decorations, we have a deeper problem.
Is there a write-up of the critique of this issue?
I read the Waycooler blog posts but they didn't go into details.
Lack of server side decorations in the core spec, and limitations of wlroots, are unrelated so I'm not sure what they're getting at (except to support the larger idea that Wayland is poorly designed, which I disagree with. However, I think I'm a little too attached because of my involvement)
Wayland and the surrounding ecosystem is written with the fundamental assumption that you will be coding in C/C++. Abstractions and architectures were written assuming that.
In the last 15 years, that assumption has broken. People want to write applications specifically avoiding C/C++ and even Gnome/KDE.
Wayland has not adjusted to this new reality. It is not clear that it may even be able to.
The problem is that this kind of graphics system programming is a gigantic pile of work while being very fiddly with non-obvious details that you have to get correct. Very little of it is "fun" and the people qualified to be doing it are capable of earning a lot more money doing basically anything else. The fact that almost exactly the same people doing Wayland were the ones doing X11 previously shows that nobody is jumping up and down for the chance to attack this.
You can link against libdecor to get window decorations. It is a young project and there are certainly lots of issues with it, but at least it is not one of the big GUI toolkits.
FWIW, I much prefer learning and using 4-5 "hackish utilities" to the giant blob of design-by-committee that is Gnome or KDE. Redshift took like 1 minute to set up once, and has worked flawlessly w/o intervention since. Taskbar is actually better than on any distribution including Mac or Windows, IMO. I can put anything on the system in my task bar with a few minutes of work. Try that anywhere else. Screen recording is still a bit buggy, but usually because projects (like Electron) haven't updated their dependencies.
Love Sway.
I'd kind of say it the other way around. I don't notice any real downside over X, configuration is much nicer, and apparently it's more secure or something? Why not.
1. I was told to ~piss off~ because i use Nvidia for Blender and don't want to switch to AMD. The hostility was very discouraging.
2. I adjusted my workstation accordingly to not deal with the screen issue long ago. Yea, it would be nice to use differing monitors, but before Wayland was really viable i had to solve the problem properly to keep working. The net result is this killer feature had to be solved by me with hardware before Wayland could even have a chance to solve it with software.
I can understand that. This doesn't excuse the hostility, but also understand that, to the developers, it's been a huge slap in the face that nvidia didn't feel like adopting the standard that the rest of the Linux graphics community had adopted. Having to support another rendering backend, just because nvidia chooses to -- yet again -- abuse the Linux community, is hard to feel good about.
Now, it's inexcusable to pass that hostility on to regular users, assuming they are asking politely, and aren't acting entitled to a solution. But I understand where the hostility comes from.
(It's also telling that nvidia has, more recently, decided to stop doing their own thing, and adopt the established standard. Even they realized -- though it took them way more time than it should have -- that what they were asking was too much.)
Well, what seemed like an attempt to split the ecosystem didn't work and the ship was sailing in the opposite direction. The community holding firm helped the greater good in this case.
I find the general defaults in most Wayland compositors to be much more pleasant than X-org defaults.
Libinput and Pipewire in particular solve a huge selection of issues I would routinely run into on linux machines in the 2010's.
Not limited to wayland, but definitely pushed by wayland and developed in tandom with a similar set of goals (renovate mostly legacy spaces that are critical in the end-user space).
I have also - not ever - had to touch a terminal/file editor to make an external display work with wayland. Which... is just hands down pleasant. And frankly an experience that does not exist with X. For the most part - I just plug it an and things work as expected. Including much better scaling/hidpi support.
Honestly - not having to mess with conf files for basic input/output has been a hallmark of my experience with Wayland.
I enjoy quite a bit more deep diving in linux than most any typical user (I run arch as a daily driver and host about 15 apps in a k8s cluster in my basement - I'm not afraid of some configuration), but I'm tired of wasting time making devices work like they should. Personally - I don't find that experience enjoyable or satisfying. I'd like my mouse/touchpad/monitor/headset to just work when I connect them, and wayland based compositors seem to get those mostly right.
I switched 5 years ago - back when screen sharing was still... rough, and a lot of the benefits weren't present because major applications (namely - browsers) were still running in Xwayland. The experience today is just downright pleasant. I enjoy my wayland boxes far more than my work macbook, and the last hold-out has been a Windows 10 box for games, which thanks to valve and wine will not be getting upgraded to windows 11.
> Trying to use a wayland window manager outside of gnome or kde leaves you reliant on a bunch of hackish utilities (redshift, taskbar, screen recording / sharing, etc) which are available on AUR and basically no where else.
Because redshift is designed for X11. If you want redshift for Wayland then you use gammastep. Taskbar options if you don't want to use whatever is part of a common window manager would be Waybar, swaybar, and i3statusrs. Screen recording works well with OBS. All this works without relying on the AUR BTW.
Nope, wrong, this doesn't work for KDE or GNOME or most of the big compositors for that matter. There really is no standard way as I mentioned in another comment. Anyway, from the gammastep README:
> Why doesn't this work with Wayland compositors like Mutter, KWin, Mir, and Enlightenment?
> This program supports the wlroots protocol for gamma adjustments, but several Wayland compositors do not support this protocol. GNOME and KDE have their own built-in mechanisms for adjusting color temperature on Wayland.
For GNOME, I might as well say here for reference, using gsettings to adjust the keys in org.gnome.desktop.a11y.magnifier seems to work for now, but I'm sure they'll find a reason to remove that soon enough.
The original comment was that redshift was needed outside of Gnome or KDE, so I think the comment you're responding to is addressing that. It still seems like a legitimate gripe that it doesn't work for many other compositors, but I think the inclusion of KDE/Gnome here is a bit of an accidental goal post shift for the parent comment.
Yeah, I didn't bother with the proper names for these. I needed something to make my display color warmer, control brightness, a proper lock / login screen, sanely configurable task bar.
These exist, but while they are first class citizens on AUR (which is really just building from source), they are mostly absent from the ubuntu repos. You wind up chasing down and compiling nine different dependencies for a fecking backlighting program before realizing you just don't care anymore and you happily switch back to i3wm on xorg and move on with your life.
> The original comment was that redshift was needed outside of Gnome or KDE,
but even if you use gnome or kde you can still use redshift in X11 because these things are independent. e.g. maybe you have a redshift config file associated with a screen / location, it shouldn't matter which DE you are using for it.
Trying to use a wayland window manager outside of gnome or kde leaves you reliant on a bunch of hackish utilities (redshift, taskbar, screen recording / sharing, etc) which are available on AUR and basically no where else
LOL as a long time "just a WM" user, I don't find any of these to be that hackish.
I'm using wayland right now not because I give a f*ck about graphic servers, but having a 2K laptop monitor and 2 4K external monitors, I find the "monitor independent scaling" supported by wayland better than "only one scaling per all the monitors" of X11, had just to solve an issue related to screensharing on browsers but following archlinux wiki was straightforward enough for me
But yeah I use KDE so probably for other DE is not worth it?
> * I've yet to see any actual, honest to god benefits for the end user. Literally everything I've seen is the fact that it makes development easier for the wayland devs and harder for literally everyone else.
When Wayland was proposed, it’s biggest benefit was for direct rendering. At the time AIGLX was still the main hack to getting decent compositing/GL performance and had some major drawbacks. Wayland was designed in a manner more similar to the DRs in other desktop OSes (Windows and OS X, primarily). But, in its long development and adoption period, X’s DRI/DRM infrastructure has had plenty of time to mature and many of those arguments have been made moot. In fact, it still generally outperforms Wayland in many cases:
For what it's worth, most (all?) the comparisons in that phoronix article [1] compare the performance of apps running under XWayland (in a Wayland session) versus an Xorg session. That's because most steam games use SDL 2 which still uses XWayland by default (the default has been changed in SDL 3 though [2]).
So using the data from that article to conclude that "[Xorg] still generally outperforms Wayland" seems wrong. A comparison between native Wayland apps vs Xorg apps would have been a lot more relevant though.
> I've yet to see any actual, honest to god benefits for the end user.
X11 reduces my laptop battery life to about 33%. Possibly because the refresh rate is 144Hz, I don't know. Battery life is a substantial use-case.
The perceived latency is also way lower, resulting in a more pleasurable experience (and that matters).
> Trying to use a wayland window manager outside of gnome or kde leaves you reliant on a bunch of hackish utilities
That's due to the philosophy of those compositors (compose something bigger with simple tools, i.e. the unix philosophy). Nothing in Wayland is preventing the implementation of a full shell (as Mutter and KWin prove).
> Scaling and perf for xwayland apps (i.e. most apps) is pretty bad.
The fractional scaling extension is pretty new, but was designed to rectify this.
My big peeve with Wayland is Gnome and their f$&#(_ing refusal to implement decorations. And screen sharing is mostly broken on Ubuntu 22.4 (but that's probably because Ubuntu is utter shite, screensharing has been rock-solid on my distro-hopping personal PC).
> * I've yet to see any actual, honest to god benefits for the end user. Literally everything I've seen is the fact that it makes development easier for the wayland devs and harder for literally everyone else.
You can see some small benefits in GNOME like movies still playing while window switching, nice shadows, no more tearing, etc. but honestly who cares but aesthetes. None of this was in popular demand. The security is nice but when was that ever exploited in the wild? Mostly I see downsides, like now there's multiple ways (or no way in some compositors) to dim screens...
I'm sure eventually we'll get all our features back but for now it's mostly a bunch of small irritations that add up.
I mean in thumbnails and while switching workspaces and all that... There's never any frozen frames, skips or anything. Yes, eventually X got compositing, but not for most of its life and not in mainstream WMs.
Yeah, I did mention that in my comment and also that it's not Metacity/Mutter, KWin or popular. On the other hand, I've never seen Compiz perform this well.
I'm a very casual Ubuntu user, but I use external monitors with my laptop closed. I fixed the "5 second input lag when main display is off" bug a bunch of times, it seemed to break again every third time I restarted. Eventually I saw a snarky "or you losers could just switch to Wayland" comment, I tried it, and I haven't had a problem since.
I'm learning a lot about the politics of the situation in here! For me, Wayland worked.
Frame perfect rendering, and heterogenous and fractional DPI are the main user benefits today. Future benefits that we might get as a result of the improved stack include better crash recovery for display servers, better multi-GPU and hotplugging functionality, and better support for variable refresh rate/HDR/etc. Also, the new stack is about far more than just the Wayland devs, but as someone who has implemented a Wayland client from scratch in Go, I do not believe it is at all worse than Xorg. The biggest issue I can see today is rough edges where different compositors implement the protocols differently. But, that seems to improve over time, and users of UI toolkits generally needn't care.
Most Wayland compositors outside of GNOME, KDE and Weston are wlroots-based. Whether the packages have been added to Arch repos is kind of neither here nor there, a lot of them are reasonably mature and will do what they say. Wlrobs for example will work just fine if you want high performance screen capture inside of wlroots without relying on desktop portals.
XWayland apps scale fine, they just don't render at high DPI. I agree this is bad if it's most apps. That said, you may not have realized but a lot of apps that start in XWayland today work fine with Wayland/XDG Shell with a command line flag or environment variable. Chromium and Electron apps will switch soon when the ozone platform hint flag is changed to default to "auto". That means that Firefox, Chrome, Electron, SDL, GLFW, Qt 5, Qt 6, GTK 3 and GTK 4 apps can all start in native Wayland assuming the apps themselves don't depend on X11, including many tricky ones like OBS. Even Wine might soon have a native Wayland driver[1], further knocking down the number of apps that hard-depend on X. (For me, the main ones I can think of are Krita and dolphin-emu.)
I think the vast majority of Wayland issues today just stem from the NVIDIA experience. And to be fair, that is a completely valid reason to not use Wayland. It's a shame though, because the developers behind Xorg have no interest in maintaining parts of X outside of XWayland. It's tough to break many of these chicken-and-egg problems with 80% of the Linux user base being shafted. Of course, it's not just about Wayland, it's about the whole stack. NVIDIA users had to wait longer to get many basic creature comforts, like KMS support. It's very telling that only one major GPU vendor has issues like this with open source OSes, it's just unfortunate that it has to be the biggest one on desktop.
Of course, everything is always just around the bend. But with improving NVIDIA drivers and concrete timelines for apps defaulting to Wayland when possible, 2023 seems like it might just be the turning point. I'm patiently awaiting new NVIDIA driver releases... (But for now, you will have a way, way better time on Intel and AMD. It's leagues ahead.)
I'd rather not. More people should write wayland clients though, and complain to the makers of existing compositors where they fall short. Which is in a lot of places, and they are definitely not sufficiently robust.
That make no sense. Those are big and doesn't fit all purposes. It would also be a shame as the underlying protocol is just fine and reasonable easy to integrate with, even if it is poorly documented in many places.
Which is not really something that can be said about the big toolkits which tend to dictate how execution happen.
But that attitude certainly seems common among the people that build compositors. Having more alternative implementations would be very healthy for the environment.
Yes. Similarly a crash of Xorg will bring down all running applications. However thanks to standardized interfaces in X11 both the window manager and the compositor can independently crash or even replaced at runtime by a different program without affecting running programs. This just shows how much further ahead X11 is on a conceptual level and how bad the Wayland design philosophy really is.
If X works, why bother? If you're missing something it's not hard to test, but if there's nothing driving you to it then why bother at this point? It'll keep getting better over time
However, applications that just grabbed X11 framebuffer won't work. They have to use respective APIs with user control ("portals") for that. Some do (Chrome, Firefox), some don't (MS Teams).
If you're forced to use an application that only supports the X API, you can kludge it to work by VNCing to yourself with x11vnc and then screencasting the x11vnc window.
I use it every day. I can share screen in Google Meet, Zoom, Slack Huddle, and many others. It doesn't work in Signal, though, because Signal is using an outdated version of Electron.
Since each company sort of sticks with 1 tool for every employee, could be kind of a deal breaker if they pick a tool that can't screen share on Wayland.
A friend of mine couldn't share his screen on meet or teams and I had to change it from Wayland to X11 to make it work. His was a Ubuntu 22.04 distro (GNOME) 4 month old install. Wonder why that happened.
No! The Wayland protocol provides no mechanisms for screen sharing and it is not intended that it ever will. Screen sharing has to be done by external protocols that are either compositor specific or using a somewhat standardized dbus protocol that is completely separate from Wayland.
It seems to work for a lot of people, but I still get constant crashes using xdg-desktop-portal-wlr, despite trying multiple GPUs, extensive testing etc. I will probably dive into the code when I get some time.
Although I do understand the value of the current approach in the industry (i.e. MVP), I think it does come with a cost - you get a large corpus of low density code that gives a lot of surprises in many different corners. Redhat-funded projects are really leading this trend (I'm looking at you systemd, the project). If you read their code, you'll instantly notice that they are becoming more and more labor-intensive, and less and less hacker-friendly. We are losing smartness for the sake of grunting.
Some might think Wayland is a smart approach, but it's actually not. It's mostly just pure grunting - a showcase of large engineering horsepower. It's a large-scale refactoring & optimization of the modern day X11 desktop ecosystem. Everything works in the exact same way, but things happen in slightly different places, and some features are banned by policy (arbitrarily outlined by wayland devs). These missing special features now need to be implemented explicitly as "protocols".
Now, I really want to ask - where are we heading here? Are we really stepping forward? I don't think so. We're just going side ways.