Hacker News new | comments | ask | show | jobs | submit login
GNOME Shell and Mutter (feaneron.com)
79 points by anuragsoni 16 days ago | hide | past | web | favorite | 70 comments



Problem of support for default server side decorations with Mutter is still not resolved. Toolkits like SDL and applications like mpv and the like can't make basic window controls (like title bar and title bar buttons) look consistent without it, unless they start depending on GTK.

The issue seems to be on both sides. SDL, mpv and etc. don't want to depend on GTK, but it looks like Mutter developers don't want to do that either. The end result is very messy.

Unlike Mutter, KWin for instance does depend on Qt, and provides server side decorations which allows such non GUI windows easily draw minimal controls and the title bar.


I consistently have a problem running Mate Desktop where I have to disable CSD in apps that try to use it by default - off the top of my head, Chrome, Chromium, Firefox and VSCode all have CSD functionality but are unusable in Mate Desktop unless CSD is disabled.


For me Firefox seems to be using SSD by default (may be because I'm using KDE), unless I disable Customize > Title Bar. Then it starts using CSD which looks rather bad.


got a link for a description of Client vs Server side decorations?


A few years ago, some KDE developers invested quite some time in concepts of creating a CSD/SSD hybrid titled DWD, marrying up the best of both worlds. Unfortunately it didn't gain any traction, but I still have hopes they'll tackle this again once all the Wayland work is mostly done.

https://kver.wordpress.com/2014/10/25/presenting-dwd-a-candi...

https://kver.wordpress.com/2014/10/25/what-if-kde-started-us...


It was a terrible idea, inventing a new GUI toolkit for the window manager the application would have to support (and that they couldn't use with others), instead of just allowing applications to draw in the title bar/have them do their own decoration.


Note that Mutter already draws decorations for X11 windows. There's also a standardized Wayland protocol to make the compositor draw window decorations. It's just that GNOME doesn't like server-side decorations, from a design point of view.


It is not due design point of view. It is due to synchronizing painting chrome with painting the window content, for frame-perfect rendering. With SSDs/different processes having responsibility, you can't really do that.

Rasterman, of the Enlightenment wm fame, also wrote about this, and he holds the same opinion as Gnome folks. He also noted, that other OS-es (i.e. Windows and macOS) also do client-side rendering. You just don't get to choose, whether you will link the library doing that, when it is required to open the window surface in the first place.


And it's always amazingly convenient and user-friendly when a MacOS application freezes and you can't move its windows or interact with it at all. Especially when it's fullscreen. At some point perfection in one area impedes functionality in another and I think having one frame lag between window frame and window contents is an acceptable price to pay for having always accessible window controls. I know I'll never use GNOME and I'll live happy, but the issue is that their stubbornness starts leaking GNOME implementation details to everyone else. They are not acting like good community contributors.


In most WM/DE/compositor/things, you can move any window with Mod+left click drag. Works on frozen clients just fine :)


Please remove this silly JS engine that's been implemented, poorly. It didn't attract mass developers, and it's ruined performance.


Fully agree, Ubuntu dropping Unity for GNOME made me an happy XFCE user.

I still tried to use it for a couple of months, but the performance difference is quite noticeable.


I don't think the JS interface is the main performance bottleneck, honestly. I would start to look at Clutter and modernizing the GPU backend (e.g. using UBOs, using the depth buffer, batching texture binds, etc.) before thinking about killing the JS interpreter.


It may be or may not be anymore, but at one time not long ago it definitely was. When I cared enough to actually profile its performance, you could literally just move the mouse to the top corner to expose the dashboard, and back. Each time doing this, 4MB of memory gone for good. That particular bug has been fixed. JS engine memory leak. In fact, I invite anyone to look at the number of 'memory leak' commits in Gnome 3/Gnome Shell regarding its JS use.


Might be, but it is does really lag behind what Unity was capable of, regarding performance.


With Gtk 4 being based on a GPU accelerated scene graph thing, isn't it time to deprecate Clutter?


This is a complex question. Using GTK+ in a window manager / compositor is tricky because GTK+ itself wants to talk to the window manager / compositor. So it's easy to end up in a deadlock. A DRM/libinput backend would have to be made for GDK and that's not an easy task, there are a lot of conceptual mismatches. I'm torn on whether it's the right way to go, basically.


Wait what? There are hundreds of popular gnome extensions.


With various degrees of CPU hogging.


which programming language has a similar popularity and would exhibit less CPU hogging?


If you want specific examples, C and C++ to start with, as the old GNOME plugins used to be written on, before they went JavaScript crazy with the UI.

And doesn't need to be C or C++, anything else that compiles to native code AOT would do as well.


Neither C or C++ have the same developer reach or safety.


Developer reach in what sense?

Script kiddies writting GNOME extensiosn with 100% CPU use and memory leaks?

As for safety, yes C++ and specially C are quite good at memory corruption bugs. Which I am quite aware, if you look at my comments history.

Exactly because of that I took care to expressly mention "And doesn't need to be C or C++, anything else that compiles to native code AOT would do as well.".

And while JavaScript might be safe in terms of memory corruption issues, there are enough WAT opportunities in the language.


Disagree. "Dash to Panel", "Gnomesome" and "Topicons redux" extensions have got GNOME back to where I wanted it exactly. They're not extensions that add any appreciable CPU usage, because they're altering shell behaviour, rather than rendering stuff in a tight loop.

My favourite tiling WM is awesome, and the majority of the interesting functionality for that is scripted in Lua, which is fast but more of a niche language.

However, awesome doesn't support Wayland, and the awesome-clones for Wayland are all early-development and don't work very well for me.

GNOME supports wayland, has session management, styles things consistently, has sensible defaults, and thanks to the JS engine and the extensions repo, can do exactly what I want it to do.

(Except for those !£$"£$% gigantic title bars which GNOME apps like to stuff navigation things into so you can't really remove, wasting SO much screen estate....)


> (Except for those !£$"£$% gigantic title bars which GNOME apps like to stuff navigation things into so you can't really remove, wasting SO much screen estate....)

The point of the gigantic titlebars and widgets in GTK3 is to support touchscreens - you can "grab" the big titlebar on a touchscreen and move the window easily, which you wouldn't be able to do with traditional window decorations. Once you're constrained to a bigger titlebar, having widgets in there as well is just sensible. Windows 10 has chosen the alternative of just making the titlebar thicker without stuffing anything in there, which wastes more screen space! (I still think there should be a traditional, mouse-oriented layout for systems which don't support touch-- but guess what, that's quite hard. The default should be touch-friendly.)


How about a PR? /s

No, but really, saying things like "I hate the technical decisions made by your project, please change it" without really participating in the project is silly. It is like "why not Rust?" all over again.


Its 2019 and the leading linux DE still doesnt have fractional scaling. Doesn't a single dev use a 1440p screen?


No matter how you implement fractional scaling, it's not going to look good. The reason is simple: half pixels don't exist. If an app wants to draw a 1px solid border, it can't draw it on a 1.5 scaled display because 1.5 pixel doesn't exist.


It looks good on macOS for years now. Even Windows figured out a way to make it work on those displays (although it's worse than on macOS).

So your explanation, while common amongst Linux devs, is a rather cheap cop-out in comparison to the bigger two competitors who have mostly solved the problem.

The end result is that Linux DEs can be horrible to use on many HiDPI laptops, which don't have exact resolutions to make integer scaling feasable.


macOS doesn't expose fractional scaling to applications, it is done by the output encoder by scaling the entire framebuffer. For software, it is strictly integer only.

It is also careful with the resolutions it supports, it is not just any random scale requested. It is always to scale down, but not lose many pixels. For example, with 2560x1600 physical display (rMBP13), you can go to logical 2880x1800 (@1.78 scale) or 3360x2100 (@1.52 scale), but not further.

With Linux (and also Windows), people have weird requests that would not fly in macOS world.


>macOS doesn't expose fractional scaling to applications, it is done by the output encoder by scaling the entire framebuffer. For software, it is strictly integer only.

Wayland works the same way.


I haven't looked into the sources, but my impression was, that it was using GPU to scale surfaces and then compose into final framebuffer, that is the same resolution as display. Not that it uses output encoder.

The quick test is, when I do a screenshot, I get a file with dimensions same as physical display, not logical resolution. On macOS, it is exacly the other way.

Scaling with GPU is more flexible; you can do things per surface instead of per desktop; but slightly more complicated (because you do things per surface instead of per desktop), takes away GPU capacity/memory bandwidth from applications when then might it, and is more power hungry when the GPU could be idle.


That's up to the compositor, but pretty much everyone does it per surface indeed.

What everyone means by "it's like macOS" is that the same "render at ceil(scale) and downscale" general principle is used.


And that is an ok solution as well - something Wayland or other Linux compositors can't do and force you to be stuck in either "too tiny" or "too large" uncanny valley of UIs.


Because Linux and Windows users ask for different things than Mac users.

With 2560x1440 display, you won't get fullhd@2x sized desktop (scale 133%) on Apple. That something expected in Linux or Windows, but Apple will get you at most 1680x1050@2x (152%, they're 16:10, so different height) here.

It is compounded by the issue, that Apple traditionally used 72 dpi for @1x scale, while Windows and Linux used 96 dpi. Apple apps look quite good at lower resolutions, while Linux (ok, Gnome) dug itself into a hole and looks good on dpis much higher than 96. For Apple, it is acceptable to run 2880x1900 or 3360x2100 on 2560x1600 display, but when Gnome looks passable on 1600x900@1x, good on 1920x1080@1x and you've got 2560x1440, so you need to display 4K (1920x1080@2x) logical resolution on that... that's more "interesting".


Pretty much all Wayland compositors support fractional scaling with independent scales for different monitors.


Drawing it is possible, it would get approximated; the pixels are so small, that the user would know no difference.

But then hit-testing it with mouse pointer is another level.


That's not how scaling works. Click the reply button on this comment, then when you see the form press ctrl++ and ctrl+- to increase and decrease zoom. The borders of the input/submit fields do not get blurry, despite being 1px thick.


That's exactly how desktop scaling works.

Browsers have luxuries that desktop compositor with desktop apps do not have.


I see your point. Mine is, if a browser can do that, GTK and Qt applications can do that. Of course the compositor works on a different layer, but it still can resize non-compliant applications.


And we hit the typical problem with software evolution.

The apps would have to cooperate, and that would need rewrite. In many cases, re-architecturing them.

Who is going to do that? Nobody. We need to live with the legacy that we have. How resizing non-compliant applications works (and looks), see the experimental fractional support in Mutter.


Most if not all Windows applications can be resized properly without being updated. Remember that it's the widget toolkit (Win32, GTK, Qt, etc) that is doing the resizing, not the application.

The app tells the toolkit to draw a button at 10,10 with a size of 100,20. The toolkit multiplies everything by 1.5 and then draws the button. The app doesn't need to know that this is happening at all. Of course this is not possible with some applications that do weird things, but for the 99% this works, and the 1% can just be resized by the compositor. That's exactly what Windows does.


In the Linux world, you cannot rely on the widget toolkit doing the heavy lifting. The API is defined on the socket protocol level, not on the symbols of the library level, so you have to handle it there. The old xeyes or xfontsel, or other athena, motif, openlook, fltk or whatever forgotten apps, that conform to x11 protocol, must be handled too. The scaling at compositor level is the easy part, Wayland compositors have per-surface flag, how it is supposed to scale in the composition, they are aware also of the scaling needed by each display. The issue is, that the X11 clients - unlike Wayland clients - do not announce what they are capable of. As you noted, both Windows and macOS can fix this in the framework, where Linux cannot. macOS even took the shotgun approach and required a flag in the executable header for the scaling capability announcement part. Such approach is not passable in the Linux world.

But Windows is not a garden of roses either, I still see a fair share of blurry dialogs (until recently, even vscode installer/updater).


For what it's worth, Gnome is very usable on a 1440p screen if one sets Font Scaling to 1.6 (in tweak tools) and leaves the general scaling at factor 1. The result is that the fonts have the right size, while GUI elements are a bit smaller than usual (which is actually quite nice because it saves space). Maybe that's why this hasn't been fixed properly for so long.


Uh, yes it does (since 3.26)? It's experimental and requires wayland:

https://www.omgubuntu.co.uk/2017/09/enable-fractional-scalin...

But it does work. I personally think it looks way worse than just going to 1x (or 2x) and bumping up (or down) the font size.


That's because it assumes 96 dpi for every X11 client and scales it up, even for non-fractional scaling (i.e. @2x) if the experimental support is enabled. There is supposedly work being done to address the problem by using two Xwayland servers, one exposing real dpi for dpi aware clients, and one fixed dpi for all the rest.

Unfortunately, it is highlighted by the fact, that all current Linux browsers are X11 and not Wayland clients, so using it would mean you watch the upscaling ugliness on your display every day.


I'm not trying to imply the situation is perfect, but its way more workable than you are letting on.

In the gnome world, you are stuck with GTK3 apps if you want partial scaling that looks alright, but there are options for web browsers.

* Epiphany has supported Wayland since the time of the dinosaurs. And I know its not a perfect browser, but it (1) looks native (2) starts up in a second, (3) has libva support support for hardware accelerated video, (4) has a built in adblocker.

As for mainstream browsers:

* Developer builds of Firefox support Wayland straight from Mozilla: https://www.phoronix.com/scan.php?page=news_item&px=Firefox-...

* Most big name distros have some blessed(like fedora) or less blessed (aur/ppa) builds of the latest stable version of Firefox with baked in wayland support.


I admit, that I haven't looked at Epiphany for a long time, because at the time I did (~2 years ago), it didn't even support SPNEGO/GSSAPI.

The Fedora Wayland Firefox package is experimental for a reason; there are bugs, that would be quite embarrassing for a default browser.


Epiphany for all its virtues is simply not that fast of a browser when compared to Firefox or Chrome. If I ever find myself with enough free time to dedicate to an OSS project, it will be epiphany. Never has there been such a diamond in the rough.

It's very serviceable if you are on a fast computer, and hardware accelerated video in a linux web browser is such a treat, but I bet any given release of the big boys has more patches in it than a year's worth of epiphany builds.

As for firefox-wayland, I've seen that fat bug tracker, and it doesn't have webrender or many of the fancier performance stuff working, but it does run well. I've been bouncing between the two browsers for a few months now and there doesn't seem to be any showstopping issues. It's just not as shiny as the real deal firefox.


Wayland Firefox: For me, it doesn't work at all with scaling, and on another machine with no scaling, opening new window (in the sense of a new wayland surface, so that includes dialogs) could take 30-60 seconds, during which Firefox would be completely unresponsive (this was fixed in v65 yesterday).

Hw accelerated videos: some distributions (fedora, suse, arch?) have patched Chromium with VA-API support. On some GPUs, the allow_rgb10_configs raises it's ugly head though. In also works only in native X11, not in Xwayland. Other than that, it is only way to watch youtube during the more intensive compiles :)


I'm not sure what made it into v65, but Nightly works pretty well with different scale monitors :)


Yes, v65 as packaged by Fedora finally works!


Monitor/laptop makers don't help either by introducing weird size/res combinations. The only reasonable company seems to be Apple with 220dpi displays that are perfect for 2x scaling.


Even Apple configures their new laptops with non-2x scaling by default.


Yes, but you get only reasonable resolutions, that look good on a given hardware, not any random choice. Apple doesn't have to solve, how to display 1920x1080 desktop on 2560x1440 display (i.e. @1.333x scale), because they don't offer the choice of doing so.


I use 1.5 scaling in KDE and it mostly works fine except for a bug where horizontal lines appear in the terminal.


Same, though i found that it doesnt appear on xcfe4term


After seeing fractional scaling in Windows, I'd rather just not. 150% was really, really bad on my 1440p screen. Fonts were probably the most horrible I've seen in my life, even for Windows. I just turned scaling off and increased font size where I could.


No, because real developers are using CLI apps wihtout fancy GUIs. /s


If you think the support for fractional scaling is bad, wait until you see the support for having monitors with different DPI settings.

Linux on the desktop is death by a thousand cuts. I would argue that it's getting worse and worse, because new technologies are introduced (like scaling, GPU switching like Optimus, touch, good desktop composition, etc) and they are never well implemented. 15 years ago the biggest problem you could have with Linux is having to recompile your kernel to make your sound work or something like that.


Any current Wayland compositor handles fractional scaling, different scale displays, touch, good composition very well. (idk about Optimus, I don't buy systems with Nvidia or switchable graphics)


The Bumblebee tool supports Nvidia Optimus graphics, even without a proprietary driver.


one request - could you guys help out Teamviewer and Anydesk guys ? cos their products still dont work on Wayland. There's been zillions of threads and questions on stackoverflow on this. This would be a godsend


In all likelihood it will have to have work done on each and every possible wayland environment to work unless everyone can agree on standards.

The logical solution is to use X for now.


I thought xdg-desktop-portal was the agreed on standard for that already?

https://old.reddit.com/r/kde/comments/9i7p8e/complete_waylan...


It should be prioritized more then. At least for screen capturing on Wayland there seems to be a common proposal to use Pipewire.


There are already ways to do this on Wayland, e.g. xdg-desktop-portal. It's just that those clients don't support it yet.

This is more of a client bug than a Mutter bug.


> Thanks to the fantastic work by Jasper St. Pierre, GJS now supports overriding virtual functions.

It's very unfortunate Jasper is no longer participating in the GNOME community. I'm hopeful he'll one day return.


Heh, I had no idea I attracted such a wide audience. I'm still around!

These days I'm a professional graphics programmer -- I worked on a few games, fell in love with that, and I'm currently trying to figure out what's next. I don't know if Linux and GNOME are the right communities for me anymore for quite a few reasons.

As always, if people have questions about the work I did, or just want to chat, please feel free to reach out -- my email is in my git commits :)


Interesting! Are those low frame rates visible in graph constrained by energy savings/minimizing work or caused by performance issues?




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: