Hacker News new | past | comments | ask | show | jobs | submit login
A few HiDPI tricks for Linux (yossarian.net)
139 points by woodruffw on Jan 30, 2021 | hide | past | favorite | 189 comments



Credit where credit is due this is the one area where Windows is the best OS these days. It not only will do fractional scaling but it will do display independent fractional scaling - whack in the scaling percentage on each screen and in the recent versions of Windows 10 it works well and looks great and remembers your settings when you plug that monitor back in. It is perhaps THE thing along with WSL2 (now giving me a ‘real’ Ubuntu commandline as well as Docker/Kubernetes experience where disks are mounted on both sides and things work seamlessly via VS Code) that made me choose Windows over the Mac and Linux this time around.


This also works well on GNOME Wayland (and probably other Wayland compositors) since about 1.5 years ago (for me at least). In the display settings you can pick a scale per monitor and almost all apps will respect it. (It seems some apps with custom toolkits are lagging but I can't remember the last time I ran into one of these).


I've got 4K display for my Linux machine in 2017; Fedora+Gnome+Wayland worked fine at 200% scale back then.

That's 4 years. After those 4 years, there are people still trying to make their X11 square peg fit into the HiDPI round hole.

X11 is dead. Nobody is going to retrofit HiDPI into it as it is in Wayland. In another 4 years, it is still going to be as annoying as it is today to fix up X11 desktops and apps; maybe even more so.


> worked fine at 200% scale

My X11 setup also works perfectly fine back then, at 200%. Maybe even earlier, 2016 or so.

The caveat is, of course, 200%. Try 150% and it is really a pain in the ass.


> Try 150% and it is really a pain in the ass.

Yes and no.

Wayland clients work file with 150%, or 125%, or whatever scaling. X11 client don't, they are blurry, as they are upscaled from @1X scale.

In the past, I considered that a problem, but I no longer do. You see, 20 years ago, one of the great things Apple did with their migration to Unix-like system was to make X11 clients second class citizens, beyond any doubt. You had to run X11 server manually and later, they even stopped shipping XQuartz with the system. It was a message, that while X11 apps work, if you are the author, you should really make your apps native.

Blurriness of X clients is a similar message here. You want it sharp? Ask its author for a native client. By dragging the feet the problem won't be solved. Chrome, and by extension Electron, are "working" on Wayland support since, what, 2016? For 5 years? They obviously do not see any need for it. The blurriness is a kick so they feel that they should see the need.

Similarly, JetBrains. Recently, there was an article how they partnered with Azul to rework their JIT for M1. M1 is a product, that wasn't on the market yet, but they were working on supporting it. On the other hand, they haven't managed to introduce Wayland support during last years yet. Again, the difference is, that they know M1 is inevitable, but they can afford to delay with Wayland, the users will excuse that, will look for hacks on their own and they won't be blamed anyway.

So if anything, the problem is too nice Xwayland integration and Ubuntu, that delayed switch to default Wayland session, giving time to ISVs to ignore the Wayland support for another couple of years. The distributions have to signal inevitable changes; the problems that will appear by not doing so will be their own, and solving them later will be harder.


Been using Ubuntu Mate X11 for five years on two 4k monitors and have to correct posts like this once every few days.


Works fine for me and always has, since my SXGA+ Thinkpad X61T. Currently running KDE Plasma on a WQHD screen and everything native (QT and GTK) is great; only WINE programs are too small.


Electron is the biggest offender these days. I saw that they added support a few months ago but it seems the apps haven't turned it on yet


The good thing about Electron apps is they all usually accept ctrl+shift+plus to quickly zoom in the whole application, not just the text. Just like zooming any page in the browser.

The bad thing is it applies to all windows of that app, so you can't mix big and small vscode if put one window on laptop HD screen and one window on 4k external monitor.


You can do it on X11 using xrand (crapy but works)


In my experience with Windows 10, it depends on the app. A lot of apps are blurry. Even some of the ones shipped with the OS, made by Microsoft, not to mention third-party apps. Linux with KDE works better than Windows here[1].

I've also read the opposite opinion about macOS - namely, that it's the platform that does it best. I've ordered my first Mac (mini). It's arriving in 2 weeks. Will see then.

Overall, poor HiDPI support on Windows is one of the main reasons why I decided that I don't want to switch to Windows as my main driver and instead try macOS.

[1]: GNOME on the other hand is terrible. It's possible to get i3 to work with HiDPI, but there's no end to the work you need to perform to do that.


Yep. Windows 10 is fantastic in this regard.

MacOS on the other hand sucks. Basically the recommendation is to run a lower and blurry resolution if you want bigger buttons and text...

At the same time, iOS has great scaling and text sizing options, and macOS’ interface is so abstract that it should be very easy to vectorize it, enabling seamless scaling to any size wanted.

Really baffling.


Huh? I have a 4k TV I've been using connected to my gaming computer for awhile running windows 10. I've had no end of issues with various programs becoming horribly confused by display scaling. Some applications ignore the display scaling entirely and end up tiny, and other applications (particularly some games) end up with 4x size on UI elements - which looks utterly awful. Its pretty clear lots of game developers never test with HiDPI displays.

In comparison I've never had a problem on macos. All macos applications I run seem to render correctly no matter what resolution and dpi settings my various displays use. And I can move most applications between displays just fine - with no pop-in or strange behaviour as the resolution of the containing monitor changes. (Except for some web browsers - which detect the transition and re-render content at the new resolution).


If you restrict yourself to Google Chrome, Chromium-based Edge, apps using Electron and apps recently written by Microsoft (Settings, Timer, image viewer, video player, etc), then display scaling is very slick on Windows 10. (I didn't try with multiple monitors, but I did use the Settings app to change the scaling factor frequently.)

On OSX in contrast, if you change the display resolution in System Preferences, most apps do not automatically adapt, with the result that you have to close their windows, then re-open them before your system is usable. And at non-integral scaling factors, everything is blurry at least on the non-retina displays I tried. In constrast, there is zero blurriness in the apps I just listed in Windows 10 regardless of which scale factor is chosen in the Settings app. On a 1080p monitor, the choices are 100%, 125%, 150% and 175%.

Gnome also lets you choose the scale factor in increments of 25% and in a few months after more apps have been adapted to work "directly" with Wayland (without the intervention of XWayland) promises to be as slick as Windows 10 currently is. (In fact, I personally prefer Gnome because on Windows changing the scale factor by 25% can result the stems and lines of the letters becoming abruptly twice as thick, but I vastly prefer either to MacOS's blurriness when the display is not being run at its native resolution, but again I had the luxury of being very choosy about which apps I used on Windows, relying mainly on a web browser and vscode.)


OSX scaling is implemented such that non-int scaling makes pixel-perfect rendering literally impossible. (Hello moire!)

Windows has pretty perfect handing for non-int scaling, though it's on apps to do the right thing. Modern browsers (including Firefox) handle this great, though other apps may or may not.


This is only true for 4k monitors with MacOS. Anything under that looks terrible these days as it won't offer the HiDPI modes on those.


I'm sorry what? I've used Windows 10 scaling and it looks horrible compared to MacOS native 2x HiDPI scaling. The computers are made for HiDPI or not and it looks very natural compared to windows which doesn't scale all elements by the same amount.


I use macOS on a good old 24” 1080p monitor. It’s either “tiny text town” or “blurry shit town”.


24" 1080p is not hidpi; it is normal @1X scale.

If you have it blurry, it is because you upscale @1X.


Ok please help me then.

I have a 1080p screen, and the default rendering of macOS is too small for me.

How can I get the UI to be pixel perfect and bigger without setting a lower than native resolution?

On Windows I do this by setting the scaling factor to 125%.


When you switch resolution to lower; the operating system cannot do anything then, and whatever you see is either your GPU or your display scaler's fault.

It will be not pixel perfect; but: In the display control panel, under "Customized" group, you should have "larger text" option, as the first one. This will keep the native display resolution, and the description will says that "it will look like XY resolution".

However, I don't currently have @1X monitor available to test it; it works with @2X display, which might not be the same - it works by rendering at higher resolution, which is how @2X normally works. But worth a try.


That “larger text” option is the same, it will scale down to a non-display native resolution (except on the default setting). But because the pixels are so small, you notice it less.

So I think we are both aware of what Windows and macOS can do.

And for me it’s clear Windows solution is superior, as it can almost arbitrarily scale the interface to fit any size pixel perfect on any screen.


This is only true for 4k monitors on MacOS - and I often run into non-4K monitors in my work hotdesking situation (though admittedly I mainly WFH at the moment). And Windows solved that not scaling elements by the same amount several years ago.


Yep - all our screens at work are not 4K but 2K (1/2 being the ultra wide curved ones) and I find that the MacBooks look just awful on them as it doesn't see them as 'retina'. It was really strange at first as I always remembered Macs to have great font rendering when displays were even lower res than that but I can't stand how it looks on them. Having Windows scale to 150% on them just looks soo much better. I have a 4K 15" ThinkPad and a 4K 27" external at home and, even though they are the same resolution, they are such different sizes and I like to scale them differently to make things closer to the same size when I drag them back and forth - but only when docked. At other times I scale at a lower percentage to give me more realestate. With Windows I can do this on the fly these days without even logging off and everything looks great and 'just works'.


My MacBook powering my 4K monitor doesn’t look blurry at all, and I have it set to “2560x1440” scaling in the display settings.

It does however run a tiny bit slower due to that non-integer retina scaling.


It does look surprisingly good, but it's still blurring unless they changed it this year. Alternating-color pixel grids always moire there.


I don’t believe they’ve changed it, but tbqh I swap back and forward daily from my MacBook to my gaming PC over the same DisplayPort connection, both set at 150% display scaling (basically), and for where I’m sitting I can’t tell the difference — and there are heap more UI elements on Windows (various icons, apps, etc) that look way way worse due to not being compatible, or having hi-res assets perhaps?

For me it’s a bit of a wash day to day


When I heard what macOS did for fractional scaling, I said “wait, they can’t actually be doing that, surely? I mean, that’s obviously a stupidly bad way of doing it!” And even if the proper solution required more work for individual apps (it’d be only a tiny bit at most), Apple’s in a better position than Microsoft to require that their developers get it right.

I’ve still not actually seen it for myself, and even having heard from multiple sources that yes, it really does work that way, I still find it hard to believe. It’s just… “here, let’s put higher-resolution displays in our laptops, but then scale things so badly that somehow it manages to be worse than a lower-resolution 1920×1080 panel would have been”. Snatching defeat from the jaws of victory.

I also find it utterly baffling.


Have you even used macOS on a HiDPI display? I use this every day on several setups and it shines at handling this. Windows, while it can do HiDPI as well, it's basically become a turd in terms of user friendliness and experience.


I’m not talking about HiDPI. I’m talking about people who like all UI elements and text a little bigger.


Option key while choosing the "size" desired in Screen settings. Gives you more options and there's a solution for everyone. I think my resolution is 2640x something on my 4k. Crisp and large enough, although some would find it too small.


I’m not saying it’s a lack of resolutions to choose from.

Let’s say a default button is 100px wide with 10pt font size and I have a 1920x1080 screen.

If I choose “125%” on Windows, it will render the button 125px side with 12.5pt font. With the caveat that I can render fewer buttons on my screen.

On macOS I just need to pick a lower resolution, and then after al UI elements are rendered, scale it up to 1920x1080. This results in a scaled up and thus not pixel perfect and thus blurry picture.

So Windows renders bigger UI elements at native resolution. macOS always renders a button 100px wide, you can just pick a lower resolution on a high res screen so those 100px seem bigger.

See also https://en.m.wikipedia.org/wiki/Resolution_independence


You have a good point with non-HiDPI screens like a 1080p screen, but I guess that's falling out of fashion anyway. With 4k screens the problem does not exist. The Windows solution is certainly an engineer's approach, but overall it's not necessarily the best when it comes to accessibility. Look at macOS accessibility settings - it's comprehensive.


Your positive experiences with this, do not mirror mine. I have one 4K main display at 200% scaling, and an old 19" off-screen I use for IRC and music playback at 100% scale. Every time I wake my machine, the windows from the off-screen are shuffled around to the main screen, and windows on the main screen are resized and moved around. It absolutely drives me up the wall, and it's clear that this configuration has never been tested by anyone at Microsoft.


Don't worry that bug isn't caused by scaling.


It still fails in many situations. For example switching between RDP session on standard and hidpi display will randomly get the desktop stuck in wrong mode, until you change resolution and or do one of actions that seem to trigger a refresh or that setting. I'm also having random switching when starting a Lenovo laptop. Occasionally it will think it's not hidpi at startup. At best I'd give windows "least broken" mark.


Wayland has a proper protocol for fractional per-output scaling: https://wayland-book.com/surfaces-in-depth/hidpi.html. While there are many reasons why I prefer Wayland over X, this is the biggest.

X only allows setting a DPI per "display", but not per screen (multiple monitor setups where you can drag windows between screens use one X "display"). So you have to resort to these hacks.


The protocol you linked to only supports integer scaling.


Some other resources that may be helpful for anyone looking is (of course) the Arch wiki [0]. I also found it helpful to set somethings like DPI based on the machine (I share my dot files across my computers) which can be done pretty easily using e.g. the hostname [1]. (Note that GitHub doesn't perfectly render the org file, you'd want to see the raw file for details if interested in literal dotfiles.)

[0] https://wiki.archlinux.org/index.php/HiDPI

[1] https://github.com/podiki/dot.me/tree/master/x11


Also, it is possible for most settings to be dynamic, which is useful on a laptop where you can plug screens, with XSETTINGS. That's what Gnome and XFCE are using. With Awesome WM, I am using a script [1] to update XSETTINGS (through xsettingsd configuration file). I explain more on my blog [2].

[1] https://github.com/vincentbernat/awesome-configuration/blob/... [2] https://vincent.bernat.ch/en/blog/2018-4k-hidpi-dual-screen-...


Nice, thanks for sharing! At most I've also hooked up a projector to my computer, but never needed to deal with any setting changes really (just for gaming or rarely for a presentation).

A little off topic, but I've run into some weird issues with Display Port/Nvidia/Dell monitor where the monitor is not recognized on unplug/replug. In the past I know there were some issues with older Nvidia drivers and this particular monitor (P2715Q), though this seems like a weird problem that I've seen others have with different configurations online. It led to me needing a weird set of steps (I wrote about this in my Valve Index impressions [0]), using xrandr to add the display mode and switching to a virtual console too. I'm guessing this is mostly due to the older Dell monitor which has always been a bit wonky with Display Port. I assume things are better these days more broadly.

Relevant part of [0]:

  xrandr --newmode "4K" 533.250 3840 3888 3920 4000 2160 2163 2168 2222 +hsync -vsync
  xrandr --addmode DP-0 4K
  xrandr --output DP-0 --mode 4K
  # switch to virtual terminal, i.e. Ctrl-Alt-F2
  # switch to DisplayPort input on monitor which should display the terminal
  env DISPLAY=:0 xset dpms force off; sleep 5; env DISPLAY=:0 xset dpms force on
  # switch back to X, i.e. Ctrl-Alt-F7
The modeline to add can be found ahead of time with xrandr --verbose or using a tool like cvt or gtf (see the Arch wiki). Sometimes, though, I just need to power cycle the monitor or unplug and replug some cables. I hope this is just this older hardware, and probably solved, like everything else here, with a new GPU that has more DisplayPort outputs.

[0] https://boilingsteam.com/the-valve-index-on-linux-on-a-min-s...


To add to hardware that's being mentioned in this discussion, for reference, I use a 27" 4k monitor (Dell P2715Q) at 1.5x scaling and a laptop with 15" 4k screen (Dell XPS 15 9550) at 2x scaling. Both on Arch, X11, and just a window manager (currently Xmonad, but just before that StumpWM and i3, too). One of the reasons being I prefer to have direct control over Xresources, xinitrc, etc. to set up dpi, color management, etc.


I set up Linux on a Dell XPS 13 with the HiDPI screen. With both Ubuntu and Manjaro it was relatively painless, especially using GNOME via Wayland. i3 and sway also worked well too. I had heard a lot of issues with Linux and HiDPI, and was worried I would have to return/swap for a FHD screen, but everything has worked almost flawlessly.


I have used a 4k screen for several years now without any issues on GNOME with 200% scaling.

However, I recently bought a ThinkPad T14 with a 1080p screen. With 100% scaling, the fonts and widgets are too small. The ideal scaling seems to be 125% or 150%. GNOME has experimental support for fractional scaling. This works generally ok for Wayland applications, but windows of X11 applications are incredibly blurry both on the laptop display with 150% scaling and the external screen with 200% screen. I think they are rendered at 100% and then scaled up. There are various workarounds for X11 applications, but it seems that e.g. none of the workaround really works with e.g. JetBrains IDEs (there are some bug reports about it).

Windows 10, on the other hand, handles fractional scaling and per-screen scaling just fine. It's frustrating, because I highly prefer GNOME and Linux and if all applications supported Wayland, there wouldn't be any major problems.


I have a similar problem with a QHD screen that I want 150% scaling. The worst part is that using the experimental scaling is it seems to consume a lot more cpu usage and thus power on my laptop.


Get rid of the 1080, it's 2021.


14" WQHD would have the same problem, you would just switch 125% to 150%.

The ideal would be 3200x2000, but sadly, nobody sells such laptops anymore. Everyone jumps from WQHD to 4k. With 4k, you have exactly the same problem, except for 125% you need 250%.


I've had a 15" 4k laptop monitor for five years, and it looks fantastic at 200%. I enabled large fonts.

I'd love a squarer monitor however, they are making a comeback. My next purchase hopefully.


> I enabled large fonts.

In Windows, large fonts is basically 125% (120 dpi, vs the base 96 dpi, double for @2X HiDPI), except it enlarges only fonts and pushes dialog controls around, but doesn't touch raster assets. It is something that was part of Windows since '90s, but in many applications, it newer worked right, because the developers never knew about the option and newer tested it (looking at most Delphi developers here).

Changing scaling to 125% (since Windows 8 and 10) changes both font sizes, grid calculations in windows controls and bitmaps; the subtle brokenness of some apps disappears, just to be replaced by another set of subtle brokeness. However, the good news is, that the chance of developers testing in this mode is higher than them testing Large fonts.

I also have 1080p 14" laptop somewhere; at 100%/dpi the display is too tiny, at 125% it is great. I also used to have 1600x900 14" laptop (T430s) - that one, at 100% was perfect (except it was an TN display, which was different downside). Prior to that, 1440x900 (T400) - that one was a bit too large.


Ubuntu Mate, says 192 dpi currently.


192 is exactly @2X scale... so it probably just means, you have 200% scale and you probably have increased the point size of system fonts.

You laptop display is higher density than 192 dpi; at 4k, that would be 23" display.


Yes, its beautiful. Would never go back to 1080.


>but everything has worked almost flawlessly.

Its because everyone having issues is using X and likely 2 monitors with different dpis. On wayland its pretty much flawless unless you use 2 monitors at different scales and load an X application.


> unless you use 2 monitors at different scales and load an X application

Agree 100%, this is the only issue I ran into using Wayland (Sway) with different DPI monitors.

I only want to stress that the impact of this issue is very limited—at least for me. Most of the apps that I use run natively on Wayland and scale perfectly: Firefox (with some extra setup), Telegram Desktop, Alacritty, all GTK and KDE apps.

The apps that do get blurry are the ones that are made by big companies for which Linux experience is not the first priority (thus they don't care about native Wayland support). Zoom, Chromium, Google Earth, Discord, Steam, Skype. I don't use them much, and when I do I try to keep them on my big (non-scaled) monitor, and it works fine.


The sad thing is, that Google Earth is a Qt app. If they updated the Qt they ship with, it would be fine. However, Google Earth looks being abandoned.


The xps 13 with 4k is gorgeous. Pair it with an external 4k and everything is seemless. If you go 1080 or whatever for the external monitor it becomes a pain to switch resolutions and scaling. I briefly had to do that and had a script to adjust things but pycharm for instance needed to be restarted

Once I upgraded to a 4k it's a great setup. I actually have dual 4k monitors now and that xps can handle it easily. No software issues either. The most I've had to do was adjust the zoo launcher to be hidpi


My only issue is that my mid sized 4k screen is a but clunky at 2x UI scaling but otherwise I don't have an issue until I plug into a standard Monitor when 2x scaling is ridiculous.

But I find 1x scaling to be too small to enjoy on the QHD screen. Multiple integer scale factors are possible but it is a bit complex and when I did it machine was prone to crash...

That said I still love it.


I'm using linux mint with fractional scaling on just one of my displays. (The other display is just 1:1). Everything seems to just work fine.

Well, except for an outstanding bug where after I log in, I get a duplicate, stationary mouse cursor appearing somewhere on the screen that just hovers over any open windows. Its annoying and weird.


This article makes it seem like these steps are necessary for most Linux setups, when in reality you can do per monitor fractional scaling from a GUI as long as you're using GNOME on Wayland. No need to fiddle with per app settings, it works transparently.


Except the author is using X11 as many people will. And what this shows is how many different environments / toolkits / settings there are out there. It would be nice if distributions would do a better job of handling this for you. I’ve run into the same issues myself, and it’s kind of hard to believe that I even had to figure out these configuration dpi settings in 2021.


>It would be nice if distributions would do a better job of handling this for you

They did, its called wayland. If you want to use a legacy tool you shouldn't expect it to work as nice as the current generation software.


> If you want to use a legacy tool you shouldn't expect it to work as nice as the current generation software.

I'm not using the "current generation" because it doesn't work as nice as the "legacy" software. Have y'all figured out how to do screen sharing without per-compositor hacks yet?


> Have y'all figured out how to do screen sharing without per-compositor hacks yet?

Yes, it is called Pipewire.


But distros do handle this for you. I installed Arch last year with sway and “it just works”.

Like the OP say, if you want HiDPI fractional scaling that’s a solved problem on Wayland. If you want to stick with X11, then that’s never going to work great


He's using "i3 with a heavily customized userspace" which isn't exactly what most people would use. Gnome on X11 lets you (integer) scale monitors through a GUI at the very least.


Yeah but there’s a very good chance that won’t work with various hardware configurations. I was able to get per-monitor scaling to work only after giving up use of half my GPUs, and still there were problems with several apps. This was on stock Ubuntu; I didn’t get too crazy customizing because I just wanted to find a Linux desktop that worked.


>I was able to get per-monitor scaling to work

X11 is not capable of doing this properly. Wayland is and as you probably know, nvidia refuses to support it which is out of the control of linux developers. You can get everything working if you use the open source nvidia drivers but then they run slow as shit because nvidia blocks them from setting the clock speed on the gpu.

Basically every problem falls back on nvidia and the linux desktop people have done everything in their power to make it work.


> Basically every problem falls back on nvidia

So... get an AMD GPU instead?


Yes, either that or put up with the many problems of X


The point is, you don't have to figure out configuration dpi settings in 2021, if you use distributions as they are supposed to be used.

If you insist on your favorite x11 window manager, the integration is on you. No point in complaining then, it is self-inflicted pain.


> distributions as they are supposed to be used.

As much as i like the concept of Wayland, until it's a one click always works without any caveats option in Ubuntu it essentially doesn't exist. Ubuntu 20.04, the latest of the most popular distributions, does not have wayland enabled by default. They actively discourages you from trying it if you have Nvidia, which again is the most popular gfx card - not sure if one is supposed skip the proprietary drivers or not with Wayland but on X11 using Nouveau is not a pleasant experience.


Ubuntu was criticized for that; every time they release LTS without default Wayland session, they delay the solution for another 2 years.

One of the reason why Apple is able to move fast, that they do not fart around with introducing new and deprecating old things; if you are not ready for that, it is your problem, stick with old release. So most vendors get ready for the new things, because they know there's no way to avoid them. In Linux land, we have people putting hacks on top of other hacks just to avoid switching to the new thing, and in this case, the existing Ubuntu LTS releases are part of the problem.

Also, Nvidia is not the most popular GPU in general. It is a gamer card. Most used GPU is Intel. Intel has some 70% share; the rest is split between Nvidia and AMD.


Nvidia is the most popular graphics for gamers; the most popular graphics for general computer users is integrated with an Intel CPU.


That hasn’t been the case for me with Ubuntu or raspbian.


I am using Retina display on which everything looks tiny. I struggled with setting up QT scaling, GTK3.0 scaling, icon sizes, font DPI and resolutions. The problem is there is no single setting which all UI libraries follow and this makes your life hell.

Even worse is when you connect an external monitor which is not HiDPI and now everything is bigger than usual.

Instead scale the whole display and this works perfectly. I have explained it here: https://np.reddit.com/r/linuxquestions/comments/4c2o4p/what_... (Except one hiccup, but its way better than tuning inidividual knobs)

I have added the command to /etc/X11/xinit/xinitrc so it scales automatically when I login:

xrandr --output eDP1 --scale 0.60x0.60


> Even worse is when you connect an external monitor which is not HiDPI and now everything is bigger than usual.

This has been a real bugbear for me. I'm a big fan of small sharp bitmap terminal fonts, but as my new laptop is now the same resolution as my external monitor I have a choice between comically small on the laptop display or comically huge on the monitor.


What if you want to display a pixel-perfect image? Perhaps this a rare use case, but it's important for some things, like dithering. Is there any way for you display an image where every pixel matches up to a display pixel?


Plenty of art is made assuming crisp lines too.


The question is, how do we fix this? Is there a best setting to use, and if not, who will join the working group to define one?


It's fixed in Wayland.


No, it's not, see my comment [1].

Wayland just punts on the issue by removing the compositor from any decision involving scaling. The "Wayland Way" is for the compoisitor to relay information (monitor-reported scale, etc) to the client, and have the client figure out how to scale itself correctly.

Unfortunately this means that these HiDPI bugs have to be fixed over and over in myriad apps and toolkits, and software (cough cough chromium) that is so complicated that it cannot use a toolkit.

Some compositors like sway will let you do in-compositor scaling, but this is admittedly a kludge: it produces blurry scaled-up output instead of rendering at the right resolution in the first place. The sway authors acknowledge this; the feature is only there as a "better than nothing" workaround.

[1] https://news.ycombinator.com/item?id=25973875


It would actually be great if Wayland would allow clients to handle HiDPI themselves but it doesn't which makes fractional scaling much worse than it needs to be. Wayland defines integer scaling only and then compositors scale 2x content down to 1.5x or whatever it is you want. This means that if you have an app that's capable of drawing at arbitrary scale you are still forced to draw at 2x and then be scaled down by the compositor. This is needlessly blurry and a performance penalty. Text can be drawn at arbitrary scale but is blurry because of this. Browsers can render at any resolution but have substantial performace penalties because of this (2x at common fractional scales). Anything that renders images also gets needless downscales because of this with both quality and performance penalties. It's a baffling decision to not do this client side and hopefully will be fixed with the evolution of the protocol.


This was discussed at length years ago and it was decided that passing a fractional scale value to the client is not what you want, makes the code much more complicated and doesn't really solve the blurriness. A bit of background here: https://lists.freedesktop.org/archives/wayland-devel/2013-Ma...

To illustrate, I usually use a simple example: How do you render a 20px tall font at scale 1 2/3 without being blurry or without rounding errors? And if you wanted to avoid any additional scaling artifacts or rounding errors, how tall would you make the output buffer, in pixels? What happens when you try to stretch this window across multiple screens?


>To illustrate, I usually use a simple example: How do you render a 20px tall font at scale 1 2/3?

You replace it with a 33px tall font.

This is what Gnome 3.38 actually does in my experience: on the computer on which I am writing this, I have a scale factor of 1.50 set in the Displays pane of Gnome Settings. Google Chrome and vscode show up in the output of xlsclients, which I am told means that they are talking to XWayland (and there are web pages promising that in a few months those 2 apps will be adapted to talk directly to wayland), and those 2 apps are blurry whereas Gnome Terminal and Evince (my PDF reader) do not show up in the output of xlsclients and those 2 apps are not blurry, but their default text size and the size of their UI elements are consistent with everything being scaled by 1.5.


> You replace it with a 33px tall font.

You cannot. You would need 33,333333333px font; you cannot ignore the fractional pixel errors, otherwise you will get odd bugs.


Yeah, well, what Gnome does is if you ask it for 150% scaling, then it actually gives you 147% scaling or 152% scaling to make the number come out even.


Don't know about how Gnome does that[1], but Apple certainly does that, and it allows them to limit compounding of the pixel errors to a group of 8 or 9 pixels. Each new pixel group starts without any compounded error.

[1] Last time I checked, they didn't scale entire framebuffer as Apple does it, with output scaler. It looked like they were scaling surfaces individually with GPU and then composed the scaled surfaces into framebuffer with the same dimensions as the display resolution.


> This was discussed at length years ago and it was decided that passing a fractional scale value to the client is not what you want, makes the code much more complicated and doesn't really solve the blurriness.

From your link:

"""

> While rendering at rational factor can't be of better quality, it can

> however be done at a much higher quality than downscaling, faster and with

> less memory usage.

Perhaps, but that benefit diminishes compared to the cons.

"""

Being blurry and losing performance was deemed acceptable for other benefits. Maybe I'm discounting those benefits but the penalties are very real. 2x performance penalties in performance sensitive code and blurry scalable content like fonts are real downsides.

Looking further down the thread what I think would be the ideal solution is what is being proposed and doesn't get any response:

https://lists.freedesktop.org/archives/wayland-devel/2013-Ma...

The current solution actually seems much more complex than this as the protocol needs to actually know about scaling instead of the scaling factor being just some metadata about how things should be and have been rendered and everything else would be within the client and compositor code.

> How do you render a 20px tall font at scale 1 2/3 without being blurry or without rounding errors?

Fonts are not sized in pixels for the most part. Modern fonts are resolution independent and the renderers try hard to fit it to however many pixels you have. By faking them out and telling them "here's a 200x200 box to do your work" and then later scaling down that box to 150x150 prevents that code from working properly.

> What happens when you try to stretch this window across multiple screens?

For this case your render at the maximum of the two fractional scales and scale down in the lowest one. It's no worse than the current solution for that screen and better for the one that matches.


The solution in that email is essentially how it works, the only difference is that the scaling factor is forced to be an integer.

>By faking them out and telling them "here's a 200x200 box to do your work" and then later scaling down that box to 150x150 prevents that code from working properly.

Then question I was getting at is: how do you write a renderer that does the right thing and works properly when given a box of 133.333333 x 133.333333 pixels? If you say "round in some direction" this is now a policy that must propagate to every bit in the stack that touches input and rendering.

>It's no worse than the current solution for that screen and better for the one that matches.

It actually is much worse for the one that doesn't match, because now you have lost the ability to do pixel-perfect scaling.


It's not how it works at all for fractional scaling, which is the point. And you don't give it a 133.33 box. That's what you're in effect doing now by having the compositor scale the whole screen. You give it an actual 133px box because since you're working at native resolution apps get snapped to pixel boundaries.

> It actually is much worse for the one that doesn't match, because now you have lost the ability to do pixel-perfect scaling.

I don't know what you mean by this. Pixel perfect scaling is precisely what's not possible right now. If you have a 1.3x screen and a 1.5x screen you can't render 1:1 to any of them. With this you could at least have 1:1 in the 1.5x screen and be pixel perfect there.


You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide. The result is the same on the 1.5x display, and on the 1.1x display it's worse, because now you're re-scaling the line that is already anti-aliased, blurring it twice.

If you get a 133px box, you are cutting off pixels at the bottom.

Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.


> You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide.

Pixel-perfect is not the standard. If you're rendering a PDF everything starts out in vector space and there's nothing special about 1x and 2x. It may very well be that 1.5x is the scale at which the PDF line becomes pixel aligned. You just don't know. But the app might know, and that's the point. Asking it to do 2x and then scaling by a factor it's not aware of is strictly worse than giving it all the information to make good decisions.

> If you get a 133px box, you are cutting off pixels at the bottom.

No you're not. You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does. Just like sway doesn't let you do a 133.333px window in 1x it shouldn't let you do one in 1.5x. But it does, so the whole desktop is not pixel aligned, not even the start of the windows.

> Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.

This is precisely what I do. Increase the font scale in GTK and the interface scale in Firefox. That this gives me better results than the current solution is evidence that it can be done better. Unfortunately it breaks down in mixed-DPI situations.


With the PDF you would see the same results using a filtered scaler. There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues. If you want to add this to a wayland implementation, I won't (and can't) stop you, but I think it's a bad idea. Just my experience from trying to write various scaled interfaces, it becomes impossible to ensure that anything is pixel aligned after you introduce a non-integer transform in the graph. When the top point in the graph is the screen, that limits a lot what you can do there.

>You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does.

To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.


> There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues.

I don't think I've been able to explain then. There's strictly less rounding than today this way, Because from the point of view of Wayland every window is at native resolution and it's up to the client to scale.

> To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.

I don't understand what you're saying. 133px is correct, just like 134px is correct. The way WMs and compositors work is they split the screen between windows at integer coordinates. That's how it works at 1x and there's no reason to not work like that at 1.5x. They're completely independent issues.


It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale. What you are describing is rounding, you can't do this without rounding. Worse, with that method the scale technically changes every time you resize the window. This is hard to explain in text on HN and I don't have reference images to illustrate this at the moment, sorry.


> It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale.

You won't get the same result as rendering the entire screen at the same scale because you're snapping windows to integer borders. But that's the whole point. The app however is scaling at 2/3 within that window box and that's actually what you want because it gives better results. There's nothing special about scaling the whole screen that you can't do by scaling each individual window after first snapping all windows to integer sizes.


If the app is still rendering into that box at 2/3 scale, then it is either cutting off pixels at the bottom or you will get a seam at the bottom.


Only if the app is broken. The rendering engine needs to be able to fill whatever pixel size at whatever scale. If you mean that you can no longer do 2x and scale down at the exact perfect scale then that's true, but only a limitation of that way of rendering. It's also broken for whole screen scaling, just less noticeable. Firefox and Chrome don't have that issue at all for example, and neither does anything that renders fonts and vectors. At a guess this is probably where all this came from. GTK decided integer scaling was all it was going to support and then everything else was decided as if all clients behaved like that. I doubt even that's a problem. The differences in scale from doing 2x and scale down per-app instead of per-screen are miniscule.


There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.

If you follow the email I linked a while ago, this was mostly a decision in Weston, not really particularly related to GTK. But if you want to follow the conversation on where Gnome is at with this, see here: https://gitlab.gnome.org/GNOME/mutter/-/issues/478

The end comments there are what I'm getting at, the only reliable way to do this is to just bypass scaling completely and have something where the client says "never scale my buffer at all on any monitor." But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.


> There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.

This is just not true. There's nothing special about 1x and 2x when you're rendering a font or a vector. 1.5x may very well be the scale at which everything snaps into the pixel grid because that's how the pt->px conversion happened to play out at that given DPI. Thinking of 1x and 2x rendering as being special cases is exactly the problem here. And even for things where you want to pixel align if you leave that to the client some can make more intelligent decisions. There's nothing stopping a UI library snapping everything to the pixel grid at arbitrary scales by varying the spacing between elements slightly between steps. That's the whole point, the client can do better things in a lot of cases and while the discussion was about Weston the Wayland protocol has ended up with this limitation, which makes sense since Weston is the reference implementation. There's currently no way for a client to request the scale factor of the screen and render directly at that factor.

> But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.

There's no problem with mixed-DPI. If you have a 1x screen and a 1.5x screen render at 1.5x and scale down in the lower screen when spanning two screens. When in one screen exclusively render directly at 1x or 1.5x.


The real serious issues are not really with fonts and vectors, it's when you have a surface within a surface. (In the context of a browser, think flash plugins, JS canvas and the like) Those have to be pixel sized, which means practically you are forced to 1x, 2x, 3x... It is unfortunately a special case that is incredibly common, and it has to be handled. Solving this by snapping everything to the pixel grid is another form of rounding that has its own set of artifacts with elements jittering around when the window gets resized, and it breaks down again if you try to have surfaces within those subsurfaces. (e.g. You decide to display a cached buffer within your flash app or JS canvas) These problems are exacerbated further when you have mixed DPI, it actually makes it much worse, and you still have one screen that is going to be scaled down and blurry.

At some point maybe I'll put together a post to illustrate this, I know it seems easy when you think about it but if you try to implement it you'll see it causes a huge number of problems. They are not unsolvable but the amount of clients that are going to care enough to solve them all is very small. (Fundamentally this cannot really be solved in a browser either for the reasons I described above)

If you want to work on this, please do it, but don't say I didn't warn you that it's a rabbit hole. I've actually done everything you're suggesting before in other projects, and honestly we've really only scratched the surface of why trying to do this kind of scaling is problematic.


I don't get the argument that browsers are an example of how hard this is when browsers are today already completely capable of doing the right thing as clients. And so does Windows by the way. What keeps me from having a fractionally scaled desktop is Wayland. Firefox already works, and so does Chrome.


You already have to propagate code all over to handle integer multiples, don't you?

The email seems to be worried about things happening at the wayland level with fractions, but that's easily solved by rounding entire windows to the nearest pixel.


You do have to propagate the scaling factor, but as long as it's an integer scale, you don't have to convert all your coordinates to use floating-point everywhere and enforce rounding rules.

I'd also like to point out that if you are rounding in your rendering anywhere, you are now likely either introducing blurriness again, or something is getting scaled incorrectly. The only real way I know to do this without causing artifacts or error build-up is to have to compositor only do the rounding as the last step, which is the way it works now, and you have to use an integer scale for that. As you describe, this is the "easily solved" way, but it's only really easy because the compositor gets to handle it internally -- once you push that problem to clients it gets significantly harder.


Quite the opposite. Once you push the problem to the client a lot of them already know how to do it and by not doing two scaling steps you can make much better decisions. And you don't need coordinates as floating point at all. What you need to do is tell the app "you have a 150x150 window on which to display content at 1.33x scale, just give me back a buffer with those properties". Wayland can work at integer coordinates all the way. The app also draws at integer coordinates it just uses the scaling factor to know what to do. It can even just replicate the 2x and then scale down solution if it doesn't know how to do anything better but plenty of code does know how to do something better and much faster. Browsers, image viewers, 3D renderers, PDF viewers, are all natively able to scale arbitrarily and yet are forced by wayland to draw at 2x and be scaled down.


They aren't forced to draw at 2x and be scaled down. They only need to do that if you have enabled fractional scaling. If you really want to avoid the overdraw associated with that, just don't use any scaling on your primary display and then turn the font size in your client up. Anything else that passes the fractional scale onto the client is going to have issues. The reason you pass the buffer scale in wayland is because you explicitly want the compositor to scale your buffer and you want it to do it correctly. If you don't want that, don't bother with messing with fractional scaling.

>you have a 150x150 window on which to display content at 1.33x scale, just give me back a buffer with those properties

Again, this comes back to the issue where now you need to deal with rounding in the client. Only the simplest of browsers, image viewers, 3D renderers, PDF viewers that have no UI chrome could get away with just passing on a fractionally scaled buffer. And for those, you can do what I described above, disable fractional scaling and do your scaling in the client, you don't need any special support in the compositor. I believe in most applications you can press Ctrl-Plus and Ctrl-Minus :)


> They aren't forced to draw at 2x and be scaled down. They only need to do that if you have enabled fractional scaling.

Fractional scaling is exactly what we're talking about. At 1x/2x everything works.

> If you really want to avoid the overdraw associated with that, just don't use any scaling on your primary display and then turn the font size in your client up.

That's what I do but doesn't work with mixed-DPI. The fact that it works and improves things just shows how this could be done better.

> The reason you pass the buffer scale in wayland is because you explicitly want the compositor to scale your buffer and you want it to do it correctly. If you don't want that, don't bother with messing with fractional scaling.

That's circular. You're arguing that the way things are done is an argument for things to continue to be done like that.

> Only the simplest of browsers, image viewers, 3D renderers, PDF viewers that have no UI chrome could get away with just passing on a fractionally scaled buffer.

Both Firefox and Chrome have their whole UI scalable at arbitrary values and have for many years. See layout.css.devPixelsPerPx in Firefox for example. There's nothing special about the window where the Webpage/PDF/3D gets rendered versus the UI. Vector UIs that can scale arbitrarily not only exist they're widely used. And the worst case is still "if you don't know any better do 2x and scale down yourself client".

> And for those, you can do what I described above, disable fractional scaling and do your scaling in the client, you don't need any special support in the compositor. I believe in most applications you can press Ctrl-Plus and Ctrl-Minus :)

You can't disable fractional scaling on a per-program basis, it wouldn't work at all, and it would still be broken in a mixed DPI setting. I want the Browser/PDF/Image/3D windows on my 1440p screen to have different scaling from my 4K screen and for them to switch scaling automatically, just like they resize automatically, when I switch them between screens. That's currently not possible because there's no support for it in Wayland.


>Fractional scaling is exactly what we're talking about. ... That's circular. You're arguing that the way things are done is an argument for things to continue to be done like that.

Actually no, what you are describing is ideally having no scaling at all in the compositor on any particular monitor. (Unless of course a window is stretched between them) I'm not arguing for things to be done any way, I'm saying don't misuse a feature that was designed to do something else than what you're asking for.

For a vector UI that is supposed to be pixel-aligned, you can't render that at a non-integer scale without the same blurriness that you would get with doing the scaling in the compositor. That's different from rendering print content that has no relation to pixels.


> For a vector UI that is supposed to be pixel-aligned, you can't render that at a non-integer scale without the same blurriness that you would get with doing the scaling in the compositor. That's different from rendering print content that has no relation to pixels.

That's one case where the downside is the same, not even worse, and the cases where that's not the case are broken. Wayland pushes a lot of things to the client, so it's surprising that the client can't make this decision when at worst the result has the same problem. And there are a lot of clients that could take advantage of this today. The browsers we are using to have this discussion are made needlessly blurry because of this. They have the whole stack ready for fractional scaling themselves and are forced to 2x and scale down by the compositor.


> I'd also like to point out that if you are rounding in your rendering anywhere, you are now likely either introducing blurriness again, or something is getting scaled incorrectly.

Some things will be blurry, but no worse than having the compositor do it. Other things will not be blurry, and that's a huge huge benefit.

> artifacts or error build-up

Integer multiples risk artifacts too, if the program is actually supposed to be rendering in high resolution. Error build-up I'm not sure should be a big worry when you can have 53 bits of precision for a window a couple thousand pixels across.

> As you describe, this is the "easily solved" way

I'm not sure you understood what I meant. I was saying that when possible the compositor should round the window to the nearest pixel, then have the client render at that resolution, so then the compositor would not do any scaling at all.


>Some things will be blurry, but no worse than having the compositor do it. Other things will not be blurry, and that's a huge huge benefit.

This is the same trade-off as you would get without it.

The precision isn't an issue, the error build-up is with rounding and it becomes particularly problematic when you have subsurfaces.

As I have said in another sibling comment, if you don't want the compositor to do any scaling, then don't enable scaling in the compositor.


> This is the same trade-off as you would get without it.

Using the compositor method makes everything blurry. It's not the same trade-off.

> then don't enable scaling in the compositor.

Turning up the font size as you suggested is not at all a proper solution.

I want to pass fractional scaling on to the program whenever possible. If it can't handle it, then 2x followed by shrinking can be the fallback.


> The "Wayland Way" is for the compoisitor to relay information (monitor-reported scale, etc) to the client, and have the client figure out how to scale itself correctly.

FWIW this is also the "Xorg Way" (RandR provides all the necessary information), it is just that clients do not bother.

Ideally the window manager would use the information provided by RandR to set up default scale levels for each monitor and then send notifications to the applications whenever the window changes monitor to alter their scale levels (this would also allow for custom per-window scale level that is applied on top of the per-monitor scale level in case someone wants to scale up/down a specific window - e.g. scaling up some notepad-like editor for a screencast).

This does require each client (and toolkit) to support arbitrary scaling... and window managers to agree on such a message, though i do remember an email about this topic being posted on Xorg mailing list some time ago. AFAIK Qt should already provide the necessary functionality for this.


> FWIW this is also the "Xorg Way" (RandR provides all the necessary information), it is just that clients do not bother.

One of the differences is, that Wayland surfaces do have scale property. So compositor knows, when clients do not bother, and can do scaling for them. Under X11, some clients do bother, some don't, and the compositor doesn't know which are which.


Right now there isn't any protocol for doing what i described, however this information could easily be added. EWM (or something like it) needs to be extended for scaling support. Applications could do the scaling themselves as they detect their toplevel windows be moved around, but i'm not aware of any application doing that and it'd be fighting the window manager so it isn't a good idea anyway.

But the X server already provides all the necessary information and mechanism for implementing this, it is the clients that need to use it: the window managers need to implement some way to inform toplevel windows that they need to change scale (e.g. via a custom message), applications need to inform the window manager and the compositor that they support such scaling messages (e.g. via a window attribute) and compositors need to be able to scale windows that do not support this scaling (this will probably need a special window manager <-> compositor protocol for compositors that are independent from the window manager while still allowing the window manager to handle scaling without having to also be a compositor).


> But the X server already provides all the necessary information and mechanism for implementing this,

X server provides that at the display level, not at the screen level. If you do that at display level, you cannot move windows between displays, the application has to destroy and recreate them (out of all applications, only emacs can do that). If you want to use then per screen, how it is done today, then screens must have identical density.

> the window managers need to implement some way to inform toplevel windows that they need to change scale (e.g. via a custom message), applications need to inform the window manager and the compositor that they support such scaling messages (e.g. via a window attribute) and compositors need to be able to scale windows that do not support this scaling (this will probably need a special window manager <-> compositor protocol for compositors that are independent from the window manager while still allowing the window manager to handle scaling without having to also be a compositor).

Defining the protocol would be the easier part; persuading application authors to support it would be the hard part. They would ignore it for years to come. And that's for applications that are maintained or can be updated.

So when you have to make changes, you might as well as change the protocol and adjust for achieving multiple additional objectives - which is exactly what Wayland did.


What you are describing is not enough, the X server or the X compositor would still need to scale windows up/down if they don't support scaling or if you want to have fractional scaling. It's not something a old-style window manager can do -- There is just no real "Xorg way" on how to do this right now.


This is only needed if a program doesn't support scaling itself and is something that a compositor can support. In addition there is a branch by Keith Packard for having server-side window scaling that could also be used for this and work independent of any window manager (though a compositor can also be made to work with any window manager - for example there are people using a compositor with Window Maker even though WM doesn't support composition).


>Unfortunately this means that these HiDPI bugs have to be fixed over and over in myriad apps and toolkits

There is nothing that can be done about this from the view of the display server. The application and toolkits have to be updated to support high DPI.


Is it intrinsically fixed, or is it up to the compositor to do the right thing? (implying it may be broken in some compositors)


There's only integral buffer scales, so it's entirely up to the compositor or client to do fractional scaling.

https://wayland-client-d.dpldocs.info/wayland.client.protoco...


I don't know anything about the wayland API. Does this imply that windows can't have a different scaling on each screen they may overlap on? Or can a window be split across different surfaces?


Wayland surfaces (windows) have integer property, that says at what scale they are rendered. Compositor then suitably scales them for target displays. If a surface is at two different displays at two different resolutions, it will be scaled correctly on both (up or down; depending on the application; in practice downscaled on the lower resolution one).


All I know is the gnome settings has a scaling section and everything respects that scale.


I have synced dotfiles across my various machines and I run i3 across these. My configs for DPI consist entirely of the following:

Xresources:

    Xft.dpi:120
    st.font: monospace:size=14
xsession:

    # haven't had time to dig into this, some systems need
    # DPI set twice
    xSetDpi () { xrandr --dpi "$1"; xrandr --dpi "$1"; }
That function is called in a case statement for other machine-specific startup config, but it's just a matter of xSetDpi 144;.

emacs.d/init.el:

    (set-face-attribute 'default nil :height xxx)
Again this is per-host, but it's a simple COND expression.

qutebrowser/config.py:

    # even with DPI set, I typically find web text too small
    c.zoom.default = '175%'
I promise you that this does not feel like any significant effort for me. It's ~10 lines across a few config files, each of which I would be maintaining anyway - none of these configs were born for the express purpose of handling DPI or zoom or anything like that.

I value having a system that is configured exactly to my preferences. I recognize that having multiple configs is a turnoff for some. I can't debate personal preference.

I have spent much more time trying to get something like Gnome to behave in a way I want than I have on all of my current config. XFCE is a pretty good experience for a DE and configuring scaling through system settings.

Machines:

1. 11" HD, OpenBSD current

2. 12" FHD, Artix Linux

3. 2x27" WQHD, Artix Linux


My settings are:

GTK2 apps do not support HiDPI correctly. oomox (and similar) can be used to generate a scaled GTK2 theme, but it won't solve all your problems.

GTK3 supports HiDPI well. If you use Gnome, Xfce and others, you can configure this via the preferences UI. Otherwise, add this to your .profile file:

    export GDK_SCALE=2
    export GDK_DPI_SCALE=0.5
Many Java apps can be scaled by adding an argument...

    java -Dsun.java2d.uiScale=2
For Qt you can add this to your .profile file:

    export QT_AUTO_SCREEN_SCALE_FACTOR=0
    export QT_SCALE_FACTOR=2

    # For theming, requires qt5ct
    export QT_QPA_PLATFORMTHEME=qt5ct
For Spotify... I use ncspot for listening, and the web version to manage my playlists (although you can also management with ncspot). ncspot is a keyboard friendly, lightweight terminal client.

    https://github.com/hrkfdn/ncspot
I use FreeOffice as office suite, you can configure HiDPI through the menus and works perfectly. Same with Steam.


If anybody knows how to forcibly override the "pixels per mm" reported by a monitor, you will be my hero forever.

Every solution I've found so far (including the QT/GTK environment variables for overriding DPI) ends up instead causing scaling during compositing, rather than getting the apps to rasterize their fonts at the correct size in the first place. The worst offenders are anything derived from Chromium (i.e. qutebrowser, who say the problem is a QtWebEngine bug that they can't fix).

I'm using sway, but almost all the Wayland compositors lack this override-the-monitor-reported-DPI functionality, because (for good reason) their philosophy is to securely multiplex the hardware, not abstract it away. They just pass through whatever values the monitor reports, to the client.

IMHO this does not make sense for physical pixel size, because it disregards the physical distance between the user's face and the pixels. What you really want to size your fonts based on is the number of arc-radians (or arc-degrees) per pixel, measured at the user's eyeball. In order to compute that you need to know both the physical pixel size (which can be queried from the device) and the physical eyeball-screen distance (which will need to be user-configurable).


> If anybody knows how to forcibly override the "pixels per mm" reported by a monitor, you will be my hero forever.

The brute-force way to alter everything your monitor reports, is to dump its EDID data, patch it (there are programs for this, or you can resort to a hex editor and the EDID spec), and then tell your kernel to load that patched EDID in place of the real EDID when it recognizes your monitor.

On Linux, that's apparently accomplished by plopping the patched EDID file into /lib/firmware: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

(I've never actually done this with Linux, but I've done it in macOS on a Hackintosh, because macOS frequently thinks televisions connected over HDMI should be fed YUV instead of sRGB. You can patch the EDID to force macOS to treat the TV as a monitor.)


I don't know of any Wayland implementations that let you do that, it wouldn't be hard to hack in, but Wayland applications should not be using that information to decide scaling. They should be using the wl_output.scale.

In X11, you can do it with xrandr --fbmm. Technically you can also set this information per-monitor but I don't think a command line option was ever added for it.


> In X11, you can do it with xrandr --fbmm

Yeah, that's basically what I want, but for sway.

> Wayland applications should not be using that information to decide scaling. They should be using the wl_output.scale.

Maybe so, but the physical size is definitely exposed to the clients via the wayland protocol (see below). I'm pretty sure that the QT toolkit and Chromium both use the reported values.

    <event name="geometry">
      <description summary="properties of the output">
    The geometry event describes geometric properties of the output.
    The event is sent when binding to the output object and whenever
    any of the properties change.
    
    The physical size can be set to zero if it doesn't make sense for this
    output (e.g. for projectors or virtual outputs).
      </description>
      <arg name="x" type="int" />
      <arg name="y" type="int" />
      <arg name="physical_width" type="int" />
      <arg name="physical_height" type="int" />
      <arg name="subpixel" type="int" enum="subpixel" />
      <arg name="make" type="string" />
      <arg name="model" type="string" />
      <arg name="transform" type="int" enum="transform" />
    </event>
I will probably end up hacking it into sway, like I've done with other things.

It's just that the swaywm/libwlroots codebase is so utterly illegible. "wlroots is largely documented through comments in the headers. Read 'em." Yeah, those headers with no comments anywhere, egregious amounts of boilerplate code (because C), and naming conventions that take weeks to learn. Argh. I've spent entire days trying to figure out "how to I get a sway_foo pointer from a wlr_foo pointer". It's madness. There's so much typecasting going (with totally undocumented rules for when said typecasts are safe) on that you can't even use the compiler-visible types as a guide for these kinds of questions.


Is there some reason you cannot just set the output scale? This almost certainly already does what you want. If you're talking about X11 applications running on XWayland, I think you can still change the properties there with xrandr.


I've definitely tried that and it has the wrong effect in a lot of cases. The worst is Chromium (via QtWebEngine, via qutebrowser), which seems to ignore things.

No, I'm not using XWayland.

I edited my comment above after you posted -- the Wayland protocol unfortunately does expose the screen size in millimeters to the client. I'm pretty sure Chromium and QT use this, which is why I need to lie to them.

Another question: is there any sort of "Wayland socket proxy"? Like a stupid simple program that offers up a $WAYLAND_DISPLAY socket to a client, opens its own to a server, and then passes all results through unmodified? It would be really easy to hack a change like this into a proxy like that, but I can't seem to find one. Sort of surprising; the simplicity of the Wayland protocol makes it seem like such a thing should exist. Sort of like waypipe, but without the cross-network magic.


Qt applications are almost certainly using the output scale property, I've tried those recently. I haven't tested Chromium's wayland backend recently, but Chromium is ignoring those physical values: https://github.com/chromium/chromium/blob/99314be8152e688baf...

Maybe your chromium is not using the Wayland backend correctly? If a Wayland application is using the physical values to determine scaling, instead of the output scale, that's a bug and you probably should consider reporting it. It's probably a mistake that the physical size information is there, it's unreliable as you have seen.

Edit: Rather than messing around with proxies, if you want to do a quick test to confirm, you could simply hack wlroots here to send a constant value: https://github.com/swaywm/wlroots/blob/b3e76d667855c16e97a4d...


I think Wayland compositors is a better option for it. That's where all the modern Linux desktop development happens these days.


I have a 2013 Chrome Pixel, with 240 dpi, and I’ve also found that the Xft.dpi setting takes care of nearly everything, although there are some interface elements in some applications that still display tiny type. The author seems to be suggesting that 163 DPI is high resolution; is it commonly understood to be so? Because that would look a bit fuzzy to me.


Every screen’s density of pixels are different.


No, I once saw two that were the same.


This is not true. Your pixel density is a function of your resolution and dimensions. “HiDPI” is a nebulous term that refers to high resolution screens that are also relatively small.


If DPI means “dots per inch”, then how can the size of the screen come into it?

And what is “not true”?


As the phrase "dots per inch" implies, it's a ratio. Specifically, it's the ratio between your resolution (say, 4K) and the size of your screen.

What is "not true" is that every screen's pixel density is the same. Different screens have different sizes and resolutions, which means they have different pixel densities.


Who said that every screen’s pixel density is the same? What kind of bizarre claim would that be?

DPI is a resolution. It is not the ratio between resolution and size. It already has the units of resolution: 1/length.


Sorry, I badly misread your original comment: I read it as "I saw [a reference saying that] the two were the same," not "I've seen at least two displays with the same density."


Oh, OK. No problem.


You need to go reread the comment you called not true.


My experience with 4k has essentially been that it's too small for the display sizes I want to have on my desk - and the fractional scaling support is always quite a bit of work to get right (as evidenced by this post).

I've found that 2k (2560x) displays at 27" give me great real estate without any need to scale.


1440p at 27" with no scaling is perfect for my desktop. I wish it were more popular, because it's a great compromise between quality and performance for gaming.

The relatively common use of "2k" to mean 2560x1440 has always struck me as confusing though, since it's not half as wide as 4k. Words mean whatever people agree they mean, but I feel like "2k" is too ambiguous.


I argue that 2K for 2560px is completely wrong term. Even 3K is better term.


Not related to HiDPI, but looking at the picture of his monitor setup:

From an ergonomics standpoint (i.e. the health of your neck), the upper edge of your monitor shouldn't be higher than your eye level. Looking straight or looking down is fine.

Thus the monitor on the left is probably not very ergonomic.


This is the recommended default, and I assume it is for good reason, but I would caution against adopting this on faith. If you feel pain, try different positions out. I had the setup you recommend for many years. At first I started getting shoulder pain, and realized that I was stopping over to accommodate when looking down. I forced myself (at regular intervals, and modest cost to productivity) to have better posture, and to tilt my neck downward instead. It fixed the shoulder pain but my neck was uncomfortable. Not in pain, but constantly uncomfortable. I kept with it because I knew this was the "proper" ergonomics.

Then I realized: I spend most of my time staring at the bottom half, to bottom edge, of the screen. Typing in a terminal, adding functions at the bottom of a file, etc. I moved my monitor such that my eyes are almost dead-center, vertically, and haven't had any issues since, neither neck nor shoulders. I kept the improved posture too, so I'm sure that ultimately helped too.

I'm sure the recommended is recommended for a reason, and works for most, but hopefully my experience is useful to some.


(slight followup correction: "stooping over" not "stopping over" but I cannot edit my post anymore)


Author here: you’re absolutely right. My previous vertical monitor was a lot shorter; this one has definitely not been great for my neck. I’ll probably switch it back to horizontal at some point.


What size monitors are those? I use a pair of 24" 4K monitors in roughly the same configuration as yours. 27" would definitely be too tall.

I find having a vertical 24" monitor to be super useful, especially for reading typical PDF files as an entire page fits nicely on the screen.

My vertical monitor is a bit closer to the desk, and the horizontal monitor is much higher than yours, about 7.5" above the desk. The bottom edges are not aligned; the vertical monitor bottom edge is about 5" below the bottom of the horizontal monitor, and its top edge is about 3" above the top of the horizontal monitor.

The top of horizontal monitor is just below my eye height. So the top of the vertical one is a couple of inches above that, but this is not a problem as I don't spend much time looking at the very top of that monitor.

Each monitor is tilted separately. The horizontal one is tilted back only slightly; the vertical one tilted more. So the center of each monitor is perpendicular to my line of sight.

At the moment I'm using the two monitors with a Linux desktop, but sometimes I connect them both to a ThinkPad for a three-monitor setup with the ThinkPad's display and keyboard. When I do this, I raise the horizontal monitor up another inch so the ThinkPad display tucks in underneath it.

With your horizontal monitor sitting so close to the desk, I have to wonder if your seating position may be too low? Are your forearms horizontal when typing? (Forgive me if I'm being nosy or presumptuous, just curious.)

One related tip for anyone reading this, regardless of your monitor arrangement: if you wear glasses, do not use progressive lenses at your computer. Instead, get a pair of single vision lenses with a fairly tall aspect ratio so they let you see the entire screen(s) without tilting your head back. Don't ask your optometrist for a "reading" prescription - that tends to be about a 16" distance, and your monitors are probably farther from your eyes than that. Measure the distance from your eyes to the monitors and get single vision lenses made for that distance.


> What size monitors are those? I use a pair of 24" 4K monitors in roughly the same configuration as yours. 27" would definitely be too tall.

They're 24", like yours. It sounds like our setups are actually nearly identical: the top of my horizontal monitor is also just below my eye height, and I spend most of my time looking at the middle of the vertical one. I should probably move them around a bit so I can actually look at the vertical one without turning my head, though.

> With your horizontal monitor sitting so close to the desk, I have to wonder if your seating position may be too low? Are your forearms horizontal when typing? (Forgive me if I'm being nosy or presumptuous, just curious.)

I appreciate it! I think my seating is right, at least based on what I was taught about ergonomics -- my forearms are horizontal and I don't have to pivot my wrists up or down while typing or moving my mouse.


Actually, I prefer looking up and down over looking left and right. So for me it is normal to have the upper monitor way above my eye level.


HPDI support on Linux is terrible. After stuggling to make it work consistently accross various applications on my Dell XPS 13 with HDPI screen, I conciously choose my next laptop to have just HD screen. I am so happy as everything is finally working normally.


Nope, I've had two 4k screens under Ubuntu Mate for five years. It's literally one checkbox in the control panel.


I tried ubuntu + i3 on a Lenovo x1 carbon. The fonts, the settings, the software, the differences, incompatibilities everyday just killed me. I went back to Mac + Magnet window manager. I could never know if I was actually seeing the actual setting I included/desired or the font was being displayed correctly. It Was a constant study of web searches and esoteric knowledge such as evident in this blog of what to do to get whatever you wanted that just worked on a Mac or even windows.


To provide some counter-anecdata, if you are in the lucky position to have screens that work at exactly 200% (4K at 24", 5K at 27"), then Ubuntu and its derivatives will Just Work out of the box. At least that was my experience around 2017-2019. Fractional scaling factors like 150% are tricky, but macOS does not properly support them either.

The only app that didn't pick up my DPI settings on Linux was Spotify, but all web-ish apps support ctrl +/-...which honestly strikes me as a big usability win over native toolkits.

I actually found my 5K screen more enjoyable under Linux than macOS, because the performance of JetBrains IDEs on Retina Macs seems to flip-flop between okay and abysmal every year.

Mixed-DPI setups are where macOS is truly superior.


macOS supports fractional scaling just fine, by rounding up and then downscaling. Sure it's not pixel-perfect to the display, but at such high ppi it's not really noticeable, and certainly much better than trying to get actual fractional resolutions working.

Apple tried true resolution independence in the early 2000s and it didn't work out. 2x and 3x scaling proved to be a much better solution.


I liked Apple's approach when the iPhone 4 and MacBook Pro 2012 came out. Adding @2x bitmaps was easy enough for developers, and the transition was quick and painless.

Since then, however, Apple's UI has (regrettably) become less about pixel-perfect bitmaps and more about black typography on white roundrects, and almost all Macs run at a scaled resolution by default, which is then typically used to display web-based apps that support fractional scaling.

I really think that Windows' approach has aged much better. Kudos to Microsoft. We could have our cake and eat it too if Apple sold Macs with slightly higher-resolution displays, and also a 5K 27" display because other manufacturers can't be bothered.


macOS does very similar thing that Gnome/Wayland is using; except in Linux land, due to misc reasons[1], it works only for Wayland apps.

Yet, there are cries from the Internets, that it is not pixel perfect and other solution (Windows/Qt/Android-like) should be used.

[1] Mostly that you can't update client side libs like you can in Mac and Windows land and be done; there will be always some legacy X11 client that will connect directly to display server and ignore your modifications. On mac and windows, apps cannot talk directly to display server, the protocol is not public.


Same opinion here. Whereas with Mac you’re setting things in one place across all of your apps, and it accounts for different scaling factors for different displays, this just seems like a huge pain to have to set things on an almost per-app basis.


Plasma wayland has per-screen DPI scaling to any percentage you want. Only thing holding me back from switching is screen recording/sharing woes.


Have you set up pipewire? It doesn't seem to be the default in any distros yet but it works fine for me. The only real issue is that the permission dialog on Firefox makes you select the window or screen to share twice. Not seamless but gets the job done.


Setting the DisplaySize in a /etc/X11/xorg.conf.d/90-monitor.conf is the way I like solving the DPI issue. The advantage is, that even the display manager (in my case SDDM) runs with the correct DPI.

If you want to find the correct dimensions for you monitor, this page might help you:

https://www.displayspecifications.com


Wayland has proper scaling support as a first-class thing, so you just configure it once and forget about it. This includes per-output scaling.

I was actually surprised when using a 2020 Mac that macOS doesn't actually support scaling on external displays. It seems that Linux/Wayland is on the lead here.


Yes! Scaling was a nightmare for me on X11 since I have a high resolution laptop screen and then a lower resolution external monitor. I switched to Sway and scaling, switching monitors etc. was so much simpler and just worked.


Is there any desktop environment, where you can set font, interface elements sizes in points (1/72 inch) and let the system handle all the scaling?


Nice.

If you, like me, are blind as a bat may I recommend `Xcursor.size` as well.

Anyone know how to get mouse cursor comet trails in Linux?


No idea how to get a comet trail on your cursor, but I did find oneko. Maybe it can help out if you want a cat chasing your cursor: http://www.daidouji.com/oneko/


Thanks for this! My mouse cursor is tiny on 4k monitors with i3


The author brought up mouse acceleration, which reminded me of why I switched back from Linux to a Mac: the mouse curves were terrible IMHO, with seemingly no way to change it. The parameters you can tweak in X11 didn’t let me change the fundamental curve, and to me where so much worse than Apple’s it rendered the system frustrating to use. I don’t know if wayland suffers from the same issue or not, but certainly googling around the many places were people similarly complained weren’t answered by a magic Wayland config. Does anyone know if it’s possible to get an Apple-like mouse curve in Linux?


i managed (well, to make sensitivity low enough it was usable even at 400dpi .. ) by recompiling libinput. yea really


Seems like a patch others (ie:me) would like to see


Shouldn’t that be something that’s pulled into upstream? And if they refuse, fork libinput?


Since the libinput upstream is one guy and nobody stepped up to help him, good luck with long-term maintenance of your fork.


It's this kind of defeatist, responsibility-shifting attitude that holds Linux back from becoming a widely used desktop platform. What's the worst case with a fork, it stagnates and remains mostly identical to the upstream?


Is it defeatist to point out, that there is no need to fork, just communicate with the upstream?

Especially if you do not have resources for maintenance. If you had them, and spent some time and effort in the problem domain, maybe, just maybe, you could come to the same conclusions and the results as the original you forked from.


Like I said, it is something that should be pull requested to the upstream, and if that fails, then fork. The fork is a secondary path to cooperation.

If the conclusions are that the platform cannot support cursor acceleration the proper way (i.e. the way all commercial OSes do), the premise of the library is flawed. If not, then it must be the whole OS that's flawed, because that needs to be possible on a desktop system. Linux's cursor acceleration sucks, there is no excuse for the whole UI layer of the ecosystem being as bad as it is.


I think "there is only one person working full-time on maintaining Linux input libraries and he's overworked" is probably a decent excuse. I don't know what you plan to do, but I would advise against forking unless you can hire a team to accumulate testing hardware and work on this for multiple years. What we have seen too often is these forks just fizzle and get forgotten, especially if they are only focused on getting one feature to work on one very specific device.


I’ve heard of tons of these patches people write for libinput, yet they are never in master. There’s nothing in the libinput README or FAQ about how they’re looking for help, and there are quite a few different authors. How is anyone supposed to know help is needed?


This post was going around a while ago, I don't think anything has really changed for the better since then: https://who-t.blogspot.com/2019/10/libinputs-bus-factor-is-1...


well i put it out there, but it was (apparently) very personal. I use zowie mice at 800 dpi and setting gui settings to 0.1 sensitivity (or 1 etc,c whatever is lowest) would result in several desktops worth of movement per mousepad (22 inches), really bad over all. i just wanted a single left-right corresponding with monitor edges which was never a problem across any os until recently which oddly started treating low dpi at 1000 hz really weird

i still got the patches somewhere and will post them soon. it basically let you set mouse speed scaling on a larger scale


what an incredible waste of time. messing with dpi, crappy apps taking from here and there in 2021. linux desktop userspace is a dumpster fire


Author here: as they say, Linux is free if your time is worthless.

In my case, my time isn’t worthless but I do enjoy configuring my system. The tweaks shown in those blog post took me maybe half a day; I haven’t needed to make any significant changes since.


well that is good that it holds up. i also have to make this tradeoff but am less enthusiastic about it obviously. switching to a single hidpi display has made it more tolerable with bigger DMs


The problem is largely X11. Wayland desktops such as GNOME + Wayland applications generally work very well with HiDPI.

Even X11 applications work reasonably well on XWayland on e.g. GNOME, as long as all screens use 100% or 200% scaling. Unfortunately, with fractional scaling things break down with X11 applications.


Indeed. I'm running my 4k monitor at 100% and just increase the fonts in the apps themselves. It seems to work fine and there's no scaling involved.


If you actually use a modern config like the defaults on Fedora it all just works. 100% of these problems come from people trying to use the proprietary nvidia drivers which is filled with issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: