No matter how you implement fractional scaling, it's not going to look good. The reason is simple: half pixels don't exist. If an app wants to draw a 1px solid border, it can't draw it on a 1.5 scaled display because 1.5 pixel doesn't exist.
It looks good on macOS for years now. Even Windows figured out a way to make it work on those displays (although it's worse than on macOS).
So your explanation, while common amongst Linux devs, is a rather cheap cop-out in comparison to the bigger two competitors who have mostly solved the problem.
The end result is that Linux DEs can be horrible to use on many HiDPI laptops, which don't have exact resolutions to make integer scaling feasable.
macOS doesn't expose fractional scaling to applications, it is done by the output encoder by scaling the entire framebuffer. For software, it is strictly integer only.
It is also careful with the resolutions it supports, it is not just any random scale requested. It is always to scale down, but not lose many pixels. For example, with 2560x1600 physical display (rMBP13), you can go to logical 2880x1800 (@1.78 scale) or 3360x2100 (@1.52 scale), but not further.
With Linux (and also Windows), people have weird requests that would not fly in macOS world.
>macOS doesn't expose fractional scaling to applications, it is done by the output encoder by scaling the entire framebuffer. For software, it is strictly integer only.
I haven't looked into the sources, but my impression was, that it was using GPU to scale surfaces and then compose into final framebuffer, that is the same resolution as display. Not that it uses output encoder.
The quick test is, when I do a screenshot, I get a file with dimensions same as physical display, not logical resolution. On macOS, it is exacly the other way.
Scaling with GPU is more flexible; you can do things per surface instead of per desktop; but slightly more complicated (because you do things per surface instead of per desktop), takes away GPU capacity/memory bandwidth from applications when then might it, and is more power hungry when the GPU could be idle.
And that is an ok solution as well - something Wayland or other Linux compositors can't do and force you to be stuck in either "too tiny" or "too large" uncanny valley of UIs.
Because Linux and Windows users ask for different things than Mac users.
With 2560x1440 display, you won't get fullhd@2x sized desktop (scale 133%) on Apple. That something expected in Linux or Windows, but Apple will get you at most 1680x1050@2x (152%, they're 16:10, so different height) here.
It is compounded by the issue, that Apple traditionally used 72 dpi for @1x scale, while Windows and Linux used 96 dpi. Apple apps look quite good at lower resolutions, while Linux (ok, Gnome) dug itself into a hole and looks good on dpis much higher than 96. For Apple, it is acceptable to run 2880x1900 or 3360x2100 on 2560x1600 display, but when Gnome looks passable on 1600x900@1x, good on 1920x1080@1x and you've got 2560x1440, so you need to display 4K (1920x1080@2x) logical resolution on that... that's more "interesting".
That's not how scaling works. Click the reply button on this comment, then when you see the form press ctrl++ and ctrl+- to increase and decrease zoom. The borders of the input/submit fields do not get blurry, despite being 1px thick.
I see your point. Mine is, if a browser can do that, GTK and Qt applications can do that. Of course the compositor works on a different layer, but it still can resize non-compliant applications.
And we hit the typical problem with software evolution.
The apps would have to cooperate, and that would need rewrite. In many cases, re-architecturing them.
Who is going to do that? Nobody. We need to live with the legacy that we have. How resizing non-compliant applications works (and looks), see the experimental fractional support in Mutter.
Most if not all Windows applications can be resized properly without being updated. Remember that it's the widget toolkit (Win32, GTK, Qt, etc) that is doing the resizing, not the application.
The app tells the toolkit to draw a button at 10,10 with a size of 100,20. The toolkit multiplies everything by 1.5 and then draws the button. The app doesn't need to know that this is happening at all. Of course this is not possible with some applications that do weird things, but for the 99% this works, and the 1% can just be resized by the compositor. That's exactly what Windows does.
In the Linux world, you cannot rely on the widget toolkit doing the heavy lifting. The API is defined on the socket protocol level, not on the symbols of the library level, so you have to handle it there. The old xeyes or xfontsel, or other athena, motif, openlook, fltk or whatever forgotten apps, that conform to x11 protocol, must be handled too. The scaling at compositor level is the easy part, Wayland compositors have per-surface flag, how it is supposed to scale in the composition, they are aware also of the scaling needed by each display. The issue is, that the X11 clients - unlike Wayland clients - do not announce what they are capable of. As you noted, both Windows and macOS can fix this in the framework, where Linux cannot. macOS even took the shotgun approach and required a flag in the executable header for the scaling capability announcement part. Such approach is not passable in the Linux world.
But Windows is not a garden of roses either, I still see a fair share of blurry dialogs (until recently, even vscode installer/updater).