Hacker News new | past | comments | ask | show | jobs | submit login

True! VNC is not ideal because it's pretty old by now. Chrome Remote Desktop would've been a better example, and even that is behind what can be done, as I believe it still uses VP8. It's possible even a lossless codec like ffv1 could be plausible in the window of 8 MiB/s, but I'm not sure it's necessary, as even old h264 does a pretty convincing job at very low bitrates.

Here's a snippet of my 2256x1504 screen, uncompressed:

https://files.catbox.moe/9k6cnm.png

Here's a snippet of my 2256x1504 screen from an OBS recording:

https://files.catbox.moe/va46ze.png

This is using x264 at just 0.5 MiB/s. Not even pegging a CPU core.

If you move really fast, then there are some artifacts during motion (same recording):

https://files.catbox.moe/myhnc7.png

...But they are not really noticeable in motion, and it clears up quickly.

I don't have a high framerate, high DPI display to test, but I'm guessing most people will only strongly care about one or the other since displays that do both are pretty expensive.

And yeah, chroma subsampling on subpixel rendering should impact legibility, but in practice it's difficult for me to tell any difference.

I've played around for a bit and I don't go above 1 MiB/s so far. I probably would need to play a video for that.




> Here's a snippet of my 2256x1504 screen from an OBS recording:

I wouldn't be able to stand something like this at all, it looks horrible to me. The text is all smudged.


The only place where the text looks remotely smudged to me is in the low contrast bits in the header. It’s very difficult for me to tell the difference otherwise, especially considering that it’s high DPI.

And h264 is old, and I’m using software x264 with fairly modest settings. More modern general video codecs like h265, VP9, perhaps even AV1 can eek out slightly better fidelity at similar bitrates, at the cost of higher complexity. (But if it can be hardware accelerated at both ends, it basically doesn’t matter.)

And these codecs are designed for general video content… it would be instructive to see exactly what kind of performance could be achieved if using lossless codecs or codecs designed for screen capture like ffv1 or TSC2.

It would be… but honestly, there’s no point, because all I was trying to illustrate is that I sincerely doubt 8 MiB/s is the best that can ever be done for a decent desktop experience. Judging by Qt issue reports, it’s worse than what Qt used to be able to accomplish. If you really like your X11 setup, there’s no reason to change it, because it isn’t going to become unusable any time soon. Even if you switch to Wayland in the future, you should still be able to use `ssh -X` with Xwayland as if nothing ever really changed.

This is all a serious tangent. The actual point was that again, X11 doesn’t have any built-in scaling. All along, it was Qt 4+, GTK 3+, and other X11 clients that have been handling all of the details. And traditionally, it wasn’t good. And even contemporarily, it still has issues. Beckoning to the “way X11 did it” makes no sense because 1. X11 as a protocol or server never did anything 2. Even then, historically toolkits have had a lot of trouble dealing with it. The fact that you set the DPI for Xft specifically, which is just a font rendering library, hints at the reality: what oldschool X11 “scaling” amounted to in the 2000s was changing how font sizes were calculated. Modern toolkits just read this value to infer the setting, and it still isn’t good enough for many modern setups that Linux desktops want to support.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: