Now all X-enabled windowing toolkits can use the rock-solid framebuffer without modifications.
Linux's Xorg distribution already has a framebuffer driver for Xorg (or maybe even two different ones) that ships with it, but we didn’t have the same “works everywhere” experience we do with our custom solution (based off of netBSD’s previous work), perhaps due to framebuffer bugs on Linux or maybe something else entirely.
The drawback is that the higher the screen resolution (we weren’t happy with 800x600 due to LCD scaling blur from horrible on-controller scalers and instead decode edid data from the monitor/adapter to find the native resolution) the slower refreshes to the screen get. At high enough resolutions there is a clear order of 10-100ms delay in updates.
And even if you don't want GUI framework, and want to write your app differently, you can start with cairo and pangocairo or something similar, so that you don't have to care about low level drawing primitives.
Also, the whole clock is basically 100 lines of fairly straightforward code in all its glory. How much would a DRM/Pango/Cairo soultion weight?
A minimal DRM example isn't much more, by the way:
And this does a lot more stuff that you want, like what connectors (think HDMI or LVDS internally) does my hardware have, which of those have displays connected, what modes are supported by those displays, and it can be very easily extended to do 3D rendering.
Beyond the HW planes, there is lots of stuff DRM enables that you simply can't do with /dev/fb but absolutely want, like true VSync synchronized rendering.
I'm not sure why you'd need to replicate anything complicated like wayland. You can use DRM API just like you'd use /dev/fb0, it just has more options/features.
As for cairo, obviously it's a heavier solution, but also more featureful, optimized and output neutral.
Does the Linux system console support Unicode in the first place?
edit: the linked header file explains that a single PSF2 can actually encode any number of glyphs (not just 256), and additional meta-information after the bitmaps spell out which Unicode code points OR combining character sequences each bitmap corresponds to.
There are terminal emulators, such as fbterm or kmscon, that use framebuffer and might have better Unicode support.
Anyone have any experience with it?
Was it easy to set up? Does it make for a comfortable environment? Is it relatively easy to write your own applications that can output on top of the frame buffer, e.g. if you are developing a game with OpenGL and want to develop in this environment and to test your game from there as well?
One could also use MPlayer + libcaca + framebuffer.
I have various files in Chinese and Japanese, and they display correctly for me in Linux, even under the console.
About the latency thing ... having recently switched back from OSX to Gentoo as my laptop's native desktop, Linux is great. Latency on all operations even in X11 are noticeably lower and more pleasant than even typing in iTerm2 on a top end Macbook Pro, and root on ZFS is awesome :)
Primary issues with console are UTF-8 support and irritations around mode setting, font resizing, and video chip (re-)initialization when booting and coming to/from X11, since my laptop (Dell XPS15 9560) has two graphics chips and they don't play perfectly nice together under many Linux kernel configurations. Angband latency, though, is awesome :)
Enjoy it while it lasts; The Powers That Be (mainly Red Hat) have decided that Wayland and compositing are the future.
--- before 2018-01-31 17:12:13.626560688 -0800
+++ after 2018-01-31 17:12:17.494580179 -0800
@@ -1 +1 @@
-uint32_t buf = mmap(NULL, len, PROT_READ | PROT_WRITE, MAP_SHARED, fb, 0);
+uint32_t *buf = mmap(NULL, len, PROT_READ | PROT_WRITE, MAP_SHARED, fb, 0);
The current interface is DRM (drivers/gpu/drm) and there specifically, atomic modesetting:
So you can have multiple framebuffers in the same time with arbitrary color formats (like 8-bit grayscale, 16/32-bit (A)RGB, YUV2, etc.). Bilinear scaling, alpha-blending and mirroring/90/180/270 degree rotations are often supported as well.
It's important to understand these framebuffers are not composited anywhere else, but are actually scanned to the display device in real time, on the fly.
Traditional framebuffer can of course be emulated by setting a single layer to cover full screen without alpha-blending.
Starting with Android N, Google has been locking down what is possible to do via the NDK and apps can no longer directly read outside their own sandboxed filesystem, so no /dev.
For a single screen embedded application it might be more than enough and you don't need to load a WM, etc