In a world where one user has a 1366x768 screen and her neighbor has a 4k display, we need to move past interface code specifying pixel sizes up front, both on the web and off.
The problem that hits abstract units is that for features which are small, you never really know how big they will be. It's easy to end up with elements which should look similar but some are three pixels wide and some are four, or the spacing is a little ragged from the rounding. Some kits avoid this by just antialiasing everything. Then you try to draw a detailed graph or a musical score and some of your horizontal lines are sharp and crisp because they hit solidly inside a pixel, and some are a grey smear because they hit a crack.
This is probably a large reason why many graphs and charts you see on the web look like they were drawn by a child with a felt tip marker. A 3 pixel wide line is pretty forgiving about where it falls.
(I'm mostly kidding but in my experience, the sorts of apps that you build with this sort of library invariably end up dealing with pixels anyway. There's no religious reason why resolution-independent scaling code has to live in the UI primitive library, especially since it probably can't be confined there.)
p/s: obviously I have NO idea how easy / complex this would be, just an idea given that sometimes text contained in an element bleeds / overflows / outright missing
And as long as we have photographs, we're going to have raster images, for the foreseeable future.
What is the appeal of the implementation hiding in the header file? Is it a 'thing' nowadays? It makes every file that includes it read all 15k lines, and makes you put a special #define in one of your source files, just to avoid having a C file. That doesn't feel like a good trade-off to me. If I used this code in a project, I'd build a static library out of it first thing, and go about my business.
If compile-time is a concern (it isn't in my experience, at least if the rest of the project is C++ code), you can still wrap the library-header into your own header and put include-guards around it :)
PS: also see here: https://github.com/nothings/stb#why-single-file-headers
It's an absolute mess, the lib files are incompatible between Visual Studio versions, whether threaded/non-threaded/DLL/static MSVC runtime is used, whether linktime-code-generation is used, if there's C++ code in it, a release lib cannot be linked against a debug executable, etc etc etc...
Only sane solution is to drop the sources directly into your project. Whether it's just a header or a header/source pair is just a small detail, admittedly.
It needs to abuse the preprocessor to avoid having multiple conflicting copies of the functions.
The implementation mode requires to define the preprocessor macro
NK_IMPLEMENTATION in *one* .c/.cpp file before #includeing this file, e.g.:
Having now read the rationale for single header libraries, I am completely unconvinced. These people either don't know how their tools work, or are making bad choices to coddle people who don't. Even forgetting the wasted time parsing the #ifdef'ed out implementation section over and over, it screws up separate compilation. Every time you tweak the implementation section, all the files that #include it will recompile for no good reason.
Even if the compile time is affected (which I never noticed) this is only a minor inconvenience compared to the wins those 3 header files provide.
I prefer the 2-file approach myself, but the basic idea of distributing source files that the user adds into their own project is a good one. I much prefer it to having a solutionful of random projects, each of which inevitably needs its settings carefully tweaking to make sure everything works properly. Though I do admit that is not a very high bar.
It seems that you're expected to provide the layer between the OS and nuklear. They probably have some default implementation for demo's and stuff.
So, you glue it to the OS yourself. That's... well actually I guess if someone marries it to SDL it could be very useful.
it might be just the screenshots, i've not tried any of both, but comparing the two README files, i cannot say that they are real alternatives.
even if -- very likely -- both are very good toolkits!
I tested the imGui and changing the code allowed to select a bigger font. I wished it was possible to change the widget look and feel because it looks minimalist.
X11 example doesn't seem to run here, on my debian sid+nvidia blobs drivers. The GL* do work tho, abeit with un unreadable font (I'm on a 32" 3180x2160 screen)
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 70 (X_PolyFillRectangle)
Serial number of failed request: 29
Current serial number in output stream: 102
Or a straight port to Rust (either imgui or nuklear)
A demo program can be found here:
A more complex, feature complete demo can be found here:
Surprisingly, in a lot of GUI libraries, it's not an easy thing.
gcc main.c -std=c89 -pedantic -O2 -D_POSIX_C_SOURCE=200809L -o bin/zahnrad -lX11 -lm
$ ldd bin/zahnrad
libX11.so.6 => /usr/lib/libX11.so.6 (0x00007f8468746000)
libm.so.6 => /usr/lib/libm.so.6 (0x00007f8468441000)
libc.so.6 => /usr/lib/libc.so.6 (0x00007f84680a0000)
libxcb.so.1 => /usr/lib/libxcb.so.1 (0x00007f8467e7d000)
libdl.so.2 => /usr/lib/libdl.so.2 (0x00007f8467c79000)
libXau.so.6 => /usr/lib/libXau.so.6 (0x00007f8467a75000)
libXdmcp.so.6 => /usr/lib/libXdmcp.so.6 (0x00007f846786f000)
There's no reason you couldn't take the same approach but only process/redraw in response to an event (e.g. on a mouse move or click). But since immediate mode GUIs tend to be used in contexts where you're already drawing every frame (games and interactive tools), people tend to use them this way as it fits the best.
The main defining characteristic of an immediate mode gui is it tries not to store state for each widget. Rather than having a large "object graph" (retained mode), you take a more functional approach and draw what you need every frame. There's much less bugs/complexity trying to keep your Model and View in sync, but somethings that need extra state (e.g. tree view expanded state) get more complicated.
I did not dig further, but I like the general idea, as GUI toolkits are biiiiig.
- In the Makefile, use
LIBS = -lglfw3 -framework OpenGL -lm -lGLEW
- In the shader code, change "#version 300 es" to "#version 400" for both the vertex and fragment shader.
I'll try to make a PR with conditionals if I can find the time today.
You need to install glfw3 and glew via brew:
$ brew install glew
$ brew tap homebrew/versions
$ brew install --build-bottle --static glfw3
$ brew install homebrew/versions/glfw3
$ cd examples; make all
Someone concerned about licensing can't touch this with a ten foot pole.
Header only libs just make all these problems go away, and (IMO) there's not really any downside.
Well, the downside is that it's literally in one header file. Besides being gigantic, it's extremely annoying to search and parse by hand - this is an issue when that's also serving as the documentation. Especially in this case, it would be much nicer to have a "nuklear" folder with a bunch of separate (appropriately named) headers. The single `nulkear.h` header could just `#include` them as necessary.
IMO, 20,000 is already way to much, but when that reaches 100,000 even just editing it will be a chore, let alone using it. It will simply be unmaintainable and unusable. I say this having worked on code bases which have reached that point and attempted to deal with the issues it creates. There's a reason every decently sized code base out there uses multiple files and a hierarchy. It's a natural way to create modular code, and modular code is simply better code.
That's why I still use a buggy BRIEF clone instead of a modern IDE. If I were forced to use someone else's idea of a code browser that was architected around files rather than subroutines, I'd probably be more inclined to agree with you.
The idea that the most appropriate way to organize source code happens to correspond to something that a (likely-long-dead) OS implementer chose to call a "file" is a notion that doesn't get challenged often enough.
On the same note, proper modularization of source files also encourages better discipline for dependencies among modules, such as discouraging circular dependencies.
If it's really light-weight and self-contained it will be very useful for some embedded systems with a screen.
linux-vdso.so.1 => (0x00007fff46f49000)
libSDL2-2.0.so.0 => /usr/lib/x86_64-linux-gnu/libSDL2-2.0.so.0 (0x00007f3587140000)
libGL.so.1 => /usr/lib/fglrx/libGL.so.1 (0x00007f3586f5e000)
libGLEW.so.1.6 => /usr/lib/x86_64-linux-gnu/libGLEW.so.1.6 (0x00007f3586cf2000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f3586934000)
libasound.so.2 => /usr/lib/x86_64-linux-gnu/libasound.so.2 (0x00007f3586647000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f358634a000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f3586146000)
libpulse-simple.so.0 => /usr/lib/x86_64-linux-gnu/libpulse-simple.so.0 (0x00007f3585f42000)
libpulse.so.0 => /usr/lib/x86_64-linux-gnu/libpulse.so.0 (0x00007f3585cf9000)
libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007f35859c4000)
libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007f35857b3000)
libXcursor.so.1 => /usr/lib/x86_64-linux-gnu/libXcursor.so.1 (0x00007f35855a8000)
libXinerama.so.1 => /usr/lib/x86_64-linux-gnu/libXinerama.so.1 (0x00007f35853a5000)
libXi.so.6 => /usr/lib/x86_64-linux-gnu/libXi.so.6 (0x00007f3585195000)
libXrandr.so.2 => /usr/lib/x86_64-linux-gnu/libXrandr.so.2 (0x00007f3584f8c000)
libXss.so.1 => /usr/lib/x86_64-linux-gnu/libXss.so.1 (0x00007f3584d88000)
libXxf86vm.so.1 => /usr/lib/x86_64-linux-gnu/libXxf86vm.so.1 (0x00007f3584b83000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f3584965000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f358475d000)
libatiuki.so.1 => /usr/lib/fglrx/libatiuki.so.1 (0x00007f3584640000)
libpulsecommon-1.1.so => /usr/lib/x86_64-linux-gnu/libpulsecommon-1.1.so (0x00007f35843e1000)
libjson.so.0 => /usr/lib/x86_64-linux-gnu/libjson.so.0 (0x00007f35841d9000)
libdbus-1.so.3 => /lib/x86_64-linux-gnu/libdbus-1.so.3 (0x00007f3583f94000)
libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007f3583d76000)
libXrender.so.1 => /usr/lib/x86_64-linux-gnu/libXrender.so.1 (0x00007f3583b6c000)
libXfixes.so.3 => /usr/lib/x86_64-linux-gnu/libXfixes.so.3 (0x00007f3583965000)
libwrap.so.0 => /lib/x86_64-linux-gnu/libwrap.so.0 (0x00007f358375c000)
libsndfile.so.1 => /usr/lib/x86_64-linux-gnu/libsndfile.so.1 (0x00007f35834f5000)
libasyncns.so.0 => /usr/lib/x86_64-linux-gnu/libasyncns.so.0 (0x00007f35832ef000)
libXau.so.6 => /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007f35830ec000)
libXdmcp.so.6 => /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007f3582ee5000)
libnsl.so.1 => /lib/x86_64-linux-gnu/libnsl.so.1 (0x00007f3582ccb000)
libFLAC.so.8 => /usr/lib/x86_64-linux-gnu/libFLAC.so.8 (0x00007f3582a81000)
libvorbisenc.so.2 => /usr/lib/x86_64-linux-gnu/libvorbisenc.so.2 (0x00007f35825b1000)
libvorbis.so.0 => /usr/lib/x86_64-linux-gnu/libvorbis.so.0 (0x00007f3582385000)
libogg.so.0 => /usr/lib/x86_64-linux-gnu/libogg.so.0 (0x00007f358217e000)
libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007f3581f61000)
In my case I integrated imgui with my "weekend engine" Oryol, which has a very slim wrapper around 3D APIs, and I have this running in WebGL (via emscripten) and GLES2 on Raspberry Pi 2 (should also work on the original Pi). The resulting executables are small enough for most 32-bit embedded platforms (a couple hundred KBytes), and have a much shorter list of system dependencies. On Raspberry Pi it doesn't even need a window system, it runs straight from the Linux terminal and uses the Pi's EGL wrapper.
For instance here's the WebGL imgui demo (325 kByte): http://floooh.github.io/oryol/asmjs/ImGuiDemo.html
To get on something on screen, it provides a number of backends, including SDL. However there is also one that talks to X11 directly, providing the minimum dependencies needed on Linux (as far as I'm aware).
Basically the library generates a list of drawing commands. It is up to the underlying backend (SDL, GLFW, X11, your own embedded one,....) to draw these on-screen.
This is what ldd on my build of SDL looks like:
linux-vdso.so.1 => (0x00007ffda33d9000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007ff1e4c51000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007ff1e4a4d000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007ff1e4830000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007ff1e4628000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007ff1e426a000)
So that list is not SDL's contribution and that demo was just linking to a lot of things, most of which look unnecessary (and sloppy, though this also makes a case why people want header-only libraries because many people don't want to think about this part). For example, why is a gui demo demo linking to FLAC and all the Ogg Vorbis libraries? SDL doesn't use or depend on these either.
So in theory you could just write the rendering code for ncurses, browser, etc.