I've got a few patches in pkg-config[1]. That was the kind of contribution I like to do: small, improves behavior while removing code, very central and popular tool...
I gave up contributing after being ignored[2]. Around the same time, other people posting patches to the mailing list were also ignored. Sad.
I can't describe how much I hate purely e-mail merge requests. If your e-mail doesn't get a response in a week or so, everyone will have forgotten about it and the relevant people have gotten 1000 new e-mail since yours. With something like gitlab/github/sr.ht/gogs/gitea/anything else, your MR will remain in the relatively short (compared to an inbox with all incoming e-mail) list of open merge requests, clearly visible in a web interface.
Issues you create are the same way.
I very, very rarely experience having my MR or issue completely ignored for years with projects which use a web issue/MR tracker, except for with unmaintained single-person hobby projects. But it seems to be the rule with e-mail, even for actively maintained serious projects.
I used email workflows before. And on a project I used to work on (Glasgow Haskell Compiler), people would originally just attach .patch files to issues in the bug tracker we used, called Trac. This was how it worked for years, and while it did work, it had some similar outcomes: tons of patches that just silently were forgotten. The only saving grace is that I had a search macro setup that could just find "WITH attachment AND status = 'Ready for Review'" but of course that requires people knowing to fill out the forms properly to specify that this issue has a patch and that it fixes the problem and it's ready for review. Why would someone who spent 2 hours writing a bugfix know this? In contrast I have hundreds of hours in this realm so it's second nature. But it's a joke to think it's intimately or immediately familiar.
Occasionally someone would email me and ask to look at a patch and ask if they did something wrong, which is why it got ignored. Ignoring any actual technical complaints, probably 80% of the time my answer was "Actually, I just missed this one" or "It didn't have the "patch" field set so my search didn't find it." I had to do this annoyingly often; it was annoying. I didn't blame anyone for it however. But pretending it's not annoying is also a lie.
Honestly I don't think GitHub is the end-all of project contribution and it still has some really annoying flaws. I don't like the branch-merge workflow; I prefer non-branch-based merge workflows, which email does provide. But beyond that there are almost no advantages to it in my experience.
I also suspect that there's a reinforcement mechanism going on these days. People who think email is good for this task already love email. They think it's amazing. In some sense I can agree, and in the past, it really was amazing. But less and less people even "bother" with email in my experience than ever before; to them, it's a necessary evil where they get shit shoveled at them 24/7/365, all so that they can pluck a single piece of information from the shitheap of trash; where they can go to reset their account passwords. It is not a bright shining feature of the internet to them.
Email is a fault tolerant communication. If you aren't getting a response then... reply back. If you still aren't getting a response then the community isn't interested at the moment. This is how open source works and how a small collection of maintainers can keep their sanity and avoid burnout in the face of thousands of people demanding time and attention. No one can possibly juggle all of the requests and such like yours. Creating an issue, sending a pull request, etc. costs the maintainer time and that time is incredibly valuable.
The maintainers' time is valuable, yes, which is why it's good to have a system where they can get to something in their own time that doesn't rely on contributors spamming on a mailing list.
> I very, very rarely experience having my MR or issue completely ignored for years with projects which use a web issue/MR tracker, except for with unmaintained single-person hobby projects.
I mean, it might not be "completely ignored", but after the initial triage by a maintainer I have definitely experienced extremely large projects going years ignoring something from the core while occasionally random third parties show up to cry. The worst part is that if people stop actively commenting with "me too"--even if an official developer chastises them for it every time--the issue often gets auto-locked/deleted. (Or, in the case of a pull request, the code will rot to the point where something auto-closes it due to it no longer merging.)
Back in the day, I also posted a bunch of patches to pkg-config, but all of them were ignored by the maintainer(s). I'm not surprised people eventually started creating their own rewrites because the project was very poorly run.
for one thing, pkg-config really should do -isystem instead of -I for headers, -isystem helps editor to not bother checking them for syntax conformity, otherwise my vi sometimes produces thousands of lines of warnings for those standard header files, and I have to add isystem to them manually. cmake smartly converts pkg-config's -I to -isystem nowadays, which is great.
That really seems like it should be the purview of the build system (such as cmake), not pkg-config. The flags provided by pkg-config should be as simple and obvious and easy to reason about and as portable as possible (and, worst case, as consistent as possible so you can convert it to some different flag format, as arguably is being done there by cmake). The goal of -isystem has nothing to do with syntax highlighting, even if your text editor may also be reading it (though I am unsure from where, lol)... it has non-trivial effects on things like #include path ordering.
Slightly off-topic; One thing pkg-config really needs added is a .pc syntax to say when static linking is unsupported by the package as-installed, which results in `pkg-config --static` invocations for the package returning EXIT_FAILURE so attempts to do so clearly fail without wasting peoples' time.
Some distros have been deliberately breaking static linking, but the .pc files and pkg-config on those systems continue to mislead build processes into attempting static builds. Arch in particular has started just silently omitting .a files, even for libraries that have no problem with static linking. It's infuriating if you don't know they've done this and are trusting `pkg-config --static` to Just Work. Arch should be able to stick something in the .pc files for when they've omitted the .a file. As-is it's just producing a broken development environment.
hah I was going to say the same thing- it's also maddening when a package's autoconf accepts --disable-shared --enable-static and then fails to pass --static to pkg-config.
I understand how we got here, but the pkg-config format being the least common denominator of build-tool package discovery makes me sad.
I support it, barely, in most of my packages, but I never use it. cmake and vcpkg have completely supplanted it in my workflows, and cmake is quickly approaching the point where we might be able to describe it as the new least common denominator.
Everyone hates cmake, of course, but to paraphrase Stroustrup: there are only two kinds of tools, the ones people complain about and the ones nobody uses.
AFAICT to use CMake for dependency discovery you have to either be or require CMake, the full interdependent coupling of a macro language and half-of-a build tool. Regardless of the merits of either part, that’s a lot to implement and a lot to commit to.
Pkg-config, on the other hand, is rather small. The initial implementation was bulky enough to annoy people into at least two rewrites, but the concepts are few and the requirements on the host build system practically nonexistent.
(One place where the CMake-like approach won is the editable form of scientific documents. I like TeX to bits, but you can’t even tokenize it without being TeX, and even if you are essentially the only operation you can do on it is typeset, those even feed into each other. Thus the current two decades of half-working AST-producing parsers and ugly HTML generators and DVI extractors and whatnot.)
So, no, I’m not the least bit sad CMake didn’t win the game, not any more than I’m sad autom4te didn’t. Pkg-config isn’t ideal, but as far as not forcing the whole world into its image it’s miles better.
> Pkg-config, on the other hand, is rather small. The initial implementation was bulky enough to annoy people into at least two rewrites, but the concepts are few and the requirements on the host build system practically nonexistent.
yet all these years and it's still a pain on Windows with the Visual Studio world. And if it doesn't work with VS (as in, you can have a VS solution which uses pkg-config automatically to find libraries / flags / ... without having to fudge with scripting it yourself) it's 100% irrelevant to any cross-platform discussion (and thus most of C / C++ / native ... software development).
If you are serious about cross-platform, the answer is simple, and has been the same for decades: don't use Visual Studio to build for Windows, as it isn't offering you anything of value. This answer used to be maybe a bit controversial--hell: I even supported Visual Studio for years back in the early '00s--but now even Chrome builds with clang... if it is good enough for Chrome, it is good enough for you.
the main soft I develop (https://ossia.io) has been built through clang/libc++ on msw for 4+ years now so you are preaching to the choir, but many companies insist in building with msvc and you cannot just tell them "no"
There’s “builds on Windows” cross-platform and there’s “works on a big-endian MIPS router with a 2.6.x series Linux kernel and an ancient Busybox” cross-platform. And a lot more. All of those are important to somebody. So “100% irrelevant to any cross-platform discussion” for lack of Windows support sounds overexaggerated to me, let alone for lack of Visual Studio support.
Also, well. Before the article under discussion, there was no decent Windows support, because no pkg-config user was willing to donate their time or spend their money to improve the Microsoft Windows ecosystem. Now somebody came around and there is. Perhaps that will happen to the Microsoft Visual Studio ecosystem as well, in time, or perhaps not. Probably not going to be me, though.
(In the case of CMake, the support appeared when Microsoft spent their money on improving the Microsoft Visual Studio ecosystem, because C++ programmers not using Visual Studio became numerous enough, and CMake became popular enough among them, that not being to interoperate with their work diminished the IDE’s value. I don’t see that happening with pkg-config, mainly because a lot of its usage is in the C world, and Microsoft doesn’t really care all that much about that.)
CMake is not a build system, it does not replace make, or vsproj, or even pkgconfig.
It's a _meta_ build system - that is it generates makefiles/vsproj/etc.
CMake does not _find_ libraries magically. You can have CMake modules to use pkgconfig, or custom search scenarios, or whatever else you want, to find libraries and flags.
Really it's two different things working at different levels, one does not replace the other.
CMake has an independent module resolution system with Find*.cmake files. It's not great, and not a lot of libraries support it, but if you're on a Linux system you may find those Find*.cmake files in /usr/lib/cmake.
I'm referring to the dedicated PackageNameConfig.cmake files that you'll find pretty much everywhere these days, not the old FindPackage.cmake modules.
This is correct. CMake is a configuration tool, not a build system. It generates make/xcodeproj/vcsln/or the like. It's meant as a replacement for autoconf or the ./configure scripts previously used to auto-configure Makefiles.
CMake-generated Makefiles call CMake at build time—unlike autotools where m4 and Perl are not required once you’ve built the distribution tarball. If you really really want, with autoconf alone you can even write the config.h by hand, the template contains lists all the defines that should go there. (With automake added to the mix, you probably can’t.)
> to paraphrase Stroustrup: there are only two kinds of tools, the ones people complain about and the ones nobody uses.
Yeah, no, one of the interesting things is that we routinely use some tools without even really thinking about them - we don't complain about them because we never think of them at all even though we're using them, they're that unobtrusive.
In its original context Bjarne's claim is belied by the StackOverflow surveys. You can choose to imagine that "nobody uses" Rust, which famously keeps topping the charts, but not far down the list is Python.
It would be nice if our industry could manage better than .pc files but I'm an old man and when I wrote all my early code this didn't exist, your best chance was to cargo cult some auto-detection of common libraries.
What's the problem with pkg-config? You mean the slightly inconsistent behavior across those different implementations? Are they really relevant in practice? I never really had problems with it.
Exactly my thoughts. We can do better than both these solutions but it's going to take an industry to adopt the practice. I like what vcpkg is doing... I'm a cmake guy myself, but I always found pkg-config to be useful to the point where you're upset when it's not present and that presents a problem. Build chains need consistency, reproducibility, and some form of order.
Flag soup is terrible for dependency management. It only works for a single compiler (family), there's no semantics of what each flag might mean, undefined ordering or grouping of flags like `-ffast-math` or `-fno-fast-math` (which are ABI-affecting) means you don't know what you'll actually get. More fun, if you have multiple prefixes of dependencies, one `-L` can change what any following `-l` flag means (and is primarily why I think that absolute paths to libraries is far better on the link line).
I feel like pkg-config should really be returning a list of linkable libraries as files instead of using -L/-l, but I figure it is having to do the latter because of something with libtool and .la, right?
Very nice, and I liked the detailed description of implementation choices. A bunch of which I personally wouldn't agree with (of course) but that's programming for you. :)
As a minor detail, even as a long time computer user of the technical kind, I found it difficult to distinguish between "pkgconfig" and "pkg-config" as being two separate things throughout the text, probably since the dash is silent when I pronounce these words. Not sure how to fix it, perhaps by actually making the names longer ("the Freedesktop pkgconfig" and "pkgconfig.org" for instance).
> If a crazy person — or well-known multinational corporation — comes along puts has a space in their system’s installation “prefix”, this .pc will not work.
Comedy option: `C:\PROGRA~1` still works as expected on my Win10 box.
I love to see slim alternatives to bigger, older tools (acknowledging that both versions have their place in the world).
However, I have to ask: why C? Apparently the author is very comfortable using it, and I'm sure using C has enabled them to make a very efficient implementation with a small binary size.
But this program is just slinging around strings, right? Is manual memory management actually important, or even desirable?
ANSI-C (and to a lesser extent C99) is pretty much the only language which you can compile anywhere, forever. For this project portability is the #1 requirement.
If he used, say Rust, then the target system needs a rust compiler, and furthermore, the language is not even close to being as standardized a C is at the moment, i.e. who knows if in the future you will still be able to compile your program that you write on 2023's dialect?
Lua is pretty small, portable, and easy to build; you can ship the lua source code with a "lua-pkg-config" so you don't need to install anything else. Lua is not my favourite language, but for something like this it might be a nice fit.
Yes, the official Lua implementation is written in a very widely supported subset of C, so is probably one of the most portable languages along with C. Lua might well be a bit more concise for a task like this string processing, but in such as case it's just syntactic sugar really. The C is uncomplicated and efficient.
Everyone and their dog can compile C, and the initial purpose of this project was to get something working on Windows. As you say, this is a relatively uncomplicated program, and there's certainly no need for higher-level features.
Is the only time you choose C when you need manual memory management? I suspect that had nothing to do with their decision.
Rust would have been a better option. But probably it's just because the author is more familiar with C. Maybe also because the author wants it to be more cross platform.
I gave up contributing after being ignored[2]. Around the same time, other people posting patches to the mailing list were also ignored. Sad.
[1] https://gitlab.freedesktop.org/pkg-config/pkg-config/-/commi... https://bugs.freedesktop.org/show_bug.cgi?id=98215
[2] https://lists.freedesktop.org/archives/pkg-config/2018-May/0...