Hacker News new | past | comments | ask | show | jobs | submit login
Are We Modules Yet? (arewemodulesyet.org)
96 points by dureuill 20 days ago | hide | past | favorite | 90 comments



I've been working professionally with c++ since 2001, and I'm currently a team lead.

The compensation packages we are able to offer to new hires means we're generally hiring from the middle of the talent pool, not the upper tier.

The complexity of c++ has long since outpaced the pace of fluency of the hiring pool. In my experience, the average c++ professional (that applies for our open job ads) knows c++14, and might not balk at an occasional c++17 feature.

It doesn't matter if the compilers support modules or not, because in practice, I won't be able to use modules in the workplace for ages.

--

Standard disclaimer - I'm not able to predict the crush of changes coming as generative AI for software development proliferates.


I haven't coded c++ professionally since a couple of standards ago. However, I believe that something like c++ modules can be picked up quickly even by "middle of the talent pool" devs, because it's a useful feature for them. What might hinder modules adoption, beside availability of the compilers, is the rest of the tooling ecosystem and the particular idiosyncrasies that most c++ projects have.


In every major project I've been involved in (and it's not terribly many, to be fair) things keeping us on previous versions were almost always libraries or other support software, rarely if ever was it the devs.


Really? For me, it has been almost exclusively management not wanting to invest the manpower necessary for the adoption.

If the technology stack allows it, I assume most passionate developers would develop rather with newer than older toolchains.


These might be two sides of the same coin - usually the reason there is a huge manpower requirement is because of the libraries and supporting tools.

Most developers want to use the newest and greatest, but are held back.


Why not use them as soon as the compiler supports them? Your teammates will either ask you what it is (since you are the team lead), or look it up on cppreference.com.


I used to do this back when C++11 came out and ended up regretting it. When a feature just comes out, you can understand it possibly in isolation, but it's very hard to understand how that feature will fit in with other features, tools, libraries etc... and so you use it in a certain way according to how various blog posts and thought leaders say you should use it, and then after a year you end up realizing that no one could anticipate that the new feature has all kinds of footguns and certain features even end up being deprecated or superseded by yet another new feature.

When a new feature comes out, it's best to let it settle in a bit, maybe experiment using it on smaller side projects, but avoid diving into using it too early before it's really well understood.

For example I now cringe every time I see the curly bracket initialization of the form T{...} and how that was advocated as the one true way to initialize everything in C++, only for everyone to realize a year later that it has its own footguns (like with initialize lists) and with C++20 fixing almost all of the original problems that led to T{...}, the best practice nowadays is to go back to just using plain old T(...) and there's little to no reason to use T{...} anymore.

There was also Herb Sutter's Always Use Auto, which then was revised to Almost Always Use Auto, and now I think most developers take the sensible approach of use auto for redundant names like iterators, or unnamable types like lambda expressions, and avoid using it everywhere so as to turn your codebase into an opaque soup of inscrutable keywords.


Also looking at C++ every day isn't a very fetching proposition.


Yeah, I think grandparent is confusing "not knowing the language features" with "want nothing to do with the latest metaprogramming mess the cooks are serving". Job listings with C++ are a mess, you never know what level of insanity you are going to get (bonus shoutout for those that say "C/C++").


It's a chicken and egg problem.

You mention the average C++ programmer won't know the latest features, but if you did find an enthusiast who knew the latest features, you and the team probably wouldn't allow the use of those new features.

I can't imagine a more soul draining job than maintaining a corporate C++ codebase. Talk about doing the bare minimum.


> I can't imagine a more soul draining job than maintaining a corporate C++ codebase.

To me, that's anything web-related in Java/C#/Go/JS.


> I can't imagine a more soul draining job than maintaining a corporate C++ codebase.

Especially if most of the developers learnt Microsoft Visual C++ and believe that is proper C++!


re: changes coming from generative AI, it would need fluency in modules and other modern formulae. It would get that by being trained on modern conventions. So it's the same problem then, isn't it?


I feel that the site would benefit from a paragraph or two about what C++ Modules are and why devs should use them


You know any other language that uses the header/cpp split likes C/C++ do?

This is C++es way of finally getting rid of them, akin to Swift or Rust.


It's called putting all your code in the .h


Which is going to murder your compilation times. Modules do it better, the MS Office team claims the following: Worst case: 0.9% regression compared to PCH - Best case: 21.2% improvement compared to PCH

(From https://m.youtube.com/live/H6GQUg5JquU?si=1iC_OvRQ_MprzDTQ&t...)


And what are the metrics in comparison to not using PCH?


Agreed, I’m not a C++ developer and I’ve barely guessed what’s this about


> I’m not a C++ developer and I’ve barely guessed what’s this about

I feel those statements are related.


But concepts are easily transferable, so a line or two could help a lot :)


I wrote my own build system to use C++20 modules before CMake even had support for them, and while I have probably had net benefit from using them, I can’t recommend them for anyone in their own projects at this point.

The feature has so many irregularities that could only come out of a standards process, there are too many compiler bugs (just try using header units), the different implementations are too fragmented (I’m only using clang, which makes this easier on me), and there is a lack of things like module maps that would dramatically improve usability.


As a longtime C++ user, I'd sooner just upgrade to a better system programming language. C++ is a weird mess.


C++ has long surpassed the point where mere mortals like me can understand it; It's so loaded with baggage, footguns, and inscrutable standards language that honestly I think the only thing keeping it going is institutional inertia and "backwards compatbility" (air quotes).

I work extensively in the embedded space and unfortunately C and C++ are still pretty much the only viable languages. I can not wait until the day rust or some other language finally supplants them.


I‘m currently doing work with Rust on Esp32 platforms and I‘ll have to say, it‘s not quite ready yet. Debug tools still have issues, and we‘re facing some problems with vendor specific magic in Esp IDF version 5.


What's stopping Rust from being used in embedded?


Among other things, tooling and vendor libraries. Vendor libraries are often composed of thousands upon thousands of lines of auto-generated C headers and written in some bespoke format. Demonstration code and/or sample drivers are almost invariably provided in C. Of course you _can_ rewrite these in Rust, but if you're an engineer trying to get shit working, you'd first basically have to reinvent the whole wheel just to do bringup.

I don't even want to talk about the state of proprietary vendor tooling...


Documentation is severely lacking and vendor specific libraries and build systems are sometimes interfering with cargo.

There‘s also the problem of rust-analyzer being relatively flaky in general and even more so when being used with environment specific / KConfig / build system feature flags that enable or disable certain library headers.


The grass is always greener. Rust also has rough edges and there's maybe a fraction of a fraction of a percent of Rust code out there to work on in the corpus of systems software, and a lot of it has to interface with C anyway. Even crates in Rust are overly complicated and they've set up a system that allows people to squat on well known names.

I think if you want to work on systems software you should enjoy working with legacy cruft, otherwise you're just going to be miserable. I mean it's everywhere from the language, to the POSIX APIs, to device drivers, to hardware quirks... C++ is only a quarter of the problem.


Which one would you recommend? My ideal C++ alternative would be something like Swift but faster.


For low-level compiled system applications: Rust, Zig etc.

For compiled garbage-collected applications (web/cli): Go.

For high-level applications (web/cli/etl/desktop): Java, C#.

Also here is good writeup: https://hackernoon.com/the-real-c-killers-not-you-rust

discussed here two times:

https://news.ycombinator.com/item?id=34792932

https://news.ycombinator.com/item?id=39770467


I have a similar opinion: For low level stuff Rust or more "obscure" langs like Zig or Carbon offer a lot in that space. The moment you leave low-level I'd always go with C#/Java or TS for web-stuff. The productivity gains you get from switching from C++ to C# are absolutely insane.


Little reason to use Go here. C# is a much better language at allowing you to tactically write high-performance low-level code where it matters and relying on higher level abstractions when it's not (struct generics are just like in Rust allowing you to expend a little effort for achieving zero-cost abstractions, although not as convenient as just using objects everywhere).

Go is inadequate, poorly typed, has abysmal FFI overhead and bloated binary sizes as it does no metadata compression and other tricks C#'s AOT compilation does.


With Go, I can easily cross-compile a statically linked binary for each of the major platforms.

You cannot do that with C#.


Cross compilation story in C# is very good actually.


With Go, I can use my Linux CI server to compile a statically linked binary for MacOS.


You can publish across platforms with modern .NET


But the fact that developers often need to be able to cover any one (or even multiple) of these areas, and that language proficiency (with platform APIs and various quirks) is quite hard to achieve - makes me think that the actual alternative is the elephant in the room, the one not even listed here: JavaScript.


Having been programming continuously on JavaScript for over 20 with all the popular frameworks, I'd try to stay away from it as much as possible:

1. Dynamic typing must die. Except for R/Julia/Python (aka JuPyteR) notebooks use-case, where it's awesome. My list includes only statically-typed languages. Typescript is much better, but its type-checking still fails sometimes, unlike real static-typed languages.

2. NPM is a mess that allows any transitive dependency to run arbitrary code on your machine at a time of installation (including cleaning up after itself). Compare that to Java's Maven -- libs quality is much better. no arbitrary code runs, just downloading.


> allows any transitive dependency to run arbitrary code on your machine at a time of installation

This point gets parroted so often on HN [1]. You can install packages with the --ignore-scripts flag to disable this behavior, or just set the option globally in your NPM config file. I do like the way Bun disables lifecycle scripts by default [2], but it takes me all of two seconds to run `npm config set ignore-scripts=true` on a new machine, so it's basically a non-issue for me.

[1] https://news.ycombinator.com/item?id=38797176

[2] https://bun.sh/docs/cli/install#lifecycle-scripts


Are you seriously mentioning JS as a C++ alternative? I am confused.


I am old enough to remember when everything was written in C++, except embedded and OS kernels in C. Today even my chat, email client and IDE are written in JavaScript...


> Today even my chat, email client and IDE are written in JavaScript...

Yeah I don't consider that a good thing :-)


Just so you know, Rust is in many ways a lot like Swift, and not just because many Rust compiler devs went later to work on the Swift compiler.


I think it depends on what you use C++ for. For low-level high-performance systems work, the thing I probably miss most in many of the alternatives is the extensive metaprogramming and generics capabilities of C++. This is unfortunate given both the power of this language feature and how much opportunity there is to improve the ergonomics of C++ metaprogramming.


Rust, since you're mentioning speed. It was refreshing to use, partially because the standard library is so much nicer.


That's C# :)


And yet C++ is the best programming language.

If you had to pick only one language to use, for everything, you'd pick C++. It can do it all, from bit fields to polymorphic classes to closures; it's safer and saner than C (you haven't read the standards if you think otherwise); it's got a level of support and maturity (and probably lifespan) than any other comparable language.


I love this having just started reading through the C++ 20 stuff.

However, a key opportunity is missed in that neither the icon nor the site links in the footer linked to a short definition of the language before modules (the lack), the impact of modules on the design of the language at present (the real) and its place in the future of programming languages (the imaginary and the symbolic).


Funny... me and the first poster clicked reply at the same moment indistinguishable from the HN front-end. The meta point is the same.


After writing build systems for a C/C++ operating system and years optimising builds for C/C++ operating systems the major disaster by far is the C preprocessor.

This is the source of all the evil. Even a hello world program involves reading through 100s of kilobytes, often megabytes, of headers that have to be parsed again and again for every source file but which can produce totally different outcomes in each case depending on compilers and the OS and the definitions on the commandline and whatever's defined in the source code itself and how the filesystem is laid out.

You can forget managing the dependencies on large projects this way - they are overwhelming. Every build system tends to be leaky and imperfect to not get drowned in dependencies and the fanciest systems all tend to have big holes here or there or they have to use huge "catchall" dependencies to try to be correct at the cost of efficiency.

I hoped modules would remove this problem but so far I'm not sure. I'd love to get the opinion of someone who has used them. My read-ups about it didn't seem that hopeful - I got the impression of them being a bit like pre-compiled headers.


C/C++ sucks to write dev tooling (e.g. syntax highlighting, LSPs, static analyzers) for. Pretty much everyone leans on libclang for parsing because very few people are insane enough to try to reimplement a parser themselves, let alone all the GNU extensions. And even then, macros make robust parsing really difficult. Imagine trying to parse a file that contains two programming languages that can be arbitrarily interleaved at almost the character level. That's basically what C/C++ are.


Named modules (not header units[1] which are a workaround for libraries not-yet migrated to C++ standard modules) straight-up disallow exporting macros. Which is a good thing. I can't stand macros.

[1]: https://clang.llvm.org/docs/StandardCPlusPlusModules.html#he...


> Estimated finish by: Thu Nov 03 3892

I think this line on its own sums it up.


I think calling GCC's support of modules "partial" is a tad generous. It's pretty easy to hit ICEs/segfaults when trying to use modules with GCC, which is a good reason why it's not worth it for libraries to support modules at all.



Standard library support is not there in any meaningful way across compilers, I fail to see why anyone would adopt modules at the present. MSVC is the furthest along in its support last I looked. Once compiler support is available, I would expect usage to increase fairly rapidly. As with many things C++, modules are a nice-to-have, adopting a successful strategy from other languages/ecosystems, and many will choose to never adopt them.


Estimated completion by 3892 is cute, but not surprising really given that only 1/4 compilers and 1/3 stdlibs support these completely. Presumably most projects aren't going to try to migrate until it's more generally available (I haven't looked at them in years, not sure if it's easy or possible to support it for one toolchain but not another).


No compiler comes to close to completely supporting modules. MSVC is the furthest along and it does not by any means fully support modules. I think an upcoming version will finally have somewhat usable support for import std;


Some notes:

This website scrapes vcpkg's registry[1], which contains many C libraries which are unlikely to ever receive C++20 module updates. Many are primarily binary executable packages, like lunarg-vulkantools. It is quite unfair to judge C++ module support by this. There are even bugs in the table: the issue tracking Vulkan-Hpp module links to https://github.com/KhronosGroup/Vulkan-Hpp/issues/121, but it was actually implemented in https://github.com/KhronosGroup/Vulkan-Hpp/issues/1580 (full disclosure: I implemented it).

Boost maintainers have picked up on this[2], which is big.

The big 3 compilers have had a myriad of bugs, ICEs, and redefinition errors, despite what is claimed on cppreference[3]. VS 2022 17.10 will only just fix some of these, and G++'s module support isn't even released yet. Clang 18 has seemingly full(er) support for C++20 modules, but clangd is broken, and it seems mixing standard library headers and `import std` might still break, as will header units (`import <header>`).

CMake released C++20 modules support with 3.28, and will release `import std` support with 3.30.

This is painful but IMO worth the paper cuts that the bleeding-edge adopters will experience in the next year or so as modules are implemented.

I fully believe that a good one-third to half of build time and power consumption in the past 40+ years of compiling C and C++ code (considerably more so in the case of template-heavy C++ header-only libraries and projects) has gone to parsing and re-parsing headers and the resultant output.

Headers are a distinctly 1970s approach to compartmentalisation. Other languages have sorted dependency and library/import resolution years ago; there's no reason the C and C++ world has to be stuck with essentially copy-pasting code over and over. The embarrassingly parallel building that results from headers is fake; it takes more time and more energy than strictly necessary.

[1]: https://vcpkg.link/browse/all

[2]: https://anarthal.github.io/cppblog/modules

[3]: https://en.cppreference.com/w/cpp/compiler_support/20


Ha, there I was assuming this would be about JavaScript. Good to know we all have struggles.


Same, I was ready to come in and complain, but then I was just confused. Been using C++ daily for 5 years but have never heard of C++ modules.


As of this writing the term "debug" has not appeared a single time in any part of these discussion. What's the vscode step through debugger experience for C++ modules?


What's up with the American flag branding on the logo?


It's a play on the old Uncle Sam "I want YOU" recruiting posters.


It's referencing a very famous recruitment poster from the World Wars


My C/C++ development environment is now Python + C extensions, which I treat as a portable runtime and build system that does what I need


Who all are doing the `arewe<x>yet` Web sites, other than Rust and this C++ one?


There quite a few in javascript/browser world: - Browser Houdini effort https://ishoudinireadyyet.com/ - Service workers https://jakearchibald.github.io/isserviceworkerready/ - Svelte 5 new version https://svelte-5-preview.vercel.app/status - Turbo bundler rust rewrite https://areweturboyet.com/

And prob much more


It is a Mozilla culture thing that turned into a Rust thing, and yeah this is the first one I’ve seen outside of those two general communities so far.

Here is my, uh, “favorite” https://arewereorganizedyet.com/


That one is great (but sad).



Wonder why it seems like all QT5 are "no help wanted"...


Qt5 is in maintenance mode. Qt6 has long been released.


Qt 5.15 at this point has been out for 4 years, already out of "normal" commercial LTS and will reach the end of extended commercial LTS next year. They don't have any incentive to do this kind of change.


for the wizards where would be a good place to start for someone who did a python bootcamp but wants to learn C++ and contribute to some of these?


C++ is a big language compared to Python. Many those of us with years of C++ experience don't fully understand the power and complexity of modules, partially because so few libraries and compilers support them as shown by TFA.

I'd recommend becoming an expert in Python modules. How they're packaged, how they're referenced and installed by pip, etc. Then learn how headers and translation units work in C++. How templates operate is an important concept to understand. Jumping right to C++ modules without a deeper understanding of the C++ compiler or without a reference point for other languages' module concepts will only lead to confusion.

If you're totally new to C++, I'd actually recommend reading "The C++ Programming Language" cover to cover just to "know what you don't know" and then roll up your sleeves and get some experience with a hobby project.


If you aren't aware of the c++ core guidelines[1] - it should be on your radar.

Also, it might not be a popular opinion, but I think Bjarne's books are just fine.

A Tour of C++ (3rd edition) [2]

Principles and Practice Using C++ (3rd Edition) was just published in april 2023 [3]

[1] https://github.com/isocpp/CppCoreGuidelines/blob/master/CppC... [2] https://www.stroustrup.com/tour3.html [3] https://www.stroustrup.com/programming.html


I still like the Effective Modern C++ books by Scott Meyers, although they are a few years old already.

The modern parts of C++ are alright (inelegant and cumbersome, but alright), but because the language has grown over the years and best practices changed, it's difficult to see, which parts are worth learning as a newcomer.

That's why I like the books, mentioned above. They give concrete Do's and Don'ts, explain some important concepts like move semantics and why and how best practices changed.

Keep in mind, that these books require some preliminary knowledge, but you should be fine, if you learn the basic stuff from an online tutorial before going through the books.


I'd start with an actual use case or existing project. Here's a fun one: https://github.com/assaultcube/AC



Learn Rust instead. This ship has holes.


Would love this in Unreal, but one can dream.


Silly question: What's the difference between C++20 modules and https://conan.io? (Google was vague, and ChatGPT, you know, sometimes makes things up so I rather ask fellow humans...)

(edit: Conan seem to address C++20 modules, seems to seek compatibility, but as a non CPP developer, not sure I read it right https://blog.conan.io/2023/10/17/modules-the-packaging-story...)


3892?


As always I first read the comments and then look at the post. I was really intrigued what could this number mean.

I did not expect it to be the expected year of completion.


Is it just me or does it seem like modules are a good idea which is totally dead on arrival? It's been four years, and we've had a whole new edition of c++ standard in the meantime, and Clang and GCC still don't have full support for modules.


The issue is that everyone has their own hopes and dreams for what modules in C++ should and could be, and what ended up being standardized is an unbelievably complicated and overengineered mess. A small number of people worked on standardizing modules, and the proposal is so complicated and obscure I really doubt if most of the people who voted to approve it actually understood what it is they were even voting on.

In the future, things should not get standardized without an actual working implementation of the feature that people can actually use. Even better would be to have multiple similar non-standard features that each compiler can implement, and then the standardization can serve as way for them to converge.


The problem with most new features in the standard is that they are so complicated and tough to implement that it takes ages leading to the current situation where there are features already planned for c++26 while compilers barely support c++20.

It's even worse that for embedded stuff it takes even longer for these modern compilers to be included in the toolchains etc. For example at work we are looking forward to being able to use GCC 13 later this year so we can use some features that were lacking in the GCC 11.3 we are using currently.

I don't think we can expect modules to be widely adopted before like 2026-38.


Yeah. Does gp not use C++? No new feature gets support in like less than 4 years. :p And I don't think that's a bad thing unless it's replacing a premature abomination feature released too soon, like `auto_ptr`. That's getting removed entirely soon; imagine if it were harder to do that because it got adopted faster and so much code relied on it that WG21 decided to keep it in "deprecated" limbo forever.

Also, we're on GCC 8! TnT '17 is the highest it can go.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: