This is the great strength and weakness of C++. Increasingly the answer to C++'s rough edges is "We don't do things that way anymore. Everyone does X now", where X is the hot new thing. RAII is the best example I can think of, where some people insist that no one would ever use the "new" and "delete" keywords anymore. Except for all the C++ devs that do, and all the C++ code that exists that does and must be maintained.
It leads to the current situation where you have C++ "the language" which is everything, and then C++ "the subset that everyone uses" where that subset constantly changes with time and development context.
I think I agree with the sentiment but not the example, no one seriously advocates for no-new/no-delete (collections must still be written, somewhere) but rather that new/delete are generally code smell and there are idioms that can help isolate bugs. Part of maintaining old code is updating to those idioms.
But yea this kind of thing hit me recently on the interview circuit. I wrote some correct C++ (in that it was correct for the idioms in vogue when I last wrote C++ regularly for money) but I got feedback I wasn't as senior as they hoped due to my (lack of) C++ knowledge. Part of that was a shitty interviewer, but it's also just a fundamental part of the language. If you leave for a few years or try and change shops you find that everything under you has been removed or a completely different subset of the language is being used elsewhere. The complete lack of an ecosystem just reinforces that.
To be fair I imagine the same happening to someone coming to a Java interview still writing Java 8, or writing C# as if .NET Framework 4.8 is the latest version (C# 7.3).
I kinda look to when the thing finally stabilizes as a sign as to how bad the problem was. For instance, Javascript front end was a nightmare for a long time, but it seems to have finally stabilized into a reasonable stable configuration with a couple of winners, some minor specialized choices, and the endless churn is now a minor sideshow instead of something that is changing the default choice every six months. There was a bad problem there, but it seems to have been satisfactorily conquered for now. (I expect as static typing creeps ever more deeply into the JS ecosystem that at some point that may cross a critical point and cause some more churn, but at least for now things seem more stable.)
While C++'s churn frequency seems to be higher than the Javascript front end churn frequency, as an outsider, it still seems like "best practices" on C++ are churning around 1.5-2 years, it's been happening for my entire career, and it's still happening. If I seem a bit unsympathetic to the claims that the problems are solved if you just write your C++ code this way now, it's because I first heard that in 1998 or so, for a set of common practices now considered laughably out of date, of course.
At some point it becomes more cost-effective to just "churn" on to Rust next time, because even though in that same time frame Rust is a younger language that was going through its early design iteration phase it still seems like it has settled into a lower-frequency churn rate lately for "best practices" than C++.
There's probably some interesting and deeply profound reason why C++ just can't seem to stabilize across what is approaching an entire human generation, but I'm nowhere near interested enough in learning it to actually learn the amount of C++ it would take to find it.
> it still seems like it has settled into a lower-frequency churn rate lately for "best practices" than C++.
Rust is still adding a ton of new language features, especially around async, compile time code evaluation and the type system (const generics, GAT/HKT, existential types etc.). We'll very likely see further developments in more areas next, e.g. to match C++ developments in parallel and heterogenous compute (GPU's and the like), or to add forms of proof-carrying code, etc.
A lot of that is not what I mean by "churn". What I mean by "churn" is changes in best practice. Python has been adding a lot of features, but with the possible exception of static typing support, most of them haven't made many changes to what constitutes best practices. They might make "nicer ways to write that code" but the old styles haven't been deemed wrong. async is also not exactly what I mean; this allows new code to be written that mostly couldn't before. This one is only a partial miss though since it did deprecate some older libraries, but those libraries weren't really deemed "the right answer" either.
C++ is constantly changing what a "best practice" is. The latest hotness from three-generations-ago churn is now considered broken.
C++ gets new features that provide a better way to solve common coding problems. This is not accidental, and not a problem: the new features were added because they offer that better way.
Failing to use the new feature is just failing to write code the best way that is available right now. In 2013 you had no choice but to do it the old way; but you don't have to anymore, because the language and std library have caught up with you.
Being responsive to the needs of its community of programmers is job one for a language committee. If new, better ways don't arise, your language has stagnated.
You're right it's not accidental, but the problem is that when you continually add new features you end up creating a confusing mess. Yes, the new features may respond to some need in the community, but by adding it you've also:
- Introduced possibly unforeseen issues because features are never added in isolation; they interact with one another, and the more features you have the harder it is to test them all.
- Created confusion because now all the old information on the internet is out of date
- Everyone needs to update their tooling to support the new features.
- Make it harder for new people to start learning the language
> Failing to use the new feature is just failing to write code the best way that is available right now. In 2013 you had no choice but to do it the old way; but you don't have to anymore, because the language and std library have caught up with you.
This is a nice idea but it doesn't reflect reality. If the C++ user survey [0] is to be believed, a full 67% of users were restricted from using the latest C++ at the time (either fully or certain features).
> Being responsive to the needs of its community of programmers is job one for a language committee.
This is true but it also must be balanced against mission scope creep and design considerations. Being everything to everyone is not design.
People not using the latest Standard are waiting for support within their particular environment, not because they want to stay with a less performant version of the language.
People not using the latest Standard are waiting for support within their particular environment, not because they want to stay with a less performant version of the language.
> Failing to use the new feature is just failing to write code the best way that is available right now. In 2013 you had no choice but to do it the old way; but you don't have to anymore, because the language and std library have caught up with you.
What I hear you saying is that failing to use new features is a failure to write the best code available, and that before maybe you didn't have a choice but not anymore.
And yet, huge chunk of C++ devs are forbidden by their organizations from using various C++ features that have been added throughout the years. It's not just C++ 17 but it goes back even further. This is the whole point of TFA. So it's not just that people are "waiting" for tools to catch up, unless you have evidence of this.
Organizational inertia is a problem, but not a problem that a language or Standard can fix.
When an organizational logjam is broken, programmers can immediately switch over to the better, more modern way of coding enabled by the newer Standard they are allowed to use. If they were on C++98, and now they can use C++14, they are still better off than before, even if the current Standard is C++20: they will be able to code according to best practices for C++14, which are better than for C++98.
C++20 is adding a module system to C++. That's a major change to the compilation model.
It is also adding concepts, a major change in the way templates are to be written. The very article we're commenting on is discussing how exceptions should possibly be replaced by another error report mechanism, several proposals are in flight for this. There's also the "destructive moves" proposals that have the potential to change a lot how we write types.
A telltale sign that these changes are major is that the entire std has to be "modularized" to support modules, and to be modified to support concepts. Similarly if exceptions are to be revised a large chunk of exception using functions from the std would need to be modified.
On the Rust side, I think the only change that has even a comparable impact is const generics (and maybe specialization).
Existential types and GAT will change how to express some traits (allowing a LendingIterator for example), but I don't expect they will affect a large portion of Rust's std.
Also of note is that the Rust changes come to add new systems orthogonal to the existing features (const generics fill an obvious void compared to C++, same with GAT and existential types where Rust's expressivity is limited in comparison with C++ atm). By contrast in C++, the module system comes to replace the headers, and a change to exceptions would replace the current exception system, creating churn.
That's probably correct, I did not use concepts at the moment (stuck in C++14 right now). I certainly have my beefs with both features (modules that are orthogonal to namespace, no standard way to find modules, new and exciting ways of committing ODR violations, generally a complicated module system with quirks when there is so much prior art on modules in other languages, concepts being structural and not nominal, concepts being a "lower bound" on behavior and not an "upper bound" (thus not eliminating duck typing)), but my larger point is the scope of these changes to the language, not their (purported) benefit that I'm by and large unable to assess right now.
> it still seems like "best practices" on C++ are churning around 1.5-2 years, it's been happening for my entire career, and it's still happening
I have to disagree with this quite strongly. What "best practices" churn are you seeing? Your reference example of "don't use new & delete" (which I agree with) was a single best practice change that happened ~10 years ago. Similarly https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines is like 5-7 years old now, and afaik hasn't had any significant revisions?
The "churn" was really pre-C++11 to post-C++11. It was more like a python 3 moment than churn, other than it's taking a long, long time for code bases to catch up.
I think you're mostly right, but there is a bunch of fluctuation around just how much ugly template magic you're supposed to use vs. plain old imperative control flow (<algorithm>, ranges, I'm looking at you!)
Sure, but that in itself is also not an argument. The space of programming languages in general is moving forward and languages keep adding more (usually higher-level) features. C++ is mostly trying to keep up.
"I will contend that conceptual integrity is the most important consideration in system design. It is better to have a system omit certain anomalous features and improvements, but to reflect one set of design ideas, than to have one that contains many good but independent and uncoordinated ideas."
No, a language can stay current by evolving to fit its purpose. A good example of this is Matlab. Matlab is older than C++ and is still heavily used today, enough that it's one of the few remaining programming languages to support an actual business.
Matlab is not the same language it was in the 70s. But the core language is still largely about manipulating arrays and it does it well. It has evolved by adding functionality through toolboxes and creating new, innovative developer tooling. But it hasn't jumped on every PL bandwagon that has driven by over the last 50 years.
Matlab is one of the top 20 programming languages in the world still after 60 years according to the TIOBE index [0]. It has maintained a large user-base and achieved profitability over this time, not by adopting every PL trend that has come and gone, but by adapting to new developments while staying focused on its essence as a language. It proves you can stay current without doing what C++ is doing.
Matlab's usage is down a bit since a peak at 2017, but over the last 20 years it's up over 300%, and it's done so by being a for-profit language. Mathworks is a billion dollar company, which is quite an achievement in the PL space in 2022.
Meanwhile C++ usage is up over the last couple years but the long term trend has been a steady decline over the last 20 years [1]. From 14% to a low of 4%, now back up to around 8%.
Citing TIOBE instantly demonstrates a fatally bankrupt argument: changes in TIOBE ratings have essentially nothing to do with actual usage, or with anything else quantifiable.
TIOBE is statistical noise. You would equally meaningfully cite your tea leaves, or crows flying overhead.
Okay, well then I suppose you have a better citation for your unsupported assertion that Matlab has been in decline for many years. I've backed up most of my assertions with citations, I think it's time you brought some sources to the discussion. What is your basis for anything that you've been saying here?
I mean, if TIOBE was really statistical noise as your claim, there wouldn't be clear trends in industry reflected in the data, like the rise of Python in the last few years. And yet we see it, so clearly it's measuring something.
That is code for "it is evolving and I cannot be bothered to keep up".
All languages that are actually useful evolve. They get features that other people need, and you don't, just yet. Some of those features do turn out not to be perfect, because they are made by humans.
Comparing an old language to a new language, the new language will have many fewer of those both because it leaves many old things behind, and because it gets benefit of hindsight for the rest. Give it time, and it will accumulate "incoherence" of its own; the faster it evolves, the faster that happens.
The alternative is for a language not to be used. Then, it can stay pristine and "coherent", and not useful.
> "it is evolving and I cannot be bothered to keep up"
No, I actually teach C++ and have been keeping up with it for decades. It was the second language I learned in 1994 and I still code in it professionally today.
> Give it time, and it will accumulate "incoherence" of its own; the faster it evolves, the faster that happens.
Like I pointed out with Matlab, it’s a much older language compared to c++ yet is mostly coherent, far more than c++. This has less to do about C++’s age and the March of time or the human condition. Otherwise more old languages would be as incoherent as c++ yet that’s not the case.
Please leave the personal attacks out of this, thanks. I've said nothing personally against you and yet you've turned to calling into question my profession rather than the points I've raised. I understand maybe it may feel like I'm attacking you personally as I'm criticizing a language which I gather you are very fond of, but criticizing C++ is not criticizing you, and I would appreciate if you show me the same respect I've shown you. That's not what this site is for, and if you want to engage in that kind of back and forth I'd kindly decline.
My students give me high marks, my department (which includes faculty who have contributed to C++ spec over the years) is satisfied with my teaching, and I graduate students that go on to work at top companies and research labs around the world. I'm doing my job just fine, let's stick to talking about C++.
Adding features from literally every other language, just to create a mess - isn't really a selling point.
Right now, to learn C++ in a generic way, you need to learn pretty much all of programming paradigms and all of their variations - which is not a thing I would consider a plus.
>Right now, to learn C++ in a generic way, you need to learn pretty much all of programming paradigms and all of their variations - which is not a thing I would consider a plus.
I disagree, I would recommend learning enough where the tool starts providing value to YOU for YOUR problem domain and then pause/resume as needed. The main purpose of any programming language is to be productive in it and use it to solve a problem. In my opinion, there is no reason to learn more than you need about C++ unless you were a compiler author, on the C++ standards committee, or something similar.
"Generic," maybe, but in practice many people learn C++ as one of, as it were, two languages - one for application code developers and the other, for those who build libraries and provide APIs.
One of the reasons why I try to avoid C++ is that it's an unopinionated multi paradigm kitchen sink language.
There are great uses and great features, but there are so many of them and everyone has their own opinions.... Even in this thread there's a clear subset of people who "adore" C++ exceptions
So interesting, to me a language being opinionated is the main reason to put it on the do-not-use bin. I strongly believe that the best way to develop is through embedded domain-specific languages adapted to individual problems, and opinionated languages are always way too limiting regarding that.
It is not the job of a general-purpose language to insert its own opinions. That is the system designer's job. Fighting with the language's opinions is a recipe for failure.
When it's an embedded dsl you always have the escape hatch of the entire host language - and it is not shocking to use it: being in an eDSL does not mean that you have to agree to it religiously, it's always a case-by-case engineering tradeoff.
Whereas when in an opinionated language and you cannot do what you want... Ugly hacks such as people using bash scripts, mustache templates, etc ... to preprocess their code to get what they want quickly happen.
There's no such thing as an "unopinionated" kitchen sink language. Language features have all sorts of unforeseen interactions that must be handled somehow, and good high-level design is needed to ensure that the interactions are sensible.
That's not my point. The point is that this feature once was the hot new thing, and in the future there will be a hot new way to do the same thing in addition to all the old ways, because that's how C++ evolves.
And I would have to say the average C++ dev did not know about RAII in 1993.
For the sake of pedantry only: I don't think we called it RAII in 1993-1996, but the technique was in use in that period, though it wasn't standardized in any way. IIRC, Mac developers would have been widely exposed to it by Metrowerks PowerPlant during that time.
>It leads to the current situation where you have C++ "the language" which is everything, and then C++ "the subset that everyone uses" where that subset constantly changes with time and development context.
But what exactly is wrong with that? I don't quite understand your argument here..
They’re probably referring to preferring std::make_unique or std::make_shared to bare new/delete. Using either of the former makes the ownership semantics clear and avoids the need to remember to call delete at the appropriate time.
As well as smart pointers as others have mentioned, standard library containers (other than smart pointers) are another way to handle dynamic allocations in certain situations. The vector container is probably the best example here, it alone provides massive safety and usability benefits over using new or malloc directly!
It leads to the current situation where you have C++ "the language" which is everything, and then C++ "the subset that everyone uses" where that subset constantly changes with time and development context.