The more features a language has, the more I have to know just to read someone's code or be productive in a company - and the more chance someone doesn't know about potential harmful interactions and side effects.
As per my sibling, write me something as straightforward as a C++ RAII-using dtor in C without "using C in crazy ways". I will not hold my breath. The cost of a very minimal subset of C++ is literally zero, and yet significantly improves the likelihood of your code actually working. You're picking social baggage in either case: either understanding the subset of C++ that your team is using or expecting all of your developers to be perfect where C++ (and other languages--shouts, Rust!) just do it for you, correctly.
Technology is socially interesting in that incompleteness and inexpressivity is so often misread as "elegance" or "minimalism". That a language is "simple" is not a feature when it offloads all of the danger onto the (almost invariably failing to consider critical information at the worst possible time, and I include myself at the forefront of that characterization!) developer. This is why we have tools that compensate for the most common, and most destructive, of our mistakes.
(This post should not be construed as any particular endorsement of C++. C++ is a gong show when used improperly. But at least it's possible for a mere mortal to use it properly.)
You can do essentially all the "C in crazy ways" in C++ as well, and people do. In my opinion and experience, it isn't what the language provides, it is how it is used in practice - and again from my experience (YMMV), C is used sanely and C++ is not.
The idea that people use C "sanely" more often than C++ (or, you know, something actually good--further shouts, Rust!) doesn't pass the smell test. Are you checking the retval of every sprintf? Are you writing goto drops in every method where there's allocated memory, diligently checking every error code, and properly bailing out, every time? If so, you're that one percent. But you're probably not. And that's not a slight--I'm not, either. That's why I am proud to count myself as a member of a tool-using species, because we build (and have built) better tools to compensate for our flaws.
> If so, you're that one percent.
I probably am that one percent (and I don't use sprintf, because there is no sane way to use it). When I say people use C "sanely", I do not mean that they never err in any way, far from it. And my observation (your mileage obviously varies) is only that C code bases that I tend to use and meet (e.g. x264, ffmpeg/libav, the linux kernel, the Nim compiler) tend to be saner than C++ code bases that I tend to use and meet (e.g. libzmq, although that one improved dramatically since 4.0, and is now almost sane, boost, stl)
I admit that I have not yet met a C++11 codebase with lambdas - that might have restored sanity. But even if it does, it does not retroactively bestow that goodness on the millions of lines of code already out there.
I stress again - I am not passing judgement on the language, but about how it is used in practice, through my own sample set. If I work on a project in which I can dictate the exact subset, choose the people, etc, I might pick C++. But in most projects I'm involved in, the constraints are dictated in some way or the other that makes C at least as good a choice (and often better) than C++
Except, that I have seen C++11 codebases that heavily relied on lambdas. And that was far from sane. I am very familiar with lambdas from pure functional languages and partially ones (python, ...). But all the syntax specifics, brackets, ... in C++ made it a very annoying process to understand, what was even going on at all.
Using short functions would be more lines of code. But at least I would have known right away what's happening in the code.
Hence why I eventually did a Turbo Pascal -> C++ transition, with a very very short stop in C.
I was lucky that our technical school also had access to Turbo C++ 1.0 for MS-DOS on their software library. As I was not getting why should I downgrade myself from Turbo Pascal 6.0 into C.
Already in those days of "The C++ Report" and when "The C/C++ Users Journal" was still called "The C Users Journal", there were articles how to put C++ features to good use for increased safety.
And this is a major culture gap between C and C++, that I have observed since then, yes we also get the performance obsessed ones, but there are also the security minded ones.
I seldom see talks about how to increase type safety in C, their C99 security annex is a joke as it keeps pointers and size separated, and is so highly regarded that it was moved into optional in C11.
C++ community on the other hard improves the standard library to decrease the unsafe use cases, promotes guidelines and is trying to reduce the amount of UB use cases.
#define NEW(t, args...) ((t*)malloc(sizeof(t)) && t ## _construct(args))
#define DELETE(t) (t ## _destruct() && !free(t) && (t = NULL))
But beyond that, it all went downhill. There was no reason to make C++ the most bloated language ever.
Tell me this for example, does the following use a cast or a copy constructor?
Because you know, it's all super clear.
But the language jumped the shark starting with C++0x
I seldom meet C developers that are able to distinguish between ANSI C and my compiler's C, that extrapolate from my compiler's C version Y, how the language should behave.
Then they port the code to my compiler's C version Y + 1, or another compiler vendor, their code gets broken, blame the compiler, only to find out that the code was already broken from ANSI C point of view.
Is signed arithmetic using C in crazy ways? Is shifting by an arbitrary value without checking to make sure that the number of bits is in the valid range of the type using C in crazy ways?
Undefined behavior is everywhere in C.