exceptions opens the door to your control flow being rug pulled in exchange for... an error message? realistically all you can do is wrap chunks of code in a `try { } catch (const std::exception &e)` block and hope they've implemented `std::exception::what`. an enum based approach gives you both better error messages and certainty of what errors can occur and when.
I replied to the overhead claim, you are moving the goal post.
I don't think we will ever have consensus about the control flow aspects of exceptions. For example once you employ RAII it's easier to write error-agnostic functions with exceptions than with error codes. With error codes you have to explicitly propagate the error up, while with exceptions it is automatic. Sometimes this is desirable. It's arguable that the function isn't really agnostic to exceptions, as you have to employ RAII to qualify, but in modern C++ codebases that should be the default, really.
Having said that you probably don't want to introduce exceptions to a large code base that was designed with no exceptions in mind, and this definitely applies to the Linux kernel.
edit: enum based approach giving better error messages, really? You get a fixed error message per enumeration, but you can have a custom error message with exceptions that include context.
You wrote that if you were ever to code with exceptions, you would do a useless, disruptive thing no one recommends, and get undesirable results. Trolling.
increased compile times are a reasonable tradeoff for correctness, no? hardware can always be thrown at the problem. as soon as someone does a c oopsie you're going to lose whatever time you saved to debugging
Vaguely descriptive attacks that serve no purpose other than religious wars. When I see statements like that I see another person who thinks it’s simply fashionable to hate popular languages because they heard someone like Torvalds, Thompson, or Stallman say it. If you have a fundamental explanation like they do, then it's at least reasonable, disregarding the irrelevant context of a Linux kernel forum thread. But I rarely see that, instead I find mutter like "cpp is bloated, unstable, overcomplicated crap with ugly syntax, that's why I use rust, or nim, or any other pretty new thing that was conceived 35+ years later".
For instance:
> c++ is so brittle that a gentle breeze could cause a cacophony of soundness issues
Okay, how so, anything about C++ or just C in general?
There's a design principle here, rather than just some sort of hubris.
In languages as powerful as Rust and C++ you can express programs which fall into three categories, two of these aren't very interesting, the programs which are obviously valid, and the programs which are obviously nonsense. We know what to do with these programs, the former should result in emitting correct machine code, the latter should cause a diagnostic (error message). The problem is the last group, programs whose validity is difficult to discern. The bigger and more complicated your software the more likely it may end up in this last group.
In C++ the third category are treated as valid. In the ISO standard this is achieved by having clauses which declare that in certain cases the program is "Ill-formed, no diagnostic required" which means the standard can't tell you what happens, but you won't necessarily get an error message, your compiler may spit out a program that does... something. Maybe it does what you expected, and maybe it doesn't, the standard asks nothing more.
In Rust these are treated as invalid. If you try hard enough (or cheat and Google for one) you can write Rust programs which you can reason through why they should work but the compiler says no. You get an error message explaining why the compiler won't accept this program.
Now, if Rust's compiler doesn't like your program, you can rewrite it so that it's valid. Once you do that, which is often very easy - you're definitely in that first category of correct programs, hooray. The program might well do something you didn't intend, the compiler isn't a mind reader and has no idea that you meant to write "Fnord" in that text output and "Ford" is a typo, but it's definitely a valid Rust program.
In the C++ case we can't tell. Maybe our program is the ravings of a lunatic, the compiler isn't obliged to mention that and we are in blissful ignorance. This also provides little impetus for the standard's authors or compiler vendors to reduce the size of the third category of programs, after all they seem to compile just fine.
Obviously it'd be excellent in theory to completely eliminate the third category. Unfortunately Rice's Theorem says we cannot do that.
Inherently. Because of Rice's theorem you can't just ensure all the valid programs are correctly identified (basically you need to solve the halting problem to pull that off). But the ISO document doesn't allow you to do what Rust does and just reject some of the valid programs because you aren't sure.
Now of course you could build a compiler which rejects valid programs anyway and is thus not complying with the ISO document. Arguably some modes of popular C++ compilers are exactly that today. But now we're straying from the ISO C++ standard. And I'm pretty sure I didn't see a modern C++ compiler which reliably puts all the "No diagnostic required" stuff into the "We generate a diagnostic and reject the program" category even with a flag but if you know of one I'm interested.