A little controversial, but I would have liked the larger programming community to address this rather than write completely new languages.
Think of all the effort that has gone into rust. Compilers, package managers, libraries, etc. What would have happened if that same effort went into better analyzers, linters, compilers, etc, for C/C++? Rather than rewriting everything to be made "perfect" (which will never happen), this would be applied to the metric shit ton of existing code, making it better. Not perfect, but better.
Of course that work is dirty and necessarily political, and making a new language is sexy...
The reason that this wasn't done is that we believed (and it still seems to be true) that you cannot make C or C++ be memory safe at compile time. Or rather, you cannot do it without introducing backwards incompatibility, which means you're effectively making a new language anyway, and something that's a non-starter for those languages. (as it should be, IMHO.)
It had nothing to do with "sexy" and everything to do with engineering goals. I assure you, a lot of work in Rust is dirty, political, and not sexy too :)
There is already a metric shitton of work being poured into making C safer. Everybody realizes that most existing C code will be used for a long time. That's why we have a bunch of different sanitizers, fuzzers, static analyzers and so on. The problem is that these tools only find a fraction of the possible memory corruption bugs you have and many people don't even bother to use them. You can still find bugs in many open source projects with a simple fuzzer. Sanitizers are only as good as your testsuite, but except for SQLite, which project has extensive coverage?
As someone who has dabbled in the C++ program analysis space before: A _ton_ of effort has gone into this already. There's also a lot of promising stuff there already.
There's active research on how to best do aliasing/pointer analyses, both for optimizers and for linters.
I've often said that the holy grail of C++ program analysis is aliasing info that is perfect, local, and can be computed in a reasonable time. The current situation on this is basically "pick one", _sometimes_ "pick two" if you're good. Rust's design gets you all three by picking different defaults and asking for some extra annotations from the user. The kicker is: it's all based on existing research!
Firefox has its own static analysis and IIRC has looked into other static analysis stuff (the GC analyses are pretty awesome). C++ static analysis is good, but not good enough, in this case at least.
To me, it's quite clear: the target language would have forked from its base, creating of necessity an incompatible ecosystem (no one wants an improvement after which you still can still make the same old mistakes, and some mistakes are emedded in the C/C++ syntax), and we'd end up with something slightly worse than Rust, using somewhat more familiar tools.
Think of all the effort that has gone into rust. Compilers, package managers, libraries, etc. What would have happened if that same effort went into better analyzers, linters, compilers, etc, for C/C++? Rather than rewriting everything to be made "perfect" (which will never happen), this would be applied to the metric shit ton of existing code, making it better. Not perfect, but better.
Of course that work is dirty and necessarily political, and making a new language is sexy...