To a first approximation, every single one of those wouldn't happen in a memory-safe language.
Of course, plenty of other things will. But it's not like choosing memory-unsafety makes those other things less like (memory unsafety does not, for instance, prevent incorrect ACL specification), so that's not very compelling.
Think of it this way. Suppose C didn't exist, and somebody proposed a new language today that was memory unsafe. Would it have any success, or be laughed out of existence?
In fact, go ahead and suppose C did exist, because it would still not be taken seriously.
If it's so wonderful that C is memory-unsafe, where's all the new languages that are also memory-unsafe?
(Note having specially-called out "unsafe" operators does not make a language "unsafe" the way C is.)
A few big projects written in C proves nothing about them or C. Need the larger picture with many instances to prove a claim. Certain, preventable vulnerabilities have occurred mostly in C software. Another commenter gave an example. Many of these similarly occurred in the C-lang projects you reference. So, using C instead of a language that prevents them was the first reason for their existence.
It doesn't, it's a fallacy. A lot of code that we think of today as buggy was written in the era in which people did not understanding threats and consequences of certain decisions. The reason we have nice things today is that we were lucky enough to have languages like C to experiment on and with and learn. "Safe" languages of today have the benefit of hindsight that good old C didn't have. Comparing legacy code in C to modern code in whatever you deem worthy is not only unhelpful but also very deceptive.
Compare new to new and tell me: how many web, smtp, imap, dns servers in Python, C# or Rust do we run today? There were some attempts but very few successful ones, despite of all the goodness that comes with better languages, libs and tools. I bet there's a reason behind it, other than just "people didn't try hard enough". There's old stuff out there being used today that we should definitely phase out. But we'll replace system software with new one written in system languages. Most likely C, perhaps Rust. But definitely now with ones written in "safe" languages that should deliver (if they are so awesome) but somehow didn't.
"that we were lucky enough to have languages like C to experiment on and with and learn. "Safe" languages of today have the benefit of hindsight that good old C didn't have. "
I've already detailed some of what was known about avoiding reliability and security problems before C and UNIX in the link below.
I know at least MULTIC's tricks were known at time of UNIX because it's name is a pun on MULTICS. Anyway, the better parts were largely eliminated to make C and UNIX efficient enough to run on a PDP-11. So, it's a fact that good stuff was known, they stripped it out due to constraints, and they never fixed those decisions as constraints lightened. That language, toolset, and coding style were adopted by many developers that continued to do things that way.
Meanwhile, in parallel, other trends began running. The evidence by the 1970's suggested that building secure systems require specs of behavior, a security policy, analysis one possesses the other, careful choice of language, pentesting, minimal TCB's, interface protection, and so on. Starting with MULTICS security evaluation, which talked about "insufficient argument validation" (ex: buffer overflows), that crowd demonstrated in one project and paper after another how to build secure systems. Plus, where most bang for buck was in terms of efforts.
In parallel, people like Margaret Hamilton, Dijkstra, Wirth, Hansen, etc determined how to make software that was correct by construction. Hamilton did that in 60's on Apollo. Dijkstra's method of layering of simple components with careful interfaces made nearly flawless software. Wirth & Jurg did a whole PC, safe (but efficient) language, compiler and OS in 2 years that performed fine. Kept building on that. Hansen and Dijkstra built things like Concurrent Pascal that eliminated concurrency errors on top of Wirth-style work that reduced sequential errors. One group after another built on such approaches with better results than UNIX and C crowds in terms of functionality vs labor available vs defects found.
So, this stuff was well-known at repeated points in the implementation of UNIX, server software, compilers, etc. Better approaches are finally mainstreaming with likes of Go (Wirth-inspired) and Rust (Ada redoux?). Other safe languages were plenty mature before they showed up. Yet, many projects focused on security still use C. So what I'm talking about not only happened then: it's still happening now.
"Compare new to new and tell me: how many web, smtp, imap, dns servers in Python, C# or Rust do we run today?"
Let me just apply your words to another problem. How much enterprise transaction processing uses C and UNIX today vs IBM and COBOL? There were some attempts to switch but very few successful ones despite all the goodness that comes with better languages, libs, and tools. I bet there's a reason behind it. Yep, but it's not that IBM and COBOL were best. I'd venture people not willing to put that work in, fear of breaking things, inertia, not knowing about better approaches, etc. The usual. Currently happening to C, C++, and Java stacks. :)
"But definitely now with ones written in "safe" languages that should deliver (if they are so awesome) but somehow didn't."
They did. Wirth made a whole series of them that delivered in many ways. Borland Delphi, Oberon, Component Pascal, and Modula-3 saw use in industry. GEMSOS used Pascal w/ coding guidelines to hit high security in the 1980's. Ada improved defect count in every empirical study I saw. A tasking profile for Ada and Eiffel both knocked out plenty concurrency errors. MULTICS used PL/0 to avoid issues that plagued C. Academics working on large projects like compilers and sometimes OS's had positive experiences with the LISP's, ML's, and Haskell. And so on.
They proved their worth every time they were used. People's gripes could've been fixed with less effort than what's been put into making C code maintainable, reliable, secure, etc. Instead, they kept trying to build modern stuff on a language & OS stripped of its best features to run on a PDP-11. And stayed on that bandwagon to the end.