They have a very strict patch review process in place.
They have kernel static analysers.
The kernel has been adding security gates through the years.
Yet 68% of 2018 CVE's were caused by memory corruption bugs, with the others ones left out being UB, numeric conversions, and very tiny fraction remaining of the logic error kind that are bound to happen in any programming language. 
Just the other day we had a Linux 5.0 pre-release bug caused by a GCC "optimization".
Oracle went with hardware memory tagging for Solaris/SPARC, Apple is following along on iOS, Microsoft security advisory (as of Blue Hat conf) is C# + Rust + constrained C++ , Google is pushing for hardware memory tagging, Kernel Self Protection Project, constraining what the NDK is allowed to do and Fuchsia is getting more Rust/Go modules by the day.
What is clear is that hoping for the best and the mythical 10x developers that write perfect C and C++ code won't ever happen.
 Source, Google talk at Linux Kernel Summit 2018.
 Following the Core Guidelines
Also porting the drivers to Fuschia should be relatively easy, thanks the Treble changes that kind of made their Android Linux kernel into a kind of hybrid-microkernel.
Android is already being ported to run on top of Fuchsia.
Google isn't the only one, the IoT space is getting crowed with BSD/MIT/Apache POSIX compatible OSes, including Zephyr from Linux Foundation, which is completly unrelated to Linux.
28 years ago no would would believe that Linux distributions would eventually kill commercial UNIX.
That's certainly a fair point.
I was thinking more along the lines of that, at least in the scenario of consumer usage of Linux, some really interesting features are finally gaining traction from various companies (3D accelerators for instance). Was just fearing that ditching existing work for the new shiny thing would send us back to square one.
Let's quit pretending that any software written in any other language would be more secure and have less bugs.
It's difficult to consider working on OS's in not-C, because C owns the market and ecosystem, and everything revolves around it.
But that's certainly a different reason than the absurd imagination that it's impossible to produce a language with stronger guarantees on bug-prone programming tasks (there are many languages that do, some with GC, some with stricter typing, some with static lifetime tracking, and so it goes on). That our CS theorists are so utterly incompetent that they couldn't manage to come up with even a single useful improvement to the divine language, 50 years after the great wizard Thompson etched its wisdom into a core dump
Let's quit pretending that (continued) market dominance correlates to (continued) quality.
>If there is a safer, cheaper, faster language to use, someone would be using it
someone is using other languages. Urbit, redox, lisp machines, mirage are all not-c systems/OSs. But you aren’t talking about writing an OS, you’re talking about writing a popular OS. Of which there are basically three worth noting, all of which happen to have sprung up around a similar timeframe, coincidentally when C was at its most popular...
You seem to have confused the software industry for some kind of meritocracy... it is not. It follows trends harder than than the fashion industry. Look ...anywhere... for examples: OS, DB, language, game design and tooling, the AI winters, the .com crash, everything about the web, HN itself, etc. Hell, your own argument is just bandwagoning, with nothing about the technical merit (smarter people than I are using it, thus I should... is exactly how we end up in this state)
C only got outside UNIX in the mid-90's.
Its ubiquity is an historical accident, by no means permanent, and thankfully some vendors are finally walking away from it, as proven by Microsoft security advisor for future Windows development best practices.
Those university students went out to found startups that created the UNIX workstation market, like Sun.
Had Bell Labs been allowed to sell UNIX, C would have been a footnote in systems programming languages.
EDIT: This is how well C was thought out.
"Oh, it was quite a while ago. I kind of stopped when C came out. That was a big blow. We were making so much good progress on optimizations and transformations. We were getting rid of just one nice problem after another. When C came out, at one of the SIGPLAN compiler conferences, there was a debate between Steve Johnson from Bell Labs, who was supporting C, and one of our people, Bill Harrison, who was working on a project that I had at that time supporting automatic optimization...The nubbin of the debate was Steve's defense of not having to build optimizers anymore because the programmer would take care of it. That it was really a programmer's issue.... Seibel: Do you think C is a reasonable language if they had restricted its use to operating-system kernels? Allen: Oh, yeah. That would have been fine. And, in fact, you need to have something like that, something where experts can really fine-tune without big bottlenecks because those are key problems to solve. By 1960, we had a long list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are higher-level than C. We have seriously regressed, since C developed. C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine. This is one of the reasons compilers are ... basically not taught much anymore in the colleges and universities."
-- Fran Allen interview, Excerpted from: Peter Seibel. Coders at Work: Reflections on the Craft of Programming
Don't get me wrong, I do like C. But it wasn't the run away success nor holds quite the monopoly you suggest it does.