Hacker News new | past | comments | ask | show | jobs | submit login

Linux is another good example how C doesn't cut it, even among top developers.

They have a very strict patch review process in place.

They have kernel static analysers.

The kernel has been adding security gates through the years.

Yet 68% of 2018 CVE's were caused by memory corruption bugs, with the others ones left out being UB, numeric conversions, and very tiny fraction remaining of the logic error kind that are bound to happen in any programming language. [1]

Just the other day we had a Linux 5.0 pre-release bug caused by a GCC "optimization".

Oracle went with hardware memory tagging for Solaris/SPARC, Apple is following along on iOS, Microsoft security advisory (as of Blue Hat conf) is C# + Rust + constrained C++ [2], Google is pushing for hardware memory tagging, Kernel Self Protection Project, constraining what the NDK is allowed to do and Fuchsia is getting more Rust/Go modules by the day.

What is clear is that hoping for the best and the mythical 10x developers that write perfect C and C++ code won't ever happen.

[1] Source, Google talk at Linux Kernel Summit 2018.

[2] Following the Core Guidelines

I still wonder what the plans for Fuchsia are. Is Google really thinking about just throwing out the millions of man-years which have been sunk into Linux?

They don't need to please everyone, just make their use case work.

Also porting the drivers to Fuschia should be relatively easy, thanks the Treble changes that kind of made their Android Linux kernel into a kind of hybrid-microkernel.

Android is already being ported to run on top of Fuchsia.

Google isn't the only one, the IoT space is getting crowed with BSD/MIT/Apache POSIX compatible OSes, including Zephyr from Linux Foundation, which is completly unrelated to Linux.

28 years ago no would would believe that Linux distributions would eventually kill commercial UNIX.

> 28 years ago no would would believe that Linux distributions would eventually kill commercial UNIX.

That's certainly a fair point.

I was thinking more along the lines of that, at least in the scenario of consumer usage of Linux, some really interesting features are finally gaining traction from various companies (3D accelerators for instance). Was just fearing that ditching existing work for the new shiny thing would send us back to square one.

3D acceleration works just fine in Android.

The kernel is mostly drivers. If they only support a bit of hardware, they'd be tossing a lot of stuff they don't need.

And almost every operating system in the world is written in C by top developers who are in the know. If there was a better way, everyone would do it, but they don't.

Let's quit pretending that any software written in any other language would be more secure and have less bugs.

A monopoly is naturally self-sustaining. It's absurd to think there couldn't possibly be a "better way", that C is somehow perfect in its niche (that all-encompassing niche, that once covered anything and everything that wasn't grabbed by perl); But major experience in developing OS's is with C.. the knowledge, experience, tooling, etc depends on C, because C already has complete and total dominance in the area.

It's difficult to consider working on OS's in not-C, because C owns the market and ecosystem, and everything revolves around it.

But that's certainly a different reason than the absurd imagination that it's impossible to produce a language with stronger guarantees on bug-prone programming tasks (there are many languages that do, some with GC, some with stricter typing, some with static lifetime tracking, and so it goes on). That our CS theorists are so utterly incompetent that they couldn't manage to come up with even a single useful improvement to the divine language, 50 years after the great wizard Thompson etched its wisdom into a core dump

Let's quit pretending that (continued) market dominance correlates to (continued) quality.

If there is a safer, cheaper, faster language to use, someone would be using it. For another language to take C's place, it needs to be at least two of those--maybe all three and more. Market dominance, if there is such a thing in this area, has nothing to do with it.

Market dominance can be read as ecosystem dominance, and that reading obviously exists. More specifically, the C language could be completely worthless on its own, but made good enough to be usable through tooling (eg fuzzing, valgrind, etc). As it turns out, theres a lot more involved in writing an OS than just the language; and its all so conjoined at the hip, enough so that it makes it quite difficult to change just the language. But this is an argument that despite C’s flaws, its existence is on the whole preferrable to a total rewrite.

>If there is a safer, cheaper, faster language to use, someone would be using it

someone is using other languages. Urbit, redox, lisp machines, mirage are all not-c systems/OSs. But you aren’t talking about writing an OS, you’re talking about writing a popular OS. Of which there are basically three worth noting, all of which happen to have sprung up around a similar timeframe, coincidentally when C was at its most popular...

You seem to have confused the software industry for some kind of meritocracy... it is not. It follows trends harder than than the fashion industry. Look ...anywhere... for examples: OS, DB, language, game design and tooling, the AI winters, the .com crash, everything about the web, HN itself, etc. Hell, your own argument is just bandwagoning, with nothing about the technical merit (smarter people than I are using it, thus I should... is exactly how we end up in this state)

Let stop hand waving C security exploits caused by top developers, in spite of best practices.

C only got outside UNIX in the mid-90's.

Its ubiquity is an historical accident, by no means permanent, and thankfully some vendors are finally walking away from it, as proven by Microsoft security advisor for future Windows development best practices.

I'm pretty sure C was pretty well thought out and wasn't used by accident in any way. After almost 50 years of usage, I'm not sure we can say it's "by no means permanent" either.

C only got where it is thanks to Bell Labs not being allowed to sell it commercially, thus giving it away for a symbolic price to universities alongside source code.

Those university students went out to found startups that created the UNIX workstation market, like Sun.

Had Bell Labs been allowed to sell UNIX, C would have been a footnote in systems programming languages.

Instead gratis won and we got the JavaScript of systems programming.

EDIT: This is how well C was thought out.

"Oh, it was quite a while ago. I kind of stopped when C came out. That was a big blow. We were making so much good progress on optimizations and transformations. We were getting rid of just one nice problem after another. When C came out, at one of the SIGPLAN compiler conferences, there was a debate between Steve Johnson from Bell Labs, who was supporting C, and one of our people, Bill Harrison, who was working on a project that I had at that time supporting automatic optimization...The nubbin of the debate was Steve's defense of not having to build optimizers anymore because the programmer would take care of it. That it was really a programmer's issue.... Seibel: Do you think C is a reasonable language if they had restricted its use to operating-system kernels? Allen: Oh, yeah. That would have been fine. And, in fact, you need to have something like that, something where experts can really fine-tune without big bottlenecks because those are key problems to solve. By 1960, we had a long list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are higher-level than C. We have seriously regressed, since C developed. C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine. This is one of the reasons compilers are ... basically not taught much anymore in the colleges and universities."

-- Fran Allen interview, Excerpted from: Peter Seibel. Coders at Work: Reflections on the Craft of Programming

Modern C is very different to K&R's C. Which itself was based other programming languages before it (like B - got to love their naming convention for programming languages!). But the history of C aside, it wasn't the only language that operating systems were built on. LISP, Pascal and obviously assembly / machine code too. In fact Pascal was a very popular systems language on home computing in the 80s and early 90s. If I recall correctly it used heavily by Microsoft and Apple too.

Don't get me wrong, I do like C. But it wasn't the run away success nor holds quite the monopoly you suggest it does.

Humans are fallible. Relying on them to make the same safety decisions afforded by languages such as Rust is asinine.

That is exactly why we need safety rails.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact