Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And why? Because in practice, C on Unix was better for writing actual programs that actually worked to do things that people cared about. And that's why it matters: Because C was actually better in practice. Everybody seems to be in denial about that; maybe they should learn to deal with it.

Was that "actually better" due to C, or to Unix? Well, maybe some of both, but Unix was written in C, so C was useful for writing the operating system that you blame for C taking over the world...

If you don't understand why C was better in practice, you're going to fail in attempts to create something better, and never know why. You're just going to be ignored, while you whine about how your way was better.



Completely wrong!

Back in those days we used to pay for our compilers, usually in values of thousands, remember those?

Only languages delivered as part of was then the OS vendor's tooling got used at most work locations.

It always required lots of persuasion to buy compilers for languages not delivered as a standard part of the OS.

So as UNIX managed to gain a foothold into the enterprise and universities, pushing mainframes and other OSes, C gained mind-share.

UNIX took over the world by accident, by having AT&T initially providing the code for free (which they repented later on) to universities, which happened to have persons like Bill Joy and Scott McNealy that got successful with their startups using the OS they enjoyed at their universities.

Had AT&T never given the code for free or those startups floundered, UNIX will be another footnote alongside MULTICS and friends.

Just like nowadays no one sane would use JavaScript, if the browsers had first class support for other programming languages.


There is an element of truth in what you say. You seem to think it's the whole story; I do not. Had C not been good in practice, "free" wouldn't have been enough.

Those OSes written in other system programming languages: Why didn't their computers take over the world? I mean, sure, AT&T was pumping money into Unix, but other companies were pumping money into the competition. Why did Unix win? It's not just because of evil AT&T. It's because Unix could deliver working features, and others struggled to keep up.

Why could Unix deliver working features? Yes, partly because of AT&T's money. But partly because C turned out to be really useful as a systems programming language.

These other languages you mentioned a few posts ago: Sure, they looked good on paper, only where's the beef? When it came time to deliver, what was written in them? More functionality was written in C, which is why Unix won.

I'll repeat my previous statement: If you don't understand why C was better in practice, you're going to fail in attempts to create something better, and never know why. You're just going to be ignored, while you whine about how your way was better.

History ignored your "better" languages. The world moved on from them, for good reasons. You think they were better, but in real life, they weren't.


What can you do in C that you can't do in, say, Algol 68? Or even Pascal (once you get to a recent enough version supporting dynamic arrays/first-class pointers)?

I think the situation is not far off from what pjmlp is saying: there were several more or less equally good languages available at the time, and C won by virtue of being in the right place at the right time. There are lots of really nice things about C from a systems point of view, but most languages of the 70s had those too. C was a nice language for the late 70s (not so much today, IMO), but not hugely nicer than its contemporaries.


You can do it in those languages (at least you can in a modern Pascal; I don't know enough about Algol to say).

But remember, the original claim by pjmlp was "It is a big flaw to ignore what other language communities were doing. Other systems programming languages older than C did it properly." That doesn't apply to the Pascal you speak of, because at the time C arose, Pascal wasn't "once you get to a recent enough version supporting dynamic arrays/first-class pointers". It was Pascal with the size of an array being part of the type of an array. That's type safety, sure. But it also means (to use an example that I have personal experience with), if you're writing a numerical simulation on a 2D array, and you want to let the user specify how big the mesh is, and then allocate memory to hold the mesh, you have a problem. You can't define a variable-sized array at all in the Pascal of the time.

Now try to think about how you'd write a memory allocator in that. Good luck.

Another example: We were on an embedded system. To write to hardware registers, we had to call an assembly-language subroutine. In C, we would have simply used a pointer to an absolute memory address.

This is what C gave you - you could just do things without the language getting in the way. Yes, you could cut yourself on C's sharp edges, whereas Pascal protected you. But when you needed the sharp edges to cut something, Pascal didn't have them, and you were stuck.

Again, I don't know enough about Algol to meaningfully compare it to C. The Pascal of the day wasn't the answer, though.


You are assuming hardware registers were available via memory mapped address. Many machines used only IO ports. Where are the C features for that?

Pascal was originally used for teaching it doesn't count. The first time it was used for writing an OS, was with the Object Pascal dialect for the Mac OS.

What counts were Algol 60, Algol 68, Algol W, PL/I, PL/M, Mesa among many others.

As for the statement of only C being usable without Assembly, check the B5000 from the Burroughs Corporation, developed in 1961.

EDIT: where => were


> You are assuming hardware registers were available via memory mapped address.

I'm not assuming it. The environment I was in had memory-mapped IO.

> Many machines used only IO ports. Where are the C features for that?

C compilers for machines that only had IO ports typically had a library function to do it. At least, by the era of Turbo C, they did. (The x86 architecture is the first one I was familiar with that did IO that way, and Turbo C was the first C compiler I had on it. I presume that, if earlier architectures did IO that way, C compilers for those architectures had similar capabilities, but I do not know that first-hand.

> Pascal was originally used for teaching it doesn't count.

I was replying to pcwalton, who asked what you can do in C that you can't do in Pascal. So it may not count to you (four comments upthread from here), but it counted to the comment I was replying to (two comments upthread from here). Perhaps your comments would be better addressed to him/her.

> As for the statement of only C being usable without Assembly, check the B5000 from the Burroughs Corporation, developed in 1961.

I didn't say that only C was usable without assembly. I said that using Pascal, in order to write to memory-mapped IO registers, you had to call a function written in assembly, and that in C you didn't have to do that. I never said only C was usable that way. Please stop putting words in my mouth that I didn't say.


> I said that using Pascal, in order to write to memory-mapped IO registers, you had to call a function written in assembly, and that in C you didn't have to do that. I never said only C was usable that way. Please stop putting words in my mouth that I didn't say.

In what language do you think the Turbo C library functions for port IO and all those handy BIOS and MS-DOS calls were written on?

Assembly, of course.

As for doing memory mapped IO with Pascal, you could something like this in Turbo Pascal. Other dialects had similar extensions.

    var
       videoMem : array [0..255] of byte absolute $A0000;

    begin
      videoMem [0] := 12;
    end;


> In what language do you think the Turbo C library functions for port IO and all those handy BIOS and MS-DOS calls were written on? Assembly, of course.

Of course. But they still looked like just another function call in C, and they came with the compiler, so the programmer writing in C didn't care.

In my Pascal example, there was no such function, so we had to write it, so we had to care.

You seem to be consistently trying to make my words say things that I am not saying, and then arguing against positions that are only in your own mind. It's getting quite tedious.


Pascal does count because people wrote real software in it, it was better than C, Pascal/P was one of the best portability hacks ever, UCSD P-system used it, and Modula-2 built a real OS with a lot of same features. That said, if people counter Pascal, we can remind them of its purpose as you did if the limitation was due to that purpose. Modula-2 is my goto reference in these discussions because it's closest to C's niche: resource-constrained, system programming w/out garbage collection. And better than C w/ successors that were all better than C. :)

"As for the statement of only C being usable without Assembly, check the B5000 from the Burroughs Corporation, developed in 1961."

Good call. First great, overall system as I see it. Might also remind them that Wirth's and Jurg's Lilith workstation ran on Modula-2 and assembly. Most comparable to UNIX in terms of hardware and personnel constraints. Shows even constrained teams could do better than C or UNIX. The Oberon systems used Oberon and assembly, as well, with a GC'd OS and software. That helps in another recurring debate about OS's in "managed code." ;)

So, yes, that C is necessary for systems programming is a long-running myth refuted by examples which were better because they didn't use it.


He's close to the whole truth. All of you talking about this would know if you merely looked up the history of C language and UNIX. C's lineage is as follows: ALGOL60->CPL->BCPL->B->C. Compare ALGOL60, PL/I, Pascal, or Modula-2 to C to see just how little C did for people. Why did they take all the good features features out and introduce the dangerous alternatives? They needed something whose apps and compilers would work on their PDP-11 with easy reimplementation. That's it.

Note: Nicklaus Wirth's solution to same problem was much better: P-code. He made idealized assembler that anyone could port to any machine. His compiler and standard library targeted it. Kept all design advantages of Pascal with even more implementation simplicity than C. Got ported to something like 70 architectures/machines.

Now, for OS's. Let's start with Burroughs MCP. The Burroughs OS was written in a high-level language (ALGOL variant), supported interface checks for all function calls, bounds-checked arrays, protected the stack, had code vs data checking, used virtual memory, and so on. That's awesome and might have given hackers a fight!

Later on, MULTICS tried to make a computer as reliable as a utility with a microkernel, implementation in PL/0 to reduce language-related defects, a reverse stack to prevent overflows, no support for null-terminated strings (C's favorite), and more. It was indeed very reliable, easy to use, and seemed easy to maintain. You'd have to ask a Multician to be sure.

So, the OS's were comprehensible, used languages that made reliability/security easier, had interface/array/stack protections of various sorts, consistent design, and all kinds of features. Problem? Mainframes were expensive. The minicomputers Thompson and Ritchie had were affordable but their proprietary OS's were along lines of DOS. You can't do great language or OS architecture on a PDP-11 because it's barely a computer. It would still be useful, they thought, if it had just enough of a real language and OS to do useful work.

So, they designed a language and OS where simplicity dominated everything. They took out almost all the features that improved safety, security, and maintenance along with using a monolithic style for kernel. Even the inefficient way UNIX apps share data was influenced by hardware constraints. The hardware limitations are also why users had to look for executables in /bin or /sbin for decades: original machine ran out of space on one HD & so they mounted another for rest of executables. All that crap is still there because fixing it might break apps & require fixing them. Curious, did you think they were clever design decisions rather than "we can't do something better without running out of memory or buying a real computer so let's just (insert long-term, design, problem here)?"

The overall philosophy is described in Gabriel's Worse is Better essay:

https://www.dreamsongs.com/RiseOfWorseIsBetter.html

As Gabriel noted, UNIX's simplicity, source availability, and ability to run on cheap hardware made it spread like a virus. At some point, network effects took off where there's so many people and software using it that sheer momentum just adds to that. Proprietary UNIX's, GNU, and Linus added more momentum. After much turd polishing, it's gotten pretty usable and reliable in practice while getting into all sorts of things. One look underneath it shows what it really is, though, with not much hope of it getting better in any fundamental way:

https://queue.acm.org/detail.cfm?id=2349257

So, aside from not knowing history, there seems like there's not even a reason to debate the reason behind bad design in C and UNIX at this point aside from merits of overall UNIX architecture vs others. The weaknesses of C and UNIX were deliberately inserted into those by the authors to work around hardware limitations of their PDP-11. As those limitations disappeared, these weaknesses stayed in the system because FOSS typically won't fix apps to eliminate OS crud any quicker than IBM or Microsoft will. Countless productivity, money, and peace of mind were lost over the decades to these bad design decisions in the form of crashes or hacks.

Using a UNIX is fine if you've determined it's the best in cost-benefit analysis but let's be honest where the costs are and why they're there. For the Why, it started on a hunk of garbage. That's it. Over time, when it could be fixed, developers were just too lazy to fix it plus the apps depending on such bad decisions. They still are. So, band-aids everywhere it is! :)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: