I'm a big C proponent, via C++, I've thought about this article on-and-off since it first came out. My team uses C++ to define new container types & some type-safe translation stuff. If I think about it, our containers are — at their root — fancy fat pointers.
Fixed but not enforced (which makes sense given compatibility with C). And that's often the problem, as each next "flaws are now fixed" iteration adds +1 method to do the same thing.
C's biggest mistake was originated in 1972 by people trying to create a language for writing operating systems, instead of being originated in 2000 by people trying to create a language that didn't allow you to make the mistakes you could make in C.
Yeah, yeah, yeah. C is horrible, and programmers are all idiots because they didn't see how wonderful the stuff you like is. Sure. To steal a line from Jane Austen (or at least a film adaptation), "You tell yourself that, if it gives you comfort."
The reality is that your approach didn't meet the needs of the real world as well as the C approach did. You may dislike it, you may resent it, but your way didn't work very well.
It's not only because Unix was free, and C came along for the ride. It's also that Multics took several times as long to write, and required bigger hardware. (So Unix would have been cheaper even if the OS was not free, both because of lower cost to write, and because of lower hardware cost for a system to run it on.) And less cost opened up far more possible uses, so it spread quickly and widely. (How many Multics installations were there, ever? Wikipedia says about 80.) Which is better, the language and OS that have flaws, or the language and OS that run on hardware you don't have and can't afford?
And Unix was far more portable. Didn't have a PDP-11 either? No worries; it was almost entirely written in C. You could port it to your machine with a C compiler and a little bit of assembly (a very little bit, compared to any other OS). Didn't have C either? It was a small language; it wasn't that hard to write a compiler compared to many other languages. If you were a university, you could implement C and port Unix on your own. But once on university had done so for a particular kind of computer, everyone else could usually use their implementation.
And the final nail in the coffin of your style: Programmers of that era preferred that languages get out of their way far more than that languages hold their hand. Your preferred kind of languages failed, because they didn't work with people. It may have worked with the people you wanted to have, but it didn't work with the people that actually existed as programmers at the time. A tool that doesn't fit the people using it is a bad tool.
But all is not lost for people like you. There are other languages that fit your preferences, and you can still use them. Just stop expecting them to become mainstream languages. The languages that the majority of programmers found better (for their use) are the languages that won. And that majority was not composed entirely of fools and sheep.
All the languages I care about to stay away from C are mainstream.
I have not had any reason to touch C since 2001, other than to keep up with WG14 work, exploit analysis, and security mitigations from SecDevOps point of view.
History lesson, there were other OSes other than Multics.
Also UNIX only became written in C on version V, and contrary to urban myths spread by UNIX Church, it wasn't the only OS written in a portable systems programming language.
It was the only one that came with free beer source tapes.
Now with governments looking into cybersecurity bills, let's see how great C is all about.
Were any others written in a portable language, that could also run on something as small as a PDP11? Were any of them as easy to port as Unix was? Were any of them written in a language as easy to implement as C was?
If not, then you didn't really answer what I said. You made an argument that used some of the same words, but didn't actually answer any of the substance.
The security complaints... yeah. A safer language would give you a smaller footprint of vulnerabilities, which would have made your life easier for last two decades. (Maybe you earned being bitter about C.) That's unrelated to the spread of C and Unix in the 1970s and 80s, though.
Ah yes, the PDP-11 example, as if hardware from 1958 - 1972 was more powerful.
Yes, they were as easy to port as UNIX was, for those that owned the code, Unisys still sells Burroughs as ClearPath MCP, nowadays running perfectly fine in modern hardware.
1980's he says,
"Many years later we asked our customers whether they wished
us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions
would have long been against the law."
C.A.R Hoare in 1980, during his Turing Award speech.
"Oh, it was quite a while ago. I kind of stopped when C came out. That was a big blow. We were making so much good progress on optimizations and transformations. We were getting rid of just one nice problem after another. When C came out, at one of the SIGPLAN compiler conferences, there was a debate between Steve Johnson from Bell Labs, who was supporting C, and one of our people, Bill Harrison, who was working on a project that I had at that time supporting automatic optimization...The nubbin of the debate was Steve's defense of not having to build optimizers anymore because the programmer would take care of it. That it was really a programmer's issue.... Seibel: Do you think C is a reasonable language if they had restricted its use to operating-system kernels? Allen: Oh, yeah. That would have been fine. And, in fact, you need to have something like that, something where experts can really fine-tune without big bottlenecks because those are key problems to solve. By 1960, we had a long list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are higher-level than C. We have seriously regressed, since C developed. C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine. This is one of the reasons compilers are ... basically not taught much anymore in the colleges and universities."
-- Fran Allen interview, Excerpted from: Peter Seibel. Coders at Work: Reflections on the Craft of Programming
" We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy. "
-- Rob Pike on his Slashdot interview
And to finish on security,
> The combination of BASED and REFER leaves the compiler to do the error prone pointer arithmetic while having the same innate efficiency as the clumsy equivalent in C. Add to this that PL/1 (like most contemporary languages) included bounds checking and the result is significantly superior to C.
> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.
Two points in response. Then I'm done. You can have the last word if you want it (and if you're still reading this, this long after the initial post.)
First, "such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for." This actually is what I was arguing - you could do things (like write or port an OS) with a much smaller group, which opened doors that were closed by more "advanced" tools/languages. The barriers to entry were lower, so lot more people were able to do a lot more things. There was massive value in that.
Second, it's not like the legislature made it illegal to work on all these other approaches. They were abandoned because the people working on them abandoned them. Nobody put a gun to their head.
This kind of goes back to the first point. C opened a lot of doors. Research stopped in some directions, because new areas opened up that people found more interesting.
In fact, this whole thing has a bit of a flavor of the elites mourning because the common people can now read and write, and are deciding what they're going to read and write, and it's not what the elites think they should. You (and the people you quote) are the elites that the people left behind. You think that yours is the right path; but the people disagree, and they don't care what you think.
(The point about hardware from 1958 being less than a PDP11 I will concede.)
> The nubbin of the debate was Steve's defense of not having to build optimizers anymore because the programmer would take care of it.
C compiler development has forgotten this. C compilers must not be too clever in optimizing. They should generate tidy code that allocates registers well, and puts local variables into registers well, and peepholes away poor instruction sequences. But all the memory accesses written in the program should happen.
Maybe Steve and all the early C people were philosophically wrong, but that's the language they designed; it should have been respected. In C, how you write the code is supposed to matter, like in assembly language.
The problem with this approach, as anyone coding on 8 and 16 bit home computers is well aware, anyone with a medium skill level in Assembly programming could easily outperform code generated from classical C compilers, and compiler vendors had to win those compiler benchmarks on computer magazines, so optimize exploring every UB trick in the book they did.
Not by a long shot. The ability to switch between array notation and pointers is fantastic. People crying about it in 2023 is the same as crying about why a 1971 Dodge Challenger doesn't have ABS and airbags.
Edit: I just realised the article was written by D's Author! So maybe that is C's biggest mistake.