
On the history and justification of the C programming language - Tomte
http://pastebin.com/UAQaWuWG
======
dboreham
This completely misses the key factor: (most) computers were not big or fast
enough to run high level languages back then. So for people who only got to
use small computers, they had to code in assembler (or cross-compile on a big
computer).

So anyone who could come up with a high(er) level language that would run on
smaller machines was a winner. It meant the difference between a life of
misery hacking assembler or a life of milk and honey coding in a high level
language. There was no "Oh, let's be stupid and omit <safety feature du jour>
from our language". It was more like "Wow, I got a compiler to run on that
PDP-11/20 and it only had to be split into 5 passes to fit into core".

There were many many simple languages back then that occupied this niche. Only
C has survived into the modern era (thanks to open source compilers and Unix
and Microsoft's adoption of C for WindowsNT). If things had turned out
differently we would be complaining about the lack of safety in code stil
being written in Bliss or IMP or PL/S.

~~~
stcredzero
_This completely misses the key factor: (most) computers were not big or fast
enough to run high level languages back then_

From the 1st paragraph, that's what I thought they were going to talk about:

 _totally contradicts real reasons why C is designed the way it is. Even back
then, there were better languages /OS's [1] that cheap hardware just hadn't
caught up to. That cheap hardware's limitations, and them alone, drove the
design decisions of BCPL, B, and then C._

But you're right, they're mostly talking about the compilers.

 _So anyone who could come up with a high(er) level language that would run on
smaller machines was a winner._

I think that's what the Objective-C folks were thinking. I guess the author
would lament that they chose C as the other partner.

 _Only C has survived into the modern era_

Forth was around in those days. So was Smalltalk and Lisp. So maybe, "survived
as more than a niche language?" (Forth and Smalltalk are really from the 70's.
You could argue that only Lisp is from "those days.")

~~~
dboreham
I suppose I was thinking of the set of languages commonly used back then for
programming smaller computers. Smalltalk definitely wasn't aimed at small
machines. I guess I just woudln't include Forth and Lisp in that set. Not sure
exactly why to be honest. Forth certainly ran on very small machines. Both
were considered "weird" where I came from, but I acknowledge that isn't a
universally held view ;)

~~~
hga
Lisp first ran on a vacuum tube computer, it was definitely "small", and it
can certainly run fairly nicely on current "small" machines, although not
"very small" like I recall of Forth.

Smalltalk's machines weren't all that large by today's standards, memory was
128kB, expandable to 512kB per Wikipedia, but expansions were addressable only
by bank switching. 128kB is the same memory budget as the Unix PDP-11/45's for
a single process.

~~~
nickpsecurity
Didn't know Smalltalk constraints were similar. Might have to add Smalltalk in
the essay somehow. Which computer did LISP start on again?

~~~
dalke
IBM 704 -
[https://en.wikipedia.org/wiki/Lisp_(programming_language)#Hi...](https://en.wikipedia.org/wiki/Lisp_\(programming_language\)#History)

~~~
nickpsecurity
Thanks! Wow. That's some tough hardware to work with. So, both Fortran and
LISP developed on the same machine. Guess I just never thought about the LISP
hardware given I started with the AI books & went straight to modern work. Had
I done so, I'd have wondered less about whether it could work in constrained
environments haha.

~~~
dalke
You view things perhaps too much through modern eyes.

For them, that was some amazing hardware to work with. A lot of new ideas were
tested out on the 704.

~~~
nickpsecurity
Oh yeah, Im sure. Im using modern eyes intentionally in context that many
people think a LISP or dynamic language can't run on constrained hardware. The
704 was pretty constrained by today's standards. Refutes that claim.

~~~
dalke
No, you weren't using "modern eyes intentionally in context". You wrote "I'd
have wondered less about whether it could work in constrained environments",
and your description comes across as unintentional. That's what I'm referring
to.

I grew up with microcomputers. My first machine had 16K, which I think is
about what a 704 had. There's a lot you can do with 16K. Altair BASIC ran on
4K machines. Even modern micropython, meant for microcontollers, works in 64K,
and that language wasn't designed to be space efficient.

Oh, and from
[http://edn.embarcadero.com/article/20693](http://edn.embarcadero.com/article/20693)
:

> Turbo Pascal version 1.0 shipped on one floppy disk. The total number of
> files on the disk was 10. Total floppy disk space used was 131,297 bytes.
> Total size of TURBO.COM (including the integrated development environment
> with compiler, Wordstar-style editor, and run-in-memory system) was 33,280
> bytes.

~~~
nickpsecurity
"No, you weren't using "modern eyes intentionally in context". You wrote "I'd
have wondered less about whether it could work in constrained environments",
and your description comes across as unintentional. That's what I'm referring
to."

You're putting a lot of weight into words I wrote casually. The intended
meaning was that I hadn't investigated what a good, useful-today
implementation of LISP could do in what would be a highly-constrained system
today. Particularly, on cheap microcontrollers used in embedded systems. I
have seen LISP's like femtolisp that use around 1K for implementation with
good performance. So, original LISP being able to run on a 704 implied quite a
lot to me.

On other hand, BASIC and Pascal are low-level, static 3GL's. They're easy to
compile efficiently into efficient code even without much optimization. I
pulled it off for BASIC without knowing much about compilers. I'm sure it took
talent to get the mix of functionality and efficiency in stuff like
TurboPascal or some of the Altair apps. It's not telling me much about how a
dynamic LISP will work on those platforms, though, given its execution style
is so different. Especially if I code in a way that doesn't penalize
readability or maintenance for efficiency.

Fortunately, there's a pile of papers I can go through if I want to determine
an optimal way to do that. Many of which are from 70's-80's. The 704 result
just tells me embedded impact could happen if I kept it efficient. Micropython
work is also impressive. I hope the point of my reply is more clear now.

------
TorKlingberg
> There's a recurring theme where people think the C language's design is good
> for systems programming, even today. These people think that someone sat
> down, thought of every tradeoff, and made the best ones for system
> programming.

Or, people think that a good practical language can only come from using it to
solve real problems and iterating on the design. Sitting down and conjuring up
the perfect language just does not work.

Now, I am the first to agree that C has plenty of flaws. For one, the integer
types are just a mess. I will be happy if something can replace it, but so far
most attempts have ended up either too slow or too complicated.

~~~
kabdib
An example of convergent evolution -

Apple was primarily a Pascal shop back in the early 80s, when they were
developing the Lisa. But since bog-standard Pascal is useless for systems
programming, Apple extended it with things they needed: Pointer arithmetic (
_not_ auto-scaling by base type, so you were constantly multiplying by
sizeof), oh yeah, sizeof, and a bunch of other stuff I've forgotten. Pascal's
miserable I/O system was left on the floor. NEW was largely replaced by a heap
manager that worked for small systems. And so on.

So what Apple wound up with was "C, with nested procedures". The two languages
were nearly identical in semantics. I could spend a day hacking on something,
and if you asked me in the evening which language I'd been using I would have
had to think about it.

The types of bugs that people wrote didn't seem to vary depending on the
language they chose.

My informal survey is that other companies extended their versions of Pascal,
making similar choices. These companies (Apollo? a few others) have sunk
without a trace.

Languages that look a lot like C are what you get from systems level people
doing near-bare-metal programming. You're not going to fix the language, you
need to convince the people doing the work that you have something better. And
for the most part, something better doesn't exist.

You _can_ write low-level systems in higher-level languages; Microsoft's
Midori was an OS from the metal-up in C#, with a lot of interesting deep
thinking about abstractions and security. SmallTalk / SELF / FORTH are
interesting until you need to scale. LISP Machines didn't take the world by
storm in the 80s, and we didn't get very interesting fallout from all the
LISP-based OS work that was being done. ADA . . . I know one guy who loves
ADA, but he can't tell me what he works on.

Successful languages seem to be the result of commercially successful systems,
not the other way around.

~~~
microcolonel
This is exactly what I think of when I read this kind of unilateral negativity
about C. I love writing programs in C. I don't need to learn any new semantics
ever. Even the integer types, as parent was complaining about, have been
largely cleaned up(I use (u)int(32,64)_t and (s)size_t and everything just
works.

The problem space is well understood, the tools are widely-deployed and
reliable, the performance is excellent. I just don't see how people can be so
negative about something which has been so immensely useful without getting in
the way.

~~~
blub
Op probably meant that C integer types are a mess because they underflow,
overflow or result in undefined behaviour at the smallest mistake. :)

------
CrLf
The best things always have detractors, yet they live on.

C is over 40 years old and is still in widespread use today, even though there
have been (supposedly) better languages targeting the same use cases for
decades now. There is more to it than just inertia.

I've been using it less and less over the years, mostly because I've also been
tackling less and less problems in the domains where it excels. It is still my
favorite language though.

~~~
optforfon
I respectfully disagree. I think C is a giant disaster that will continue to
plague us entirely due to inertia. You're really underestimating how
entrenched it is. Try to go program on micro in something other than C. Try
programming a driver. The toolchains simply don't exist - the comparability
with several decades of work is also shaky.

C is a terrible cross platform assembly b/c it doesn't allow you the level of
control which one would expect in this day and age. Maybe it was written in
the days when CPUs were brain-dead simple, but basic things that are available
on most architectures aren't part of the language. It has no notion of a CPU
cache, it has no notion of branching (you can't tag a likely branch and an
unlikely one), it even goes as far as to ignore user keywords for inlining and
provides no keyword for blocking inlining. RVO is implicit magic that you just
pray happens. Const != immutability.

When you write C you have no notion of what the compiler is going to output
whatsoever

A lot of these things are available through compiler extensions (so great, now
it's not cross-platform), but even still the language is broken. "Expert C
Programming" has a really long chapter that goes over all the more confusing
subtly problems C has beyond just arrays being pointers-but-not-really.

I have to use C pretty regularly and all I can think is "God.. why hasn't
anyone made this better yet"

~~~
ArkyBeagle
'C' is quite sufficient for micros and drivers. Micros are supposed to be
small. Python's catching on with the kids for systems programming because of
Arduino. Drivers are ... well, _drivers_. If you write 'C', you _have to look
at the assembly_.

And adding inline assembler to manage the hardware - caching, that sort of
thing - is just how it's done. Maybe it's possible to make a library ( or a
bunch of #defines ) that manages this for you?

I really am sorry you have to use it, but there are rather large populations
of people who go completely untroubled with it.

~~~
nickpsecurity
"And adding inline assembler to manage the hardware - caching, that sort of
thing - is just how it's done. Maybe it's possible to make a library ( or a
bunch of #defines ) that manages this for you?"

The current ones might be like C on that part, might do something different. I
haven't evaluated that. Here's a few:

[http://www.astrobe.com/Oberon.htm](http://www.astrobe.com/Oberon.htm)

[http://www.mikroe.com/mikropascal/#](http://www.mikroe.com/mikropascal/#)

[http://www.mikroe.com/mikrobasic/](http://www.mikroe.com/mikrobasic/)

~~~
ArkyBeagle
Why is it we're not all using Oberon again? :) Ah, I remember having meetings
about things like that... now _that_ was inertia.

SFAIK, ( meaning I don't really ) the various "Mikro" probably do the same
thing that the native toolsets for , say PIC, do and simply add default,
built-in symbols for bits in control registers and things like turning
interrupts off/on. That keeps the amount of assembly further down.

I feel confident in saying that because they'd want to make defection from the
default say, AVR compiler chain as painless as possible.

And of course, Linux drivers/ioctl() help a lot with bringing kernel mode
things into user space. Using assembler in a driver is just one of those
things we find acceptance with.

------
agentultra
... as if all development of C stopped with the first ANSI standard?

C99 and C11 are great languages. VLAs, anonymous structs, variable
initializers, atomics... they're not the C of the 80s and 90s. They still have
the usual warts but the tooling is great and support is coming along for the
newest features to most platforms.

While the spirit of, "trust the programmer," might have been a historical
accident there is some merit for its usefulness. Our chief problem as
programmers is managing data: taking some source inputs, performing a
transformation, and sending the output somewhere. That's true of video games,
web applications, embedded sensors, etc. As a high-level language, C is one of
the few that gives the programmer some control over how memory is mapped to
the target platform and allows them to build the appropriate abstractions on
that hardware.

It's also a loaded weapon... humans make mistakes and inevitably those
mistakes make it into our programs. When working in a language like C one must
conduct ones self accordingly.

I'm a fan of FP and ML-style languages. I'm looking forward to a day when we
can have our cake and eat it too. It'd be nice to be able to define how memory
should be handled for a particular application at the granularity of the
target platform while still getting the safety guarantees.

I like where Blow is going with his Jai language. It'd be neat to see more
research in this area.

~~~
pcwalton
> C99 and C11 are great languages.

I think the amount of undefined behavior in C--and that undefined behavior is
necessary to compensate for its design mistakes (e.g. signed
overflow)--disqualifies it from being a "great language". Empirically,
virtually every single large C program in existence depends on undefined
behavior to continue working. And that UB is very difficult to spot [1].

> It'd be nice to be able to define how memory should be handled for a
> particular application at the granularity of the target platform while still
> getting the safety guarantees.

> I like where Blow is going with his Jai language. It'd be neat to see more
> research in this area.

You said that you wanted safety guarantees, but Jai is explicitly not about
safety guarantees. In fact, being memory unsafe is one of the key design
criteria behind the language. That's what differentiates it from for example
Rust.

[1]:
[https://news.ycombinator.com/item?id=11728324](https://news.ycombinator.com/item?id=11728324)

~~~
vonmoltke
> I think the amount of undefined behavior in C--and that undefined behavior
> is necessary to compensate for its design mistakes (e.g. signed
> overflow)--disqualifies it from being a "great language".

The undefined behavior is there primarily to account for the variation in
platforms. Higher-level languages avoid this by hiding it in a platform-
specific compiler or VM, which in turn limits the number of platforms the code
can actually be run on.

> Empirically, virtually every single large C program in existence depends on
> undefined behavior to continue working.

Citation needed. I get that undefined behavior is hard to spot, but having
spent years writing cross-platform C I cannot accept that every large program
depends on it without some evidence.

~~~
pcwalton
> The undefined behavior is there primarily to account for the variation in
> platforms. Higher-level languages avoid this by hiding it in a platform-
> specific compiler or VM, which in turn limits the number of platforms the
> code can actually be run on.

It may have started this way in the committee (though that isn't true for e.g.
type based alias rules), but nowadays it's depended on for performance. You
lose a lot of ability to reason about loop trip count if signed overflow is
defined, for example.

> Citation needed. I get that undefined behavior is hard to spot, but having
> spent years writing cross-platform C I cannot accept that every large
> program depends on it without some evidence.

The probability that a program depends on any of (nonexhaustive list) (a)
inspecting the value of a dangling pointer; (b) taking the offset from NULL;
(c) type-punning through a union; (d) type-punning through any type other than
char; (e) signed overflow; (f) order of operations between sequence points
quickly approaches 1.

Here's a very interesting survey investigating this:
[https://www.cl.cam.ac.uk/~pes20/cerberus/notes50-survey-
disc...](https://www.cl.cam.ac.uk/~pes20/cerberus/notes50-survey-
discussion.html)

~~~
vonmoltke
> It may have started this way in the committee (though that isn't true for
> e.g. type based alias rules), but nowadays it's depended on for performance.
> You lose a lot of ability to reason about loop trip count if signed overflow
> is defined, for example.

I agree it is depended on in many cases, and should not be. That is on the
_programmers_ though, not the language designers. The behavior does not exist
to cover deficiencies in the _language_ but those of its _users_.

> The probability that a program depends on any of (nonexhaustive list) (a)
> inspecting the value of a dangling pointer; (b) taking the offset from NULL;
> (c) type-punning through a union; (d) type-punning through any type other
> than char; (e) signed overflow; (f) order of operations between sequence
> points quickly approaches 1.

Just on your short list I can tell you the probably of any of our C code
depending on that approached zero:

> inspecting the value of a dangling pointer

Our code was malloc once, free never. Dangling pointers were by definition
impossible.

> taking the offset from NULL > type-punning through a union > type-punning
> through any type other than char > signed overflow > order of operations
> between sequence points

Our coding standards covered all of those, so their existence would be a code
review failure.

We weren't specifically writing to MISRA C, but I will wager that your
statement is not accurate for any project that follows it or any similar
standard. Yes, it is a lot of work that you don't need to do with a more
controlled and less flexible language. I'm not going to challenge that. I'm
just challenging the idea that nobody is writing C that avoids undefined
behavior.

~~~
catnaroek
> Our code was malloc once, free never. Dangling pointers were by definition
> impossible.

Dangling pointers can arise in other ways: use an uninitialized pointer
variable, take the wrong offset from an otherwise valid pointer, etc.

~~~
vonmoltke
Those are indeterminate values but they aren't dangling pointers. Dangling
pointers are specifically references to an object that has been freed.

Plus, the cases you highlighted are programming errors, not reliance on
undefined behavior.

------
dleslie
The author snorts at aspects of C that led to its commonplace usage.

> 9\. New design goal: trim out anything hard to compile from CPL. Naturally
> eliminates most features for robust and maintainable programming.

Naturally? If those features are hard to compile, then they are hard to reason
about; the author begs the question in their statement, but I am not convinced
by their conclusion. There's a tendency for things which are hard to reason
about to be difficult to port to other systems; it is a complexity issue, and
the complexity of a behaviour influences how it can be re-implemented easily
or swiftly by others.

> 11\. The result of these was BCPL [5]: a typeless, word-oriented language
> with few keywords and unrestricted use of memory. Created philosophy of "the
> programmer is in charge and gets no help." Compiler was easy to write on
> Most Godawful Computer on Earth. Ran fast on it, too.

Great!

Sounds like BCPL was an excellent tool for their use case, and in providing
power with a low difficulty for implementation, it would be easy for others to
enjoy its benefits.

> 14\. [...] Stripped out or changed most features benefiting safety/security,
> maintenance, and consistency.

Because they were unnecessary. Don't over-engineer your products, or you'll
end up with a product that fails in the market; like MULTICS. It's like they
learned a thing or two from their past ventures.

My Conclusion: C and Unix succeeded because they were simple to re-implement
and/or port. Other, more complex systems were too hard or near-impossible to
re-implement and/or port. At a time when good, cheap software was in-demand
but scarce the pair were well-positioned to sweep the market, and so they did.
Unix may have failed to reach the desktop, but C made it there; because it was
easy to implement, had a small foot print, and produced fast code.

It's not like C and Unix stopped development, either. Let's not ignore the
last 30 years, now.

~~~
hga
_Don 't over-engineer your products, or you'll end up with a product that
fails in the market; like MULTICS._

While it might have failed in the long term anyway, Multics didn't fail
because of market reasons. It failed because first, Honeywell was incompetent
at developing serious hardware, e.g. officially blaming the failure of the
first (and only, I guess) clean slate Multics CPU project on the decision to
microcode the machine. Something which IBM had previously used in the
System/360 to dominate the market....

That caused more and more problems as its async CPUs fell further and further
behind, although this was still in the discrete logic days, and you could
easily use up to 6 in a configuration, and more with a kludge. But it was
still selling well, especially in France, where Bull seriously pushed it.

Then it was deliberately killed off due to internal politics, it seems that
trying to smash together three computer companies, Honeywell's original stuff,
GE's including Multics, and SDS -> Xerox -> Honeywell, creates a situation
where killing off rival systems makes sense to political winners who just
might not be technological winners.

That Multics was doomed by all this was self-evident to me in the 1979, one
reason I never spent any serious time on it.

~~~
nickpsecurity
Interesting. I didn't hear that angle before. So, you don't think cost of like
$7 million for probably less performance wasn't going to end it? Far as rivals
part, that same thing happened once HP had both NonStop and OpenVMS. One had
to go. Whereas Unisys did it on purpose when it acquired CTOS, a distributed
mainframe. They weren't taking any chances on that one haha.

~~~
hga
It all depends on what you want to do with it. Crunch numbers? No way after
some point. Prepare the Air Force's annual budget submission without
information leaking? A real example of a perfect fit. Or provide the IT for
MIT's EECS department? Good fit, until MIT-MULTICS was shut down as part of
Honeywell killing Multics and the Cambridge lab which did work on it (I played
a role in moving off that).

~~~
nickpsecurity
Ok, well that makes sense.

------
stephen82
I remember reading an answer on Quora by a professional C developer why none
of existing newly designed programming languages will ever replace C and to be
honest with you, I agree with him.

[https://www.quora.com/Which-language-has-the-brightest-
futur...](https://www.quora.com/Which-language-has-the-brightest-future-in-
replacement-of-C-between-D-Go-and-Rust-And-Why/answer/Roger-DiPaolo)

~~~
fulafel
OTOH C has been replaced in many of its previous application areas, and the
process continues. A lot of the software people now write in JS, Python,
Clojure, C# or CUDA would have been written in C 20+ years ago.

------
moron4hire
EDIT: I got lost in the comment thread. I was intending to reply to this
article: [https://www.quora.com/Which-language-has-the-brightest-
futur...](https://www.quora.com/Which-language-has-the-brightest-future-in-
replacement-of-C-between-D-Go-and-Rust-And-Why/answer/Roger-DiPaolo)

Do Go, Rust, or D programmers really fantasize about their language replacing
C in any other context than their own work? The article setups up a huge
strawman of a caricature of a person to start and then uses really rather
spurious arguments to take pot-shots at that strawman.

But just some easy ones, right off the bat:

"C is not a language for programmers that need to be babysat"

I think the number of buffer overflow exploits that we have seen in the wild
over the years indicates that C programmers really do need to be babysat. If
you're starting a project and you say "let's do it in C", the rest of your
team needs to be taking a really hard look at you and asking, "and just who do
you think you are?"

I have never met anyone who knows how to program in C "correctly", because
those that have opinions on how it's done always seem to be at odds with some
other cohort that is equally convinced of their own superiority.

~~~
ArkyBeagle
It's possible to write perfectly correct[1] code in 'C'. It's just that some
people can't be bothered, and they can't be bothered because it's somewhat
tedious.

[1] not to C.A.R. Tony Hoare levels of "provably correct" , the _other_ sort
of correct.

Pre-open source, I _never saw buffer overflows_. When I was dealing with full
time, professional 'C' coders, this _didn 't happen_. Because we all talked to
each other and taught each other how to _not do that_. It's like any other
skill or bundle of habits.

I did see the odd "signed integer going negative" or other integer overflow
problem, but that was much more about broken analysis and flawed testing.
Yeah, it took a little more time to make sure this was the case, but it was
just _required_.

------
planteen
I assume the author meant Bjarne Stroustrup for items 20 and 21. It's spelled
wrong twice, as Bjourne and Bjourn.

> Conclusion 3: C and UNIX should be avoided where possible because they're
> Bad by Design for reasons that stopped applying sometime in the 80's or
> early 90's.

What does the author propose instead? Is there a major OS that has a kernel
not written in C?

~~~
dalke
Perhaps the IBM operating systems?
[https://en.wikipedia.org/wiki/IBM_zEnterprise_System#Operati...](https://en.wikipedia.org/wiki/IBM_zEnterprise_System#Operating_systems)
. I can't tell what they are written in. Their heritage extends to before C
existed.

It also depends on where your major/minor cutoff is located.

~~~
planteen
Oh true forgot about mainframe things. It also looks like VMS has other
languages than C:
[https://en.wikipedia.org/wiki/OpenVMS](https://en.wikipedia.org/wiki/OpenVMS)

I don't really know where the major/minor cutoff is myself. :)

Pretty much every OS I can think of is in C - Windows, *nix, RTEMS, INTEGRITY,
VxWorks

~~~
nickpsecurity
Remember that a few of those RTOS's, although OS itself is C, have runtimes
for Ada and Java that apparently sold well enough they continue to be updated.
Only RTOS I know still done in Ada is MaRTE OS:

[http://marte.unican.es/](http://marte.unican.es/)

Well, Muen is done in SPARK but it's a separation kernel rather than full OS:

[https://muen.sk/#_features](https://muen.sk/#_features)

That it was integrated into GenodeOS framework shows one could build a system
around it but it's not quite one itself. There's also the JavaCard OS's for
low-level stuff and even Java CPU's on market.

------
hossbeast
The conclusions do not follow from the facts presented

------
nickpsecurity
@ Tomte

Thanks for posting it as the comments have valuable feedback. I was holding
off on responses until it dropped off the front page to let the discussions
happen unimpeded. I can already see ways to make the case more clear. What I
haven't seen is any contradictory evidence supplied with citations about (a)
why C's justifications _then_ no longer apply outside maybe embedded or (b)
why it was better a design level than others. A few, good opinions, though. A
few comments also have some misinformation on C the article itself addresses
despite citations in article... from inventors of BCPL and C... directly
contradicting their claims. That's the psychological effect I mention where
cause and effect being so far separated in history that affected people not
knowing bad causes (or no longer applicable ones) start inventing better,
false causes to reinforce their choice of language or faith in its design
process.

All in all, I've enjoyed reading the feedback on my ranty summary of that
Vimeo vid I did in half an hour or so after doing a C debate. I did condense
hours of watching/reading into something digestible but presentation is still
shit in a number of ways. Many of points survived with others needing to be
improved or replaced to make case clearer. Will do in near future. :)

------
raarts
In a world that is overflowing with competing and incompatible operating
systems, it's the most portable and fast language that wins over the hearts of
those writing software.

~~~
nickpsecurity
The _OS_ won due to a number of factors. The language came with it. Then, as
the OS spread, the compilers for its source language spread. Then that
language was everywhere. With that OS and language everywhere, esp on cheap
machines, ecosystem effect kicks in from there where people start building on,
using, or targeting it. Most portable and fast language isn't why it succeeded
as UNIX could've been built with alternatives. UNIX itself was the killer app
that caused C to flourish.

------
sam4ritan
This makes a whol lot of good points. C is really not suited for most modern
applications, and some of its design choices are questionable (to say the
least).

On the other hand: In my opinion, C is a good language to show beginning
programmers how memory management is done. Basically, it could be a third
learned language, after a scrip language (like python) and an OOP language
(Java or C++).

~~~
dalke
It's hard to figure out which are the good points and which are hyperbole.
"EDSAC was the second electronic digital stored-program computer to go into
regular service" and with "the world's first assembler" dating from 1949, says
the Wikipedia reference. As such, many languages can be traced back to that
"Most Godawful Computer on Earth (TM)".

Or, "4\. ALGOL60 [4] had many nice features of languages today but was too
theoretical: only existed on paper." Yet its reference 4, at
[https://en.wikipedia.org/wiki/ALGOL_60#ALGOL_60_implementati...](https://en.wikipedia.org/wiki/ALGOL_60#ALGOL_60_implementations_timeline)
, lists a number of ALGOL60 implementations even in the 1960s.

~~~
dllthomas
Algol60 could never be taken seriously as a mainstream language, lacking
Algol-like syntax.

~~~
AnimalMuppet
Algol60 lacked _Algol-like_ syntax?

In one sense, that's almost tautologically impossible. I presume you mean
"like the _other_ Algols", which makes it a non-absurd statement, but... could
you expand or clarify what you mean here?

~~~
dalke
I believe it's a joke. Gold isn't a _gold-like_ substance because it is gold.
Algol isn't an _Algol-like_ language because it's Algol.

If so, I don't believe it's a good joke.

~~~
dllthomas
It's a joke, but not the one you thought.

The joke is that the syntax of Algol didn't actually look all that much like
what people mean when they say "Algol-like syntax". No curly braces anywhere
to be seen.

~~~
dalke
As my introduction to Algol-like languages was through Pascal, the joke
completely passed me by.

~~~
dllthomas
I could see that putting you in a place where encountering people misusing
"Algol-like" to mean C-like, when trying to distinguish from something more
like Algol, would be extra noticeable and annoying.

Likely you have just been lucky :)

~~~
dalke
I hang around computational chemists. They wouldn't say "Algol-like" in the
first place. So, double lucky!

------
AnimalMuppet
C is horrible but Algol60, CPL, and Algol68 were wonderful? Um... sure. (Backs
away slowly...)

This reeks of ideological blindness, of having reached a conclusion and then
throwing mud to try to support it, no matter how biased a twist on the facts
it takes.

------
pif
Calling UNIX Bad by Design... oh my gosh!

~~~
gcr
If you thought that was entertaining, you'll get a kick out of reading the old
UNIX Hater's Handbook.

[http://www.vbcf.ac.at/fileadmin/user_upload/BioComp/training...](http://www.vbcf.ac.at/fileadmin/user_upload/BioComp/training/unix_haters_handbook.pdf)

In particular, at least check out Page 36, showing an articulately hand-drawn
Dennis Ritchie, in stoic defiance, blowing raspberries at UNIX critics through
his thick-rimmed glasses. It's by far the high point of the book.

~~~
loeg
The nice part about the Unix Haters book is that the authors were actually
familiar with their subject material :-).

~~~
nickpsecurity
The funny thing about gcr's comment is that I posted ESR's review of UNIX
Hater's Handbook here before. He agreed with a good chunk of it while noting
other stuff changed or was ranting. As for your comment, I came to C after
using alternative, system languages that compiled faster, crashed less, and
ran fast. After I figured out basics of C & read secure coding books, I said
"F#$@ this!" It was an unnecessary and horrible development experience. So, I
coded up a 4GL that let me develop in industrial BASIC that compiled to C apps
with checks in to use their compilers and libraries. Macro'd out boilerplate.
People thought my "C code" was fast and thorough but I never wrote in that
garbage language except prototyping the tool. (Long gone so don't ask.) My
application areas were OS-level stuff, games, AI, networked apps, and GUI
apps. Shows how easy it would've been to ditch given I sucked at compiler
design.

Eventually, I discovered how it was created, that most proponents claims
weren't accurate depiction, and did this write-up. Glad you enjoyed that I
kept it close to subject material from its inventors. Meanwhile, the C
inventors went onto design, with Pike's help, a better language that solves
most of C's problems while maintaining a Wirth-like level of safety and
development. Pick up Go language if you want to know what that experience is
like. Personally, I'm more for a subset of Modula-3 with Rust's dynamic
safety, SPARK Ada's extensions for correctness, and macro's for DSL,
productivity benefits. Or efficient functional one with similar properties.

------
myst
Good overview. Wrong conclusions.

------
gravypod
I think a large problem with this is that people unreasonably idealize the
features of ALGOL 60 and many other "Perfect Systems Languages"

The issue is that there are not any major open source projects to examine that
use these languages to see how they actually work in the wild. For example,
writing a Java codebase with nothing other then the standard library you can
do some very nice design patterns. You can come up with some very clean and
concise code and everything can be perfect. Then, when you go to integrate
with someone else's code (see Spring) you notice that everything becomes a
mess since they are using some crazy things that make it very difficult to
integrate with it.

Due to this it's impossible to say how well another language would behave in
the wild since I don't think anyone can predict how well the language will be
used to it's extent.

That being said, I think C succeeded because it was SIMPLE. Not only for the
compiler, but I can teach pretty much anyone how to understand C in most
likely a few days to a week. It's very simple and very little you need to know
to understand how to start writing your programs.

On the other hand, ALGOL is MASSIVE. There are 106 reserved identifiers in the
language according to wikipedia and I don't think I could ever remember what
every one of them did.

Currently C has about 32 key words and I can tell you in full detail what
every one of them means except union as I don't work much with C.

This is good for the beginning programmer since it shortens the time they need
to hit the books and helps them jump strait into writing.

Something that's good for the advanced programmer, in my opinion, is the use
of braces and not words.

I know this is kind of controversial but I do honestly think that a code
should be easily segmented by the eye not readable.

Begin and End only help so much and by so much I mean not at all. Words should
not be syntax in my opinion. Seeing { } is less to look at and understand then
seing Begin and End.

This sample phenomenon can be looked at from the recent Linux 1000 patches
debacle.

Linus said he didn't want to replace unix file permission numbers with the
symbolic values. Why? Because it's much much faster for someone to see the
numbers and know what's going on then to have to read a bunch of text. It's
much quicker in parsing.

That's where the breakdown comes in these discussions. People will throw
"You've never used these, how do you know yours is faster."

While that's true that doesn't really present a good argument for changing
everything.

I also hate needing to type := for assignment. It makes no sense. Fixing the
beginner issue of missing the second equal sign in an if statement has a much
simpler fix. Use some kind of IDE/Editor with a linter. You'll not have this
problem.

~~~
AnimalMuppet
> I can teach pretty much anyone how to understand C in most likely a few days
> to a week.

I learned C from reading K&R over a Thanksgiving weekend, without even a
compiler to try it out on. The only thing that confused me was the explanation
of argc and argv.

~~~
nickpsecurity
I learned an industrial BASIC in a few hours if you add it all up. Learning
how to use it reliably took weeks with safety mostly coming by default.
Learning that for large stuff took several months where I learned programming
practice. How long do these milestones take for average C developer with what
productivity? Note that my Pentium 3 at 400MHz could compile the compiler in a
second or two. Fast iterations. :)

Learning C was a different experience entirely. The mental work involved was
huge with the level I worked at for _any job_ being micro-managing the
machine. It's only simple on the surface. The job is much harder than a BASIC,
Oberon, or whatever with selective use of unsafe features.

~~~
AnimalMuppet
I said that I understood the language after reading K&R. The language is
simple. Using it safely? Not so simple. I didn't claim that it was.

On the other hand, I learned BASIC on a TRS-80. Safe? Not with the POKE
command being part of the language.

~~~
nickpsecurity
Ok. Fair enough haha.

