
The Unreasonable Effectiveness of C - daschl1
http://damienkatz.net/2013/01/the_unreasonable_effectiveness_of_c.html
======
madhadron
Oh for heavens' sakes. Yet more ignorance.

A more realistic view of C:

\- C is straightforward to compile into fast machine code...on a PDP-11. Its
virtual machine does not match modern architectures very well, and its
explicitness about details of its machine mean FORTRAN compilers typically
produce faster code. The C virtual machine does not provide an accurate model
of why your code is fast on a modern machine. The social necessity of
implementing a decent C compiler may have stifled our architecture development
(go look at the Burroughs systems, or the Connection Machine, and tell me how
well C's virtual machine maps to them).

\- C's standard library is a joke. Its shortcomings, particularly around
string handling, have been responsible for an appalling fraction of the
security holes of the past forty years.

\- C's tooling is hardly something to brag about, especially compared to its
contemporaries like Smalltalk and Lisp. Most of the debuggers people use with
C are command line monstrosities. Compare them to the standard debuggers of,
say, Squeak or Allegro Common Lisp.

\- Claiming a fast build/debug/run cycle for C is sad. It seems fast because
of the failure in this area of C++. Go look at Turbo Pascal if you want to
know how to make the build/debug/run cycle fast.

\- Claiming that C is callable from anywhere via its standard ABI equates all
the world with Unix. Sadly, that's almost true today, though, but maybe it's
because of the ubiquity of C rather than the other way around.

So, before writing about the glories of C, please go familiarize yourself with
modern FORTRAN, ALGOL 60 and 68, Turbo Pascal's compiler and environment, a
good Smalltalk like Squeak or Pharo, and the state of modern pipelines
processor architectures.

~~~
SeanDav
Madhadron, you make a lot of claims but provide no detail. Also using terms
like "virtual machine" with respect to C is plainly ridiculous and a case of
bullshit baffles brains.

Turbo Pascal vs C? Really? In its time Turbo Pascal was an amazing piece of
software but in the grand scheme of things it is a pimple compared to the
whale that is C. Please compare all software written in Turbo Pascal as
opposed to C if you have any doubts. The same goes for Smalltalk, Lisp, Algol
60/68. All great products/languages but very niche.

Fortran can be faster than C in some areas but again it is a niche language.

I could go on a lot more but quite frankly I don't think your post merits much
more discussion and is borderline trollish.

~~~
eropple
_> Madhadron, you make a lot of claims but provide no detail. Also using terms
like "virtual machine" with respect to C is plainly ridiculous and a case of
bullshit baffles brains._

C as a very thin virtual machine is a common conception and not an incorrect
one--C runs on many systems with noncontiguous memory segments but presents it
as a single contiguous space, for example. The idea of C as a virtual machine
is much of the basis of LLVM, and to the best of my knowledge I've never
worked on a computer where C represented the underlying hardware without
significant abstractions.

If you're going to accuse somebody of trolling, you should know what you're
talking about first.

~~~
SeanDav
_> C as a very thin virtual machine is a common conception and not an
incorrect one_

I have worked extensively in the past with C and have never heard it referred
to as a virtual machine, thin or otherwise. I understand that the OP probably
means "computation model" or something similar but felt that the use of the
phrase "virtual machine" was a bit on the bombastic side and that in addition
to the general tone of the post made me think the post was borderline
trolling.

BTW. I would be quite happy to be proved wrong about C commonly being referred
to as a thin virtual machine - what books/literature refer to C in this way?

~~~
jerf
"I have worked extensively in the past with C and have never heard it referred
to as a virtual machine, thin or otherwise."

True. Usually other terms are used; for instance, "memory model". If you
google that you'll find some things. As you read them, notice that they may or
may not match the hardware you are actually running on, and given the
simplicity of C's model, nowadays almost never does.

C is a low-level language that lets you close to the machine, and even lets
you drop into assembler, but it is true that it is enforcing/providing a
certain set of constraints on how the library and code will act that do not
necessarily match the underlying machine. It may not be a "virtual machine",
depending on exactly how you define that term, but the idea isn't that far
off.

Also, this is a good thing, not a criticism. If it really was just a "high
level assembler" that provided no guarantees about the system, it would be as
portable as assembler, which is to say, not at all.

For a much more clear example of this sort of thing in a language similarly
thought to be "close to the metal", look at C++'s much more thorough
specification of its memory and concurrency model, and again notice that this
model is expected to be provided everywhere you can use a C++ compiler,
regardless of the underlying hardware or OS. It is, in its own way, a virtual
machine specification.

------
cageface
_C is a fantastic high level language._

Nonsense. This sudden C fad is totally baffling to me. C is a great language
for low-level work and it does have an attractive minimal elegance but is in
absolutely no sense of the term a _high level language_. A language with next
to no standard containers or algorithms, manual memory management, raw
pointers, a barely functional string type and minimal standard library and
concurrency primitives is just the _wrong_ choice for any application that
doesn't need the kind of low-level control C provides.

Go ahead and jump on the bandwagon but don't come crying to me when you wind
up with an unmaintainable mess riddled with security holes and fiendishly
subtle memory errors.

~~~
Thrymr
> C... is in absolutely no sense of the term a high level language.

Of course it is. You're not worrying about how many registers your CPU has, or
when you swap a register out to memory. All that has been abstracted away for
you. The fact that you can access memory locations directly gives you some
low-level access, but in a very real sense C is a high-level language. There
are higher-level languages with more abstraction, of course. From the article:
"It's not as high level as Java or C#, and certainly no where near as high
level as Erlang, Python, or Javascript."

~~~
ryeguy
The problem is that the "level" of a language is relative to other languages.
If C is a high level language, that means you can group it into the same pool
as Java, C#, Haskell, Python, and Ruby. Don't you see a bit of a difference
here? Unless there is a significant number of actual _languages_ , not
_concepts_ , that are below C, it logically has to be called a "low level
language" because there really isn't much below it but there are TONS of
languages above it.

At the very least, it's something like a medium-level language. Grouping it
with much higher-abstracted languages is just wrong.

~~~
peterevans
Here's the thing: "high-level", when applied to a programming language, has a
historical context. It means something specific (see:
<http://en.wikipedia.org/wiki/High-level_programming_language>), and what it
means and has always meant is that the language in question abstracts away
registers and allows structured programming (as in, through the use of if,
while, switch, for, as opposed to using labels and branching).

It's fine if you want to call "high-level" relative. But it should be
acknowledged that "high-level" is not simply a strict comparison between the
features of one language and another. And it SHOULD be acknowledged that C is,
by convention, a high-level language. Python is definitely a higher-level
language, but C is still a high-level language.

~~~
bunderbunder
Here's the thing: The English language is polysemous. Computer science jargon
even more so.

Meaning that any particular term can often have many gradations of meaning. So
"X means Y" does not necessarily imply "X does not mean Z". Especially when Y
and Z are similar concepts.

I suppose we could lament the inherent ambiguity of the jargon, but the truth
is that for the most part problems only result when ambiguous language is used
in combination with an argumentative person armed with equal measures of
pedantic zeal and failure to grasp the fundamental characteristics of natural
language. Without that element, for the most part any reasonably knowledgeable
person should be able to figure out which of the particular meanings is at
play from context. For example, even though the term "high level language" is
used in multiple ways, the statement "C is not a high level language" is not
all that ambiguous, even when considered in isolation. As long as you're
willing to grant that the person making the statement is not an idiot, then
it's trivial to determine that they weren't using the "everything but machine
and assembly language" definition.

~~~
saraid216
> polysemous

I did not know this word. Thank you.

------
shadowmint
It also has a few downsides:

\- The build system is broken.

vs. make. qmake. cmake. autotools. scons. 'modern' makefiles (>_> what does
that even mean? Yes, I'm looking at you Google) There's a whole ecosystem of
tools out there to solve this.

\- There are a few rubbish IDEs, most of which support c as a second class
candidate.

VS has officially abandoned C; xcode grudgingly supports it. The CDT is
mediocre. There's little or no support for refactoring or doing other things
on large code bases, and the majority of the time the work has to be done
manually.

\- Because the build system is broken, dependency management is hell

Got one library that uses scons, another using autoconf, and what to build a
local dev instance of a library _without installing it into the system path_?
Goooood luck.

This is made even worse by arcane things like rpath, which mean that dynamic
libraries only work when they are in specific paths relative to the
application binary (I'm looking at you OSX).

\- Its a terrible collaboration platform.

Why? Because the barrier to entry is high. To submit a patch that works and
doesn't break other things, doesn't introduce serious security holes or memory
leaks is hard.

Astonishingly, some successful projects like SDL are actually written in C,
but most C projects are not collaboration friendly.

\- There are no frameworks for doing common high productivity tasks in C, only
low level system operations, or vastly complicated frameworks with terrible
apis (>_> GTK).

I'll just whip up a C REST service that talks to mysql and point nginx at it.
Um... yeah. I'm sure there's an easy way to do that. ...perhaps...?

How about a quick UI application? We'll just use GTK, that's portable and
easy. ...Or, not very easy, and vastly complicated to setup and build.

These issues aren't unique to C, but they're certainly issues.

I'm not really sure I'd happily wonder around telling people how amazing C is
at everything.

It's a tool for some jobs; definitely not all.

~~~
kyrra
> It's a tool for some jobs; definitely not all.

I'd completely agree. I would not use C for doing any web platform work
(writing a REST service as you say). I may write a webserver in C if I had
tight memory constraints.

Where I see C still being very useful: embedded applications, drivers, and
latency sensitive code.

When you are trying to push the most I/O possible through a network interface
or disk interface, C allows extremely tight controls on memory and CPU usage.
Especially if you are coding to a specific CPU for an embedded product, you
can really tweak the app to perform as required (though, this may require some
assembly to do what you need).

~~~
dakimov
I have done low-level and mobile programming on very restricted platforms and
I cannot see any reason why in the world I would use C instead of C++.
Basically there is always an opportunity to use C++ if you can use C. Myths
that C++ is slower are spread by people who just do not know C++ well or are
not skilled/clever enough to use it.

~~~
yuushi
Indeed. And there are numerous reasons why C++ code can be/is significantly
faster; firstly code inlining for code that would be required to use function
pointers in C (ala qsort vs std::sort), secondly, things such as expression
templates for things such as Matrix libraries.

------
geophile
C/C++/Java. A programmer's version of Rock/Paper/Scissors.

Ignoring pre-history (BASIC, FORTRAN, PDP-11 assembler, Z80 assembler,
Pascal), I started out in C, many years ago. I found myself using macros and
libraries to provide useful combinations of state and functions. I was
reinventing objects and found C++.

I was a very happy user of C++ for many years, starting very early on (cfront
days). But I was burned by the complexity of the language, and the extremely
subtle interaction of features. And I was tired of memory management. I was
longing for Java, and then it appeared.

And I was happy. As I was learning the language, I was sure I missed
something. Every object is in the heap? Really? There is really no way to have
one object physically embedded within another? But everything else was so
nice, I didn't care.

And now I'm writing a couple of system that would like to use many gigabytes
of memory, containing millions of objects, some small, some large. The per-
object overhead is killing me. GC tuning is a nightmare. I'm implementing
suballocation schemes. I'm writing microbenchmarks to compare working with
ordinary objects with objects serialized to byte arrays. And since C++ has
become a hideous mess, far more complicated than the early version that burned
me, I long for C again.

So I don't like any language right now.

~~~
slurry
> Ignoring pre-history (BASIC, FORTRAN, PDP-11 assembler, Z80 assembler,
> Pascal)

A side effect of C universalization, especially with open source, is that
people forget all about that pre-history. Hacker culture now is mostly C/Unix
with a dash of Lisp and Smalltalk heritage.

But from what I remember of microcomputer culture in the early 80s, hackers in
the field were doing line-numbered Basic and Assembly language - if you had
exotic tastes maybe Forth or something. If you were a Basic guy C was
DANGEROUS (my programming teacher spoke of C the way Nancy Reagan spoke of
crack cocaine) and if you were an assembly guy C was a slow hog.

And if you had access to a "real" computer, chances are it was running PL/I,
COBOL or Fortran, not C.

The 1983 video game Tron had levels named after programming languages:
"BASIC", "RPG", "COBOL", "FORTRAN", "SNOBOL", "PL1", "PASCAL", "ALGOL",
"ASSEMBLY", "OS" and "JCL". C apparently did not merit mention; its complete
dominance didn't come until some ways into the PC era.

~~~
geophile
SNOBOL. I loved SNOBOL. A completely bizarre, very powerful language. I took
two wonderful compiler classes from R. B. K. Dewar, who worked on the Spitbol
implementation. He was also involved with the SETL language, which really
impressed me.

~~~
eugenejen
And setl influenced ABC, then Python

------
AshleysBrain
I always get bashed for saying I like C++, but I genuinely don't get how C
programmers manage without code like this:

    
    
        std::map<std::string, std::vector<std::string>> foo;
    

One line and you've set up a nontrivial data structure with automatic memory
management. No macro horrors (which are a diabolical way of implementing what
C++ templates do well).

Since you can code like C in C++, I'm not sure why more people don't use
C-with-templates as a programming style.

~~~
xamuel
C guy here. I'm assuming you're serious and not being sarcastic... which is a
nontrivial assumption, because that line you posted looks like something out
of the foul depths of hell.

You aren't going to run out of lines any time soon. Who cares whether your
data structure definition is 1 line or 5?

~~~
hackinthebochs
> because that line you posted looks like something out of the foul depths of
> hell.

I must say I laughed-out-loud at that, mainly because I've been on both side
of the divide when it comes to opinions on verbose std definitions. It is an
odd feeling to simultaneously feel revulsion and nostalgia towards a line of
code.

I do think there is a non-trivial, and sometimes massive boon gained from
concise definitions. It's the difference between an acronym in natural
language vs referencing a concept by its full verbose name. The shorter the
definition of a concept, the fewer units used by your working memory when
referencing that concept, thus freeing your mind to higher level
considerations.

~~~
to3m
I agree - I've found the STL a massive productivity boost, for all its
ugliness. It's worth it for std::vector and std::string alone, and I also like
std::map (even though the API teeters on the boundary between genius and
insanity) and std::set. All there for your use, with your own data types, out
of the box.

I discovered the STL one day in February 1999, by accident, when looking for
something else in the VC++ help. Once I realised what I'd found, I gave up C
entirely within about 1 week.

------
galaktor
"I always have it in the back of my head that I want to make a slightly better
C. Just to clean up some of the rough edges and fix some of the more egregious
problems."

I immediately thought of this: "Go is like a better C, from the guys that
didn’t bring you C++" — Ikai Lan [1]

And that's what it feels like to me when I use it.

[1] <http://go-lang.cat-v.org/quotes>

------
skrebbel
To nitpick on a single statement:

> _C has the fastest development interactivity of any mainstream statically
> typed language._

What? Despite all its shortcomings, Java in a modern IDE effectively has
_zero_ build time. The level of interactivity is as fast as that of
interpreted languages. I haven't seen anyone manage this with C yet.

~~~
henrik_w
Agree completely about zero build time!

I've been using IntelliJ IDEA for Java development for the last 4 years, and
before that I used C and C++ using Emacs for 7 years.

I am way more productive in IntelliJ IDEA than I was before. One reason is the
instant feedback on syntax errors when I type the code. I don't need to
compile to see them, as I used to in C and C++. Another reason is the
navigation support you get in an IDE.

I've written more about the differences in development environment here:
[http://henrikwarne.com/2012/06/17/programmer-productivity-
em...](http://henrikwarne.com/2012/06/17/programmer-productivity-emacs-versus-
intellij-idea/)

~~~
alexkus
That's the IDE not the language.

Java, itself, does not highlight your syntax errors. Nor does C.

There are IDEs out there that will do exactly the same thing for you for your
C code.

~~~
chadzawistowski
The difficulty of writing static analysis software does depend on the
language, though. In Visual Studio, for example, intellisense and
autocompletion are vastly superior in C# compared to C++.

As a matter of interest, which C IDEs are you referring to? I'd like to check
them out.

~~~
octopus
Try Xcode 4.5.x (wich uses Clang as the default C compiler).

VS2012 has similar capabilities for C++.

~~~
chadzawistowski
VS2012 is actually the IDE I was thinking of when I said VS C++ support is
inferior to C#'s. While there is intellisense, it does not display any
documentation. The closest thing it offers is variable names for a method's
arguments.

------
stcredzero
1) Pick a popular language

2) Figure out something controversial to say about it and a justification for
that

3) Write it up and post it to reddit, HN, &c

4) Enjoy the hits

EDIT: Which is enabled by this - <http://news.ycombinator.com/item?id=5037649>

~~~
manish_gill
Pretty much this. Instead of reading about how awesome C is, I would much
rather read a concrete example of a problem which you were facing and how you
used C to solve it. Get to learn something in the process.

------
sramsay
I feel like I'm always trying to make this point to people, and doing so with
far less eloquence. The fact that Katz is (a) an experienced and accomplished
developer and (b) someone with very non-trivial experience with hot, hip,
super-high-level languages should not go unnoticed. He's not some cranky old
embedded systems programmer; he's doing thoroughly modern development.

Someone the other day said that the great thing about C is that you can get
your head around it. When Java, or C++, or Erlang, or Common Lisp, or Haskell
programming becomes unbearable, it's almost always because the chain of
abstractions overwhelms the person building the system. And to be honest, it's
why I'm a little less enthusiastic than most of the people here about Rust.

~~~
MichaelGG
Can you give me an example of a problem in Haskell, where the abstractions are
so complicated you can't get your head around it, but putting it back in C
makes it easy enough to deal with?

A more plausible explanation is that you wouldn't even attempt a similar
design in C because it'd be obviously impossible.

~~~
dmpk2k
An industrial operating system. In C, you know what you've got: its exact
dimensions, location, and lifetime. Since you're not relying on a GC, you can
make latency more predictable and with much lower upper bounds.

On that topic, I'd rather write crypto code in C too.

------
plg
If you tried to use an F1 racecar as your day to day commuter, of course you
would complain about it. It would be horrible. Tune the engine every day!?
Tire grip changes over TIME? Must watch oil temps so carefully? What a pain in
the ass. Also easy to crash.

If you tried to use your soccer-mom minivan to participate in F1 races, of
course you would complain. I can't tune the engine? I can't tinker with the
oil pressure?!? How do I adjust performance for different tracks?! This thing
sucks. Also it's slow.

These kinds of debates about C and other "higher" level languages are growing
tiresome. We live in a wonderful world full of different tools for different
purposes. Maybe your grandma doesn't want to worry about details under the
hood, and she also doesn't need to race F1 cars, so a minivan suits her just
fine. Maybe you value being able to tune your code for maximum speed and you
don't mind (or even enjoy) the challenges involved in C code. Good for you.

yawn...

~~~
kgabis
I really like your analogy.

------
zvrba
Oh, spare me, high-level language that can't even do arithmetic properly. It's
anything _but_ "damn successful as an abstraction over the underlying
machine".

Case in point: Whenever you're doing signed arithmetic, and it overflows,
you're in the land of undefined behaviour. (See here for an example of how
this can bite you: <http://thiemonagel.de/2010/01/signed-integer-overflow/>)

Another case in point: type-punning with through pointer casting is also UB.
[Going through union is legal though.]

I'm pretty sure the author hasn't written a line of ANSI-C compliant code in
his life, otherwise he'd never write something like this.

~~~
cookiecaper
The author doesn't claim it's perfect or that there's no way to improve on it,
but I think he has a very salient point, which is that most of the OO
buzzwords and programming fads that come and go in "higher-level languages"
simply end up making a bigger mess of things as a project grows. Basically,
the author loves C because it is simple, straightforward, and restrictive --
it forces you to write [relatively] simple, straightforward code too, instead
of concocting a terrible Frankenstein of custom classes and types intertangled
into a grotesque, intractable mass of dependencies and subdependencies. If
something is in C, you know it is going to be built from the basics, and in
many cases, this simplicity is a life saver as a project matures.

There are definitely annoyances and issues, but they are known and can be
taken into account with much less hassle than attempting to grok a Java
project that requires you to traverse into the basest-level of classes like
BusinessObject2013SingletonDispatcherFactoryFactory every time something needs
to be debugged or fixed.

It doesn't mean you should use C for your Web 2.0 startup, but his point is
well taken. I heard someone 'round these parts once acknowledge that the
modern equivalent of spaghetti code (i.e., code that intractably descends
through hundreds of code paths with gotos, etc.) is OO hell, i.e., code with
huge dependency stacks and equally intractable and unjustifiable inheritance
models, where you have to descend into all kinds of classes and special cases
to make a meaningful change.

>I'm pretty sure the author hasn't written a line of ANSI-C compliant code in
his life, otherwise he'd never write something like this.

Damien Katz is actually fairly accomplished, and it definitely sounds like
he's written multiple lines of C to me.

~~~
zvrba
> OO buzzwords and programming fads that come and go in "higher-level
> languages" simply end up making a bigger mess of things as a project grows

No, it's not the language features which make a mess. It's people lacking
judgement and common sense. (Like, trying to apply patterns everywhere. Been
domain-specific [crypto] consultant on such a project and watched it smash the
schedule by more than 2x. It wasn't the language [Java], it were stupid
people.)

> sounds like he's written multiple lines of C to me

I don't contend that he's written a lot of C. I _DO_ , however, contend that
he's written some ANSI C. If you're writing ANSI C, you have to account for AT
LEAST all of the following behaviors:

[https://www.securecoding.cert.org/confluence/display/seccode...](https://www.securecoding.cert.org/confluence/display/seccode/CC.+Undefined+Behavior)
[https://www.securecoding.cert.org/confluence/display/seccode...](https://www.securecoding.cert.org/confluence/display/seccode/DD.+Unspecified+Behavior)

For example, if f is some function, then in the sequence

    
    
        int x = 0;
        f(x++, x++);
    

the call to f is undefined behavior because x is modified twice without an
intervening sequence point. Anybody claiming that such language is "high-
level" is a moron, regardless of their "accomplishments".

~~~
ehaliewicz
In scheme, argument evaluation order is not specified. So, (pretending that
set! returns the assigned value), (f (set! x (+ 1 x)) (set! x (+ 1 x)) ) is
undefined and unpredictable as well. I don't think anyone would claim Scheme
isn't high level.

~~~
zvrba
"Undefined behavior" in C is much more insidious than merely "unspecified" or
"unpredictable". All those security holes and exploits resulting from buffer
overflows, stack overwrites, heap spraying, etc. are manifestations of UB made
possible by a badly written program. (Or by the interaction between an invalid
program and an optimistic compiler, as in the case of assuming strict pointer
aliasing.)

------
haberman
C is the language that doesn't force any preconceived notions about how the
world should work onto you. Sure, C strings are NULL-terminated by convention,
but even that is something you can almost completely ignore if you want to
build up your own parallel stack of software that does it differently.

It is for this reason that C (and sometimes C++) are what people use when they
have a new idea about higher-level programming abstractions. With C, you can
be totally free of other people's big ideas (and their associated
costs/complications) and invent something new.

Anyone who wants you to give up C in favor of their
language/framework/VM/runtime is selling their own vision for how the world
should work. That's fine and sometimes buying in will save you a lot of
hassle. But what they're offering was almost certainly written in C or C++.
Essentially their pitch is equivalent to "I have written the last C program
you'll ever need."

In the early 90s that was Perl. In the late 90s, Java. In the early 2000s,
.NET, then Ruby, then fast JavaScript.

This history is an oversimplification of course, but the real question is:
would you really prefer a history in which we had, at some point, decided that
the current VM that was in vogue should become the new replacement for C? That
we'd tailor all our hardware to it, and no future VM would be "native" but
would have to run on top of a different VM?

C is the key to why we have general-purpose computers that can run a wide
variety of different languages efficiently, and why programs that need to be
particularly efficient can always drop down to lower-level programming to get
extra performance.

~~~
kscaldef
> C is the language that doesn't force any preconceived notions about how the
> world should work onto you

Some preconceived notions forced upon you by C, off the top of my head:

\- problems should be solved by describing a linear sequence of steps (as
opposed to logic/declarative programming)

\- a variable can have different values at different times in the execution of
a program (in contrast to standard mathematical conventions)

\- a function can return different results when called multiple times with the
same arguments (again, in contrast to the standard mathematical meaning of the
term)

\- there is random access storage of information (with constant time access
and update)

~~~
neumann_alfred
_problems should be solved by describing a linear sequence of steps (as
opposed to logic/declarative programming)_

Linear sequences of steps processing data arrayed in some linear fashion or
others _is what computers do._ It is certainly what a single core does.

So you would rather have someone else write a C program for you that abstracts
it away and calls it a new way to program? That's k, just know that you'll
still have those pesky linear steps under the hood. You know, at _some_ point
the neat source you write actually has to get converted into stuff CPU can
make the slightest bit of sense out of.

Of course it's nice to try to "get the computer think like the programmer
instead of the other way around" (kind of the motivation for the inventor of
the compiler IIRC, in times when people really had to "speak binary" to the
computer), but it also kinda sucks when people completely loose track of the
machine they're programming, and think their abstraction layers du jour grow
on trees or something.

 _a variable can have different values at different times in the execution of
a program (in contrast to standard mathematical conventions)_

Your problem is nomenclature? Why not simply declare constant variables where
you need them and move on? And are you really trying to twist C giving you the
choice into "forcing preconceived notions" on programmers? That is hilarious,
I will give you that much.

 _a function can return different results when called multiple times with the
same arguments (again, in contrast to the standard mathematical meaning of the
term)_

"Function" can also mean a lot of people coming together for a wedding or
something. I think that's why you would call them "C function" if you wanted
to be precise.

 _there is random access storage of information (with constant time access and
update)_

Again, that's just how "the world" works. But C hardly came up with that,
which is why all languages have it.. yes, all of them. Some abstract it away
from you, sure, but they still have it. So you pay overhead and control "tax"
in return for convenience and and expressiveness; that's not a bad thing per
se, often it's the sane and productive choice, but to me dissing C is just
shitting where you eat. The only reasons I can imagine for it are jealousy or
ignorance.

~~~
kscaldef
> Linear sequences of steps processing data arrayed in some linear fashion or
> others is what computers do

No, it's what some computers do. It's almost certainly not what the computer
you are using currently does. It's quite possible it's not what any computer
you've ever used does.

> So you would rather ...

I'm not expressing a preference. If you think I was, you've misunderstood my
comment. I'm making observations, not judgements. I'm not anti-C, it's a fine
language for many things; however I do think that the articles author is not
being as even-handed as he claims when it comes to some of the short-comings
of C.

> Your problem is nomenclature?

Again, I'm not making a judgement. C exists in a paradigm where writing

    
    
        x = 3
        x = 4
    

makes sense. In other paradigms it would be logically inconsistent.

>>there is random access storage of information (with constant time access and
update)

>Again, that's just how "the world" works.

That's absolutely not how "the world" works. Again, you've probably never even
used a computer where it was true.

> dissing C is just shitting where you eat. The only reasons I can imagine for
> it are jealousy or ignorance.

I don't think I've made any statement anywhere in this discussion that's
"dissing C".

~~~
neumann_alfred
How does this computer right here not consist of a whole lof of linear
sequenceS?(notice the plural btw? and the fact that "linear" does say nothing
about "serial" vs "parallel", either?) How is data not arrayed linearly?
Please elaborate, "because I said so" is not enough.

------
lukego
I can relate to what he's saying. I've also been falling back to C pretty
often over the years and finding it pleasant when I arrive there. LuaJIT+C is
my latest experimental compromise.

My thoughts in more depth at [http://blog.lukego.com/blog/2012/09/25/lukes-
highly-opiniona...](http://blog.lukego.com/blog/2012/09/25/lukes-highly-
opinionated-programming-language-roundup-2012/)

------
kscaldef
I take significant issue with this paragraph:

    
    
        When you write something to be fast in C, you know why 
        it's fast, and it doesn't degrade significantly with
        different compilers or environments the way different 
        VMs will, the way GC settings can radically affect 
        performance and pauses, or the way interaction of one 
        piece of code in an application will totally change
        the garbage collection profile for the rest.
    

The implementation of malloc & free that your C implementation uses
significantly impacts performance (both time and space) just like the choice
of GC implementation in other languages. It is not uncommon for memory
fragmentation to be a serious problem for large or long-running C programs,
and at the point that you have to start worrying about that and working around
it, you've broken the abstraction in just the same way that the author
complains about for Java, Erlang, Haskell, etc.

Also, as others have pointed out, performance can also depend significantly on
how well C's machine model maps to the actual machine architecture you're
running on.

------
voidlogic
Its often worked out well to write the 10% most CPU bound parts of large Java
or PHP applications in C. C defiantly has its place.

I wonder if this gentlemen has tried Go? If he is a big C fan, Go might be a
great choice for the other 90%. Not only is Go a safer version of C with lots
of great modern features, it integrates very well with C via cGo.

If memory usage and speed are your preeminent concerns, C is certainly a force
to be reckoned with: [http://benchmarksgame.alioth.debian.org/u64q/which-
programs-...](http://benchmarksgame.alioth.debian.org/u64q/which-programs-are-
best.php?calc=chart&gcc=on&gpp=on&ghc=on&go=on&sbcl=on&java=on&csharp=on&scala=on&fsharp=on&hipe=on&php=on&erlang=on&python3=on&yarv=on&perl=on&xfullcpu=1&xmem=1&xloc=0&nbody=1&fannkuchredux=1&meteor=1&fasta=1&fastaredux=1&spectralnorm=1&revcomp=1&mandelbrot=1&knucleotide=1&regexdna=1&pidigits=1&chameneosredux=1&threadring=1&binarytreesredux=1&binarytrees=1)

------
jws
_Closures? HA!_

If you are working in C and think you might like closures and a different
concurrency model than pthreads, then you should look into
clang+libBlocksRuntime+libdispatch. That gives you the blocks(closures) known
to OS X and iOS programmers and a pretty spiffy queue based concurrency model.
I'm never going back to pthreads. You can't make me.

Slightly sadly, I haven't seen a distribution shipping a modern version of
libdispatch. There is one in github, but it may or may not have a problem in
its read/write support. I use the old version that comes in my distribution.

------
afandian
The author's next love affair will be Go, and he won't be back. I see it as a
very real successor to C. I have written about 6k lines of Go (on a project
that I had previously written in C) and I'm deliriously happy with it (partly
because its just more fun to write than C). Granted, you can't write a dylib
or kernel, but it sounds like for the author's case it would be a good fit.

~~~
unphased
I am not sure about this. Every analysis of performance when it comes to Go
shows it lagging behind even Java. It is simply not mature enough.

------
vannevar
When you're an expert in C, you tend to forget how much you've really learned.
And not just about the language itself, or the available libraries, but _about
your own coding style_. As an experienced programmer, you have habits and
intuition that make developing and debugging your code vastly easier than for
a novice. The reason that C has given so much ground to a language like Java
is not that Java is intrinsically more powerful, but that novice and
intermediate programmers can be much more productive with it from the start.
Companies don't want to hire a 10-year C veteran at $125/hr to code up their
CRUD app, even if one were readily available. They want to be able to hire
interchangeable $50/hr Java guys with a bit of experience, who won't have to
waste days at a time trying to track down intermittent memory problems or re-
writing the wheel because they aren't aware someone wrote a C lib for
something Java has built-in.

~~~
cunac
Actually they will ake java 125/hr guy and still be ahead in terms of
time/cost

------
njharman
> Faster Build-Run-Debug Cycles

Interactive languages blow anything with a Build-Run cycle away. "Faster"
doesn't matter.

Having a Build cycle is not one of C's advantages. It is a trade off (and a
good one, if you care about speed of execution vs speed of development).

------
mschaef
I've recently been doing a little bit of C hacking on a small app with a web
based interface. I've missed the higher level collections and I/O facilities,
but some parts of the experience have been really refreshing. There are mature
tools, the softwawre builds in no time at all, and it's been easy to make it
fast to both run and startup.

Dropping down to C has taken an attitude adjustement, and a willingness to
drop certain kinds of features, but it's really been quite refreshing.

------
fusiongyro
> I've had intense and torrid love affairs with Java, C++, and Erlang.

Oh, but this time it's different, this time it's the real thing?

------
scott_w
"But amazingly it's proven much more predictable when we'll hit issues and how
to debug and fix them. In the long run, it's more productive."

That's nothing to do with C. It's because Damien worked on CouchDB first, and
learned a lot from that. If I were to go and rewrite our current system in
Java or Scala, I'd also be able to predict where performance, and other,
issues would crop up, simply because I've already encountered a number of them
before.

~~~
mavelikara
"It wasn't, it was a race condition bug in core Erlang. We only found the
problem via code inspection of Erlang. This is a fundamental problem in any
language that abstracts away too much of the computer."

Another way to look at this is that all languages which offer poor
abstractions force you to write more code, by definition. Up front you will
have to write much more code, but when the inevitable bug comes up, you will
have the luxury of debugging code you authored. For some individuals and
teams, this trade-off is a sensible one.

~~~
scott_w
Both Jeff Atwood[1] and Patrick Wyatt[2] have written posts pointing out that
the bug is more likely to be in your code than in the platform.

There are instances where writing it yourself is the better option, but more
often than not you will want to offload that burden onto a library that deals
with the details so you can focus on doing what's important to your
system/business.

[1] [http://www.codinghorror.com/blog/2008/03/the-first-rule-
of-p...](http://www.codinghorror.com/blog/2008/03/the-first-rule-of-
programming-its-always-your-fault.html) [2]
<http://www.codeofhonor.com/blog/whose-bug-is-this-anyway>

------
darwinGod
To folks who think C code is in general a mess of Pointers, macros, and goto's
- Go download Postgres codebase,build cscope,open vim.. and try browsing the
code.

Arguably, one of the most beautiful C-code bases you can get hold off.

~~~
millerm
I just did. I agree with you.

------
linuxhansl
C is a fine language. So is Java, and so are Python, Perl, Javascript, as well
as Lisp and Scheme.

Saying C is always effective without further qualification is like saying that
high speed racing cars are always effective. You need vans, trucks,
bulldozers, etc, etc.

Then there is the question of what is really meant by "effectiveness". Is the
code execution speed, maintainability, collaboration, compile-execute cycle,
extensibility, etc?

As always, it depends on what you want to do. In many cases C is the most
logical choice. But in other case Java or other languages get the job done
more "effectively".

Anybody who is limiting him/herself to a single language to "rule them all" is
certainly not effective.

Edit: Usual spelling mistakes.

------
krizzz
Thank you for writing that. It makes me feel like I'm not completely insane
because there are other people who think the same. If you know what you are
doing - C is your best option. The problem is that there are thousands of
people these days who do not know what they are doing and basically they do
not understand how computers and operating systems work. They need very high
levels of abstraction which is costly and usually introduces a whole bunch of
new problems. Add all those weird frameworks/libraries on top of that and you
end up paying 10x more for hardware and get 10x slower product. Though, it's
really hard to find a decent and productive C programmer today, so people
settle for what's easily available. Another problem is that those "cheaper
resources" are defining what programming is these days. I think there is a
huge difference between programming and scripting... Call me Dino. BTW, loved
the banana example :)

------
robot
"With C++ you still have to know everything you knew in C, plus a bunch of
other ridiculous shit."

So true. C++ is designed by someone who is a moderately smart but not a great
engineer. There are too many details without much benefit, and you are
encouraged to hide things in obfuscated manners, rather than doing
straightforward programming.

------
stiff
This should have been an article about having made the wrong language choice
for the project to be done, as an attempt of an objective evaluation of C it
seems very confused in several places:

 _Faster Build-Run-Debug Cycles [...] C has the fastest development
interactivity of any mainstream statically typed language._

I could understand someone praising Go for this, because the language
designers specifically targeted fast builds for example by making the syntax
easy to parse efficiently, hence achieving good build times while keeping the
language relatively high level, but C simply does little for the programmer
and hence the builds are faster than in Java or Haskell. You could maybe
admire the decades of work put into C compilers, but not C the language itself
for it.

Also, if you criticize high-level languages for being far removed from the
computer it would be fair to point out that wider understood "development
interactivity" of C compared to something like Smalltalk is rather bad.
Consider also that this removal from the computer could be as much an argument
against our current computer architecture as it is an argument against high-
level programming languages. That we have invested decades into a technology
and can't easily switch to a different one doesn't mean we should stay
uncritical about its weaknesses.

 _Ubiquitous Debuggers and Useful Crash Dumps_

I suspect this is a sign of disappointment with Erlang, where debugging seems
to be a complete mess, the last time I was doing some development in it I had
to fire up some really weird looking debugger and run the program under it
just to get something reassembling a stack trace with the filename and line
number for the error I got in the program. But you do have lots of debuggers
and crash dumps in the JVM for example and I don't think you all that often
need to examine the level below the JVM when debugging JVM-based programs.
It's somewhat funny to praise C for being convenient in interactions with
other C code.

 _Contrast this to OO languages where codebases tend to evolve massive
interdependent interfaces of complex types, where the arguments and return
types are more complex types and the complexity is fractal, each type is a
class defined in terms of methods with arguments and return types or more
complex return types._

I don't know what this is even supposed to mean. From what I know most large
scale C programs end up desperately trying to emulate some of the features of
higher-level languages to create more strict module boundaries and prevent
implementation details from breaking the library interfaces.

------
sharjeel
I re-read the article as a satire on C and thoroughly enjoyed it

------
shmerl
I can't agree that C is _as high level as C++_. C++ templates and many
features in C++11 are way higher level abstractions than C.

~~~
alexchamberlain
Templates and namespaces are reason enough to use C++; add in STL and you have
a massive winner.

------
taylodl
If you're C code is portable then you've chosen the wrong language. C is best
when assembly would have been appropriate but you want to write something in a
more maintainable, higher-level language. C shines when you need to optimize
for a specific operating platform.

------
guilloche
My humble opinion: Scripting languages like python do have merits.

As a 20+ years C++ programmer, I sadly found there seems no scenarios that C++
can play very well. Compared to C, C++ projects are much harder to maintain.
Whenever C++ should be used, C is always a better option.

~~~
zmmmmm
Well, I think people forget history a bit sometimes. For 20 years or so the
most common exploit in any system was a buffer overflow, often in string
manipulation. I think we're finally getting past that, but C's lack of
abstraction over string pointers was a major, major problem and even today if
I was writing a secure system I would choose C++ if only to use the
std::string class.

~~~
dakimov
This is not exactly the C language problem, but the design flaw of the C
standard library. Also, there are folks out there that still use unprotected
buffers in C++ instead of safe types.

------
pimentel
"C is a weak, statically typed language"

Wouldn't that imply that a variable of one type could be coerced into another
type?

I would say C is strong typed...

~~~
dspeyer

        int i;
        float f = *(float*)&i;
    

I'm not saying you _should_ do this, but you can.

Or to use an example the author didn't regret:

    
    
        struct foo {
          int a;
          int b;
          int c;
        }__attribute__((packed));
    
        void print(foo* f) {
          int *p = f;
          int i;
          for (i=0; i<sizeof(foo)/sizeof(int); i++) {
            printf("%d,",p[i]);
          }
        }

~~~
emeraldd
Here's one I used in a GC implementation a while back. That last 'uint_8t
obj[]' is used to hold the object that was actually allocated.

    
    
       struct meta_obj {
           meta_obj_type *next; // next object in our list
           mark_type mark;
           size_t size;
           gc_type_def type_def;
           uint8_t obj[]; // contained object
       };
    

Or another (contrived) example:

    
    
       struct obj_type {
           obj_type_enum type;
       };
    
       struct string_obj_type {
           obj_type_enum type;
           char *c;
       };
    

You start with a collection of obj_type pointers and cast them to the
appropriate pointer type when you have identified the actual contained struct.
Useful if you need to have a heterogeneous list of things.

~~~
chj
Is this standard? I mean the unknown size field 'obj'.

~~~
dagw
It was added to the C99 standard.

------
mjs
"bolted-on support for concurrency"--just what are your options when it comes
to concurrency and C?

~~~
RodgerTheGreat
If you agree with Hans Boehm, "Threads Cannot be Implemented as a Library":
<http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf>

~~~
pdw
The C11 standard added concurrency primitives to the language.

~~~
mjs
Interesting. Do you know what the tool support is like? What compilers
implement it, if debuggers support it, etc.

~~~
andyzweb
tool support is pretty much non-existent. expect it to have the same amount of
tooling and support as pthreads.

------
alexakarpov
so, with you kind of experience, how often did you prove Greenspun's tenth
rule in practice? =)

------
michaelochurch
I'm a language design buff, so you might be able to guess my biases: FP is
good, OO is bad, every language should have higher-order functions, yadda
yadda.

I finally decided to get a deeper knowledge of C, something I've been saying I
"should do" for years. I'm learning it via Zed Shaw's _Learn C The Hard Way_
and I'm very impressed by the language. It does what it does very well. I
wouldn't use it for a complex web app, but it's a fine language for things
that are small but in which quality (which includes performance) is extremely
important.

I'm growing to like C a lot, and I think people who don't bother to learn it
are short-changing themselves in a major way. That said, there are a lot of
great languages out there that are better-suited to high-level projects... so
long as the concerns remain high-level.

~~~
hackinthebochs
Interesting, I would have considered deep knowledge of C a prerequisite to
being a language design buff. Did you have experience implementing languages?

~~~
michaelochurch
No, most of my experience was with high-level languages like Haskell, Python,
and Clojure, with my original interest being more in design than
implementation. One of the reasons I'm learning C is to have a better grasp of
implementation, and eventually (if needed) be able to competently implement
languages.

For most of my time, I focused more on user experience (syntax, type system,
workflow) and its effects on development culture. Watching language choices
make or break companies made me very opinionated. That said, as I get older,
I'm finding it harder to call specific languages "good" or "bad". They all
have their niches. Enterprise Java is horrible, but that's not James Gosling's
fault.

------
martinced
Regarding the performance of C, it is interesting to note that although hardly
anyone does it, Java, for example, can still be very very low-level if you
want.

You can still allocate a gigantic primitive array (say of Java 32-bits
integers) and mess with it, including doing bit-trickery, etc. and that is
amazingly fast.

When people want the uttermost speed in Java, they dodge objects as much as
they can to prevent too much GC'ing.

For example that is how the people behind the amazing LMAX disruptor "pattern"
(hardly a "pattern" in the OO sense) manage to handle, on a single thread, to
process about 12 millions events... Per second. In Java.

"Thankfully" Java still allows to create a gigantic primitive array and to
directly mess inside that array.

------
dschiptsov
Why not just stay with the definition that C is a portable assembler to white
OSes and Lisps?)

------
eriksank
C is the language someone would create if he just came out of a 5-year long
attempt to build an operating system such as multics and wants to do it right
this time ;-)

------
Roybatty
The C language is wonderful is because you can look Quake I, II, III, and
instantly see the intent. In C++ you can't.

~~~
VMG
Quake3 is written in C++

~~~
gillianseed
Although Quake 3 contains some C++ code (in splines/ iirc), it's probably 99%
pure C code so I don't think it can be labeled as being written in C++ .

<https://github.com/id-Software/Quake-III-Arena>

