

Why I Still Like C - hypermatt
http://mitchgolden.com/essays/psychotic/psychotic10.txt

======
pmjordan
This isn't crankyness. If you try to generalise the thoughts on asm/C then it
boils down to this:

\- _Abstractions will always be leaky_

and

\- _be prepared for when an abstraction leaks_.

As for the ideology part:

 _think for yourself and try to see beyond the hype_

And I have to say I absolutely agree.

For example, I've written maybe a handful of lines of assembly, and maybe as
many lines of inline assembly. However, understanding assembly language has
saved me many times. Compiler bugs exist. Sometimes you don't have access to
debug info and a debugger, just the instruction pointer and a register dump.
Sometimes the compiler just _can't_ optimise well enough.

C is just one level of abstraction higher, so that's probably the level that's
going to leak through when your high-level-language interpreter fails for some
bizarre reason. (recent Ruby segfaults anyone?)

It's not turtles all the way down, but rather a menagerie of different
abstractions, and it's useful to know your zoology, as it were.

------
tialys
I've just finished my freshman Java class and was severely disillusioned at
how far from 'programming' it felt. I picked up a copy of K&R and have been
enjoying myself since. I actually feel bad for my fellow majors who think that
Java/OOP is the only way that things can be done.

~~~
jrockway
_I actually feel bad for my fellow majors who think that Java/OOP is the only
way that things can be done._

I feel bad for people that think imperative-style programming is the only way
things can be done.

C is not the anti-Java. It's Java minus modern features and with more rope to
hang yourself with. (Then again, someone actually told me that spending an
entire day with Valgrind hunting down a memory leak was fun.)

If you want power, non-Java-ness, and modern features, learn a language like
Lisp, Haskell, OCaml, etc. And learn Perl for actually getting things done.

~~~
tx
... meanwhile the entire Internet is powered by software written in C,
including the very same browser you typed this message in.

I've been hearing how obsolete C is ever since high school (1994) yet pieces
of successful programs not written in C-family is an order of magnitude less
common: everything form web-servers to browsers, OSes and office suites and
imaging software - all C. Meanwhile I've never had to install JVM on any of my
computers, somehow I never run across a single piece of useful software
written in Java (with exception of evaluating Jython and toying with
Eclipse&Netbeans at work).

Perhaps it's running on my phone. If so, that's probably why the damn thing
feels so anemic compared to Objective-C powered iPhone.

I don't know how C programmers do it, but somehow they just get things done.

Moreover, C remains the only choice of writing the most portable code ever.
There isn't a reasonably popular CPU on this planet that doesn't have a C
compiler for it, while Java runs only on "java computers".

~~~
jrockway
C is entrenched. Everyone uses it, and because everyone uses it, everyone
continues to use it. That doesn't make it good, though, just popular. Read
pg's "Beating the Averages" for a more eloquent explanation.

You seem to think that Java is C's competition. It's not. I think we can all
agree that Java is one of the worst programming languages ever created. (The
VM is nice though.) Nobody is telling you to use Java or C++. They are telling
you to use a dynamic language (Perl/Python/Ruby) or a functional language
(Haskell/OCaml/Lisp).

 _I don't know how C programmers do it, but somehow they just get things
done._

Sure, but their programs usually leak memory (ever use Firefox?), and their
libraries handle errors by terminating the whole program (see PulseAudio; any
error condition immediately calls exit(1)).

 _There isn't a reasonably popular CPU on this planet that doesn't have a C
compiler for it, while Java runs only on "java computers"._

I don't optimize my life for solving problems I don't have. Every machine I
want my code to run on has Perl, Lisp, Haskell... whatever. If C was the
highest-level language I could use, then obviously I'd use it. But it's not,
so I don't.

Finally, gcj will compile Java down to machine code; ghc compiles Haskell to
native code. So I don't ever see a situation where I will absolutely have to
use C for some reason (other than interfacing with C libraries, of course).

~~~
jey
" _Sure, but their programs usually leak memory (ever use Firefox?), and their
libraries handle errors by terminating the whole program (see PulseAudio; any
error condition immediately calls exit(1))._ "

That's a poorly written program. You can just as easily System.exit(1) in
Java, or to do it a more Javaesque way, throw new Error("im too lazy to do
this right").

~~~
jrockway
But C doesn't have any mechanism other than exiting or returning a value that
indicates "false". Both are non-optimal.

One (exit) makes the code simpler, but makes the consuming application
extremely flaky; the other bloats the code with a weird API where functions
modify their arguments:

    
    
       errorcode_t do_some_work(int arg1, int arg2, int *result)
    

and forces the client code to check every single error and decide right then
and there what to do.

It's possible to write code that way, but why would you want to? This is a
solved problem; why keep using a square wheel when round ones are available?

(Yes, there is longjmp. Still a lot of work to use that square wheel.)

In the end, don't think I care what programming language you use. It doesn't
matter. But I do think that C should be dying off now; it is much less useful
than it was 30 years ago.

~~~
huhtenberg
> _But C doesn't have any mechanism other than exiting or returning a value
> that indicates "false"._

Of course it does, in fact you just said it yourself - long jumps. Back in
late 90s I worked with the firmware for the Point of Sale terminals, PIN pads,
etc. It was written in C and its error handling was exception-based. Behind
the scenes it was just a handful of long/setjmp wrappers, but it did
nevertheless implement semantics of try/catch/etc.

Additionally, "exiting or returning false" are not the only two options, nor
they even the most commonly used ones. At least in the projects that I was
exposed to. Kernel code (e.g. Linux, BSD) routinely uses int as a return value
and it still somehow manages to be both readable and functional without being
"bloated" or using "weird API".

Yet another thing to consider is that optional exceptions (such as those in
C++) come with a non-negligible performance hit, so it is considered an
absolute no-no to use them in a "fast path" parts of the code.

> _But I do think that C should be dying off now_

But it just doesn't. Bummer :)

~~~
plinkplonk
"Of course it does, in fact you just said it yourself - long jumps. Back in
late 90s I worked with the firmware for the Point of Sale terminals, PIN pads,
etc. It was written in C and its error handling was exception-based. Behind
the scenes it was just a handful of long/setjmp wrappers, but it did
nevertheless implement semantics of try/catch/etc."

David Hanson's book "C Interfaces and Implementations" explains how this is
done and provides (very high quality) source code you can use/learn from.

------
jrockway
Most of the article is "why I think learning assembly is important". I don't
know assembly, but I do know how computers work (and I even know you have to
trigger an interrupt to make a system call). I think he should just get to the
point and say "people should learn how the computer works", rather than
suggesting that you learn a programming language that makes you aware of some
aspects of how a computer works.

 _It's like a very sharp knife: you can cut yourself as well as whatever it is
you're trying to slice. It's really easy to write broken or inflexible code in
it._

This is true of all powerful tools. I've seen horribly unmaintainable Lisp and
Haskell. I've also seen very clean and maintainable Perl. C is no different
than any other programming language, although it does actively discourage good
code. You can ignore the discouragement, but why not use a language that
encourages maintainability, or at least automatically deallocates memory when
you're done using it?

Anyway, I don't think this article shows why the author likes C, or why
someone else should learn C. Sure, learn C if you want to learn C. But you can
also learn about your computer by just learning about your computer.

~~~
IsaacSchlueter
Dial your time machine back a few decades, and you'll see assembly programmers
telling those C upstarts that they should learn assembly for all the same
reasons. A few more, and it's the logic circuit designers griping that you
can't trust instruction cards because you can't see the program. A few more,
and it's the vacuum tube enthusiasts complaining that digital switches are
slower, less reliable, and too small to debug properly.

If you want a visceral relationship with your hardware, get a soldering iron
and go to town. That's a respectable hobby, and confers mega geek-points. If
you want to write software and be productive at it, learn a real language.

C is a powerful tool, and every programmer should know it. But it's woefully
underpowered as a fulltime software development tool.

~~~
ajross
And you know what? All those curmudgeons were right (except for the bit about
vacuum tubes: transistor logic was never slower or less reliable than tubes,
ever). If you don't know how your tools are put together, there will be areas
of software development _at_ _all_ _levels_ which will forever remain voodoo
to you.

You can do an awful lot of useful work while relying on voodoo, but eventually
the voodoo will catch up with you and you'll end up with an ugly mess where a
better trained developer would produce an elegant hack. That's no less true
for a web developer using rails than it is for an embedded systems driver
developer.

Basically, I've never known a great hacker who doesn't understand CPU
architectures and at least a little bit of digital logic design.

As for C specifically, let's just agree to disagree. I'll put of the sum total
of "great hacks" written in C up against any amount of C++ or Java you can
find. It's true that for some problems (web development being a good example)
there are better tools (scripting languages, databases) for the domain. But
that alone doesn't make C "woefully underpowered".

~~~
jimbokun
"It's true that for some problems (web development being a good example) there
are better tools (scripting languages, databases)..."

most likely implemented in C.

~~~
IsaacSchlueter
It's implemented in machine language somewhere. Programmers should know what a
register is. It's implemented in electrons, too. And, some of the best
programmers I've known were EE majors.

Don't get me wrong, it's wise to know your craft. That's never a bad thing.
But it's foolish to use a tool that is, frankly, not as powerful.

Saying a tool is "powerful" doesn't mean "you can do lots of stuff with it."
You CAN saw down a tree with a screwdriver. But a chainsaw is a much more
powerful tool for that job. Yes, a screwdriver can do lots of things that a
chainsaw can't, which only proves that it's a more versatile tool. The _power_
of a tool depends on the context of the problem you're trying to solve.

I'm guessing that screwdrivers are used in the assembly of chainsaws. They
still suck for cutting down trees. Same with C vs just about every other
language. C is a fairly small step up from assembly. An important step, no
doubt, but for most tasks that software developers face, especially on the
web, it's not appropriate.

~~~
ajross
I'm going to go out on a limb and guess that you've never actually worked
seriously in C. You probably had to maintain someone else's code at some
point, got confused by the linker or debugger semantics, and decided you hated
it. [disclosure: I peeked at your blog and resume, and it seems I'm broadly
correct.] That's fine: work in C is grounded deeper than web development. You
don't have the scaffolding around you to provide a fallback for changes.
Sometimes you need to fix bugs from first principles, and that takes a lot
more knowledge up front than web work does.

But that's not the same as saying that someone who _has_ that background is as
unproductive in C as you are. Seriously: spend some time writing something
serious in C, you might like it better. It will certainly seem more powerful
than it does to you right now.

~~~
IsaacSchlueter
You're "broadly correct" in the sense that the Republican Party is "broadly
libertarian."

I spent 5 years writing almost 100% C and C++ in college, with detours into
Lisp, Java, VB, and Ada. (Actually, I'd grown up on Basic, so VB wasn't a
detour, really.) After school, I worked at a VB shop, but built a few side-
projects in C++. I've had to fix things that were broken, and I don't have a
problem with the linker or debugger semantics. I've used templates, and both
has-a and is-a object extension. I am certainly not an world-class expert in
it, but it's not like I tried it once, got burned, and decided I hated it.

I'm a web developer because I like web development better. For a variety of
reasons, not the least of them the challenge of building code that is so
portable it will run on 12 different browser/os combinations, and the
opportunity to work in many different languages, I find front-end development
much more rewarding.

My aversion to C is based on a simple rule of thumb:

    
    
      (usefulness of a feature) / (tokens required to implement it)
    

Lines of code is a pretty good fill-in for "tokens", but I had to change the
rule when someone pointed out that a 500 character regex in Perl is hardly an
elegant or maintainable program :)

Another way to express this is that the elegance of a solution is the number
of tokens that are not essential to the solution and do not aid in
understanding the intent of the code. For example, compare Javascript's
closures with Lisp's or Erlang's syntax. On the other side, compare any of
those with the mess of class and object boilerplate in C++ or Java or PHP.

Conceptual cruft is even more pervasive. If encapsulation _requires_
class/object constructs, then there is no way to get past that.

Some programmers will be able to create more elegant solutions than others, of
course. But without changing languages, you can't get past the cruft that is
built into language. (Or, in the case of C, the cruft that was not removed
from the language.)

Again, "power" does not mean "you can do anything with it." C can be used to
tell a computer to do anything that computer can do. But I'd argue that that's
a naive view of the power of a programming language. There are things that are
trivial to express in other languages, and require several lines to do in C.
Just something as simple as comparing that two strings hold the same value
requires about half as much conceptual overhead in almost any other language.

Or is there a version C with lexical closures, first-order functions and
strings, and a garbage collector that I'm not familiar with?

I'm not sure what you mean by _You don't have the scaffolding around you to
provide a fallback for changes._

The claim that C programming takes more knowledge up front than web work does
is frankly presumptuous and a bit misguided. I've seen the Javascript that C
experts write. It's terrible. In fact, I'd argue that since quality web work
requires an understanding of semantic HTML, CSS, Javascript, maybe some Flash
and/or Canvas chops, and probably at least one kind of server-side language
like PHP or Java, it requires much more knowledge up front than almost
anything else out there.

------
Enlightenment13
The #1 thing that I hate about C++ is all the f _cking side effects that
people can bury in their constructors and every where else under the f_ cking
sun. I want to start swinging a baseball bat at all the f _cking idiots that
don't think in advance of using the latest f_ cking cool C++ whatsamcallit.

When you are maintaining a very very large project, the best thing about C is
that when you read the code it is fairly obvious what is going on, and you
don't have worry about some innocent looking code causing a bunch of other
crap the be executed behind your back.

On the downside of C, name space collisions are a problem and the ability to
hide data and functions.

I vote they should add C++ style classes to fix the name space problem, and
private and public keywords to hide data and functions. I don't want all of
the object-oriented B.S. like constructors and destructors that makes it
difficult to read and debug other peoples code.

Don't get me wrong.....I think some features of C++ are very useful,
especially operator overloading which has allow us to make it easier to code
big programs that handle lots of longer intergers and weird integer sizes like
56 bits, or 112 bits, or 128 bits. Yeah, these days we have uint64_t in C99
compilers, but back in the mid-90's we had to write code in C++ just because
it had operator overloaded classes, which we also had to write.

Even though I haven't stated earlier...I do real-time embedded software...so
I'm not going to be writing an interrupt in Perl or Python anytime soon. C is
not going away in the real-time / embedded software / driver world anytime
soon. It is still very very common!

------
utnick
article reminds me of my granddad complaining that us youngsters don't know
how to kill and clean chickens because we can just go to the grocery store and
buy them already ready to go.

~~~
rw
Programming is not like murdering chickens.

~~~
ken
Programming in C is maybe a _little_ bit like murdering chickens.

------
catechin
Obviously you should be familiar with C. This entire conversation, however, is
misguided by comparing it to Java or to fancy high-level languages. It's
apples and oranges.

Of all of those, only C/C++ offers extremely high performance as its key
feature. Being primarily a shorthand for assembly, C occupies a permanent
place as the systems and performance language of choice and will remain there.
If your application demands performance, use it. If not, then don't.

~~~
seano
Sure, but first write the application in a higher level language and then, if
required, you can re-write the bottleneck in C. Otherwise what you are doing
amounts to premature optimisation.

~~~
rw
_Otherwise what you are doing amounts to premature optimisation._

That's a good insight - what language you use is also a form of optimization.

------
Enlightenment13
>> C is a powerful tool, and every programmer should know it. But it's
woefully underpowered as a fulltime software devlepment tool.

Only a inexperienced fool says such a thing! The bottom line is that it all
depends on your what you are trying to do. For embedded / real-time / drivers
/ O/S, both C and C++ are perfect, but for middleware and scripting and many
other high level things they aren't the best language.

------
hsmyers
Just as a useful benchmark for all of you higher level language fans--- try
programming without your C libraries...

~~~
Oompa
Completely agreed. I've been doing Euler problems in Ruby/Java, and my friend
does them in C. The difference there is very noticeable, and quite interesting
at the same time. Obviously our programs are only compared when we use the
same algorithm.

~~~
Tichy
What difference is noticeable? Speed? Code ugliness? Coding speed?

~~~
Oompa
I can typically speed through them faster, but the speed of the code execution
is so much faster in C. Especially in basic number crunching.

------
Enlightenment13
The #1 group that says C should be dieing off is f-ing book authors and
sellers! Why? Because they can't write a bunch of new books on an older
language. It is old...so we can't write book and make money on those older
things. New stuff is cool...yeah new stuff is cool...buy our books...buy are
books.

------
jimbokun
What's the answer to the p, q, r question?

~~~
parenthesis
What p points to shouldn't be modified (but p can be changed to point to
something else).

q mustn't be changed to point to something else (but what it points to can be
changed -- except that in this particular example, q will point to a string
constant, which shouldn't be modified).

Both of the above.

~~~
jimbokun
Thanks!

------
eventhough
What is the answer to the second question?

int i[10]; <\- allocates memory for 10 integers

&i <\- pointer to the first element in the array (i.e. i[0]).

correct or not?

~~~
gizmo
Geez. No.

int i[10]; does allocate memory for 10 integers (on the stack)

So i has the type int•. A pointer to a value: an array. A pointer to a value
is the same thing as a pointer to the first value in the array. That's why you
can use ++i, increment the pointer, and it will point to the second value in
the array (++i => •i == i[1])

So &i is, wait for it, a int••, that is, a pointer to an array. So if you have
an array of arrays, then you need to dereference twice. E.g. argv. It's a
char••, so an array of strings which is an array of arrays of characters.

~~~
kylec
If you declare

int i[10];

you can't do

++i

because modification of the value of i is not allowed.

~~~
gizmo
Yeah, but that's not the point. It's about which address points to which
place. An array always points to its 0th member in C by default - that was the
point. Declare int• j = i; and you can do as much pointer shuffling as you
want.

