
C Craft: C is the desert island language. - nkurz
http://www-cs-students.stanford.edu/~blynn/c/ch01.html
======
Groxx
> _In my Eiffel days, I was encouraged to write "integer", not "int",
> "character", not "char", and so on. I believe Java encourages this practice
> too. Supposedly clarity is maximized. But how can this be if we do the
> opposite when we speak? Do you say “taxi cab”, or “taximeter cabriolet”?
> Redundant utterances are tiresome and worse still, obscure the idea being
> expressed._

I see this argument a lot, and they strike me as people complaining because
they don't use tools their language provides.

Typedefs are the answer to excessive name length, and they're _nearly_
everywhere. Just create a couple typedef files, and import them as needed -
future programmers get the full names easily, while you can program in your
pseudo-K version of C for maximum keyboard efficiency. I have a handful of
such files, they're endlessly useful - why write
`Pop_froM_lIsT_WhIch_COntaiNs_speCiFik_type_X`, doing battle with naming-
scheme-X that only employee-Y uses (and their associated spelling errors) when
you can do so once, and write `pop` from then on, unambiguously?

The upside of typedefs for this comparison is that they're precisely what we
do with spoken language - nobody knew what a "taxi cab" was until someone told
them it the shorter version of "taximeter cabriolet", or until the full phrase
was well enough known that it could be inferred accurately by the average
person.

~~~
chittis
Given that Java doesn't have typedefs how would you reduce the verbosity of :
ConcurrentHashMap<String, Integer> foo = new ConcurrentHashMap<String,
Integer>();

~~~
Groxx
There you're largely screwed, but Java is well-known for being extremely
verbose, and of course there will be languages that don't have basic features
others have had for decades.

The closest you can get in Java is to wrap it in a private, internal class.
Which also lets you rename some verbose or frequently-combined operations,
say:

    
    
      foo.someoneCouldntComeUpWithAShorterName() -> wrapped.succinct()
      -- or --
      foo.store(val);
      foo.store(val);
      foo.store(val); // I hate void functions...
      -> wrapped.store(val,val,val);
    

But yes, it's a pain. Java's largely a pain.

~~~
jedsmith
I see your Java, and raise you Objective-C:

    
    
        NSString *thing = [NSString stringWithFormat:@"Result: %@", [foo someoneCouldntComeUpWithAShorterNameWithZipcode:08205 name:@"larry" parcels:5]];
        NSDictionary *request = [NSDictionary dictionaryWithObjectsAndKeys:@"iphoneapp/0.1", @"User-Agent", @"news.ycombinator.com", @"Host", nil];
    

This isn't to kick off pet language wars, by the way, just to point out that
my OCD tic to wrap to 80 characters has absolutely zero place in Xcode. Or
Java.

~~~
Stormbringer
_"just to point out that my OCD tic to wrap to 80 characters has absolutely
zero place in Xcode. Or Java."_

Dude. Are you still using CGA? Unless for your IDE you're running emacs on
your wristwatch or phone there's no place for that shit in the real world, and
you have no excuse.

Ditch that green cathode ray tube, here, have 50c and go buy yourself a real
monitor. :D

~~~
berntb
It is fascinating to see people having 24+" monitors -- and seeing ca 30 lines
of code on their screen, because of the real estate eaten by their IDEs.

To add maiming to injury, these guys write in modern Cobol (a.k.a Java).

Not to mention that they can't print the code -- since it is 150+ chars
wide(!) -- and browse it at a cafe. (I love my iPad, but It'll be iPad 3
before it is half as nice for browsing code [edit: as paper].)

I have about 2 full pages of code on the same size of monitor. Of scripting
language. I doubt that I have abnormally bad memory or write too complex
code... I just like to be productive.

Edit: oK, downvote if you want. But please also add a counter argument _for_
wasting all that screen area?

~~~
gnosis
_"Not to mention that they can't print the code -- since it is 150+ chars
wide(!) -- and browse it at a cafe."_

You can't print code longer than 150 characters wide?

Since when?

~~~
berntb
Sigh, are you trolling or new? OK, I'll answer.

The point of printing and browsing code is that it is easier and nicer than to
read it on a screen (you'll have a portable with you at the cafe anyway to
search, annotate, etc.) If you get lots of broken lines, it isn't readable.

Point was, you lose a capability with long lines -- a nice way to go over
code. This is important for many of us.

Edit: Point jedsmith, write it off as grumpy-old-man syndrome. Sigh, I just
can't get that this isn't understood by everyone, anymore.

~~~
gnosis
Broken lines?

We no longer live in the days of typewriters or daisychain printers with fixed
column widths.

We now have these wonderful things called laser printers which can print using
fonts.

Fonts are resizeable. Even to the point that 150+ columns could be fit on one
unbroken line.

And even lines that long can be quite readable if you print in landscape.

~~~
berntb
Let me first not that printing was a little sub point.

Second, in what you quote from: _easier and nicer than to read it on a
screen_.

So your argument seems to be that it is equally nice to read fewer lines of
paper in a smaller font. Huh? Also, my eyes can't handle 4 pages on a page any
more -- and your eyes won't be able to do that after 40, either.

------
enneff
It's nice to see that the things he describes as C's greatest virtues are the
same things we carried over into the design of Go. (<http://golang.org/>)

~~~
naner
It appears he has something to say about that, too:

<http://www-cs-students.stanford.edu/~blynn/c2go/>

------
gcv
I love the link to OTCC (the Obfuscated Tiny C Compiler,
<http://bellard.org/otcc/>). It frankly blows my mind.

~~~
roel_v
I was quite disappointed though that most of the obfuscation comes from the
use of the preprocessor. I can make any program in any language look hard if I
can rewrite it with an m4 macro before running it.

------
onan_barbarian
There are some reasonable things here, but:

"Lumping together function pointers that operate on the same data structure is
essentially object-oriented programming" is unbelievably contentious
regardless of which of the various definitions of OOP you follow.

~~~
jedsmith
I would strongly disagree that it's contentious. (Think that
over...disagreement about contention. I raised my eyebrow too.)

Some of the more _exotic_ features of what people call "OOP", like
polymorphism, take a bit more care and feeding to pull off in straight C. They
are most certainly possible, that being said.

Unwind "OOP" to "working with objects that contain data and methods together,"
and structs with function pointers fit the bill. I've had people stand in my
face and practically shout that C++ is object-oriented and C isn't, and one
refused to believe or admit that C++ basically compiles down to C in the end.

~~~
tsuraan
If you try to equate C and C++ by saying C++ can compile to C (not sure if
that's what you're saying but it sort of looks like it), you might want to
change your argument a bit. Haskell can also compile to C, but that doesn't
make C strongly-typed, functional, or garbage collected.

------
angusgr
I'm deeply conflicted about this article because I agree with many of its
premises. For instance: that C is superior to full-blown C++, that object-
oriented programming is no panacea, that simplicity is good.

I have serious concerns, however, especially with the first page and the first
few chapters.

\- They support the biases of the myopic programmer who believes that now he
or she knows C, they know everything one must know about programming. I know
that isn't the entire point, but you can get that from the article and I have
met such programmers. You don't want to work with them, even on projects
written in C.

\- The "C Craft" section largely describes hacks to work around shortcomings
in the syntax or semantics of C.

\- The languages used for "vs" comparison are: FORTRAN, C++, Java. Fish in a
barrel, anyone? Haskell, APL & J are presented as curios. Python is only
mentioned in passing, as an inferior choice to Haskell for rapid prototyping
of mathematically-oriented code.

\- Go is presented as the "better C", which is encouraging but I'd feel more
encouraged if the author showed they were properly familiar with some
additional modern programming languages and the cases in which one might use
them.

\- The assertion that "you can write object-oriented code in C" is accurate,
although I think a better point to stress is "you can write mostly-well-
modularised code in C, and that's what you want a lot of the time."

\- The author also ignores the reality that object-oriented C really only
scales up to a certain amount of object-orientatedness, and then it becomes
very unwieldly if you are not very careful. Unwieldy at a scale where using a
small subset of C++ (ie "C with classes") would remove the overhead, improve
the code's signal-to-noise ratio, and still not bring in most of the "bad C++"
that the author is talking about.

\- The author seems to have chosen to define certain terms as they see fit.
For instance, simplicity is defined in terms of brevity & terseness but the
example used to prove the point is that Eiffel requires "character" and
"integer" whereas C only requires "char" and "int".

For an alternative point of view on what constitutes "brevity" and
"simplicity", see the common C idioms for filtering or mapping any variable
sized data structure. The only time it becomes less brief is if you rewrite it
in C++ w/ STL or Boost. ;)

\- It's also telling that in Chapter 2 the Fibonacci counterpoint to Java is
in Haskell, not C. That's because a full C program would look pretty similar
to the Java program quoted, albeit without the sore-thumb of wrapping it all
in a class .

Anyhow, I should quit ranting but IMHO (a) you should know C, (b) you should
respect C but (c) you should know some other languages and use C only when you
actually need to.

(c) may not apply if you're a super-whizz genius C programmer, some of those
people seem like they can carry off ridiculous use cases without making
horrible messes. Most of us are not those people. ;)

~~~
blub
"that C is superior to full-blown C++[...]"

I disagree that C is superior to C++ for most projects.

First of all, C++ offers higher-level semantics which leads to an increase in
productivity. According to [1], the ratio of C++ to C LOC is 1 to 1-2.5, but
I'd be interested to read other studies or articles. My experience and [1]
also contradicts your statement that rewriting code with STL/Boost will lead
to increased code size. It's almost impossible for C++ to be more verbose than
C - consider that you can write C-like code directly in C++.

Second of all, for a developer that has good knowledge of both C and C++,
writing quality C is harder. C is very prone to programmer errors, much more
so than C++ where you can choose to use libraries/language features that
completely eliminate some of these errors. e.g: scope-based destruction, the
STL data structures, string classes, etc.

Third of all, when discussing C++ many try to make C++ look as a frightening
monster ("full-blown C++"?) where you have template meta-programming
interacting with automatic conversions and operator overloading and all the
other features in the worst possible way. This is not much different to
claiming C is bad because you can use macros and void* for everything. The
languages should be compared by real-world use patterns, not dreamed up
criteria.

[1] Estimating Software Costs (Jones 1998) and Software Cost Estimation with
Cocomo II (Boehm 2000)

"[...]that object-oriented programming is no panacea[...]"

While it is true that some have argued for OOP as panacea, this is mostly a
strawman. OOP is one possible solution that works _most of the time_.

~~~
Stormbringer
LOC means nothing, and if you want to generate subtle programmer errors, then
templates is definitely the way to go.

~~~
blub
LOC means a lot when it comes to:

* reading and trying to understand source code

* maintaining source code

* typing and programming-related injuries

The dynamic-typed camp argues that even declaring a variable is a pain.
Conciseness is one of the favourite features of JS/Ruby/Python developers and
prolixity is universaly loathed in Java.

Please expand on your assertion that subtle programming errors are caused by
templates. I'm interested especially in errors as a consequence of using
templates from the STL or to build generic code. i.e: not metaprogramming

~~~
rdtsc
> reading and trying to understand source code ... maintaining source code

I think it helps but in general those things are not related very well. For
examples guess what this line of code does (this is APL) ?

    
    
        X[⍋X+.≠' ';]
    

According to APL's Wikipedia entry it "sorts a word list stored in matrix X
according to word length". It is just 1 line of code so it wins there. But now
imagine reading code like that and maintaining it.

~~~
blub
An extreme example doesn't prove much and there are many languages that
successfully balance readability with LOC. e.g: Python, JS.

~~~
rdtsc
Right, I was just trying to highlight that a simple metric of LOCs is not
enough. One can right cryptic code to minimize LOCs but that usually leads in
harder to read and maintain code.

------
cafard
Desert island language, as in "like trying to build a hut, and manage your
hunting and gathering with the Swiss Army knife that happened to be in your
pocket"?

I agree that for Torvalds work it is the only sane choice. But I'm not writing
kernels.

~~~
forgotAgain
It's along the lines of "if you were stuck on a desert island and could only
have one whatever, which would it be?"

So if you had to choose one programming language to do all of your work in,
which language would it be?

~~~
cafard
Actually, I understood. I used to listen to Desert Island Disks 15 or 20 years
ago--a local NPR affiliate's version.

One programming language? As I recall DI Disks let one pick 10 pieces of
music, and threw in the Bible, all of Shakespeare, and a luxury of your
choosing. But I suppose that if one were not distracted by King James, King
Lear, or a luxury, the sensible thing would be to follow in steps of Wall,
Ousterhout, Van Rossum, and Matz: use C and create an extensible scripting
language.

(edit: spelling)

------
d_r
Another excellent resource (a Git tutorial) by the same author: <http://www-
cs-students.stanford.edu/~blynn/gitmagic/>

------
tptacek
You can easily beat "static const volatile signed long long int bar" with a
function pointer (and without the unrealistic redundant indirection of "int
_const ..._ const foo"). Start with static const volatile signed long long int
(*foo)(static const volatile signed long long int bar). :)

~~~
leon_
calls by function pointers can't be resolved at compile time and thus won't be
optimized. depending on the program you're working on you might want to
consider this.

~~~
anon_d
Not strictly true, it is quite possible to statically resolve many uses of
function pointers.

------
discreteevent
"Other object-oriented programming preachers rebelled in practice, by spending
most of their time in “fun” languages like Perl or Python. After all, who
enjoys the boilerplate and verbosity of Java?"

\- What has the verbosity of java vs python have to do with the validity of
object oriented programming?

"New languages like Erlang and Go have become popular despite not being
object-oriented. In fact, Joe Armstrong, inventor of Erlang, explains why OO
sucks."

\- Actually later Joe Armstrong came back and said the following:

"Actually it’s a kind of 180 degree turn because I wrote a blog article that
said "Why object-oriented programming is silly" or "Why it sucks". I wrote
that years ago and I sort of believed that for years. Then, my thesis
supervisor, Seif Haridi, stopped me one day and he said "You’re wrong! Erlang
is object oriented!" and I said "No, it’s not!" and he said "Yes, it is! It’s
more object-oriented than any other programming language." And I thought "Why
is he saying that?" He said "What’s object oriented?" Well, we have to have
encapsulation, so Erlang has got strong isolation or it tries to have strong
isolation, tries to isolate computations and to me that’s the most important
thing. If we’re not isolated, I can write a crack program and your program can
crash my program, so it doesn’t matter.

You have to protect people from each other. You need strong isolation and you
need polymorphism, you need polymorphic messages because otherwise you can’t
program. Everybody’s got to have a "print myself" method or something like
that. That makes programming easy. The rest, the classes and the methods, just
how you organize your program, that’s abstract data type and things. In case
that the big thing about object-oriented programming is the messaging, it’s
not about the objects, it’s not about the classes and he said "Unfortunately
people picked up on the minor things, which is the objects and classes and
made a religion of it and they forgot all about the messaging."

------
nadam
It says:

"In my Eiffel days, I was encouraged to write "integer", not "int",
"character", not "char", and so on. I believe Java encourages this practice
too."

1\. No Java uses char, int, and so on.

2\. No, C is absolutely not concise even compared to Java. At least how I
write Java code.

Look at this Java method:

String strangeConcat(String str1, String str2) { return str1.substring(0,
str1.length() - 1) + str2.substring(1, str2.length()); }

Is it really more concise in C? How do you tell the caller to free the memory
of the returned string?

And if you compare C to Ruby, Python, Scala, Clojure etc... then it is no
question that C is not a concise language for most tasks.

~~~
ElliotH
In Java int and char are primitive types. They don't provide methods so:

    
    
      int i = 4;
      i.toString(); //Exception - int isn't a class, so i isn't an object, so you can't call its methods.
    
      Integer j = 4;
      j.toString(); //Returns "4" as a string, because Integer is a class.
    

Java would 'encourage' the latter practice at times because its useful to be
able to call methods off of characters and integers and such, but there is a
difference.

~~~
msg
But you wouldn't do this in Java (need a random string for an integer).

Instead you might see:

    
    
      logger.fatal( String.format("Program crashed on index %d: %s", i, message) );
    

I think C has a similar construct, not quite as verbose, that would save 7
characters out of the line...

~~~
ElliotH
There are other methods on Integer and such - it was a contrived example
because I was too lazy to read the docs.

------
dasil003
Just the notion of a single desert island language is a bit humorous, but if I
had to choose I'd pick something meatier like Haskell; and not out of lack of
fondness for C.

------
swah
I'm assuming the island has power and modern computers.

------
granite_scones
The "Power" section made me think of Blub.
<http://www.paulgraham.com/avg.html>

------
ryanisinallofus
Good thing we don't work on deserted islands right?

------
Ruudjah
> Not only is C easy for humans to understand,

Is he being funny or serious?

~~~
billforsternz
Serious. C is a simple, elegant language. If your natural inclination is
towards low level tools without elaborate abstractions built in, then yes it
is easy to understand. Easier than the equivalent assembly language for
example.

~~~
chalst
Assembly language is generally easy, excepting only that it typically doesn't
insulate issues of memory addressing, &c. The big advantage of C over
assembler is portability.

