
How can C Programs be so Reliable? - ltratt
http://tratt.net/laurie/tech_articles/articles/how_can_c_programs_be_so_reliable
======
jd
People often write in higher level languages because they want lots of bad
code fast. Almost all business applications are CRUD apps
(create/retrieve/update/destroy) with some business logic, and they're
generally written in C#. The app may crash when you click the wrong button,
but the app is cheap to develop and the programmers are easily replaceable.

Of course I'm generalizing, and a lot of C# programmers write great and
reliable code. The point is that they often don't -need- to write great code,
because lousy code is good enough for all business purposes (except when the
software is your product, which is rarely the case).

The second issue is that people who choose C for their projects tend to (a)
understand low-level concepts, (b) care about speed / memory usage /
reliability / dependencies, (c) don't consider development time that
important. That you create more reliable software that way is obvious - the
same programmers would create reliable software in C# (or similar language).
But there aren't many situations in which development speed doesn't matter,
speed and memory usage don't matter, dependencies don't matter but, for some
reason, reliability is very important.

~~~
scott_s
I come from the systems research world. I've seen people choose C not for the
reasons you mention, but just because it's what they know best. This is not
always the best thing to do.

One instance is a colleague who needed to do data post-processing, and just
did it in C because it was most familiar. A language like Perl or Python would
have been a better choice, and saved him time in the long run, since string
manipulation in C is tedious and error prone.

~~~
bsaunder
Your colleague may be interested in pcre (perl compatible regular expressions)
see <http://www.pcre.org>

But I agree, C isn't the best choice for that task.

------
scott_s
The author has a unique perspective since he - somehow - skipped learning C
until now. He programmed assembly before, and from his other work he clearly
is familiar with dynamic and more modern languages.

Consequently, his perspective on C is that of somehow who is new to the
language, yet also understands both the fundamentals of what his code will
compile down to, and the higher level facilities that later languages and
programming models abstracted away.

~~~
qwph
I'm still not convinced that there's anything particularly magical about C
here. Reliable programs are written by people who:

* understand the problem domain

* know the implementation language _and its supporting library_

* pay attention to detail

Admittedly, some languages fit some problem domains better than others, but
90% of the time, picking the language you're personally most familiar with
will be as good a choice as any.

~~~
scott_s
The author comes to a similar conclusion.

------
wheels
This misses two points I consider important:

\- A whole lot of C code is old. Old, actively maintained code, tends to be
more reliable than new code.

\- C tends to be employed in relatively predictable sub-systems. Something
like a device driver has a relatively predictable set of states relative to a
GUI application. There aren't that many paths. C code for GUIs, in my
experience, tends to be at least a buggy as code in other languages.

~~~
VinzO
You seem to forget that in embedded systems, C is still the most used
language. So most of the new systems developed these days have new C code. And
embedded systems are everywhere.

Also a big part of embedded systems have to be reliable for years in hostile
environement without external interventions. so I wouldn't say that this is
easily predictable subsystems.

~~~
demallien
Hmmm, I've worked in the embedded world for the last 10 years, and I can tell
you that the bulk of embedded code is far from hardened. It appalls me the
number of times that mobile phones, set top boxes, etc crash. Typically,
reliability is low because it is hard to run decent unit tests in an embedded
environment. You're obliged to write a type of simulator that you can run on a
PC, but the simulator never has the exact same behaviour as the target system,
so you can never truly verify correct code behaviour.

~~~
janm
Yes, this is true: A lot of embedded code is poorly tested. Even more is
poorly designed and written by poor programmers (like most software.)

Some people have to deal with the machine, and C is a good model of the
machine. Even reliable systems have some core in C (or an equivalent) that
provides an abstraction where higher level abstractions can be expressed.

Testing isn't the ultimate solution; some people are just able to use the
machine better and produce better code. Testing crappy code doesn't really
help. A subset of developers produce tools where others can produce systems
with less risk by providing appropriate abstractions.

------
jodrellblank
Considering it's the first (alpha) release of the first significant C program
he's written which is for a new (and therefore little used) programming
language, and the page of Converge tools has no mention of testing tools, and
the page itself has error messages on it, the claim that it's "not riddled
with bugs" seems a touch, erm, cheerful.

------
jwilliams
The thing about C is that you are generally very aware of the side effects.
Aside from the libraries you use, the (data) structures are yours. When you
set something to NULL, you know damn well that that means to you.

As the author alludes to as well - in C you're made more aware of the error
conditions you can handle and the ones you can't. So you can code to a level
of robustness... Exceptions in Java are all well and good, but I haven't seen
many implementations that _do_ anything with IOException except for cascade
it.

This works really well for programs/modules that can fit in the head of a
single programmer. When you go beyond that it gets pretty messy - which is
where some of the advantages of metaphors like OO start to help... Course,
there is the argument that modules should never get that big, but that's
another debate.

~~~
Hexstream
"As the author alludes to as well - in C you're made more aware of the error
conditions you can handle and the ones you can't."

You mean like when a function silently returns -1 to indicate failure and then
you wonder why your program returned a wrong result (if you're lucky enough to
even notice)? In the bigger part of most of my programs I want a big, flashy,
loud, total failure by default if anything goes wrong.

~~~
parenthesis

      if ((result = foo()) == -1) {
         fprintf(stderr, "!$*!$&^!$*!!!!\n");
         exit(1);
      }

~~~
qwph
If _exit()_ is too abrupt, you can always use _setjmp()_ and _longjmp()_ to
set up a non-local jump to an error handler.

~~~
jmtulloss
setjmp/longjmp is essentially what exception handlers do for you. I love C for
its simplicity, but I'm not certain that leaving exception handling out of the
language was a good idea. There are a lot of people who would agree with me
there, including (I believe) a few Bell labs veterans who wrote the language
in the first place.

~~~
LogicHoleFlaw
The nice thing about exceptions beyond setjmp/lngjmp (in C++ at least) is the
destructor semantics which you can use to guarantee that resources are cleaned
up on error. There are better ways to handle such things, such as scoped
resource allocation, but exceptions do an ok job.

~~~
qwph
You can actually use setjmp()/longjmp() to implement that kind of thing, it
turns out. Here's but one example:

<http://www.on-time.com/ddj0011.htm>

I'm almost tempted to make an analogy with scheme's (call-with-current-
continuation) here, but I think that might be pushing it.

------
mleonhard
> What I realised is that neither exception-based approach is appropriate when
> one wishes to make software as robust as possible.

The author misses the main point of exceptions: they let us separate data
processing code from error handling code. This is why we are more productive
in languages that have exceptions and our code is easier to maintain.

C requires us to handle errors throughout our program, tightly coupling the
data processing code with the error handling code. With apologies to Mr.
Spencer, I would declare that:

"Those who don't code in languages that lack exceptions are doomed to
reimplement them, poorly."

~~~
demallien
I have to disagree. From my perspective it is an error to think that error
handling can be handled separately from the main code path. The classic
example that I use to demonstrate this is to point to all of those tedious
discussions that programmers have as to whether something is an exception or
just normal behaviour of a system. Straight away for me that's a red flag that
a non-real distinction is being made.

I honestly don't see any advantage to exceptions over C-style return codes,
with one important ...euh... exception: the boiler plate for exception
handlers can be well handled by modern IDEs. All the other supposed advantages
seem to be just waffling to me. Take the whole 'Oh, but exceptions make
handling errors the default!' kind of argument (several examples on this
thread already). Yes, sure, you _do_ have to write exception handlers for all
errors in a language such as Java. But my experience is that if I'm writing
use-once-and-throw-away in C, I'll just not use the return code. In Java I'll
just stick a whopping great big try/catch around the whole app, and be done
with it. If I'm trying to write stable code that's going to be around for a
while in C, I check the error codes returned by a function every single time,
which gives me around about as much work as when I am using Java, and actually
handling different exceptions correctly.

All of which means, for me at least, that exceptions don't add anything to a
language, but they do make the language just a little bit harder to learn
(remembering exactly how any given language has implemented exceptions, and
which resources can still be safely used when is a pain, as each language
tends to have subtle differences that can bite you).

------
msluyter
After having read a lot of these types of language debates, the only
conclusions I can safely arrive at are 1) that every language has its
advocates and 2) some people are highly productive in their preferred
language, much more so than the _average_ programmer would be in their
preferred language. But the real question, imho, is how do average programmers
compare? How will the same app written in C++ and Java compare, when written
by non-superstars? The question merits some empirical research.

------
alecco
The problem with C is they left too many things completely on the wild.
Strings and memory management are all laissez-faire and everybody does
whatever they think is right.

A typical performance issue is calling malloc/free all the time, it can be
avoided but there are no standard ways.

I hope some great features of C++ get some day backported to C. But don't bet
money on that :(

~~~
holygoat
The later C standards add additional features. However, I'd argue that there
is no such thing as a "great feature" in C++: almost everything in the
language is a mistake, either intrinsically or in combination with other
'feature's.

It's a blind pig with fifteen legs, trying to put on its own lipstick while
riding a unicycle.

~~~
orib
Spoken like someone who truly doesn't understand C. You even managed to
confuse it with C++ (you might as well have said "Java" in the place of C++,
it's about as close to C as C++ is)

Well done.

 _edit: Ignore this post. I misread. I'm an idiot. Sorry about that_

~~~
kaens
I think you misread the comment you were responding to, and the comment that
it was responding to.

~~~
orib
Yes, yes I did.

------
alecco
Programs compiled from C don't change much when run, while dynamic
applications vary significantly on every run due to the large environment
(e.g. GC.) This is a bliss for debugging. I've never seen anything like gdb on
any other language. You can backtrack, set and change things, watch for
expressions, and even run one line of _compiled_ code at a time to see what's
breaking.

Also C programs usually are made to run many times and stay alive, so the
attitude of the developer tends to be more careful.

------
maurycy
As a person who spent few years doing C programming, and then got trapped into
the Ruby realm, I must say that the main difference is focus. If you have to
focus about memory management, strings, you think much more about the code you
write.

Theoretically, high level programming should enable you to focus on the
abstractions much more, but somehow it doesn't work this way.

------
jmah
See also: No Silver Bullet

------
axod
On the downside, most bugs ever discovered are most likely in c code.
Especially memory leaks, buffer overflows etc.

------
sharkfish
_If, as in the case of extsmail, one wants to be robust against errors, one
has to handle all possible error paths oneself._

That's not really a bad thing, as the author points out.

One thing I've always felt a sense of dread about in C# and Java (last Java I
did was back in 2001) is that I never truly knew what errors were lurking with
their exception handling. It would be really nice if all error possibilities
were listed in the documentation so I could pick precisely what to handle.

~~~
jd
It's even worse. For top level functions it can throw the UNION of the
exceptions thrown by the functions it calls. Therefore, by changing a low
level function's exception signature the exception signature of all higher
level functions changes too. That always worried me, but it's doesn't seem to
be a big deal in practice.

~~~
nostrademons
It's a huge deal with checked exceptions in Java. You have to change the
signature of every caller, and so on, when you change a low-level function.
Abstraction leakage galore; it's why many libraries just throw a single
generic exception type whenever anything goes wrong (which defeats the purpose
of exceptions) or subclass RuntimeException (which defeats the purpose of
checked exceptions).

Unchecked exceptions don't seem to be a problem, because oftentimes you don't
care what specifically went wrong, you just need to know that _something_ went
wrong and abort appropriately.

------
lst
Do they? Mature C code: yes. Modern one: no, many times affected by using very
bad algorithms...

