
Taking C Seriously - tomh
http://www.subfurther.com/blog/2011/11/01/taking-c-seriously/
======
DiabloD3
I'm confused. Did C stop being taken seriously at some point? C is the only
language that you can truly get things done without having to fight the
language every step of the way.

I can't say the same of... well, any other language I've used in the past 5
years.

~~~
Blunt
agreed. and if you do any sort of embedded work, you must know C and C++. If
history is any predictor of the future, chips will continue to get smaller and
the "embedded" development world, I think, will proliforate and demand coders
to know C and C++ at least for the next several years until fancy
tools/compilers argue things like Java and C# code bloat doesn't matter
because a 4Gig embedded system will be the norm... ahem..

~~~
gte910h
There is a huge swing against C++ in the embedded world since 2007 or so. ABI
issues, etc.

Not saying it disappeared, but there is now a bifurcation between C and C++
programmers.

------
zeteo
>C was developed between 1969 and 1973, making it nearly 40 years old [...] C
ha[s] already made an exceptionally good start on a century-long reign

In all fairness, development continued past 1973. Most C programmers today
would have some trouble even reading the code from the 1978 first edition of
K&R; and the void* pointer, that the author refers to later in the article,
was a part of ANSI C (~1990).

~~~
DiabloD3
Development on C hasn't stopped. C1x is coming soon (probably 2012 or 13), and
has a lot of interesting enhancements including bounds checking.

~~~
cygx
According to Herb Sutter's obit for Dennis Ritchie (
<http://herbsutter.com/2011/10/12/dennis-ritchie/> ), C1x already passed its
final ballot, so all that's missing is an official announcement.

It would be nice to know if there are any changes from the April draft,
though...

------
cageface
Pure C seems to be enjoying a bit of language fad hipness lately but I don't
know anybody that has to maintain a large body of non-trivial, low-level code
that chooses pure C over C++. There's a good reason that everything from
Solaris to V8 to Photoshop to Quake is written in C++ and not C.

That C is still in wide use after 40 years is a testament to the elegance of
its original design but let's not get carried away.

~~~
pvarangot
The Linux kernel, Subversion and PHP are pure C. And don't get me started on
the BSDs or binutils... ld and libbfd are pure C and its C++ replacement
(gold) is still not widely used.

And this came without thinking about it... surely in two more minutes or by
going to ohloh I could fire a couple more large bodies of non-trivial low-
level C code at you.

~~~
comex
> The Linux kernel, Subversion and PHP are pure C. And don't get me started on
> the BSDs or binutils... ld and libbfd are pure C and its C++ replacement
> (gold) is still not widely used.

Please pick gcc to mention or, indeed, any other C program ever written,
before libbfd ;)

~~~
danieldk
But then, the new star (clang/LLVM) is written in C++.

------
jeffreymcmanus
Saying that "TIOBE rankings are at least the best system we currently have"
for gauging uptake of a programming language is like saying that astrology is
the best system we have for predicting the future.

~~~
thisrod
It would be interesting - but a lot of work - to add "time to modify" and
"time to debug" metrics.

Each task would come with a modification, and an error to make in implementing
the original task. You'd recruit undergraduates, who hadn't used the language
before. Some would be given the original program to modify, others the broken
version to fix. You'd measure how long they took to do it.

This way, language communities that gamed the machine benchmarks would pay a
price on the human ones.

~~~
derleth
Yes, because untrained undergraduates are the programming workforce in every
industry, especially aerospace and medical device makers.

~~~
Roboprog
Hello, "Enterprise"? Beam me up, there's no intelligent life here.

------
jonathansizz
Those TIOBE numbers (linked to in the article) are quite interesting.

As well as C refusing to go away, there's a noticeable surge for C#,
Objective-C and Lua, and a substantial erosion in the popularity of trendy
languages like Python and Ruby.

~~~
malkia
Lua is small and embeddable. You can run multiple independent instance of it
in the same process. The lua table is a very useful data structure.

Then you have luajit - also a small, easy to embed solution, that approaches
standard "C" written code, and it's main weakness, from what I understood
(Mike Pall had a post about it) is it's garbage collector.

On top of that, ffi bindings make "C" calls extremely fast, when the jit is
active, and not very bad when the interpretter runs (but slower).

And the lua/luajit license allows you to embed it in your application, without
the need to share source code.

Mike Pall is working on ppc, arm versions (and I think there might be already
ppc jit).

Not last to forget - awesome community (comp.lang.lua).

~~~
strait
Using LuaJIT with minimal FFI C code to optimize, seems to be the best way
forward in maximizing both performance and maintainability.

What would be really interesting is to see someone highlight specific cases
where this approach ultimately fails to measure up in performance with using
pure C.

I would think that the LuaJIT approach would be tens of times more
maintainable for a sufficiently large application, so it's really imperative
here that we ask 'Why not?'

~~~
malkia
You can always find cases where it would fail. And that's ok. For what it's
doing, and what it allows it's simply unbelievable (at least to me) how it
succeeds in doing it (I understand very little of compilers, and little of
interpretters).

One area which is not easy translatable is OpenMP (www.openmp.org), inlined
assembly, and SSE packed floats. But that's okay, and even then there is
probably a better alternative - a language more suited to such tasks, instead
of "C" - OpenCL (www.khronos.org) or DirectCompute.

But for general coding, it's very very good.

------
garyhalverson
Whoa, that page is hard on the eyes....I didn't get far in that article, but I
think C has continued to live on as well as a foundation that has spawned
other great languages. I don't see it (or variations of it) going away anytime
soon.

------
derleth
The problem is, nothing in this article even begins to address the actual
criticism people have of C, instead saying that it's possible to write good C
code and that Java is a memory hog. The first is true, the second is
debatable, both as to whether it's the case and whether it matters.

For example, read this: <http://www.jwz.org/doc/gc.html>

> In a large application, a good garbage collector is more efficient than
> malloc/free.

My point isn't necessarily to disagree with the article, but to point out that
the article has practically nothing to disagree with. It has no substance.

~~~
qdog
JWZ also states "Java and Emacs are really bad examples of GC-oriented
environments." in that same article.

And later, in 2000, he states "Today, I program in C."
<http://www.jwz.org/doc/java.html>

JWZ is an entertaining read, but unless he's updated in a post somewhere in
the last 11 years, it isn't clear to me where he comes down on C at the
moment.

Btw, I program in C.

~~~
mzl
From the "Coders at Work" interview in 2009, JWZ says that he mostly writes
Perl scripts for small one-off tasks, and every now and then a new screen-
saver in C.

