
Why Learn to Program in C? - AlexeyBrin
http://philipbuuck.com/why-learn-to-program-in-c
======
bluedino
C is great. I wouldn't recommend starting programming in C, though. At least
not with the standard exercises you see. Get user input, read a file, do this
do that with it.

Why? the C standard library is full of gotchas and tricks and goofy shit that
makes it unsuitable for new programmers. The comp.lang.c FAQ should be
standard reading material for anyone taking a C class. Look at all the posts
on StackOverflow about the stdio functions 'not working right' and asking how
to do menial things like collect keypresses.

C works well as a first language if you teach it like assembly (use of which
as a teaching language is another debate): flip some bits and do some loops.
But not for collecting user input/output. Half of your students are going to
get stuck trying to collect a list of names from the user, before they even
get a chance to get stuck trying to sort the list.

~~~
DyslexicAtheist
My first language was C and I loved it. Maybe it takes a certain type of
mindset and tenaciousness (madness?) that will put up with the gotchas. It was
exactly those gotchas that got me hooked though. It was similar gotchas that
made me love Linux the terminal, vi etc ...

The argument that one "should study other languages before they're fit to
learn C" is terrible IMO and somewhat condescending. It's like saying one must
learn "nedit/pico" before starting with editors like "emacs/vi". Please stop
repeating that - especially if you have not done much C.

I am probably too old to understand the constant hate and criticism on why C
shouldn't be taught to beginners. Not everything "has to be easy". Starting
with C & Linux, I always had an advantage later in my career because it was
these skills that enabled me to drill deeper than those who didn't know C &
system programming.

C isn't just hard for beginners because of the language but the rest of the
tool-chain. If you can get over this then learning C will also teach how to
debug with gdb, strace, valgrind and learning about low level system calls,
and a great deal about how a program can be loaded/structured e.g.: dynamic &
static linking. Also where the resources are used and how to eliminate it’s
bottlenecks. What is a heap, stack or static memory? How does memory
management or garbage-collection work on a modern OS? What is IPC and how do
the individual system calls function & require as input? How do you compile a
program and optimize it for different target platforms? How do you write
Makefiles or use autotools etc ...

Tell me again which other language forces you to learn these things? You could
easily build a career just on C & Linux because of this "additional learning
curve" and extra knowledge that a good C developer will have.

Sorry for all my bias (as I said my views might be very outdated :-))

~~~
gnuvince
Learning to program is about solving problems using a computer. Any
programming language allows this, however some languages make it harder than
others, and C is in that category. A language like Python has very little
quirks that will stump a person coding for the first time in their life; in C
they'll wonder "what does int main(int argc, char __argv) mean? ", "why is
there a & before my variable in a scanf but not in printf?", "what are 'nul-
terminated' strings?", etc.

A programmer should definitely eventually learn C, but there's no reason it
needs to be their first language and definitely no reason it should be their
only language.

~~~
vram22
>in C they'll wonder "what does int main(int argc, char argv) mean?

They'll only wonder that if the book / course / instructor teaching them C,
does NOT explain it to them. Which would be more of a reflection on said book
/ course / instructor than on the C language or the students.

~~~
na85
For students who are just learning to wrestle with the programming concept,
python is far superior because it lets the student focus more on what to
express and when, and less on remembering silly things like curly braces and
semicolons.

~~~
vram22
First, braces and semicolons are not silly. They are just a different form of
syntax than indentation. (And I say that as a Pythonista of many years, who
also learned and then used C a lot before learning Python.)

Second, your point may (or not) be correct, but it is at best tangentially
related to the point I was replying to, or to mine.

~~~
na85
I don't think it's tangentially related at all. The cognitive load of trying
to program for the first time is quite high.

If the student doesn't have to focus on remembering whether or not you need a
semicolon or a colon on switch statement cases, they can focus more on
actually grokking the process of translating thoughts and algorithms to code.

------
philipbuuck
I'm the author of the article.

I don't think C should be the first language for most new programmers. The
article doesn't recommend that. Rather, my argument is that C provides the
lowest level abstraction that (most) modern programmers could ever need. By
understanding C, you give yourself the ability to break into the black box of
any higher-level language or framework. You're not at their mercy if they
provide odd abstraction layers or bugs in their code.

It seems to me, at least anecdotally, that beginner programmers using C feel
like they can't do as much as other beginners using languages like Python or
Java. Often this is because graphics are so much faster to get to when working
with the libraries that those other languages offer. Without a standard
graphics library, C feels like the weaker choice. How wrong that is, but I've
heard this sentiment from several different beginners.

I think part of what draws me to the Handmade Dev community is that we are
finding so many people who know how to code, but want to feel more in control
of their programs. Many people following my Handmade Quake project have told
me that they're considering learning C for the first time ever. Maybe that's a
good position for C to take in 2016 - a language to learn once you've learned
your first language and some simple logic but want to understand the hardware
better, or figure out exactly what code is running when you turn on your
Python interpreter. I hope the community is able to plant that seed in the
minds of as many programmers as possible.

------
pklausler
Learning C is to computer programming what the study of human anatomy is to
medicine. You may not enjoy it, but you will be dangerously uninformed if you
don't do the work.

------
cballard
I'm surprised that people have managed to get through a CS program without
learning C (or, at least, C++)! It seems to me that programming should be
taught simultaneously in Haskell (or Lisp?) and C.

\- C teaches you how computers work.

\- Haskell teaches you how programming logic works.

~~~
Tomte
C doesn't really tell you how computers work, a healthy dose of Hennessy's
book and an accompanying lecture works much better than learning C.

But C has immense cultural value. Lots and lots of extant source code,
important source code: Linux. The BSDs. PostgreSQL, SQLite, whatever.

C is the lingua franca. "Everybody" speaks at least a little C, so it can be
used as pseudocode notation (without learning some specific, maybe even ad-
hoc, pseudo language).

~~~
allbody
> a healthy dose of Hennessy's book

Most any CS book seems to have been written in C, Java, C++ and lately,
Python. Are there books on computer architecture, algorithms, compilers,
whatever written in Scheme?

~~~
Tomte
I have no idea what you're talking about.

Are you interested in finding such a book? Do you try to refute any of my
points? If so, which one?

~~~
allbody
I am not refuting. I was actually asking if there are CS books written in
Lisp. Almost any CS book I've seen relies on C, C++ or something along those
lines.

> Are you interested in finding such a book?

Yes.

~~~
ratboy666
Start with "Structure and Interpretation of Computer Programs".

Google it -- it's freely available on-line, along with a video lecture series.

It's different, in that it doesn't teach Scheme (Lisp). It teaches computation
(aka CS).

Go from there.

------
ZanyProgrammer
Until not too long ago, De Anza College still taught C as an introductory
programming language (I think because some recently retired instructors wrote
a textbook on it). Even now, from what I've seen of the syllabi online, their
C++ course that replaced it is still mostly C with some sprinkling of classes
added on.

I thought it was a horrible idea, and I had a pretty miserable experience at
De Anza. If you're just starting out programming, you should simply get used
to writing code, and not fighting low level details of the language. C is the
kind of thing that is useful, perhaps only as an advanced undergraduate
course. Most beginning programmers need to start writing code, not get bogged
down in pointer hell. And obviously most beginning programmers aren't going to
be hacking away at *nix kernels or anything advanced like that.

~~~
gjulianm
My experience is the reverse. I think that beginning with C gives you the
necessary knowledge of the inner workings of programs and systems that is
required if you want to end up with a degree in CS/CE/similar. I've seen too
many programmers that started with Java or Python and then didn't know what
exactly is memory, what is "allocation" or similar concepts that are required
if you want to understand what's going on with your programs.

~~~
ldjb
I do think C is a very useful programming language to learn. Certainly,
concepts like memory and pointers can be crucial knowledge. But I disagree
that C needs to be a programmer's first language. I've seen plenty of
programmers who started out with Python or Java, then later went on to learn C
and were none the worse. I myself started out with Visual Basic. Just because
you didn't start out with C doesn't mean you can't subsequently learn it and
low-level concepts.

------
huangc10
One of the best academic decisions I've made as a student was to take a course
on Operation Systems in C. I did take a beginners course in C previously so it
was an intermediate class.

Nothing compares to spending hours and hours grinding through thousands of
lines of C and building your own passable OS from a skeleton kernel. It was
the hardest course I've ever taken in college and it was my favorite.

I can honestly say I'm a better programmer and engineer after that experience
and would recommend anyone to take a course in OS (hopefully in C), and you'll
know what I mean.

~~~
belltyler
I took a very similar sounding course at Purdue (CS 503) and would second all
of these sentiments.

------
yodsanklai
I think it doesn't really matter what people learn as a first programming
language. Eventually, any serious programmer will become familiar with the
most common languages and paradigms, C being one of them. It may not be the
best choice for beginners, but it's valid nonetheless. People can have fun
learning algorithms without worrying too much about pointers and complicated
type declarations. It's not like it's going to mess up their brain and prevent
them from learning higher-level languages later on.

------
StillBored
Ugh, while I 100% agree that learning C (or similar lower level language)
should be a prereq for any programming job, the use of the K&R C book is
idiomatic of what is wrong with C. To many people pick up that book and try to
emulate what they read.

I firmly believe that book will teach you every bad habit you should never use
in actual practice. 30 years ago, many of those "tricks" made sense given the
compilers and computers of the time, but today your much better just writing
straight clean C (for example using array indexes instead of pointers). Not
only because its more maintainable, but because it often generates more
efficient code due to hardware support for base+displacement addressing and
compilers abilities to understand the intent and optimize the resulting code.
Furthermore, code like what you find in that book is what has given C a bad
name due to buffer overflows and the dozens of other things people think are
the fault of "C" rather than the programming environment around it. Not the
least of which is a formatting scheme that hides bugs.

So, PLEASE, if you are learning C, stay as far away from the K&R book as you
can until you have gained enough proficiency to understand the pitfalls of may
of the techniques taught in that book.

~~~
vram22
> hardware support for base+displacement addressing

Not sure exactly what you mean by the "hardware support" part, unless it is
machine instruction support of some kind for C syntax. (But I thought that a
basic level of support for that was present in most processors - things like
base + offset addressing mode, indirect addressing via the value in a
register, etc.)

But:

a[i] is equivalent to star(a + i) in C. (The K&R book says so somewhere, in
either the original or the ANSI C edition.)

And star(a + i) is equivalent to star(i + a), by commutativity (a mathematical
property) which is equivalent to i[a]. [1]

So you can write any of the above 4 expressions equivalently. I verified it by
writing and compiling such code after reading this in the K&R book years ago;
still remember being surprised until I saw the logic made sense.

[1] Or at least, it was so, some years ago, not sure about in C99 and such.

I used "star" instead of the asterisk symbol in code above because the actual
asterisk turned following text to italics - HN markup.

~~~
StillBored
> Not sure exactly what you mean by the "hardware support" part,

Yes I should have been clearer, and said no penalty or minimal penalty,
support for base+offset address calculation. A fair number of machines a
couple decades ago, had additional latency or throughput penalties for
addressing modes that required address calculation. This is still true in some
cases for more complex calculations. So the use of pointer arithmetic was in
many cases noticeably faster than using array syntax even though they are
functionally the same. This was also partially due to the compilers not being
smart enough (or not having enough registers) to dedicate one explicitly for
the resulting address (and using register indirect) if the programmer was
using a base[offset] calculation.

For example, the ARM I'm using at the moment has a 1 cycle latency penalty for
certain scaled displacements. This means that if your looping over an array of
16 bit ints it will be faster to use a pointer to the integers if the compiler
fails to increment the index by 2 and instead uses a (base+offset * 2)
load/store. This means the compiler has to be fairly smart about code that
uses the index's absolute value as well as its "offset" value in an array.

(BTW the particular ARM in question does base+offset*4 or 8 at full speed).

~~~
vram22
Got it now, thanks. Did not know that.

------
autoreleasepool
If anyone is interested in learning C as a first language, the lectures from
Harvad's CS50 course make it more than feasible[0]. IMO the practical
advantage of learning C first is that all other curly-brace languages become
very easy to pick up.

[0] [https://cs50.harvard.edu/lectures](https://cs50.harvard.edu/lectures)

------
kozukumi
I am a strong believer in learning C as your first language. Why? Well a
number of reasons but the two main ones are 1) it teaches you about what is
happening at a lower level so you have some idea about what is happening
behind the scenes of that Java code you wrote or whatever and 2) it makes you
appreciate just how lucky you have it in higher level languages.

However I don't think you should start writing all your programs in C! Use the
right tool for the job. If that job is something C makes sense for then go for
it but if you are writing a webapp I wouldn't recommend you use C (although I
would if you needed a fast, native library to call from your webapp).

~~~
wtetzner
> although I would if you needed a fast, native library to call from your
> webapp

Honestly, at this point you're probably better of using Rust for that.

------
stared
C is an abstraction. Assembly is an abstraction. Even physical transistors are
abstractions. You can go down the ladder as deep as you wish, there is no
bottom. (Sure, you can delude yourself you that X is the fundamental level of
Y... as long as you define Y properly.)

In this grand scheme of things, C is not better than Python. I don't mean the
we should avoid going deeper. I mean that each level of abstraction has its
own benefits.

BTW: How many of you started learning natural numbers not from counting
fruits/blocks/candies, but from the ZF set theory axioms and the von Neumann
construction?

~~~
vram22
>How many of you started learning natural numbers not from counting
fruits/blocks/candies, but from the ZF set theory axioms

Ha ha, good one. I don't even know what the ZF set theory is. Not putting down
theory either in general or in this case, of course.

~~~
amagumori
Zermelo-Fraenkel set theory, an axiomatic set theory that basically forms a
logical "foundation" for math. I think this comparison of counting blocks to
zf set theory is super hyperbolic and not accurate though.

~~~
stared
It is a hyperbolic, but not far from "Python is magic, C is real programming".
(It would be more like "Python is magic, Turing machine is real programming".)

------
scottLobster
While I agree with this in theory, I can't help but imagine that 20-30 years
ago you could take this post, replace every instance of "C" with "assembly"
and it would relate to a lot of developers at the time.

The march of technology (and knowledge in general) is one of constant
abstraction. We figure out ways to do more with less so we have more capacity
left over for further advancement. You don't know 10x10=100 because your brain
derives it every single time, you just memorize the fact, hopefully with
understanding. In software terms that means higher level languages with
sophisticated libraries, and languages like C are relegated to more of an
infrastructure role.

This isn't necessarily bad. If the technology is more capable then everybody
wins at the end of the day. As the author points out there will always be a
niche of enthusiasts/zealots who will lead the charge and further the field.
The remaining 99% will learn what those people come up with and apply it. To
the best of my knowledge that's how it's always been, and it's worked pretty
well so far. So I'd say let people start off with high level languages if they
want, but expose them to lower level languages and let them choose which
demographic they end up joining.

------
jdbernard
Why do so many people seem to think the author is advocating that you learn C
as a first language? Did you read the article? He explicitly says the
opposite:

 _Does someone brand new to programming, who’s excited about making changes in
the HTML on a page or setting up a blog to later customize, or just wants a
slow introduction to more complex topics, really need to worry about cache
coherency? Or threading? Clearly that’s a problem for down the road. Maybe
that’s a road that some developers will never need to walk down at all,
depending on their focus._

His position is that, while C may not be the best language for everyone coming
into programming it is beneficial to learn it eventually because it allow you
to peel back the layers of abstraction underneath all of the goodies in your
high-level language. This allows you to better understand and reason about
even your higher-level code and opens the door to fixing and understanding a
lot of things that would have constrained you previously.

I completely, whole-heartedly agree. We don't always need to be able to plumb
the depths of our abstraction stack. I don't believe it is necessary for a
beginner. But having the ability to go up or down in the stack is extremely
valuable. C is worth learning.

~~~
philipbuuck
It's funny how the comments went down that road. I don't think C is a good
choice for a first language. I would argue that C++ is an even worse choice
than C, but that's for another article...

------
goodcanadian
I find it odd that anyone would question the value of learning C. I find the
negative attitudes toward C to be very odd in general. Now, it is clear to me
that I am in the minority, but I can't understand why people would be doing
serious work in an interpreted language. I detest garbage collection as I have
no control over it. I am sure I could write statements like this all day.

Don't get me wrong: I use perl when it makes sense; I use Java when it makes
sense. I quite like playing with new languages and seeing what I can do with
them. There are many languages that are safer than C or easier to do
particular tasks than with C, but I am very rarely happy with the trade offs.
When I am trying to do serious work that is meant to stand the test of time,
I'll use C.

~~~
munificent
> I can't understand why people would be doing serious work in an interpreted
> language. I detest garbage collection as I have no control over it.

You have no control over C's register allocator. (Well, very little,
`register` doesn't mean much.) Assuming you use it, you have no control over
how the standard library's malloc() and free() work. malloc() could be best-
fit, first-fit, something else entirely, who knows. No control over
fragmentation. You have no control over the CPU's branch predictor or cache
eviction policy.

You don't control the OS's thread scheduler, virtual memory implementation,
etc.

Everyone picks a spot where things below that level are details they don't
want to care about. You may have picked a slightly lower level, but there's
nothing fundamentally different between you liking C and someone else liking
Java (which is GC but not interpreted), Forth (interpreted but not GC), or
Ruby (GC and interpreted).

People do serious work in Ruby for the same exact same reason you do serious
work in C, it lets them control the things they need to control and ignore the
things they don't.

~~~
goodcanadian
Yes, but interpreted is slow, and garbage collection causes pauses at
essentially random times. These things are fine for one off sorts of problems
where it doesn't matter much if it takes a little longer. If you are writing
an application that is meant to be run thousands or millions of times, I start
to wonder why people don't care more about these trade-offs (the answer is, of
course, they do . . . if they are, for example, running large infrastructure
like Google or Amazon).

My point isn't/wasn't so much to crap on other languages, but to express
dismay at why people crap on C so much or view it as irrelevant. Ruby is
definitely not irrelevant. Why would people think C is?

~~~
umanwizard
The main language at Amazon is Java. Lots of Java code is being run millions
and millions of times there. Java isn't as slow as you think it is.

~~~
goodcanadian
I didn't say Java is slow. I said that garbage collection introduces
unpredictable pauses. This is potentially an issue with Java, but it all
depends on the problem you are trying to solve. In many cases, it won't
matter, but in some, it will.

How is this relevant to the idea that C is an important and relevant language?

------
gozur88
The problem with learning C is it's easy to code things that work most of the
time, but not always. A great many C coders don't _really_ understand the
architecture, and they don't get the immediate feedback that learning
requires.

In my last big C project it wasn't uncommon at all to find memory leaks and
out of scope references to stack variables. You'd add a line of code and the
whole thing would blow up because it changed the state of the stack and
unmasked a problem in another module. I would find the culprit and figure out
if they were being sloppy or they just didn't understand, and it was usually
the latter. This from people who had been programming professionally for a
year or two.

------
gavman
While I agree with the premise that it is important to know C, I also agree
with many of the comments here suggesting that there are other better first
languages out there. As a recent college grad ('15), my intro course was in
Python and then the next course was in C. Python was easy to jump into--it
usually just made sense and worked. While learning about loops, lists,
strings, etc, that was useful. Equally useful was then moving to C and getting
a better understanding of the "lower level" (especially in the sophomore
computer systems course). But you don't have to understand caches, page
tables, and threading to jump into a first programming class and master the
principles.

------
rikkus
Whenever I don't quite understand how something works in a programming
language, I try to imagine how I might implement it in C. Or I look up the
implementation, or write it myself. This helps me understand from 'first
principles' or near enough.

------
Apocryphon
Classic Joel Spolsky essay: "Back to Basics" \-
[http://www.joelonsoftware.com/articles/fog0000000319.html](http://www.joelonsoftware.com/articles/fog0000000319.html)

------
eswat
I enrolled in my university the same year they switched from teaching C to
teaching Java in the CS curriculum. As someone that learned C++ in high school
I was less than enthusiastic about this change.

But what surprised me more was that the courses became more institutionalized.
Professors were bettering you for a cushy career as an employee at a bank or
large corporation instead of showing you how to use the computer to solve a
dizzying amount of technical, social and and financial problems. I could
understand a technical college emphasizing a safe, risk-free entry into a nice
career but I expected better from a university.

------
craked5
In my college here in Lisbon, almost all the courses that you did were in C,
Programming 1 and 2, Operating Systems, Distributed Systems, etc etc, except
for OO classes. They changed it to Python after i left but i am glad that i
had the opportunity to learn C, although i don't use it very often in the
"working world". I reckon that Python is a nicer language to a freshman that
has never seen code before, but they should of kept stuff like OS and
Distributed Systems in C.

------
xyzzy4
C is a beautiful and pure language. It is like the Latin of programming
languages. I recommend everyone to at least learn it.

~~~
pklausler
A language can be beautiful, pure, or Latin, but not all three.

------
drelihan
Ruby, Python, Perl, PHP, Java are all (mostly) built in C ( not to mention
most databases ). Why not learn to program in C?

~~~
bachmeier
You realize that computers don't literally run C code, don't you? By that
argument you should go further and build your own hardware, then write your
own OS...you get the point.

~~~
philipbuuck
I think C is a good choice to stop. I know it's an abstraction itself, but
beyond that you get into hardware specifics and often-closed OS code.

~~~
cyphar
> often-closed OS code.

If you use GNU/Linux, OpenBSD, FreeBSD, NetBSD, SmartOS, OpenIndiana,
OpenSolaris, etc this isn't a problem.

------
peter303
You can write very C-like code inside of Java or C++. Just write one class and
stuff all the variables and subroutine methods inside it. Many years ago I had
to fix some bugs in such "converted" code. The programmer just wrapped old C
code into one humungous dummy class.

------
amorphid
I liked having Ruby as a first. The REPL made it easy to see what was going on
(even if I didn't understand it), the community's like for testing is
fabulous, and I became comfortable doing things at the command line.

------
xufi
Interesting. I havent' delved quite much in to C but no doubt I will try to
get more in to it during my spare time. IT's a good basis that some other high
entity languages drew their cues from

------
qwertyuiop924
The whole handmade thing seems kind of silly. Yes, understanding what's
beneath abstractions is important, but if you want to build software well, you
should use your abstractions well, because not every piece of software needs
lightning speed.

~~~
adrusi
I don't think the author is arguing everyone should use c for everything. He
is arguing that everyone should write something in c so that they understand
the abstractions they use better. Abstraction is great because it let's you
build your conceptual structure atop a virtual foundation, but your structure
will be more grounded and therefore more sound if you learn to connect your
understanding of the virtual foundation to its physical origins.

~~~
philipbuuck
Yep, exactly. Abstractions help make us more productive, but if we don't
understand what it is they're actually abstracting, I believe they're limiting
us, not enabling us.

~~~
qwertyuiop924
I'll toast to that. No, I was talking about the "Handmade Manifesto" that you
linked to, which seemed to say that high-level abstractions are EVIL.

