
Learn C, Then Learn Computer Science - qrohlf
http://qrohlf.com/posts/learn-to-code-then-learn-cs/
======
betterunix
Personally I take a different view: learn a high-level language, use it to
study computer science, and if you feel like writing some low-level code go
ahead and learn C (though even then, you could probably do better by
bootstrapping a compiler for your favorite high-level language and adding some
extensions for low-level operations). C is not the best or even a particularly
good language to try to learn abstract CS concepts with, professors who choose
it are needlessly stressing out their students, and at the end of the day you
can learn a lot more about algorithms and data structures when you are not
busy worrying about pointers or trying to avoid the things that "everyone
knows to avoid" (except people keep failing to do so).

I say this as someone who watched the catastrophic results of trying to use
C++ as a language to teach data structures. It is far more important to be
able to think abstractly than to know how to deal with pointers.

~~~
soup10
C++ and C are completely different beasts. C is an elegant clean small
language with few distracting elements, making it very suitable for a teaching
language. C++ is a grotesquerie more suitable for a carnival horror show than
something you want to introduce to new programmers.

Unfortunately at some point, teaching OOP became all the rage in schools, and
so C++ is chosen instead. That's a mistake, Java or C# should be used for
teaching OOP, C should be used to teach data structures and at least some
algorithms. Without hands on memory allocation, you're not really getting a
full understanding of how data structures work.

~~~
betterunix
"C++ is a grotesquerie more suitable for a carnival horror show than something
you want to introduce to new programmers."

In my experience TAing an undergrad course that used C++, almost all of the
things that left students scratching their heads were things that are present
in C. No garbage collector, no built-in way to determine array sizes at
runtime, no way to determine if a pointer has already been deallocated, no
requirement that non-void functions actually return a value on all control
paths, etc. The worst thing C++ does is to amplify these problems
(particularly that last one -- yes, I know, use -Wall, but someone who is just
starting to use a language would not know that, and having to teach compiler
flags is an even worse distraction from the subject matter of the course).

Really the problem is not language size at all. Python is a big language too,
but it does not have the above problems. Common Lisp is just as big as C++ (in
terms of the number of pages in the standard), yet these are not problems
Lispers have. Scheme is a small language, like C, yet the elegance and
expressive power of Scheme is on a completely different level from C.

The problem is that the few abstractions C presents are hard to deal with,
especially for beginners, and the abstractions that C could present are sorely
missed. Even the abstractions C presents are unreliable, with loads of
undefined behavior and plenty of ways to break the abstractions.

Really, the fact that real-world C programmers have to pick a subset of the
language and enforce various style standards and coding conventions speaks
volumes about the suitability of C for beginners. If we were going to require
students to use a specific subset of C, why not just write a compiler for that
subset and use that to teach? The answer is pretty clear: if we were going to
write a compiler for a new language that was suitable for teaching students,
we would write something better. Why bother when we already have better
languages to choose from? Save C for the OS course, and only as long as it
remains relevant there.

~~~
YZF
Did you teach standard library container classes? reference counted pointers?
Doesn't sound like it. So did you really teach C++?

~~~
betterunix
It was a course on data structures, not on the C++ standard library.
Referencing counting would have been the wrong thing for the lesson on doubly
linked lists -- which was basically the first lesson of the course. "Modern"
C++ would not have helped at all for the student who had a function with a
missing return statement and a confusing error log.

Frankly, almost none of the C++11 features are actually useful for teaching
data structures, and those that are relevant would only confuse students.
Basically, only _auto_ and the _three kinds of smart pointers_ are relevant to
an introductory data structures course. At the end of the day those would only
create as many problems as they solve. For example, _unique_ptr_ means that
there is only one "owner," right? Wrong, _get()_ returns a raw pointer to the
object, and you can make a new _unique_ptr_ from that. Sure it is easy to
avoid -- if you are an expert with lots of C++ experience, who follows coding
guidelines and all that. The data structures students had little to no C++
experience and would almost certainly have done what I just described -- and
that is just one of many ways they can and will screw up C++11 features.

At the end of the day, C++ is too complicated, too poorly defined, and has all
the wrong abstractions for basic CS courses.

~~~
YZF
So you were basically teaching C? How does C++ come into this discussion at
all?

How do you teach a basic data structure course in Python?

Pascal seems like a much better choice for data structures than C or C++. Or
you can use Scala and teach functional data structures if you're adventurous
...

------
ChuckMcM
Don't confuse the strategies :-) Small liberal arts school has a strategy of
broad based understanding of theory and the 'art', with graduate school
providing the nuts and bolts specialization, large engineering school (USC)
has the strategy of maximally skilled in execution of the art with enough
theory to be effective. USC will assume that you will go to work right after
you get your 4 year degree, Lewis and Clark may be assuming you're going to
have at least another 2 years of work on your master's degree to get those
bits in order.

In my experience, learning how the computer worked made it easier for me to
learn to program it. My wife, was much more an algorithmic thinker and could
care less if this variable was in a register or not, and focused on how things
flowed through the program. I think they are both valid approaches.

~~~
RogerL
"I think they are both valid approaches."

For limited ranges of valid. If you are implementing a low performance
requirement CRUD app, certainly you can become an effective programmer without
knowing the low level details.

But if you want to be an engineer? Good luck. I've worked with people that
didn't know the stuff behind C, and they are pretty useless as soon as
performance matters, or you need to talk to hardware, or crank through a lot
of numerical computations, minimize watt usage of a cellphone, app, debug
complicated situations, and so on. They don't have a good mental model of the
machine in their mind, and so they cannot figure out what is going on, reason
about the performance of a modern chip, and so on.

There are a lot of arguments for expediency on HN. Probably it is partially
justified, but I will always advocate for an engineering education, as opposed
to a 'learn rails in 3 weeks and get a job' approach. Education and knowledge
enables you to tackle any problem that comes your way.

So, while you can learn to be an effective debugger of a Python script (say)
and so on without detailed knowledge of the machine, mastering the
fundamentals will make you an effective debugger _and_ enable you to do so
much more. To stick with the python, what happens if the python implementation
is buggy? There is a large subset of programmers that can not cope with that
situation.

Why limit yourself? If this is going to be your career, the way you pay your
mortgage, feed your kids, and so on, invest in yourself. I'm having trouble of
thinking of the downside of acquiring the knowledge, but I can trivially think
of the upside.

~~~
actsasbuffoon
There are advantages to the other side. Many fascinating developments were
made possible by those who prefer to think functionally, as opposed to
imperatively. Modern relational database engines are fairly easy to implement
once you understand set theory. Lambda calculus has taught me how to create a
linked list using nothing but partially applied higher order functions.
Software transactional memory was invented in Haskell, and it's easy to see
why it happened there first.

The world needs people who think mechanically as well as people who think at a
very high level of abstraction. I'm glad people like you exist, as tech would
be vastly behind where it is now otherwise. I'm also glad there are people
like me. We may not always see problems the same way, but the world is a
better place because of our differences.

~~~
ori_b
If you aren't able to think about both sides, you've got a massive hole in
your education.

I'd be willing to bet that the people that came up with software transactional
memory were comfortable in C and assembly as well as Haskell, and hadn't
bottlenecked themselves with the false dichotomy of thinking functionally vs
thinking imperatively.

The world needs people who think mechanically _AND_ at a high level of
abstraction at _at the same time_.

------
anaphor
"The psychological profiling [of a programmer] is mostly the ability to shift
levels of abstraction, from low level to high level. To see something in the
small and to see something in the large." \-- Don Knuth

------
pjmlp
Please don't!

Learn Delphi/Object Pascal, Ada, Modula-2, Rust and discover control over
memory managerment doesn't require throwing safety out of the window.
Performance can be fine-tuned to the 1% hotspots that really require playing
dirty tricks.

That there are modular systems programming languages with compile times times
that leave C to shame.

Then cry as you are forced to adopt C to be understood by the rest of the
world.

~~~
kabdib
The problem being, if you show up and all you know is Delphi / Pascal / Ada,
you're going to be unemployable.

[I'm a veteran of a ton of Object Pascal, btw. Done some Delphi, too -- it was
nice. But.]

In general, pushing a niche language as The One does not do a beginner a
service. Delphi has never been more than a niche.

We had a guy at Apple decide to do his project in Oberon. Super smart guy who
pretty good work. He wound up getting fired because it wasn't a smart choice
for the people around him who had to take over the code.

I don't know why some people over-index on programming language. You use
what's appropriate. If you're working alone, do whatever floats your boat. If
you're on a team, using the Latest Shiny Thing or The Most Correct Safest
Thing may not be the best choice. If you're a student, using your professor's
favorite language (we used to call these little pet horrors PL/Prof or PROF-
TRAN) outside of the course is probably a bad choice -- you want to learn what
the market is, and you need to know what the weak points of your chosen
language are.

It's particularly poignant to see "Java only" kids come out of school who are
utterly ignorant of memory models, barely know what XOR is, and who can only
express themselves in terms of classes and interfaces. There's more to life
than that.

~~~
pjmlp
> The problem being, if you show up and all you know is Delphi / Pascal / Ada,
> you're going to be unemployable.

Sure, it is all about learning for the job I guess.

My rant goes into another direction actually.

Young kinds nowadays tend to think that not only C is the only player on its
field, it never existed nothing else before it.

Whereas many of us, remember the days when C was just another systems
programming languages among many others to choose from.

So if they learn only C, then they get into this mentalit that bounds
checkings, modules, proper vectors belongs to the realm of VM languages, while
only C allow for full control.

If on the other hand they learn about the Algol family of systems programming
languages, besides C, they will be aware of other languages that offer the
same feature set of C, compile to native code as well, while offering higher
productivity.

Having that knowledge will make them better C programmers as well, as they
might learn a better sense of safety and disciple while coding C.

------
UNIXgod
Lean C when you need to grok how memory works in computing. Learn Scheme to
grok the science of computing. Learn BSD when you want to understand operating
systems and the network stack. Grok engineering when you write a scheme->c
translator running on BSD.

~~~
hobbyist
Could you elaborate more or give some references on the second and third part?
I am done with the first. I seriously need some profound knowledge on second
and third, which a lot of people like you talk about. I need to put a plan to
get there too. Scheme to C looks fun though :-) . Where should I start first
with?

~~~
gamegoblin
If you want to implement Scheme in C, I'd take a look at Scheme From Scratch
([http://michaux.ca/articles/scheme-from-scratch-
bootstrap-v0_...](http://michaux.ca/articles/scheme-from-scratch-
bootstrap-v0_1-integers))

It's pretty excellent for learning how to implement a simple language in C.

------
habosa
I think C is a great language to teach someone _why_ all of the CS theory
matters. When you write Java, almost everything you do will "work" and it will
probably run pretty fast on your Core i7. When you write C you have to be much
more deliberate. There's no massive standard library to do arr.sort() for you
and you have to really think about how to implement some basic processes that
you took for granted. I think C made me a much better Java programnmer because
I now understand what the most minimal way to complete a task can be. No more
HashMap<String, ArrayList<Object>> once you've played with structs and C
arrays.

~~~
tbirdz
"There's no massive standard library to do arr.sort() for you and you have to
really think about how to implement some basic processes that you took for
granted"

Not massive by any means, but the C standard library does include qsort()
(quicksort) in stdlib.h

------
Kapura
This article resonated deeply with my own experience. Like Rohlf, I learned to
program TI-Basic in middle school, and like him, I've had the ability to see
several different teaching paradigms when it comes to computer science.

In the high school computer science courses I was able to take, the programs
were relatively simple with an, in my opinion, too-overt emphasis on object
oriented programming. It was in java, and very little of the class focused on
the "how" aspect of the language.

My first CS courses at university were in Ada. I don't know how many of y'all
have programmed in Ada, but I think about it like a wordier C that doesn't let
you just get away with shit because it's strongly typed. This was the language
we were taught basic control structures in and, to be fair, the way we were
taught focused on good styling so that our code would be unreadable mush.

The UNIX class was in C, and it is, in my opinion, the most rigorous course
that most of the students will take on their way to the bachelor's degree. The
students build, across a series of projects, a shell that can execute commands
and run simple scripts. The sorts of problems encountered introduce the
students to the nature of C at a very low level, as well as the sorts of
structures that can be used to solve practical problems.

Many of the high level more theoretical classes are language-agnostic, and
many of the students use either Python or C#. I think that it's good that
these higher-level theoretical classes allow us to choose our own language,
because I think that the ways in which different programmers attack the same
problem might bias them towards one or another language.

Recently, the lower-level classes had their programming language changed.
Heeding the argument that Ada wasn't super relevant in the higher-level
classes, the faculty changed the language to Python. This was, I think, a
mistake. My opinion is that the lower level classes should be taught in C for
many of the same reasons Rohlf mentioned. I think it's important that the
students have an idea of how their data is being represented on the disk so
that they could better understand the costs associated with certain tasks.

The best argument that I heard for Python was that not everybody taking these
into classes go on to write the sort of code that requires something fast or
efficient. Python makes data processing easy, which is really the most
important thing that students should learn in intro-to-programming classes.
I'd say this is a fair point.

~~~
cturner
Ada's powerful. The typing system is the best of the mainstream imperative
languages. You can write good cross-platform code without thinking about
endian issues. Almost as fast as C, strong gcc support.

It's a shame it's so damn verbose. A fortnight ago I was trying to brainstorm
how I could put a tighter syntax on it, possibly with some kind of pre-
processor. Ideas welcome!

~~~
pjmlp
> It's a shame it's so damn verbose.

When a big part of your job is to read the code from others of various skill
levels, one learns to have a soft spot for verbosity.

~~~
ori_b
After reading probably millions of lines of code, I find myself wishing people
would learn the difference between verbosity and clarity.

More words do not necessarily lead to clearer, more understandable code.

------
YZF
I would argue C is a difficult first language. It's even a difficult 2nd or
3rd language.

I would start people off programming at a higher level of abstraction and once
they can do that dive into how things work. You could start the other way but
it may be too theoretical and discouraging.

Pascal used to be the language of choice and IMO can still work well. Things
got more complicated when we got all those competing paradigms and now you
have a matrix of language/paradigm. Should you teach OO? Functional?
Procedural?

I also think it's important to lay our some theoretical foundations as you're
teaching the first language. You don't want to overdo it but you need to start
building some basic ideas, notations, concepts. As long as it doesn't get in
people's way in their ability to actually build something...

If you want to understand how computers work there's no substitute for
programming in assembler, preferably, under different hardware architectures.

This debate about how to teach programming/CS/software engineering is endless.
Different people and different schools have different goals and different
capabilities. A well rounded software engineer definitely needs to have a good
mix of theory and practice. A CS researcher needs a different perhaps mix.

~~~
vinceguidry
C is an excellent 2nd language. The language itself is obtuse, but the
underlying model is fairly simple. C++ is a horrible one. These days, probably
nobody should learn C++ and should probably pick up Go instead.

As a teenager, I picked up a book on C++ having already learned BASIC, Pascal,
and MS-DOS but never got proficient in it. I should have picked C. The "++"
lead me to believe that it was a better language than C and I didn't have any
real programmers around to tell me otherwise.

Oh the missed opportunities...

~~~
YZF
C++ has many followers but it's a multi-headed beast and some of it can be
much harder to grasp. I think it's easier to argue that it's better than C;
all you need to do is to look at C code that tries to implement some patterns
that are more naturally expressed in C++. Go isn't really a substitute for C++
but one might argue D is.

Sounds like C++ and you didn't quite work out. I'd still encourage trying to
figure out how some pieces of C++ can improve your C code and use those.
You're welcome to stick to C in the rest of your code. A lot of "real
programmers" do use C++ successfully and a lot of software you use has C++ in
it's DNA...

EDIT: (That said I would only expose new students to C++ after they've seen C
and some higher level language so they can appreciate the niche that it fills)

~~~
vinceguidry
Whoops, I didn't mean to give the impression that C++ guys weren't real
programmers. Of course they are. I literally meant "real programmers" as in,
anybody who had more experience than I did. I was completely self-taught.

I will probably never touch C++ again. I solve all my problems with Ruby and
if I ever need to go lower-level, I'll break out C and maybe Lua.

------
mathattack
From personal experience, I learned theory and structure first. It took a long
time to "get" variables as locations in memory, and I had to relearn it many
times.

Harvard (who sends a lot of people on to PhDs) forces everyone to learn C in
introductory cs50. I like how they do it. If you don't get "the machine" early
it is hard to get it later.

------
NAFV_P
> _This semester I used function pointers in an assignment and later heard
> them described as “Quinn’s magic C voodoo”._

I still think I have to polish up on 'em:

    
    
      #include <stdio.h>
      #include <ctype.h>
      //
      int chk_arg(char *s, int (*f)(int)) {
        while(*s)
          if(!(f(*s)))
            return 1;
          else
            ++s;
        return 0;
      }
      //
      char *s[]= { "1048leet576", "numbers only pls\n" };
      if(chk_arg(*s, isdigit))
        printf(*(s+1));
    

> _This was confusing - my classmates were juniors and seniors who were 3+
> years into a computer science degree, yet many of them didn’t seem to have
> an understanding of how computers worked. They could write high-level code
> and analyze algorithms, but had never used malloc._

What the hell, CS students who haven't used malloc? I've got no formal
background in CS, yesterday I wrote a binary search tree in C that passed
valgrind.

~~~
moccajoghurt
I am positive that 95% of the CS students with a BSc. degree in my college
haven't even seen function pointers. We learn the basics of C then OOP in C++
and the rest of the study will be Java. It's a farce and as an employer I
wouldn't even care whether my employee has a fancy degree. You won't find real
coders if you choose them by their degree. At least in Germany it's like that.
I am not sure of it's different in other countries.

~~~
pjmlp
I am a Portuguese living in Germany and every time someone describes me how CS
degrees work here, I always feel sorry for my work colleagues.

When I took my degree in Portugal in the mid-90's, it was a 5 year long
degree, with heavy mix of theory and practical subjects in lots of programming
languages.

I got to use C, C++, Prolog, Camllight, Java, PL/I, Algol, Pascal as
compulsory ones. Plus many many others that I got to learn/use as part of the
compiler design lectures I used to attend to.

All CS areas had introductory levels as compulsory, with the more advanced
ones as optional subjects. Given the way the credits were set up, quite a few
of those advanced ones were also required.

------
gwu78
To truly appreciate C I think having to program in assembly first helps.

I say learn assembly first, then learn C.

Then do whatever you want.

~~~
moccajoghurt
On the other hand... give a student python in the first semester and he'll get
hooked. In the second semester you can start the approach you proposed.

------
gpcz
Joel Spolsky wrote a good article about this called "Back to Basics" (src:
[http://www.joelonsoftware.com/articles/fog0000000319.html](http://www.joelonsoftware.com/articles/fog0000000319.html)
). My favorite quote from it (capitalization modified) is, "If you want to
teach somebody something well, you have to start at the very lowest level.
It's like Karate Kid. Wax on, wax off. Wax on, wax off. Do that for three
weeks. Then knocking the other kid's head off is easy."

------
was_hellbanned
I think we need to break all of this down into at least three entirely
separate areas:

1) Computer Science, which is, in my opinion, entirely a subset of
mathematics, and should be taught as such.

2) Coding, which is the ability to break a problem down and describe it in a
series of simple steps.

3) Craft and application, which is everything about how to use tools, best
practices for code architecture, and everything else that every new graduate
is usually terrible at.

I think we could satisfy most of the world's need for "programmers" by
teaching #2 and #3 in a trade school model.

~~~
marcosdumay
The problem here appears when you want to teach #1. You shouldn't do it
without also teaching #2 and #3 (isn't that obvious enough?), but #3 is often
out, and it seems that there are some people trying to not "lose time" with #2
either.

~~~
was_hellbanned
I would say someone on a full CS-and-engineering track should study #2
(intro), #1 (rigorous CS/math), and finish with #3, since that's where you
want to focus on the latest industry standards and trends. This is similar to
my own CS university experience, but I'd like to see it broken out into much
better defined areas, with different ratios of focus for different career
tracks. I also think a full program should take significantly longer than a
4-year degree.

Incidentally, I think #1 could easily be taught without #3 at all. I get the
impression that there are quite a few academic people who tinker with
interesting algorithmic work, in esoteric languages, who couldn't effectively
write maintainable code for mass deployment. Not that there's anything wrong
with that, it's just a different area of specialization.

------
knappador
My recipe for the best possible syllabus for CS 103:

* Turing machine -- just the tape and making it do the equivalent of procedures so that we can compose things. Brainfuck is a valid homework medium.

* A relevant ISA such as ARM -- We can let Thumb/2 be bonus points, but the real goal is just to convey that ISA's are usually written to take the most generally abstractable things you would do with a Turing machine and do it more succinctly and faster in hardware.

* Build a function stack with the goal of understanding how procedures become generalized by calling conventions or optimized by inlining. The lesson is that we stay pure for flexibility and get dirty for raw speed.

* Introduce Scheme and tail recursion optimization. Now the whole of functional programming and the abstractions -- and their optimizations -- are opened up with a firm footing in the machine.

Turing and Church probably did write most of CS, so it's obvious that you want
to relate the two to create the most generally useful bridge concepts for the
rest of the long life of a programmer, that if they never let up, will
inevitably touch both high-level and low-level languages.

OS's are the final frontier here, but cannot possibly be squeezed down due to
things like MMU functionality totally clouding up what is what. Making a
machine look like it belongs to the program is what OS's are good at, and this
is a different kind of abstraction to grok than the relationship between the
machine and the programmer.

------
austinz
I come from an EE background, so I learned C and studied various aspects of
computer hardware and computer architecture to some relative level of detail
before doing any sort of algorithms or software design. Sometimes I wonder
what sort of software engineer I'd be if I had gone the other way - studied CS
to start off and then descended down the stack later on. Would I be better?
Worse? How would I think about problems, and what mental models would I go for
first?

~~~
erichocean
I've wondered the same, but in the reverse direction: by the time I started
programming, I had read—not kidding—thousands of papers on computer science,
and read hundreds of books.

I just had no interest in coding. (Or math for that matter, though ironically,
I also read math books.)

Anyway, I've always wondered if I'm a better or worse CTO for having taken
that route. I _do_ know I code much less than my peers, even though (it seems
to me) I get much better results with far less coding effort.

~~~
erichocean
I'm not sure why, but this comment stuck with me and I kept thinking about it.

Here's an example of where I think learning CS, then programming, helped.

I had written my own JavaScript application framework, called Blossom[0], and
I wanted "real physics". I took a look at the various 2D physics ports that
had been made to JavaScript, and they all had pretty bad performance. Also, I
knew from my work as a filmmaker that "real physics" is actually a pain in the
ass, because what you actually want is directable physics that _feel_ real.
(Stick with me, I'm getting there...)

So, I took a step back and determined that what I wanted was to be able to use
tween curves like normal, but have the actual path that was taken depend on
the velocity and/or acceleration of the movement when the user released their
finger, yet be "directed" according to the tween curve.

That required doing math, specifically, getting first and second derivatives
from your tween curves. Yuck. Except, CS to the rescue, there are these things
called Dual numbers[1], and when you use them, you get automatic
differentiation[2] for free, without having to rewrite your code.

So I wrote a Dual number implementation for JavaScript, grabbed a JS parser
and rewrote the tween functions at load time (i.e. with code, automatically)
to use the Dual numbers so that I could extract the velocity and acceleration.
I then took the original curve based on the user's movement, blended it with
the tween derivatives, and the resulting animations were actually really
great. The whole thing was efficient enough to get real time performance on an
iPad 3 in mobile Safari, entirely in JavaScript.

And the total time from start of work to the physics-based Dual-number tween
animations in Blossom? Two days, and about 2K LOC.

If I hadn't know about Dual numbers, there's no way I'd have gotten there just
by sitting down and programming, and who knows how long it took the FB team to
do the dynamics for FB Home, or the Apple team to add UI Dynamics to iOS 7. My
guess is that it took longer than two days...

[0]
[https://github.com/erichocean/blossom](https://github.com/erichocean/blossom)

[1]
[http://en.wikipedia.org/wiki/Dual_number](http://en.wikipedia.org/wiki/Dual_number)

[2]
[http://en.wikipedia.org/wiki/Automatic_differentiation](http://en.wikipedia.org/wiki/Automatic_differentiation)

------
dylandrop
In my time as a CS student, I've never needed to use C nor understand its
syntax to get through my classes. I think you're confusing the necessity of
learning C with understanding how a computer works, and the two are not the
same.

I think the most useful class I took that explained this (in my sophomore
year) was a class on digital logic and micro-architecture. Just about every
school I know teaches some class like this early on in your education (at
least teaching the basics -- gates, latches, flip-flops, adders, etc.) In
mine, it even carried up through some basic assembly code. This class gave me
enough to know how computers are working at a fundamental level to be able to
use that to my advantage when implementing projects.

Lastly -- you're getting a CS degree, not a programming degree from a
vocational school. Does everyone who researches... quantum computing, or
natural language processing, or graph theory have to know and communicate in
C? I think there are other pursuits beyond programming that you get from a CS
degree, and to say it's necessary to learn C pivots your school closer to a
vocational school. Not that that's bad -- it's just different than what you
signed up for.

------
rismay
Good stuff. However, I don't think that "learning to code," ins being used in
the same sense that Terence Eden meant. I think what Quinn Rohlf is trying to
say is, "Learn how computers work." And what Eden is trying to say is, "Learn
how to be logical." And for a hat trick of HNs submissions, what Andrew Wulf
is trying to say over at thecodist.com is that in general you should just,
"Learn how to keep learning."

Taken as a sum of collective knowledge, I think everyone is right in this
case. I think the general idea people are trying to get at is this: 1\. To
work in technology, you must love learning. (Don't get steamrolled). 2\. To
work in technology, you should know how computers really work. (Learn about
C.) 3\. Working in technology is not about learning where to but the curly
brackets. It's about learning how to think logically. (Learn Computer
Science.)

In this sense, I think a great way to get started is by working on a project
that you love and in a medium you love. Then as you start hitting limits,
descend to C and learn algorithms.

------
frou_dh
I find the notion of syscalls as the interface to an OS more intriguing than
the world according to C. C as it stands may be ubiquitous, but other
languages can conceivably be implemented as peers to it (rather than on top of
it - it took me far too long to realise that). It's fundamental neither on
bare metal nor in the presence of an OS.

~~~
pjmlp
Yep, when C was mainly a UNIX only language, there were another ones to choose
from.

With UNIXs spread into the enterprise, C started to gain weight.

If UNIX hadn't been successful, it would just be yet another language.

------
madmax96
Terence Eden was saying the same thing that this post said, which is why I was
confused at the contradictory tone in this article. It's vital to a _good_
computer science education to understand pointers at a hardware level,
differences between stack and heap variables, etc. It's a tragedy that the
students at Lewis and Clark looked at pointers to functions and thought
voodoo, but it would also be wrong to only teach them the low level stuff,
since there most certainly is enough to fill a college experience with. That
being said, the only thing that seems missing at Lewis and Clark was a good
introductory course. A good background on what languages are doing behind the
scenes, as provided by something like C, is a part of computer science and is
necessary for a thorough understanding of the subject.

It just seems like all the issues talked about were computer science issues.

------
dljsjr
I'm not convinced that these things need to be mutually exclusive. In the CS
department at my university, the only language used for the first 3 years is
C. But all of the courses are split in to a "lecture" and "lab" component. In
lecture you learn about Computer Science; you learn data structures,
algorithms, computational complexity, graph theory, proofs, summations, stats,
combinatorics, etc. In lab you learn Programming; you implement a lot of these
algorithms and data structures in C.

Abstract vs. Concrete Data Types are introduced very early as representations
for the theoretical structures discussed in class and then we are given
projects that can be solved using concepts and theorems introduced with the
expectation that we will implement the appropriate structures and algorithms
in C. The code itself is read over carefully and just as important as whether
the output is good. Now, this is partially a luxury afforded by my
university's small size; freshman and sophomore CS classes aren't taught in
200 person lecture halls, they're 30-40 students at the max. My first exposure
to a linked list was writing one in C, not using one in Lisp. You model them,
write proofs, recurrence relations, graphs, etc. on exams, and implement them
in projects so that you show an aptitude for both the theory and the
mechanics.

By the time you hit your fourth year, you take Programming Languages and
Theory of Computation. Automata/Machines, Grammars, BNF, Compilers, etc. in
lecture; seeing these concepts applied by studying and writing code in OO
languages, LISPs, and Prologs in lab. And you'd sure as hell better understand
how the garbage collector in your favorite high level language works or you'll
get hosed on the exam. There's other staples mixed in of course; a whole
semester devoted to Java/OO (I think a whole semester devoted to functional
programming and Lisp would be better, but that's another story). A semester of
OS, a semester of Networking. But just focusing on the whole programming/vs
science side of it, there's no reason they can't both be taught effectively.

~~~
YZF
And how did that work? I'm just trying to imagine those first year students
staring at those segmentation faults or trying to decipher some other non-
intuitive C behaviour (promotion rules, = vs. ==, pointer arithmetics, various
implicit rules). Talk about being thrown into the deep end of the pool to
swim...

When a language like Pascal is taught as a first language students can focus
on the mechanics of their algorithms rather than struggling with language
oddities and machine architecture. I think that's important for new
programmers to be able to focus on the flow and mechanics without worrying
about other details. Once you get that you can "advance" to things that leak
more of the underlying architecture into your program. There are probably many
other languages that fit that bill.

EDIT (replying to jkrems): E.g. in C if(a=1) assigns 1 to a. I think some
modern compilers at least warn on this but they didn't use to and it's
perfectly legal. You can't do this in Pascal. You can still make a mistake but
you'll get a compiler error. This is a very common mistake for beginners...

~~~
jkrems
Can you elaborate what you mean by "= vs. =="? Since you mention Pascal I'm
guessing that you'd say ":=" and "=" would be better, but since the general
model of math is different from procedural code, I'm not sure that's an
uncontested truth. I learned Pascal in high school and ":=" was confusing to
more people than "==", even among first time programmers.

~~~
elclanrs
I find this paper [1] very interesting and elaborates on this issue -- "From
experience it appears that there are three major semantic hurdles which trip
up novice imperative programmers. In order they are: assignment and sequence,
recursion/iteration and concurrency".

[1]
[http://www.eis.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf](http://www.eis.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf)

------
xarien
You're essentially advocating for a bottoms up approach as opposed to a top
down approach (albeit in a pretty confusing manner). Both work for different
people and circumstances. Personally, I think C is a bit high level for a true
bottoms up approach which is more inline with a computer engineering education
as opposed to a CS program from a liberal arts college. A good approach is one
I experienced while attending a tier 1 eng school over a decade ago: using ASM
of a simple simulated cpu with a small instruction set(risc or cisc doesn't
really matter although imo risc is by far easier to pick up for less
experienced individuals).

------
yoha
It really depends on what you're aiming for.

\- pure mathematical stuff, graphs and such, you can skip programming

\- making algorithms/protocols/etc by simple models require to understand what
a computer can do, etc so learn at least Python

\- making new architectures, efficient functions (like for crypto), go to C.

I am suggesting Python, because it is probably the simplest programming
language to learn.

I'm not saying that learning Python when you do purely theoretical stuff is
useless, just that you can skip it and still work. Event if you do no CS-
related things, programming is useful.

tl;dr: go for Python, unless you need to now the specific workings of a
computer

~~~
betterunix
"I am suggesting Python, because it is probably the simplest programming
language to learn."

[https://en.wikipedia.org/wiki/Scheme_%28programming_language...](https://en.wikipedia.org/wiki/Scheme_%28programming_language%29)

(Otherwise I agree, start with something like this to learn CS)

------
bmoresbest55
I am a recent graduate of CS and completed a year of C after my first year in
Java. Then finished the final two years in something of interest and software
development courses. I had the ability to do this at a single school and feel
I got a full education in terms of covering a large area of study and
understanding coding as well as the science behind it. I do however still feel
overwhelmed at times and am constantly learning. Lynda for language basics
because languages are cool and Two Scoops because Python and Django are
awesome.

------
garrettdreyfus
I learned to program in python, and later in front end web development
languages such as js, css, and html. I have recently started to learn about
operating systems, with the eventual goal to contribute to the Linux kernel.
Before now I had never touched C. I ran into very similar problems. My first C
program forked a child, its child initiated a shared memory object, wrote a
collatz sequence to it and exited. Then the parent read and printed from the
memory object. It took over 2 hours because C was so alien to me.

------
icedchai
I'm biased, since I taught myself C when I was a teenager. My first language
was BASIC (first on the Apple II, then the Amiga.) When I got to high school,
they were teaching Pascal on old 8086's. It felt like such a step backwards.

This was back in the early 90's. When I go to college, I was able to opt out
of the first couple of intro CS classes (and get credit for them) by taking a
test or two.

------
mgmeyers
Does any one know of any good self study resources (or online courses) on this
subject?

~~~
dobbsbob
The C Programming Language by Brian W. Kernighan and Dennis M. Ritchie
(Kernighan has 2 other good programming books too) and/or look around youtube
for tutorials on C. MIT has this online, lecture notes were good enough for me
plus K&R book [http://ocw.mit.edu/courses/electrical-engineering-and-
comput...](http://ocw.mit.edu/courses/electrical-engineering-and-computer-
science/6-087-practical-programming-in-c-january-iap-2010/lecture-notes/)

This is a good course about architecture
[http://www.nand2tetris.org/course.php](http://www.nand2tetris.org/course.php)

~~~
dmunoz
> look around youtube for tutorials on C

This is likely to be horrible advice. I don't doubt that there are good C
tutorials on youtube, but there are probably also hundreds of extremely poor
tutorials that are seductive and even highly rated because the people rating
don't know better.

K&R C is excellent. I don't like overbearing advice, but "Every serious
programmer should read K&R C" isn't bad as far as such advice goes.

I don't particularly like the "Learn X The Hard Way" series (maybe it is the
right way, but I find it too painful), but there is Learn C The Hard Way [0].

[0]
[http://c.learncodethehardway.org/book/](http://c.learncodethehardway.org/book/)

------
maaarghk
Hey, it sounds like if you write a book, I'll buy it ;P

------
niio
ignore this, using it to "save" article.

------
blahbl4hblahtoo
Lots of people confuse learning to code in C with learning how hardware works.
Its close in some ways but not all.

EDIT: WOW. I'm amazed at how many people think that C is "how memory works".
Just wow. This is one of those things where you're confusing the map for the
territory and ascribing value to something because its "hard". C is a fairly
high level abstraction for interacting with a computer...i know that everyone
has heard that c is close to being assembler...but what you are forgetting is
that assembler is also an abstraction over operands. The reason i think its
important to point this out is that malloc and free arent magical. Othér
techniques are just as valid and the notion of "low level" is misleading. C
makes tradeoffs that actually give rise to all of the vulnerabilities and
stability problems in all the software that you have ever used. Those
tradeoffs are REAL and just because lots of you get this macho nonsense about
not using safe collection types everyone on the planet has to deal with
malware. So many of you buy your own bullshit at an astonishing level...its
really breathtaking to see how many of you cant see the built in assumptions
in what you are saying.

~~~
smorrow
To elaborate on this...

malloc and free aren't even a part of C itself. Other than static (globals and
"static" the keyword) and automatic (stack) allocation there isn't any memory
management in C itself. malloc and free are library functions wrapping system
calls. That's something to do with the operating system - the machine itself
has NOTHING like malloc and free.

I and many people know C but know fuck all about assembly programming. (for
now)

~~~
spc476
malloc() and free() (not to mention realloc() and calloc()) are part of the
Standard C Library, and _must_ be provided for a C compiler to be certified as
ANSI compliant. Granted, you don't have to use any Standard C Library
functions, but a C compiler _must_ have them (or else, it can't claim ANSI
compliance).

~~~
smorrow
I knew they were part of the standard library and the specification -- I mean,
even the stdio stuff is, right? -- I had no idea the _compiler_ had any
involvement in that, though.

I know gcc provides them for you if you don't specify the right includes, but
that's gcc, like.

I think I'm going to go on considering them not technically part of C itself,
until I have more information.

Thanks.

~~~
spc476
(I'm not sure if you'll see this or not, but it's worth a shot)

Like I said, before a compiler can be certified ANSI C, it _must_ provide the
Standard C Library (for whatever platform the compiler is for). As a
programmer, you don't have to use the supplied functions, but they are there,
and they do comprise a part of the C Standard. And because the functions are
defined, the compiler writer can do some pretty neat things.

For instance, include string.h, and that informs the compiler you want to use
the string and memory functions defined by the C Standard. The compiler can
then generate code directly for, say, memcpy() instead of generating a call to
said function (it can inline it, even though C89 made no standard method of
inlining a function). Or the function sin() if you include math.h (some CPUs
support a single instruction for the sin function). Don't include string.h,
and well ... what happens is up to the compiler. Most just give a warning
about an undefined function, assume it's defined as "int unknownfunction()"
and leave it up to the link phase to either find it or not.

Yes, there is a distinction between the C language, and the C library, but
_both_ must be provided if a C compiler wants to conform to the C Standard.

~~~
smorrow
That is quite informative, and also quite bizarre (for C). TIL.

------
michaelochurch
Learn C _and_ learn computer science.

Let's get away from subjective "shoulds" and talk about practical stuff. You
want to learn things, as much as possible, that (a) are likely to remain
useful in 15 years and (b) are going to qualify you for the highest quality of
jobs.

C, as a language, passes this test. So do the fundamentals of computer
science; they may not bring you quantity in job opportunities, but they'll
give you access to _quality_. Aside from that? Well, it's hard to predict the
future. I can't make individual calls. Who would have thought JavaScript would
be so popular in 2014? The whole language is a hack. But if you learned it in
2000, you still have a high-demand skill set.

Learning one thing or one small set of things won't do, though. You need to
future-proof yourself with a wider knowledge portfolio, and very few jobs will
teach a person to do that.

On the other hand, if you learn a lot of useless and parochial crap that won't
generalize (i.e. the quirks of your own corporate codebase) you get to a semi-
depressed state of "learnout" (you struggle to assimilate new things, because
you've filled your brain with unrewarding crap) and that's no good either.

~~~
dclara
"The whole language is a hack."

Interesting enough that it is a must for the web development although it's
lacking the structure definition from CS point of view.

How about other scripting languages, such as Python or Ruby?

~~~
ForHackernews
Python is growing in importance in scientific computing, displacing languages
like R, MATLAB, and Fortran (yes, Fortran). There's a pretty good chance it
will still be in wide use in science in 10-15 years.

Ruby is harder to predict. It seems like a lot of the web developers who would
have used Rails a few years ago are now opting for server-side javascript,
based in node.js.

~~~
dclara
Thank you. But do you mean Julia to replace R, Matlab and Fortran?

~~~
emkemp
I'll chime in and agree that Python is challenging R, Matlab, and Fortran
thanks to some powerful third-party libraries. There is a LOT you can do with
NumPy/SciPy[1], and Matplotlib[2].

[1] [http://www.scipy.org](http://www.scipy.org) [2]
[http://matplotlib.org](http://matplotlib.org)

