
Learn C The Hard Way - llambda
http://c.learncodethehardway.org/
======
kevinalexbrown
I recently started working through Learn C the Hard Way, and after doing a few
chapters I wrote down what I liked about it in a notebook. Digging it up, here
it is:

"Why I like Learn C the Hard Way:

\- Opinionated. I think opinionated textbooks are great because they limit
their scope and focus on something. Rather than being an authoritative
reference (who uses text references anymore?), it's a framework for learning.

\- Emphasizes reading and _editing_ which contributes to overall understanding

\- It's lean. Goes with opinionated, but it's nice that it doesn't repeat
what's been done, but sends you there directly, i.e. don't waste time writing
about "strcmp", send me to "man 3 strcmp" since I need to get familiar with it
anyway."

Disclaimer: I'm a bit new to low-level coding, so feel free to point out why
these reasons for liking the text might be naiive.

~~~
oblique63
I too just started reading this one.

For me, my first foray into 'low-level' programming was during my freshman
year in a CS101 course that used C++. That first course on it's own wasn't too
bad, but the subsequent 'introductory' courses for more C++ and Data
Structures + Algorithms were just terrible. I couldn't absorb anything useful
from them because I was just overwhelmed with the constant bombardment of
Segmentation Faults everywhere. Poor teachers + high learning overhead as a
result of using a low-level language, just made me want to stray as far away
as possible from ever using a 'C-langage' again. If it wasn't for me
discovering Python around the same time, I probably would've just given up and
pursued music or audio engineering instead.

Fast forward to last year, having now learned a ton more languages, I decided
to go back and start reading K&R. I didn't manage to get through a whole lot,
but from what I read, I liked it. The problem was that I didn't really feel
motivated to keep plowing through it. Looking through the snippets of code in
it, more and more I began wondering if some things were just outdated
practices, or if they were in fact crazy idioms that I would have to get used
to when dealing with C. That doubt (as silly as it might sound) did put me off
a bit. The explanations in the text were top notch, but again, the code just
didn't compel me to type it in like I knew I should.

Now I started reading LCTHW just a few days ago, and I have to say, it is
quite awesome. It's a lot more minimal and sparse than I'm used to for a book,
but I have been seriously surprised at how effective that method is. At first
I was a bit annoyed that a lot of parts were basically saying 'just google
it', cause I look to books to present me with a more coherent explanation of
things than I can find scattered around the web or in arcane documentation.
But as I went on with it, I noticed that the things he makes you look up on
your own are generally specific enough that it makes the task of filtering out
all the nonsense much easier than if you were researching it on your own. I
quickly got into the groove of the book, because it just builds your
confidence that you can be pretty self-sufficient in a new environment, even
at the early stages of your learning. Combine that with his 'code first,
explain later' format, and that book is surprisingly motivating. In just 2
days, I went through the first 17 chapters (currently stuck on the extra
credit for the Database exercise), and I just want to keep on going. I think
this is the first time I've ever experienced this level of enjoyment and drive
from a programming tutorial. I have no idea why this is, but it's damn
impressive. I can't wait to get to the K&R portion at the end.

Thanks Zed!

~~~
zedshaw
> I noticed that the things he makes you look up on your own are generally
> specific enough that it makes the task of filtering out all the nonsense
> much easier than if you were researching it on your own.

That's my trick. I actually go googling and make sure that it's something you
can find easily with a little nudge. Part of the goal of my books is to teach
basic research skills so you can survive on your own. Glad you got that.

And, you're welcome. I'm still working on it, but feel free to fill out
comments with problems you hit.

------
Locke1689
Hmm, I've always viewed K&R as kind of a paragon of C perfection -- I first
learned C from K&R when I was 16 or so and have used it as a trusty companion
to systems C programming all the way through grad school. Not only do I view
it as a good book to learn C, I also view it as one of the best computer
science textbooks ever written, partially because it manages to encompass so
much of the C and UNIX style along with the programming language itself.

So I guess my question is: why not K&R? Learn Python and Learn Ruby always
made sense to me because there's a serious lack of definitive texts on
learning those languages, especially if you've never programmed before.
However, in my opinion C is not the best first language and there already
exists a fairly definitive text on it. So I would love it if someone could let
me know what I'm missing.

~~~
nikuda
Your question is addressed by the author at the end of the book:

[http://c.learncodethehardway.org/book/learn-c-the-hard-
waych...](http://c.learncodethehardway.org/book/learn-c-the-hard-waych55.html)

~~~
pantaloons
He brings no pedagogical issues to bare, it's simply a facile critique of
"style", I don't think that answers the question why not K&R at all.

Some may consider the points well taken, not surprisingly K&R had the
foresight to respond in kind, two decades earlier:

 _Our aim is to show the essential elements of the language in real programs,
but without getting bogged down in details, rules, and exceptions._

~~~
zedshaw
Wrong, I make a clear example of the copy() function being broken, give a
demonstration of fuzzing it to break it, and show how to do it yourself. And,
if you think the copy() function is valid, then you also think strcpy()
function is valid, and therefore you don't know what you're talking about.
Everyone who is aware of secure C coding knows strcpy() is buggy and the cause
of most buffer overflows.

~~~
angersock
Dude, stop saying it's broken.

For higher level languages where we have intelligent string objects, yeah,
bounds checking is assumed--but this is something originating in assembly-
level stuff. If you call it with broken memory, of course it won't work
correctly.

You're doing a good job spreading knowledge--don't spread misinformation.

 _Everyone who is aware of secure C coding knows strcpy() is buggy and the
cause of most buffer overflows._

If strcpy() was truly buggy and unpredictable in its implementation, it
wouldn't be nearly so useful as an attack vector. Be accurate--strcpy() is
_unsafe_ , not _buggy_. Sheesh.

~~~
zedshaw
No, it's defective and buggy. You can't prove logically that the while loop
inside will end given any random input. An implementation with a length will
end given any input. _That_ is a bug. If you wrote code like that then I'd
flag it as a bug, so how is it that strcpy is somehow reclassified as "unsafe"
but yeah totally not a bug?

It's so poorly designed that it should be considered wrong, buggy, defective,
unsafe, and removed. To go around saying "it's totally alright" when even
_Microsoft_ , bringer of more virii than a whore house, has deprecated it:

[http://msdn.microsoft.com/en-
us/library/kk6xf663%28v=vs.80%2...](http://msdn.microsoft.com/en-
us/library/kk6xf663%28v=vs.80%29.aspx)

Is idiotic thinking. Go ahead and come up with all the distinctive
classifications you want, it's a defect to use strcpy because it's buggy, and
copy() in that code is the same.

~~~
judofyr
> You can't prove logically that the while loop inside will end given _any
> random input_.

C is all about layers, and at the bottom you just assume that "all input given
to this function is safe". If you're passing random input (without any error
checking) to pretty much _any_ of the built-in functions, You're Doing
Something Wrong.

Whether you call strcpy/copy _buggy_ or _unsafe_ doesn't really matter. The
implementation on all platforms pretty much follows the standard, with its
well-known issues. Sometimes it's the right tool for the job; sometimes not.

It's also important to remember that strcpy_s doesn't just magically solve all
your problems. Someone might think that this code is safe:

    
    
        strcpy_s(src, 10, dst)
    

But if dst is shorter than 10, you'll have a problem.

Going from ASCIIZ to length-encoded strings isn't something you can just do in
the middle of a function; it requires changes all over the place. K&R was
written with ASCIIZ and low-level programming in mind. There's nothing
inheritly _wrong_ about this; it has both its advantages and disadvantages.
Your book is written with length-encoded strings in mind. (Which I think is
the best way to teach today).

I love the concept of this chapter, and I pretty much agree with you in your
critisms to strcpy/copy, but suddenly you go from "this code has issues" to
"this code is bad, bad, bad; never use this code; only silly people would
write this code" and so on (at least that how I feel when I'm reading it). I
think you should place more emphasis on "this code was written to only work
with certain input; look how bad it performs under fuzz-testing; see how easy
we can fix it!".

~~~
jballanc
I'm going to have to side with Zed here. If I'm teaching an introductory class
on Chemistry, you'd better believe that when I reach the section on cyanide
I'm going to tell students: "bad, bad, bad; never use this chemical"! If those
introductory students were to take a more advanced class, then I would
probably tell them: "well, ok, cyanide isn't going to kill you instantly and
is actually really useful for a wide range of applications".

Part of being a good teacher is recognizing that there are limits to how much
you can expect a student to learn at a given level, and then making sure their
knowledge is as "complete" as possible within those limits.

~~~
judofyr
> If I'm teaching an introductory class on Chemistry, you'd better believe
> that when I reach the section on cyanide I'm going to tell students: "bad,
> bad, bad; never use this chemical"!

I agree. So would I. However, what Zed is doing in the last chapter is showing
code written by other people. If you taught your students about an experiment
done by other (widely regarded) researches, would you say "bad, bad, bad; they
should never have used these chemicals"?

I would say: "See, they used it here, but only because they were very, very,
very careful. Let's explore different ways this breaks down. … As you can see,
I'd recommend you to _not_ do what these people did." _

------
drucken
_Caveat: only scanned the structure and read the final chapter._

Regarding the final chapter: K&R, like most programming books (especially
those for new language) shows pedagogical code, not for productive use. I
highly doubt it was ever intended as the equivalent of the modern-day "Code
Complete"!

The value of LCTHW, apart from the intro and availability, is its unusual
dissecting/analytical approach, which I welcome and am grateful, and for this
reason will read it in its entirety.

------
babarock
Wasn't this same book on the front page of HN just a few weeks ago?

It just seems so random to see it appear again today? Was there a special
release announcement that I'm not seeing?

~~~
farslan
The page that was on the front page a few weeks ago was just a respond to
K&R's C book from the author itself.

------
bretthopper
This should be titled: Learn C The Hard Way

It's for the C version, not the top level domain.

edit: titled was fixed. Thanks!

------
o2sd
OK. NOT TROLLING, but can I ask why anyone would want to learn C, other than
to develop device drivers, kernel modules or other arcane software that is yet
to be replaced by C++?

I asked this question of a younger programmer the other day, because it seemed
to me, that to HIM, learning C was a rite of passage, and that he was less of
a man for not knowing it.

I find macho philosophies in software development both amusing and counter-
productive.(If you really want to be macho, become a lisp hacker). My
amusement may be personal, but the counter-productivity of using C, when
better abstractions (i.e. programming languages) already exist, is real. It
creates fiefdoms and priesthoods that are counter-evolutionary and hard to
maintain, and leads the death of much software.

Personally, I would rather std::string be pored over by many eyeballs and
evolved than change strcpy to _strcpy or strlcpy. Unless I am working for NASA
on an embedded device for a satellite, I would rather Moore's law or SMP or
DSMP give me the speed I need, than give up productivity to squeeze every last
CPU cycle. Developer time is a lot more expensive than hardware (except on
satellites and space stations).

Apologies if it is your ambition to work for NASA on embedded devices in space
stations, I just think you may as well learn C++. You get most of C, plus some
really useful and productive abstractions as well.

If you want speed, learn inter-process communication and the principles of
symmetric multi processing. With C++, you also get to abstract away the
problems of strcpy and strlen, replace Byzantine function references with
class methods, and 35 parameter functions with polymorphism. Best of all, most
of the programming world will still think you are manly if you know C++, so
you get that too.

~~~
Synaesthesia
Objective-C is a strict superset of C, so learning C helps a lot for any iOS
or Mac programming.

C++ is also heavily based on C, so it helps to learn C++.

I agree C is hardly used anymore, and for good reason, but it's still
interesting to learn.

~~~
masklinn
> I agree C is hardly used anymore

Any time you want to provide libraries, you'll likely use C: all languages
have C FFI, and it's not possible to have a C++ FFI. So you'd have to rely on
`extern C`, and then you have to build a bunch of stuff over your OO code so
it can be used procedurally.

Often not worth it, C is the lowest common denominator of languages, if you
want to be accessible to all languages... you'll probably use C.

~~~
o2sd
> Often not worth it, C is the lowest common denominator of languages, if you
> want to be accessible to all languages... you'll probably use C.

Personally I see that not so much as a feature of C, but as a limitation of
other languages. Interoperability is a Hard Problem(tm) which deserves more
thought and effort than it currently receives. It's easy to say 'If you want
interoperability, use C' because that's the current state of affairs. It would
be better if interop was a solved problem.

~~~
masklinn
You need a common language to interoperate. Currently, C is that common
language: everybody understands C.

Some languages have built specific language interop (e.g. Erlang with Java),
it's usually broken and often not even as good as C.

> It would be better if interop was a solved problem.

It is: go through C.

