
Massacring C Pointers - signa11
https://wozniak.ca/blog/2018/06/25/Massacring-C-Pointers/index.html
======
patrickyeon
A few years ago, I saw a classroom video (I would guess ~8-10 year olds, in
the American public school system) as a demonstration of a teacher's technique
called "my favourite wrong answer". She would have them solve a problem (a
math problem, in this particular case), collect the answers, show the
distribution of the answers, and then pick out one answer (possibly with the
student's name redacted) as her "favourite wrong answer". And then she would
work through why she liked it so much with the students. In the examples, it
was cases where the person was on the right track, but then made an incorrect,
but justifiable by _some_ standard, step in solving it.

It seemed like a good way to really try to understand where someone's
comprehension broke down. It felt like it added some legitimacy to the
students who got the answer wrong, if others could look at it and go "oh, I
see why you thought that!" instead of just "wow try to get it right next
time". I believe it's part of what good teachers (not limited to school
educators, mind you) are doing all the time: looking for the student's gaps
and trying to correct those, instead of just repeating the lesson that has
already failed to stick.

I guess this just makes me think of that teacher, trying to work out what her
pupils were misunderstanding, by looking at their answers.

~~~
nwmcsween
The issue with really learning computer science outside top well known schools
is that you're basically on your own for education (or worse if taught wrong).
Open source software somewhat teaches you things but usually by trial by fire.

~~~
eesmith
Really? I went to a non-top school and putting aside the formal education
component, I felt like I also had plenty of co-learning. That is, I remember
many study groups where we helped each other learn, including for programming.

I certainly didn't feel like I was on my own.

~~~
faceplanted
Similar experience here, my university helped people form study groups near
final exams and had a "homework club", where final year and grad students were
paid to help first and second year students with their assignments for a
couple hours a week, and people were encouraged to meet each other in that
club and find "study buddies" (but no-one actually called it that, because
ugh)

Covered everything from the programming to the multivariable calculus and data
science (much harder to find students confident enough to teach others for
those last two though)

------
phaedrus
When I started my CS degree the school was transitioning from teaching C++ to
teaching Java, and the state of instruction in C++ was almost as bad as these
examples. I had a professor who wanted us to use "new Foo()" everywhere in our
code (even local / static variables) because it, "gets the students ready for
Java." No matching delete, of course, or mention of RAII. We were supposed to
"pretend" we had a garbage collector. By that logic one might prepare students
for a course in Spanish by speaking English with a "-o" on the end of every
word.

On one of the early homework assignments I realized my professor misunderstood
how pointers work - he seemed to believe that (re)assigning pointers created
chains rather than changing what the pointer points at. I.e. given "int x,y;
int_pointer a,b,c; a=b=c=&x;" he seemed to believe that then executing "c=&y;"
would also cause "a" and "b" to point at "y".

I spent the first page of my turned-in assignment excoriating his lack of
understanding. Then I presented a class template that implemented a special
smart pointer which _did_ behave in the unusual way he seemed to think C
pointers work, so that I could write the code exactly as he presented in the
assignment and make it actually work.

In retrospect I could have been nicer and considered other pedagogical factors
beside technical correctness. I think he took it with more grace than I
deserved.

~~~
Improvotter
Some of my professors also have ridiculous code standards or exam standards.
One of them teaches "algorithms and data structures", but he's not allowing us
to even use a break or continue in our loops because he thinks it's "bad
practice". We end up with a nested hellhole. Another example from the same
professor (correct me if I'm wrong on this as I'm not a C++ expert), but he's
constantly inheriting from unique or shared pointers in his classes. Like why
though? Why do you have to complicate your code like this? Why can't you just
have a normal class with unique or shared pointers in it?

Lastly, and this is taught by another professor, is my class for Linux. It
includes bash and C++ programming and focuses a bit on the POSIX API. Our exam
was last week and we had 3 parts. The first part was a theory exam by the
professor that taught the class. It was about bash/POSIX commands and very
little Linux specific stuff. He expected us to know all of the options from
all of the commands you can think of: cut, split... You had to know all of
them and off the top of your head. It's ridiculous. I'm pretty sure I failed
that one, but I absolutely nailed the next part which was about bash scripting
(where you could use if statements and the like, the first part did not allow
it, only redirection and piping). The man is mad (hah man, that's what I
needed during that first exam part).

~~~
0xffff2
>Another example from the same professor (correct me if I'm wrong on this as
I'm not a C++ expert), but he's constantly inheriting from unique or shared
pointers in his classes. Like why though?

You are not wrong. This completely breaks my brain. I really would love to
hear the professor's explanation for this.

~~~
daemin
The same thing as people inheriting from standard containers. It does not make
sense!

It's like people see inheritance and then forget that containment is a thing.

------
setquk
This doesn't surprise me really. When I was at university the programming
textbooks we had were vile nasty and plainly incorrect. And those of us who
dared challenge it by writing correct, robust code were penalised as the staff
teaching it didn't understand the domain of what they were teaching properly
and didn't have any real experience and assumed we were doing it wrong. We
learned quickly to approach education with a high level of scepticism and get
your info from more than one source (textbook) and find out which ones were
reputable and which ones were garbage.

Basically there was a monolith in the middle of the course, a crap textbook,
and monkeys were praying to it as the authority on everything.

~~~
phaedrus
Along the lines of taking the authority of a book as a replacement for
critical thinking... I was once hired to rewrite some assembly language
spaghetti code a more-electronic-than-software engineer had worked on for two
years. Ostensibly he was ordered to help me accomplish this, but he was more
like what in court would be described as a "hostile witness"

I spent a couple of months "hacking", just familiarizing myself with the
microcontroller and its instruction set, and the tooling. (In fact, I had no
source code to look at during this time, as he would not surrender it!) When I
finally got his source, I rewrote it to a 100% functional equivalent in two
weeks, 1/5 the size without all the unnecessary control path duplication and
without pointless register moves. At the time I found the way he had used
register moves particularly puzzling because it resembled the way an un-
optimized compiler might work.

Months later after I was no longer on that project, I got a call from him. He
was very flustered and wanted to know where I "got" a particular sequence of
instructions from. I was like, come again? He said, "It's not in the book. You
used a sequence of instructions that's not in the Book." (The Microchip
programming manual.) He asked about another block of 3 or 4 instructions -
also not in the book (in the combination I used them in).

Slowly it dawned on me - I'm quite certain he didn't understand what any
individual instruction "did". That whole level level of abstraction didn't
exist for him. He programmed in assembly, yes, but only using blocks of
example instructions from the Book. Suddenly the pointless register moves made
sense - he was acting as a human compiler, without an optimization step.

Years later I realized I should have asked him, "and how do you think the
example code in the book was written?"

~~~
daemin
That seriously sounds like someone that has only learnt through rote
memorisation. Absolutely great at taking tests and doing things by the book
but completely unable to apply the things they should have learned to a
problem that falls slightly outside the examples.

Great story though, even though it makes me shudder.

------
waffle_ss
I was going to ask if any attempt was made to reach the book author for
comment, but it looks like he died in 2007. RIP.

[https://www.findagrave.com/memorial/29007415/robert-
joseph-t...](https://www.findagrave.com/memorial/29007415/robert-joseph-
traister)

~~~
delhanty
It appears that someone having read your comment here has left a horribly
abusive comment on that memorial page now under the handle "Screw You Bob".

> ... The Hacker News army is here.

"Screw You Bob" you do not speak for me. Hopefully, you don't speak for many
people on HN.

Words fail me. Can't we let the dead rest in peace?

~~~
gambiting
I'm surprised there is no way to report one of those on the site.

~~~
butthole2
There is a feedback form in the lower right and the email address.

------
kosma
The fixed-location variable allocation strategy the author mentions is called
overlaying or compile-time stack[0]. It's still very much alive today, thanks
to architectures like 8051 that are not really stack-friendly (even though
they do have a stack).

> _The Keil C51 C Compiler works with the LX51 Linker to store function
> arguments and local variables in fixed memory locations using well-defined
> names_

[0]
[http://www.keil.com/support/man/docs/bl51/bl51_overlaying.ht...](http://www.keil.com/support/man/docs/bl51/bl51_overlaying.htm)

~~~
cptnapalm
Is there anything I can read to learn more about overlaying in C? I was
reading, to my limited abilities' limits, 2.11 BSD source and there was a lot
of references to overlaying.

~~~
basementcat
On a modern POSIX environment, you can hack together an approximation of an
overlay by loading some code to an area of memory, mprotect( ptr, len,
PROT_EXEC) the segment, ((int(*)())ptr)() to call the function and if you did
everything right, you don't segfault too hard. Later, you can overwrite that
area of memory with other code and repeat.

This is similar to what actually goes on under the hood of the dynamic loader.
[http://tldp.org/HOWTO/Program-Library-HOWTO/dl-
libraries.htm...](http://tldp.org/HOWTO/Program-Library-HOWTO/dl-
libraries.html)

~~~
burfog
That is a different sort of overlay, a code overlay.

The other example was data, with the compiler assisting by changing local
variables to have fixed addresses that get carefully reused for different
variables at different times.

------
the_lurker
Yesterday I encountered a similar program on a HN comment chain as shown in
this link. I am genuinely confused as to why this program is bad. I am a
student and I do not know the best practices regarding pointers, but it is how
I would write a program to combine two strings.

Can someone please elaborate why it is bad? Are their any good resources to
fill gaps in my knowledge?

Thanks in advance.

Edit: Thank you guys for pointing out so many problems. It seems that I have a
lot to learn. :)

~~~
int0x80
It is wrong in many ways.

It copies s and then t to a fixed size buffer, without any checks. That will
write to invalid memory (probably smashing the stack) if len(s), len(t) or
len(s) + len(t) > 100.

It returns a stack allocated buffer (r) pointer to the caller. The array will
be invalid when the function returns, as the automatic variables only live in
the function scope (during the call), they are deallocated when the function
returns.

To do this right you have various strategies.

1\. Allocate a buffer of len(s) + len(t) + 1 with malloc, copy the strings and
return it. Have the caller free it when it's done. It can be inapropiate
because of the dynamic allocation.

2\. Have the caller pass a destination buffer and its size. If you know the
char *'s are zero terminated, check if you have space for them in the dest
buffer. If not, truncate or error out. Most of the times, this is the prefered
solution.

3\. Use a static local buffer, and return it to the caller. You may need to
truncate the copy too. Not recomended. This is not a good solution as the
function also will not be reentrant (unsafe with multiple threads).

You can use libc functions like strncpy (C89+), snprintf (C99+) etc to make a
"size-checked" copy with various automatic truncation semantics. You can refer
to their man pages for details.

Edit: to the downvoters, please point out what is wrong in the comment.

~~~
simias
You're right, although any mention of strncpy should come with a big
disclaimer that it's effectively broken because you can end up with a non-NULL
terminated string in some cases. strlcpy should be the way to go but
unfortunately it's not part of the C standard and not available everywhere
(sometimes for rather bullshit reason IMO, but that's a different story).

~~~
jmts
Documented behaviour is a little different to "effectively broken". The
difference between strncpy and strlcpy is that strlcpy will NUL terminate the
last byte for you always. There is nothing stopping you from doing the same
thing yourself when you use strncpy. If you care enough, write your own
strlcpy - it's only one extra line.

~~~
simias
I'm not saying it's hard to work around but I maintain it's broken. You have a
function that deals with C-string that in some conditions returns something
that's not a C-string but can't be trivially distinguished from one and will
trigger undefined behavior if used like one. It's terrible ergonomics and
almost certainly not what you want to do in any situation.

You can argue that truncation is an error condition but then it ought to
notify it somehow, for instance by returning NULL in such a case. And even
then it's incoherent with snprintf which doesn't have the same behaviour and
does always terminate with '\0' even in case of truncation (assuming non-0
buffer length, of course).

It's just an unnecessary footgun that serves no practical purpose. It would be
like a date function that gives you the today's date except on the 4th of
December where it replies that it's the 31st of February. Not hard to work
around but still broken.

~~~
caf
It's not broken, but it _is_ misnamed.

This is because it is not intended to work with the same kind of string that
the other str* functions work with (ie. an ordinary null terminated string).

Instead it's supposed to work with fixed-width string fields that pad out
values shorter than the field width with nulls. This is how original UNIX
directory entries were stored.

See how the name is copied into u.u_dbuf here:
[https://github.com/hephaex/unix-v6/blob/daa355109625a50e6b10...](https://github.com/hephaex/unix-v6/blob/daa355109625a50e6b1080184dee30c9136549d1/ken/nami.c#L72)

~~~
simias
I see your point but at this point I think it's just a matter of taste. I
don't really see how having a function meant to deal with a special case of
character buffers disguised as a general purpose string manipulation routine
in the stdlib could be considered reasonable. I understand why it's here, I
understand the history, I understand why it made sense at some point to have
such a function but you won't be able to convince me that it's not broken or
that it shouldn't be deprecated in favor of strlcpy (ditto for
strncat/strlcat). After all it is in <string.h>, not <fixed-width-string.h>,
it's pretty heinous that it fails at the very low bar of actually producing a
valid C string every time (especially given the very high prejudice of having
rogue unterminated "strings" in a C program).

~~~
caf
It seems fairly unlikely that the C standard would add strlcpy() and strlcat()
when it already has strcpy_s() and strcat_s() in Annex K.

------
gwbas1c
I've definitely bought a few programming books that, on closer inspection,
just appeared to be money-grabs from the author.

But, my biggest memory is the book I didn't buy. I once worked with a
programmer who wasn't very good, and then I heard he wrote a book. I typed his
name into Amazon, and there was his book. It was all about the half-baked
concepts he was trying to put into our failing project. (Ultimately canceled
because we couldn't ship a very simple product. We couldn't ship it because
everyone just wanted to add code generators and additional layers around a
database... Instead of learning how to use a database.)

I couldn't get out of that job fast enough.

------
narag
When I was learning C in the eighties, I bought a book about 3D programming,
the worst programming book I've read. I believe that examples worked, at least
the ones that I typed did, but the style was atrocious. The concept of
function parameters seemed to be totally alien to the author. The idiot
created x1, X1, x2, X3, x, xthis, xthat... variables instead. He was a former
BASIC book author too.

I can't warn you because I put it to the trash bin long ago.

~~~
stinos
_He was a former BASIC book author too_

Hmm, I'm starting to see a pattern. Is it possible BASIC, plus lack of
internet back in the day, plus attrocious books are the reasons for truning
poeple into terrible programmers? I happen to know only a couple of seniors
but without exception their code, no matter what language written in today, is
horrible on all fronts. I used to think it was a lack of attention to detail,
their lack of wanting to strive for even the tiniest bit of more than just
'good enough for tady'. Possibly stemming from lack of education and lack of
continuous self-education. But maybe there's more to it. Maybe they were
influenced by a bad book. And/or by a not-so-optimal language like BASIC.

~~~
jerf
There's a sense in which the hardest programming language you'll ever learn is
actually your second one. I think the reason for this is that with just one
language under your belt, you have very little ability to distinguish between
the abstractions the programming language offers you and the capabilities of
the machine, and to distinguish between the abstractions the programming
language offers you and the capabilities of programming itself. So for your
first language, you're learning what is actually just an approximation and
simplification of that first language, where all three of those things are all
mixed together so you don't have to spend the cognitive effort to understand
the differences and you can develop and rely on huge misconceptions without
seeming to pay too large a price, but with your second, you're unlearning
errors about the machine, unlearning errors about programming in general, and
also learning a second language. Particularly difficult if you're making the
leap from something like BASIC to C, where the second language is also
substantially more difficult than the first.

For people of certain of a certain psychological orientation, there is the
additional challenge that having put away your first language, you now think
you are a "Programmer (TM)", and learning that second language and learning
that you have a number of misconceptions can strike at your very identity.
People can get psychologically attached to their misconceptions if it means
retaining the illusion that they have mastery.

Nowadays the easiest way to screw this up is to go to a computer
science/engineering program that uses just one language. As tempting as it may
be from a curriculum simplicity perspective, it's a big mistake. I've
interviewed a number of people who think that Java === computing. Not even the
"JVM", mind you, but Java, the language, itself. I don't blame Java for this,
it's the education. Java itself is not a great lens to understand computer
capabilities through, and it's a miserable language to be your lens to
understand the general capabilities of programming through, especially 10
years ago. (It's slowly getting better, with easy closures and such, but it's
still stuff bolted on the side 20 year later.)

Looking at it from that perspective you can see why 8-bit-era BASIC was even
worse than that. It offers a very impoverished view of the computer's
capabilities _and_ a very impoverished view of the possibilities of computing.
(It was possible to rehabilitate BASIC into at least a passable language; I'm
glad I don't have to use Visual Basic to do my job, but it's still light years
ahead of the BASICs that still used line numbers, and I've done Real Work (TM)
in it, albeit a long time ago.) A 21st-century Java-only programmer is
substantially better equipped than a 20th-century 8-bit-era BASIC-only
programmer.

(By "8-bit-era", I mean the timeframe, not necessarily the CPU. I'm fairly
sure there were BASIC implementations with line numbers and such on non-8-bit-
machines, and they'd still be dangerous. But as computers got into the 16- and
especially the 32-bit era, even BASIC had to grow up.)

~~~
LambdaComplex
> I've interviewed a number of people who think that Java === computing. Not
> even the "JVM", mind you, but Java, the language, itself.

Could you elaborate on this? What exactly made you realize that was how/what
they thought?

~~~
jerf
I had to ponder on what it is that really sets this sort of person apart, and
I think it's the sort of sneering disdain at the idea that any of the other
languages in the world are worth anything, or have any good ideas. Or maybe
it's the way that when you ask them what's good or bad about some other
language, you get back just a list of differences those languages have with
Java, and it is simply assumed that all differences are ways in which they are
inferior to Java.

And let me say again that it's not specifically Java. I've seen a couple of
people that way with C, for instance, though not in an interview situation.

------
bcaa7f3a8bbc
> _Pointers to functions are seen mainly as a way to obfuscate your program.
> "A pointer to a function serves to hide the name and source code of that
> function. Muddying the waters is not normally a purposeful routine in C
> programming, but with the security placed on software these days, there is
> an element of misdirection that seems to be growing." (p. 109)_

> _" GIGO (garbage in, garbage out) is a term coined to describe computer
> output based on erroneous input. The same applies to a human being." (p.
> 152) — ???_

(like the readers of this book?)

Priceless.

------
DanBC
I'd be interested in a review of the author's C++ pointer book _Conquering C++
Pointers_

[https://www.amazon.com/Conquering-Pointers-Robert-J-
Traister...](https://www.amazon.com/Conquering-Pointers-Robert-J-
Traister/dp/0126974209/)

------
bcaa7f3a8bbc
> _" Both programs also contain another value of 43. This is the constant that
> was written directly into the program." (p. 29) — I have no idea what this
> means._

> _I believe that the author thinks that integer constants are stored
> somewhere in memory. The reason I think this is that earlier there was a
> strange thing about a "constant being written directly into the program."
> Later on page 44 there is talk about string constants and "setting aside
> memory for constants." I'm wondering now…_

Yes, most of the book is wrong. In this example, the author probably also
presented this idea in a wrong way.

But the author is correct for having the idea that _" constant being written
directly into the program"_ (by the compiler!) and _" integer constants are
stored somewhere in memory"_, they are correct and make perfect sense. Of
course the integer constants and string constants are all allocated and stored
somewhere in memory (or somewhere that can be mapped as memory). They are
usually known the text segment and data segment.

> _…(remember, the array name becomes a pointer when used without the
> subscripting brackets) "_

> _" …while a pointer, as always, is a special variable that holds the address
> of a memory location." (p. 57) — Still wrong, but slightly less wrong._

Good enough IMHO. It is true that an array "name" is a pointer to its base
address.

~~~
caf
_It is true that an array "name" is a pointer to its base address..._

No, it's not. It _is_ true that an expression of array type, when it is not
the subject of either the unary-& or sizeof operators, _evaluates to_ a
pointer to the array's first element.

    
    
      sizeof array
    

gives the size of the whole array, not the size of a pointer.

    
    
      &array
    

gives the address of the whole array, not the address of a pointer.

~~~
bcaa7f3a8bbc
I've programming under the useful but wrong assumption, that

> array[x] and *(ptr+x) is completely equivalent, so array and ptr is
> equivalent.

until now. Thanks for the clarification.

~~~
8xde0wcNwpslOw
Strictly speaking, the initial part of your assumption is not wrong, but the
later conclusion is.

array[x] and *(array+x) are indeed equivalent for any identifiers 'array' and
'x' (assuming one of those evaluates as a pointer value, and the other as an
integer value; otherwise the code is incorrect). In fact, in this context an
actual array is not subject to either unary-& or sizeof operators, so it
evaluates to a pointer value, fulfilling the precondition.

This is why "array subscription" also directly works with pointers (i.e.
"ptr[x]"), and from the equivalence above follows one of the common useless
facts that you can swap the identifiers (i.e. "x[array]").

(This comment is probably confusing enough without saying that "(&array)[x]"
is valid code too, but isn't the same thing as those before.)

------
ensiferum
Heh, not only this but there are also plenty of C++ books that are literally
less than worthless.

Additionally another common domain of clueless writing is computer graphics
and the related math. There are so many articles written by enthusiastic
people (no doubt) where the information is just adding noise. Finding
trustworthy good quality information requires that you know a considerable
amount already so you know what is good and what is not (talking about online
content here) :)

~~~
flukus
Didn't some of the C++ books have to be crappy in some ways due to limitations
of free (as in beer) compilers with arbitrary limitations? One I vaguely
remember was a ridiculously (even for the time) limited stack size, so a lot
of examples had to do unnecessary allocation just so readers could compile.

I think it was this book ([https://www.amazon.com/Flights-Fantasy-Programming-
Video-Gam...](https://www.amazon.com/Flights-Fantasy-Programming-Video-
Games/dp/1878739182/ref=pd_sbs_14_1?_encoding=UTF8&pd_rd_i=1878739182&pd_rd_r=6ce9213c-7912-11e8-8434-9191398a8362&pd_rd_w=5vYRm&pd_rd_wg=1DW3X&pf_rd_i=desktop-
dp-
sims&pf_rd_m=ATVPDKIKX0DER&pf_rd_p=5825442648805390339&pf_rd_r=MBCC0PXWQSFBW41PV2MT&pf_rd_s=desktop-
dp-sims&pf_rd_t=40701&psc=1&refRID=MBCC0PXWQSFBW41PV2MT)) that taught me 3D
programming better than anything else. The code was readable, the maths was
well explained and it included sections on how to do things without those
newfangled maths co-processors. I'd love to buy a copy now just to see if it
really was a good book or if it led me astray.

------
bjoli
I doubt books like these are as common today, but tutorials are everywhere. I
don't know how often I have found scheme tutorials that teach a language I
barely understand. Not dangerous maybe, but very weird nonetheless.

I see beginners writing code like that all the time, which makes me sad.

~~~
buckminster
Yeah, a few months (a year?) ago I was getting back into Javascript so I went
to the mozilla website [1] to refresh my knowledge of prototypical
inheritance. And it seemed all wrong. So I actually ran their short bits of
example code in firefox and they worked like I expected them to. Their
documentation is just nonsense.

There hasn't been a lot of activity on it recently so it's probably still
wrong.

[1]: [https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Guid...](https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Guide/Details_of_the_Object_Model)

------
gumby
It's books like this that lead to inventions like Java and Go, with explicit
aims of avoiding certain difficult constructs.

Not that pointers are particularly difficult in the scheme of things, but if
someone who doesn't understand them tries to teach them, pointers inevitably
_do_ become "difficult"

------
DanielBMarkham
_"...But like analyzing a terrible movie that somehow gets made, it's more fun
to reason through the “behind the scenes” parts..."_

I do too. Many times I find it much more interesting understanding how the
sausage is made than what the actual sausage tastes like.

This is a great meta review. As somebody who's just published a book for tech
teams, I need to be acutely aware of my own work to make sure I'm not falling
down the same hole: taking a little bit of knowledge and and trying to fluff
it out to appear to be a comprehensive body of work.

It's not just the 80s and old coding books. There has been quite a trend over
the last decade or so of people over-publishing (I guess that's the term).
Promising all sorts of things while delivering on little.

In some ways I think this is okay. The received wisdom is that you don't have
to know everything, you just have to know more than the reader and be able to
explain how to move them a bit forward. Perhaps the key is attitude. Woz says
_"..'Ive always found the authors come from a position of earnestness,
attempting to draw the best conclusions based on decent principles and what
they knew at the time they wrote it..."_

A little humility, careful scoping, and honesty in a tech book can go a long
ways. (Insert long discussion here about whether people would buy such a book,
and how people are much more naturally attracted to books with a strong
emotional impact "Make money with C Now!" than they are books that simply try
to helpfully explain something without all the glitz) There is a natural
tension at work here.

------
nemanjaboric
It seems that the author followed the same logic as the author of the "English
As She Is Spoke"[1] phrase book between Portuguese and English, with the
French dictionary in between.

[1][https://en.wikipedia.org/wiki/English_As_She_Is_Spoke](https://en.wikipedia.org/wiki/English_As_She_Is_Spoke)

------
ZephyrP
I always wondered about the C tutorials Brian Kernigan mentions in his talk (
[https://www.youtube.com/watch?v=8SUkrR7ZfTA](https://www.youtube.com/watch?v=8SUkrR7ZfTA)
) , many of the examples seem intentionally designed to be incorrect by some
trickster spirit.

Now I know the even darker truth.

~~~
leni536
The exact same code sample appears in the talk as in the article.

~~~
jwilk
Yes, the article mentions the sample was taken from the talk.

------
apo
This was a fun read, but left me with the question: What book on C pointers
would be the polar opposite of this one?

I'd like to read that book.

~~~
SloopJon
While I haven't read either of them, _Pointers on C_ and _Understanding and
Using C Pointers_ both have good reviews.

------
oldcynic
Make sure to click the link at the end of the article for code samples with
potential for Segmentation fault:core dumped in 4 lines.

Even more concerning is the book seems to have some positive reviews on
Amazon(!), and just one shredding it.

~~~
saagarjha
The example given at the top of the article can cause a segmentation fault.
See the other comments here for how.

~~~
oldcynic
Returning out of bounds pointers will do that. Was more impressed by the range
of errors in his 4 line examples - the sort of thing you'd expect from a
struggling student not a tutor or author. :)

------
krylon
This reminds me of the first textbook I tried to learn C++ from. It was fun to
read, because the authors seemed to put more energy into writing Limericks
than explaining object-oriented design. I don't think it touched on
inheritance, and things like, I don't know, inline functions or templates were
not mentioned at all.

It did not even contain working example programs, let alone exercises. It was
so bad, as the saying goes, it was not even wrong. I still have that book on
my shelf as a reminder to not blindly buy the first/cheapest textbook I can
find.

------
__david__
Oh wow, I have the "Going From BASIC to C" book! I remember reading it first
because I knew Apple BASIC really well. I also had K&R. Looking back, I wonder
if learning C was made harder or easier by that book…

------
userbinator
I've found that it's usually those with embedded systems experience who are
most knowledgeable about pointers, but I suppose BASIC experence with embedded
systems doesn't count --- the ones I'm referring to usually started in Asm/C,
and others I know who started in Asm (not necessarily for embedded), also are
extremely good at pointer use.

The modern equivalent would probably be Arduino experience. I wonder if there
are similar examples in books out there about C++ written by someone with only
that...

------
korethr
I took a look at some of the transcribed code examples. Understand, I consider
myself a novice at C, one who's just starting to get a clue about pointers.
But reading the code examples, more than once I found myself going,
"Wait...what?".

I briefly tried to learn C++ an C in the 90s. I'm somewhat glad I didn't find
this book in the library. I think it would have made attempting to learn
harder, or given me some bad and dangerous habits.

------
martin1975
Apparently Mr Traister has written a number of other books -
[https://www.amazon.com/default/e/B001H6UPHY/](https://www.amazon.com/default/e/B001H6UPHY/)
\- according to Amazon...

Made me wonder if the quality of the content in the other 11 books is anything
like the one the Woz took apart.

~~~
_7c18
It's not Steve Wozniak. They just happen to share the same last name. From
[https://wozniak.ca](https://wozniak.ca):

> I’m Geoff Wozniak, just one of those persons on the Internet. My blog is
> hosted here, but not much else at the moment.

~~~
martin1975
Thank you!

------
ngvrnd
[https://www.findagrave.com/memorial/29007415/robert-
joseph-t...](https://www.findagrave.com/memorial/29007415/robert-joseph-
traister)

------
g051051
I have this book, a pristine first edition, with maximum wrongness. It's
pristine because I bought it when I started learning C in 1990, and never
opened it after my first read through.

------
microtherion
Usually, that kind of venom in C book reviews is reserved for books written by
Herbert Schildt ;-)

------
azernik
Bad reviews are always so much more fun to read than good ones.

~~~
gumby
To review your comment: "The author's well meaning and accurate comment is
that well-written negative reviews are typically far more fun to read than
positive ones of any quality -- in the positive case you usually want to
simply read the book, while the implication of the comment is that the
pleasure of reading the negative ones is a delicious _Schadenfreude_.

"Regrettably, this insight was obscured by a regrettable poor choice of
terminology ("bad" and "good"). The comments, azernik, has enough HN karma to
suggest that this error ought be assigned to the casual, off-the-cuff nature
of internet commenting, and that this comment is not up to the usual work
(i.e. comments) of the author.

~~~
jwilk
"Regrettably" and "regrettable" is a single sentence, "comments" instead of
"commenter", missing "to" after "ought", lowercase "internet"; all that in a
single paragraph. Error density in gumby's English is astounding.

------
abakus
WTF is this "for (x=y; ...)" part?

~~~
__david__
That's just standard C. It was not really part of the WTF.

~~~
abakus
but y is uninitialized, am I missing something?

~~~
__david__
Yes. The line right above it:

    
    
        y = strlen(r);

------
leafario2
Author of book is a Markov chain?

------
blackrock
The only thing worse than C, is that we got stuck with Java.

------
bitL
C'mon, viewing a book from 1990 with the lens of 2018 must be funny of course;
almost nothing in programming languages aged well, baring some theoretical
principles. Many of the things the book author tried to avoid (like not using
integer indexes) were necessary for writing fast code on 4MHz processors with
256kB of RAM in primitive compilers back in the day...

Can you please add some xBase book review to the mix for more outdated fun and
facepalming from the heightened point of view on a hill of three decades next?
/s

~~~
pjc50
No, all of that example was malpractice in the 80s when it was written. Note
the context: it's highlighted as malpractice in a talk by Brian Kernighan!

~~~
bitL
I'll bite: argument from authority can be reversed - if Brian didn't mess up
the language design, these sorts of "code drivels" wouldn't have appeared! I
think you've heard that argument from many language designers already... Then
we can argue that back in those times it was something progressive etc. And we
are straight back with the argument I was making. When you read StackOverflow,
books on Deep Learning or computation on Spark, how much drivel did you go
through already? Even the language authors are sometimes confused about some
unplanned side effects...

~~~
msla
> if Brian didn't mess up the language design, these sorts of "code drivels"
> wouldn't have appeared!

OK, ignoring the s/Brian/Dennis/ snafu... WTF is a "code drivel"? What are you
trying to say here?

Second, your point is irrelevant: It's pointless to write a book about driving
cars and fill it with a long rant about how riding motorcycles is so much
better. It's a _non sequitur_ , and false advertising. Much like how this book
is presented as a good resource for learning C and is, in fact, a horrible
example of precisely how little the author understood C.

> Then we can argue that back in those times it was something progressive etc.

Aside from your infelicitous attempt at English, this is wrong: This book was
_never_ good, and claiming it was insults the past.

~~~
bitL
I love intellectual debates, but this really isn't one. I just have the
feeling this is an "intellectual" mob in progress. BTW, it's wonderful to
attack people for their English skills; I speak 8 languages but not all of
them at C2 level, despite being grad at one of top US colleges. Would you take
my sincere apologies for offending you by not being on your level?

~~~
msla
I get the impression that you're using odd phrasings deliberately, to
obfuscate a point too weak to stand on its own, and to avoid responding to the
points of others.

