
Essential C (2003) [pdf] - saadadasd
http://cslibrary.stanford.edu/101/EssentialC.pdf
======
cblum
Man, I miss making a living as a C programmer. My happiest days as a dev were
when my "stack" was Linux, C, Makefiles, and some shell scripts. Only thing
I'd change is that source control was svn instead of git.

Sure, there was a lot more to type. Debugging was harder. But there was a
beauty to that. A simple mental model. A sense of control.

These days all jobs seem to be sort of online crap. Piles and piles of layers
and complexity and heterogenous frameworks and tools. Being on call. Never
being able to truly master anything because everything changes all the time.

/nostalgia

~~~
freedomben
Oh man, I feel exactly the same my friend. People who have never gotten into
the C world find it frightening, but it's a beautifully simple language loaded
with (at times dangerous) power. It's so much closer to the hardware that you
can't use a declarative approach easily (which now that I've drunk the
functional programming kool-aid, I do love), but in many ways it's actually
much simpler to understand. You can also be pretty declarative if you are just
smart about breaking things into functions.

I still glance longingly at my dusty paper copy of The Linux Programming
Interface ([https://nostarch.com/tlpi](https://nostarch.com/tlpi))

/nostalgia

~~~
tombert
Honestly, if you use one of the available GCs out there (like Boehm's), and
give up on static typing, and heavily rely on function pointers, you _can_
write C similar to how you'd write something like Haskell. Yes, it won't go as
fast as it the most idiomatic C, and you can't really make an operating system
if you have a GC, but really, how often do most of us actually write code that
can't use a GC these days? Even with a GC, it'll still probably perform better
than 90% of languages.

~~~
cblum
At that point it's practically not C anymore :)

I'm not a fan of adding GC to C. I've hard my fair share of stress caused by
GC issues. It's great 99% of the time but when you run into performance issues
caused by the GC it becomes a very very leaky abstraction.

~~~
kevin_thibedeau
One may as well just use C# without classes.

~~~
n4r9
Pedantically speaking that's impossible. But loosely speaking that's the
essence of how I write most of my C#.

------
tombert
_Potential Flame War Warning_

I've gotten in several arguments with people about why I like C more than C++,
and that's in no small part because I find C to be a lot simpler than C++.
This is an example; I feel you can learn enough C to actually start doing
stuff in a 45 page manual, where with C++ I did it for about a year and never
really felt I had a real handle on all the idioms.

I know there are probably a lot of objective reasons to why C++ is safer or
something, but I've always felt that if you embrace GCC's extensions, and
glibc (and use of Boehm's GC for parts that need to be a bit safer), you end
up with a language that's simple to learn, and has a lot of the features I
actually use in other languages.

That said, this is coming from a very-much-not systems programmer, and I
mostly do Lispey stuff nowadays.

~~~
userbinator
_where with C++ I did it for about a year and never really felt I had a real
handle on all the idioms._

With all the Modern C++ changes happening, it seems like the standards
committee is actively making it harder for people who want to understand the
language completely. I much prefer C to C++ for the same reason, although I
think some features from C++ are genuinely useful, like classes.

~~~
tombert
I found that when I wrote C++, I almost exclusively ended up using features
that were already in C; obviously there are no classes in C, but I was happy
enough using structs and functions that just take a pointer of that struct
type for the first argument.

~~~
Longhanks
Which is why the functions in <algorithm> and many more STL headers are just
that; templated functions. There's nothing wrong with them.

However, modern C++ enables safer and clearer semantics than what raw pointers
offer, such as references or smart pointers.

------
andystanton
Question for C experts, the article states:

 _C takes the middle road -- variables may be declared within the body of a
function, but they must follow a '{'. More modern languages like Java and C++
allow you to declare variables on any line, which is handy._

I know this is not the case in C11 for example, but is there a compile-time
speed-up when declaring variables in this way, or any other benefit?

~~~
beagle3
Old C compilers would first parse declarations, then allocate space on the
local stack (usually by subtracting the number of bytes needed from the stack
pointer), and go on to compile code, knowing that - from that point on -
there's a well defined stack structure (and very often in those days, even a
constant offset from the "frame pointer" or "base pointer" which would be
copied from the stack pointer at that point)

Introducing additional variables later, means that if you emit code "as you
go", you'll be less efficient - which makes no difference today, but was a big
thing in the days C was designed; Most compilers back then were single pass,
emit-as-you-go. There are relatively simple ways to deal with that even under
the single-pass constraint, but in those days and the common compiler
construction techniques prevalent at the time, it was considered harder.

There was always an issue of fixups with scope-exiting control transfer like
_break_ and _goto_ \- however, they are simpler, and don't harm emit-as-you-go
compilation to the same extent.

~~~
jandrese
On the other hand, one nice thing about C compilers is that they are _fast_. I
always inwardly shudder when I see a C++ file including Boost, because I know
the compiler is going to have to chew on it for several seconds every time
there's a change.

~~~
rleigh
This slowness isn't the result of where you can declare variables though!

------
walshemj
One thing through I have never pronounced char as car

~~~
liamcardenas
Me neither, if it is supposed to be an abbreviation of "character", I would
pronounce it as "care" (based on my SoCal accent, at least).

I call it "char" because that's how it's spelled, though.

~~~
jwdunne
I've always pronounced it "car". I've never made the connection with my
northern English accent pronunciation of character i.e "caractah" \- though
the Mancunian glottal stop 'er' is hard to articulate :)

------
cestith
As much support as there has been built into processors and compilers to get
better performance out of the simple machine model of C, I must wonder if a
new low level language aimed at exposing things like SIMD, large caches, SMT,
and all to the programmer might catch on.

------
hans0l074
When I used to program in C decades ago, I felt the need to read literature,
write small test programs etc - before launching into the actual problem
solving/solution building exercise. That is, I wanted some sort of internal
"armed with the knowledge now, I can get down to brass tacks..." feeling. But
I see younger devs these days using advanced frameworks as lego blocks and
getting right into "making it work/prototyping" mode, with bits and pieces
from Stackoverflow etc., whereas I find myself hesitating to even start. I'm
jealous.

~~~
romeisendcoming
Yes, and these devs come up with the worst form of spaghetti dependencies and
nightmare operational environments. This and mega scale deployment has brought
us to the container, automation and orchestration stage where no one is
competent to actually deal with anything other than bundling code into where
it worked once and pushing it everywhere.

------
dang
From 2016:
[https://news.ycombinator.com/item?id=11671985](https://news.ycombinator.com/item?id=11671985)

~~~
kensai
I don't understand why HN does not see the previous link as the same and give
points to that last thread instead of creating a new one and then forcing a
member to post the old link with the old discussion and me this silly
comment...

Is it designed to work like that or an oversight?

~~~
greenyoda
That previous thread was from 2016. HN Guidelines allow reposting of an
article if it hasn't had significant discussion in the last year:

[https://news.ycombinator.com/newsfaq.html](https://news.ycombinator.com/newsfaq.html)

However, links to the original discussion of an old article might be useful if
people are interested in the topic but no discussion occurs on the current
posting. Or, the previous discussion may have some interesting comments that
are worth revisiting.

------
rmdashrfstar
I’ve been reading through a lot of your comments and you’ve made me want to
review my existing code bases and simplify things! I’ve been that intern who
has had a large legacy C/C++ codebase dropped on my lap, and I would have
certainly appreciated it had the original developers had your mindset going
into their development. It certainly makes sense to me why we, as developers,
should be writing both performant but readable-first code that allows those in
the future, given an inherent assumption that the project we’re working on
will succeed, to be able to easily maintain, debug, and improve the codebase.
I think if the language-specific constructs offer a significant advantage that
is not achievable in another, more straightforward, language-agnostic manner,
then the intention and explanation of its function should be clearly annotated
in comments from the original developer (with reasoning, pitfalls to avoid
when attempting to debug, refactor, extend, etc).

------
chmaynard
When I started taking undergraduate CS courses at Stanford in the late '80s,
there were some wonderful instructors teaching intro courses, including Stuart
Reges, Mike Cleron, and later Nick Parlante. It's great to see that Nick is
still at it and has had a distinguished teaching career at Stanford.

------
gamma-male
Remember. Don't write stuff like a = b[i++] or a ^ c | d or use all these
ambiguous C specific tricks that make it harder for everyone to read your
code.

~~~
huhtenberg
> a = b[i++]

You can't be serious. How is this ambiguous in the slightest? Assuming that
you actually know some rudimentary C.

~~~
gamma-male
Because a = b[++i] means something different. It's a dense and unecessary
shortcut.

~~~
userbinator
At the risk of being downvoted again (for which I don't much care, but it does
say much about the mindset...), is it so very much unreasonable to ask of
programmers using a language, and more importantly a language they will be
using very frequently, to just _learn the language_!?!?

The amount of dumbing-down that I've seen happen to programming is already
beyond ridiculous. You should definitely look at APL, Lisp, or some of the
other more expressive languages out there if you think anything beyond the
equivalent of glorified Asm (one statement per line, one operation per
statement, one use per variable...) is "dense and unecessary[sic] shortcut".

I think this a relevant article to start understanding the opposite point of
view:
[https://news.ycombinator.com/item?id=13565743](https://news.ycombinator.com/item?id=13565743)

~~~
rleigh
You're absolutely right that developers should know this stuff. But that's to
some degree the wrong point to make. When considering the maintainability and
correctness of the codebase, "clever" hacks are often not desirable. Nowadays,
there's no performance gain to be had by indulging in such tricks; the
optimiser will do the right thing for both cases.

The problem for me here is not that it's "too complex", but that it's
ambiguous even for experts with a casual glance. It makes perfect sense _right
now_ after you've written it, but months or years later, the person reading
this while doing some maintenance or debugging might glance over and not see
the subtlety of pre- vs post-increment while they are busy with other tasks
and deadlines. It does make sense to avoid such pitfalls, where possible.

------
bflatt72
Great resource. Thanks

------
deepGem
"All this makes C fast but fragile."

IMO all these features of C makes programming and therefore system design
anti-fragile. Anecdotal experience based on interactions with a few C
programmers who work on embedded systems.

