
Modern C [pdf] - brakmic
http://icube-icps.unistra.fr/img_auth.php/d/db/ModernC.pdf
======
jackhack
I wish I could like this book, but after reviewing the first chapter I can
only imagine the confusion of students. I support very much the idea of
breaking the book into levels, but it attempts to cover far too much, far too
quickly and I don't believe this book would be useful for those who are not
already familiar with the language.

I've been writing C since the late 1980s, moved to mostly C++ by the mid 90s,
C# in the 2000s, and now I've come back to C. Most recently built some
realtime components and drivers, having to drop back to C77. I mention this as
I've taught many colleagues along the way and I'm sensitive to the places
where beginners tend to get hung up with problems and I've come to anticipate
many of the questions along the way. Let me take a moment to illustrate the
base of the problems i see:

"Too much, too fast." The best example is right on page 2: a program which
demonstrates a complex printf format string, along with arrays and loops. I
can't help but sarcastically ask "Are you sure that is how you want to
introduce someone to the language?" A beginner's eyes will glaze over.

Seriously, the way to introduce the language is simple examples. Explain the
main is the entry point where all programs begin running, and that main
returns it's success or failure to the operating system (or other program that
called it). 3 lines of code.

Then add a SIMPLE print, if you wish, or a variable declaration. Int. Float.
char. again, it MUST be simple. Introduce loops. Then show how to move some
functionality out of main into a subroutine/a new method/new function, how to
call that function, and return results. Talk about header files, etc.

From there, dive into the rest of the base language... talk up arrays, memory
management, heap/stack, pointers, libraries, exceptions, etc.

But this is only my experience, and I'm sure that it is different for others.
Kind regards.

~~~
xenihn
Do you have any recommendations for books?

~~~
pieterr
K&R:
[https://www.amazon.com/gp/aw/d/0131103628/](https://www.amazon.com/gp/aw/d/0131103628/)

~~~
weaksauce
Is that still relevant to how c looks and acts nowadays? I know you will learn
a lot from it and it's an excellent book, but surely there is a better
reference that is more up to date. Maybe not though.

~~~
joeberon
I think it's still really helpful to learn C and as a reference, but the code
has a very terse and difficult to read style that I wouldn't recommend
actually coding in, for example this is introduced in the first chapter,
before anything is even studied in depth:
[https://www.dropbox.com/s/4hbwyid5jwen43t/Screenshot%202016-...](https://www.dropbox.com/s/4hbwyid5jwen43t/Screenshot%202016-11-28%2019.14.02.png?dl=0)

------
colanderman
While I like a lot of what's in here,

    
    
        for (size_t i = 9; i <= 9; --i)
    

is a pretty terrible example to put in the second chapter. I would not let
that line pass code review. There is no need or place for cutesy cleverness in
C.

EDIT: Ugh, just found this too:

    
    
        isset[!!largeA[i]] += 1;
    

Not only is that confusingly cutesy, but _largeA[i] is a double_. Please DON'T
write – or encourage beginners to write! – such smug code!

EDIT2: In section 5 is the statement than unsigned integers "can be optimized
best." This is flatly untrue on x86 and I suspect many other architectures.
Compilers can and do take advantage of undefined signed overflow to optimize
signed arithmetic; the same is not possible with unsigned arithmetic. See
[https://kristerw.blogspot.com/2016/02/how-undefined-
signed-o...](https://kristerw.blogspot.com/2016/02/how-undefined-signed-
overflow-enables.html)

~~~
yoha
Decrementing loops is the one place where I have indulged in some trickery. I
do:

    
    
        for (size_t i = 9; i --> 0; )
    

This has the advantage to be very easy to pattern-match once known. Obviously,
for a beginner, I would just do:

    
    
        for (int i = 9; i >= 0; i -= 1)

~~~
colanderman
_Everyone_ should write your second example. The first does nothing but
confuse. C has enough hazing rituals without garbage like "\-->".

The fewer tricks and patterns you use in C, the higher chance _actual_ bugs
have of being caught. Cutesy tricks like "\-->" confuse human analysis and
gain nothing.

~~~
yoha
This is about weighing correction versus readability. In the "arrow operator"
version, the readability is decreased; in the "proper" version, a type cast is
required, and this can lead to bugs with values greater than
2^sizeof(ssize_t).

Obviously, I just follow the convention when contributing to an existing
project.

~~~
pwdisswordfish
...values greater than 10?

------
pawadu
The author has also been involved in development of "musl", a modern C11
compliant standard library implementation:

[http://www.musl-libc.org](http://www.musl-libc.org)

[https://gustedt.wordpress.com/2014/10/14/musl-1-1-5-with-
ful...](https://gustedt.wordpress.com/2014/10/14/musl-1-1-5-with-
full-c11-library-support/)

~~~
stinos
_complaint_

yeah I hear that often when talking about C11 :]

~~~
pawadu
hah! fixed that for you.

Who would complain about something as wonderful as C11, outside it not being
available for your compiler?

~~~
stinos
_Who would complain about something as wonderful as C11_

Beats me, but see rest of thread I guess :] In all fairness, sometimes it's
the right tool for the job, sometimes it isn't

------
SwellJoe
It's been a decade or more since I've worked in C (and have never been a heavy
C coder). Is "modern C" really a thing?

I mean, is there some subset of C that is safer than what I think of when I
think of C? I know about stuff like reference counting techniques, rather than
manual memory management, for example, and that goes miles towards safer
coding. But, even so, the variety of ways you can shoot yourself in the foot
with C are seemingly beyond counting. Are threads and async easier and/or
safer now than 10-20 years ago, and with more direct language or standard
library support? Is memory management in the standard library safer today? Are
there concurrency primitives (beyond low-level interacting with epoll or
kqueues or even fork or whatever)?

I mean, it's obviously _possible_ to write reliable, safe, secure, software in
C (Linux, Git, SQLite, all come to mind), but how much easier has it gotten?
Would anyone choose C for a new systems project with no legacy baggage or
dependencies, in a world with Rust and Go?

~~~
SQLite
Go has chosen to omit assert(), because assert() is frequently misused they
say. Antibiotics are also frequently misused, but that is not a good reason to
prohibit them. The omission of assert() makes Go a non-starter.

Rust seems more promising, but it is still not to the point where I am
interested in rewriting SQLite in Rust, though I may revisit this decision in
future years.

Some current reasons to continue to prefer C over Rust:

(1) Rust is new and shiny and evolving. For a long-term project like SQLite,
we want old and boring and static.

(2) As far as I know, there is still just a single reference implementation of
rustc. I'd like to see two or more independent implementations.

(3) Rust's ever-tightening interdependence with Cargo and Git is
disappointing.

(4) While improving, Rust still needs better tooling for things like coverage
analysis.

(5) Rust has "immutable variables". Seriously? How can an object be both
variable and immutable? I realize this is just an unfortunate choice of
terminology and not a fundamental flaw in the language, but I believe details
like this need to be worked out before Rust is considered "mature".

~~~
mixedCase
>The omission of assert() makes Go a non-starter.

A small syntactic sugar you can trivially implement yourself makes Go a non-
starter?

Go doesn't include assert in the language because you're supposed to do
_better_ than assert. Assert easily allows lazy programmers to let their
programs freely crash without properly handling error conditions. Go prevents
you from compiling with unused variables, and that combined with the Go
documentation goes a long way towards teaching new Go programmers how they're
expected to work.

And neither Go nor Rust could possibly be good fits for SQLite. A Go hello
world is bigger than all of SQLite while an idiomatic Rust one is on par, and
neither's nearly as portable as the current C implementation, one that is both
programatically and battle-tested like pretty much nothing else in the world.

~~~
PDoyle
> Assert easily allows lazy programmers to let their programs freely crash
> without properly handling error conditions.

This is what he meant by misuse. Properly-used assertions are meant to
document and check conditions that were thought to be _impossible_ by the
developer. Not just unlikely, or illegal, but impossible. If a condition is
possible, and you check it with assert, that's a bug.

~~~
dvirsky
I came to detest assert when I was working on a project with another guy, who
used it generously, as a substitute for assertions in (non existing) unit
tests. Nothing would annoy me more than working on my code, running it, and
having all sorts of weird assertions pop up all over the place from this guy's
code.

------
chrisd1100
I'm really surprised by the "hate" for C that is appearing in these comments.
What ever happened to actually enjoying the danger of getting low level? Is
assembly also useless because it isn't readable?

There is a lot of great code written in C, and a lot of crappy code written in
C. Because C doesn't protect you from yourself, it exacerbates any design
flaws your code may have, and makes logical errors ever more insidious. So in
this sense, the quality of the C you write is really a reflection of you as a
C programmer, not the shortcomings of the language. Maybe you've been badly
burned by C in the past, but keep an open mind and understand that C can be
beautiful.

~~~
yekim
Hear, hear!

Unfortunately, C does get a lot of hate on HN. I suspect it has to do with
this site's demographics. Many (not all) of the HN clan seem to be oriented
towards / mostly familiar with web based technologies. I suspect that for many
who have tried, going from a web dev environment to a C oriented dev
environment feels like a robust shock to the system.

I'd also be willing to bet that there's an age bias at play here; C has been
around, like, forever. It is certainly not the new hotness. Most (not all)
people that I know who enjoy it and are proficient at it, are 40 or older.
Much of the web based dev crowd that hang around HN seem to be in their 20s,
and as it is a time honored tradition to poo-poo the ideas / methods / tech of
the older generation(s), it's not surprising that C doesn't get a lot of love.

Yes, I realize I'm painting with broad strokes here. It'd be interesting to
see a survey or three that correlates age ranges and tech used on a day-to-day
basis to see if these assumptions or legit. (Anyone got any survey data up
their sleeve they'd be willing to share?)

Me personally - I love it all. C, C++, Java, Python, Javascript, Rust,
Haskell, Scheme, etc. Making computers do things for you, and for other
people, by writing detailed instructions is quite possibly one of the funnest
things in the world. Double bonus for getting paid to do it!

~~~
paulmd
It's not just that HN does a lot of webdev. It's that even in its element as a
"systems language" it's virtually impossible to write 100% safe C/C++ code and
guarantee that it will remain safe into the future, even for experts who are
making every effort to do it right. There are just too many gotchas with
"undefined behavior" and too many clever compilers out there waiting for you
to make a mistake.

One only needs to look at something like the OpenSSL library to see the
problem. You really need to hammer the hell out of C code with something like
AFL to get at a reasonable majority of bugs - and you could hammer out every
last bug one day and then the next day a compiler starts optimizing away your
safety checks. This isn't a theoretical problem, this actually happens. Code
rot is a very real problem in C++, to a far more massive extent than any other
language.

[http://blog.llvm.org/2011/05/what-every-c-programmer-
should-...](http://blog.llvm.org/2011/05/what-every-c-programmer-should-
know_14.html)

[http://www.kb.cert.org/vuls/id/162289](http://www.kb.cert.org/vuls/id/162289)

Personal opinion here, but with few exceptions C/C++ are inappropriate
languages for starting new development at this point. I realize the tooling is
not there yet but I would rather see something like Rust used in almost all
performance-sensitive applications where C/C++ are currently used. Unless you
can guarantee that you are operating in a trusted environment and will only
ever operate on trusted data, C/C++ is just not the right language for the
job.

Yes, it's fast, but at what cost? I would gladly give up a massive fraction of
my performance for better security and portability - and that's why I program
Java. Not that Java is perfect either, but at least I can be certain that the
sands aren't shifting out underneath my programs.

I would actually say that porting the Linux kernel to Rust would be very high
on my wish-list at this point. I am well aware of just how enormous that task
would be and I might as well wish for a pony too, but it gives me heartburn to
think of just how much C code is sitting there operating in the most untrusted
of environments on the most untrusted of data. I have every faith in the
kernel guys to do it right, but the reality is there is a lot of attack
surface there and it's really easy to make a mistake in C/C++. It may not even
be a mistake _today_ , only when the compiler gets a little more clever.

~~~
clarry
I don't think anyone can demonstrate that it is virtually impossible to write
100% safe C code. Sure, you can always find people who don't know how to write
a proper safety check. That doesn't mean nobody knows. You can always find
people who ignore or don't know about best practices, but that doesn't mean
everyone's like them. And you can find people who write goto fail; and ignore
the warnings about unreachable code posted by any half-decent compiler or
static analyzer, yet there are people who will pay attention to that kind of
stuff. People scream UB, UB, C is evil because of UB, but goto fail is
essentially a logic bug, something you could have implemented in any language.
It doesn't need UB to happen.

~~~
sqeaky
I think that you have the formulation backwards. You claim that people can
just write better, and should attain perfection.

> I don't think anyone can demonstrate that it is virtually impossible to
> write 100% safe C code.

I think most people come at the other way. Most people are aware that they are
fallible and wants tools to help with that. Most people strive for perfection
and none will ever actually attain it.

> I don't think anyone can demonstrate that it is virtually impossible to
> discover errors safely in C code.

There is a huge difference simply moving from C to C++ with exceptions. The
type system in C++ can detect several classes of errors at compile time and
prevent then grom going into the results.

Then for runtime problems if an underlying functions throws, it cannot simply
be ignored. Any programmer can miss a single statement, or worse refactor a
function with a void return to one that returns and error code (which then
results in every caller ignoring the return value). However, it takes a
special kind of malice to use something like carelessly catch(...) in C++ to
disregard exceptions so that runtime errors are avoided. C++ with exceptions
has more sane defaults because it fails fast and the failing itself doesn't
need tests until it starts doing something meaningful.

Now imagine the advances in error detection moving to languages that catch
additional classes of errors.

~~~
clarry
> which then results in every caller ignoring the return value

And a whole load of compiler warnings. Worse yet, people who ignore warnings
might ignore them.

> Now imagine the advances in error detection moving to languages that catch
> additional classes of errors.

Languages don't catch errors, tools do. The C tooling has been and still is
constantly improving.

~~~
pjmlp
Lint was created for C in 1979 as the language authors saw how easy it was to
make errors, static analysis is still largely ignored by the majority of C
developers nowadays.

[https://www.bell-labs.com/usr/dmr/www/chist.html](https://www.bell-
labs.com/usr/dmr/www/chist.html)

I am yet to see it being use in enterprise C code.

------
marmaduke
It's nice to see this perspective kept alive. I put some effort into a
numerical library (github.com/maedoc/sddekit) in C99, and I didn't find the
language lacking until I tried to imitate inherited interfaces with virtual
dispatch by hand (empirically I can say, a poor move in C lib design).

I did find it useful to apply rules like only use uint32_t, double & bool as
primitives.

My main wish is that it would be possible to opt into automatic const &
restrict, as a compiler flag or pragma, so that something like

[https://github.com/maedoc/sddekit/blob/master/doc/C.md#alias...](https://github.com/maedoc/sddekit/blob/master/doc/C.md#aliasing-
and-mutation)

would be easier to do.

------
ape4
Goto is considered useful by the book:

The use of goto and similar jumps in programming languages has been subject to
intensive debate, starting from an article by Dijkstra [1968]. Still today you
will find people that seriously object code as it is given here, but let us
try to be pragmatic about that: code with or without goto can be ugly and hard
to follow.

~~~
thegeomaster
I've found goto to be a good way of dealing with exceptions in low-level C.
For example:

    
    
        void* foo() {
            int handle = get_some_handle();
            if (handle < 0) {
                 goto fail;
            }
    
            void* something = some_function(handle);
            if (something == NULL) {
                goto free_handle;
            }
    
            void* something_else = some_other_function(something);
            if (something_else == NULL) {
                goto free_something;
            }
    
            return something_else;
    
        free_something:
            free_something(something);
        
        free_handle:
            free_handle(handle);
    
        fail:
            return NULL;
        }
    

I've seen this pattern frequently in the Linux source code. I think this is an
example of a case where usage of goto improves readability and reduces errors.

~~~
ape4
This looks fine to me. I wonder if C/C++ could be improved by introducing a
new keyword `bail` which is the same as `goto` but is only allowed to jump to
the bottom of the function. That way, codebases can outlaw `goto` but keep
`bail`.

~~~
72deluxe
Are you looking for "return"?

With C++ you can ensure that you have your destructors do the tidy up, e.g. a
messy example

struct cleaner { cleaner(string *toCleanup) : m_x(toCleanup) { } ~cleaner() {
delete m_x; m_x = nullptr; } };

~~~
ape4
Yup, C++ has autocleanup. Of course if you are using fopen() instead something
more modern you'll need to fclose(). Adding a new keywords to C is a long
shot. Perhaps compilers could detect non-cleanup use of goto and give it a
warning.

~~~
int_19h
A typical C++ codebase I'm working on these days would have scope guards
implemented via macros, such that you can do e.g.:

    
    
        FILE* f = fopen(...);
        SCOPE_GUARD({ fclose(f); });
    

This is mainly used for one-off calls to some native API, where writing a
proper RAII wrapper for the managed resource is not worth it.

------
AlexeyBrin
The author provides the book as a free to download pdf at [http://icube-
icps.unistra.fr/img_auth.php/d/db/ModernC.pdf](http://icube-
icps.unistra.fr/img_auth.php/d/db/ModernC.pdf)

------
zunzun
I personally prefer Prehistoric C, which only has the two language keywords
"ugh" and "grunt". Modern C has too many keywords for my taste.

~~~
oblio
> "grunt"

I'm sorry, you cannot use this keyword as it will cause a conflict with
Javascript tools.

~~~
bitwize
Most people use gulp now -- the _reimplementation_ of the reimplementation of
make in JavaScript because reasons.

~~~
fb03
So true. I'm helping people on a Node App and it was the first time I had to
use it to "build" (build what?) the website I should write the API to, lol.

------
nabla9
What alternatives there are for C/C++ if you want to write library that you
can call from Python, R, Matlab, Java, Rust, Lua, node.js ... and have good
performance?

Old ones like Ada and Fortran of course.

There are newcomers like Rust and Go. Are their C api's mature and portable?

~~~
biokoda
> Are their C api's mature and portable?

Rusts C api is completely mature and portable. In fact if you are writing a
library that you want to have a C interface to, Rust is a fantastic choice.

------
pcr0
For someone who studied basic C/C++ in university and is interested in hacking
around in C, should I read this over K&R?

~~~
sk1pper
Don't skip K&R. Probably read both. They're both pretty short.

A really, really good one is Advanced Programming in the Unix Environment.
But, it's pretty expensive.

~~~
grandiego
APUE does not teach C, but the Unix API. Granted, it is an excellent resource
for people interested in such OS family.

K&R is a great resource which covers a lot beyond the syntax, but is obviously
dated from the standard's perspective.

------
plg
[https://pragprog.com/magazines/2011-03/punk-rock-
languages](https://pragprog.com/magazines/2011-03/punk-rock-languages)

------
Waterluvian
I've had no luck learning a language on its own. But I've had a lot of luck
learning languages as part of something bigger. Like C# via. Unity, Swift via.
2D game dev in XCode.

Any suggestions on what I should apply C to as a way to learn it?

~~~
dagw
Arduino and other micro controllers. First of all it's really fun (YMMV).
There is something about writing code that makes things happen in the physical
world which is satisfying in ways that writing code that just affects bits on
a computer isn't. Secondly it's one of the realms where C is still genuinely
important. When you are working on problems where a few hundred bytes this way
or that is difference between success and failure you really start to
appreciate what C has to offer.

~~~
uabstraction
If you want to push this to the extreme, check out the MSP430 Launchpad and
fiddle around with avr-gcc. Pick up a Bluetooth serial module (can be had on
breakout board for ≈$40) and remote control something from any Android phone.
Lots of fun to be had.

Just be aware, microcontroller C programming is pretty far out compared to
regular systems programming. Lots of tasks involve writing bits to seemingly
random memory-mapped registers to change the state of the controller... and
forget about including your favorite libraries. You're lucky if the standard
library fits on the chip. Its very similar to OS kernel development in that
regard.

------
kruhft
The best book I had for learning more about C was titled 'Writing Bug Free
Code For Windows' from the late '90s early 2000's. It contained a complete
object oriented system using simple header tricks and data hiding plus covered
all sorts of pre-processor tricks that aren't evident until you really dig
into what C can really do. I'm sure it's impossible to find now, but
recommended.

~~~
sea6ear
Is this the book you are talking about [1]? If so, it looks like the author
has put it online for free.

[1]
[https://www.duckware.com/bugfreec/index.html](https://www.duckware.com/bugfreec/index.html)

~~~
kruhft
There you go! Thanks :)

------
minxomat
I get a 500 error. Here's an archive link:
[http://web.archive.org/web/20161128093244/http://icube-
icps....](http://web.archive.org/web/20161128093244/http://icube-
icps.unistra.fr/img_auth.php/d/db/ModernC.pdf)

------
qwertyuiop924
Can any C programmers evaluate this book? I don't do a lot of C, so I can't
really do it.

Does it advocate good best practices?

Does it talk about pitfalls?

Does it overemphasize new, possibly less widely implemented, features?

Does it do/not do anything else we should know about?

~~~
shakna
> Does it advocate good best practices?

That may come down to opinion. For example, type qualifiers are bound to the
left.

Traditionally you would write:

    
    
        char *var;
    

They advocate keeping type on the left, name on the right, so:

    
    
        char* var;
    

A few things like that are covered under "Warning to experienced C
programmers".

Personally, I prefer it, but have always done what everyone else expects, so
there are no fights over styling.

> Does it talk about pitfalls?

Absolutely.

At a glance over, they talk about the unexpected way C treats truthy values
(if it ain't 0, it's true), accidentally dereferencing to NULL, and even goes
into goto, when it's good, and when it's bad.

> Does it overemphasize new, possibly less widely implemented, features?

Yes. They assume a C11 compiler, and state it in the introduction. At the
moment, GCC and clang have some disagreements with how some C11 features
should be treated, (GCC accepts a char or a char* for _Generic, clang requires
it to be char _. clang is more technically correct, but GCC is more flexible),
and MSVC is still struggling to implement most of it. [0]

> Does it do/not do anything else we should know about?

I probably need a week to read it more fully, but I'll quote from the end of
the introduction:

> Last but not least comes ambition . It discusses my personal ideas for a
> future development of C. C as it is today has some rough edges and
> particularities that only have historical justification. I propose possible
> paths to improve on the lack of general constants, to simplify the memory
> model, and more generally to improve the modularity of the language. This
> level is clearly much more specialized than the others, most C programmers
> can probably live without it, but the curious ones among you could perhaps
> take up some of the ideas.

[0] [https://msdn.microsoft.com/en-
us/library/hh567368.aspx](https://msdn.microsoft.com/en-
us/library/hh567368.aspx)

Edit: escaping _

~~~
burfog
Note that var1 and var2 are not the same type:

    
    
        char* var1, var2;
    

Traditional style makes this clear.

~~~
pawadu
> We bind type modifiers and qualifiers to the left.

Good idea in theory but your example shows how bad it behaves in practice.

~~~
jjesus
Usually declare variables one per line, so, type is still clear when:

char* var1;

char var2;

~~~
pawadu
C does not in anyway forbid having two definitions on the same line hence

    
    
        char var2, *var1;
    

is valid C while in Java this would be illegal

    
    
        char var2, [] var1;
    

So the syntax is correct and all you are doing is to add style rules that make
it less readable.

~~~
shakna
Having one var per line, with left bound typing seems more readable to me.

That said, I understand the differences in traditions.

I think that C's syntax is flexible that we can each go to our own, and decide
which is more readable.

------
frag
I want to cry... remembering the old good time of C programming... ohhhh

------
awinter-py
Yikes. important words that don't appear in this: 'static analysis',
'verification'.

On the 'wow' side, had no idea there was a _Generic macro. Pretty cool.

------
pksadiq
I would recommend that anyone who hires a programmer should test his/her
knowledge in C (especially in areas like code that produces undefined or
unspecified results), even if the candidate is never going to code in C, ever.

If he/she knows these concepts well, that means he/she have invested much
time, and probably know other things well enough (or can learn them easily).

~~~
loup-vaillant
Let's not confuse low level understanding with the fine points of the C
virtual machine, which drifts from whatever platform you are working on by the
year.

What you really want is an understanding of how whatever code the candidate
may write will map to the underlying hardware. Test for that.

~~~
uryga
do you have any pointers for reading about "the C virtual machine"? i'd love
an accessible explanation of how the model of the hardware predented by C
differs from actual modern chips, but those keywords make it hard to google
for.

~~~
loup-vaillant
Well, like the sibling comment, I can only point to the standard. I'll give
one example: the segmented memory model. In C, the difference between two
pointers is a valid operation iff they point to the same object (or one slot
past it). So it works if it's the same array or malloc() block, but something
as simple as

    
    
      int a;
      int b;
      ptrdiff_t d = &b - &a;
    

is undefined. Modern computers, including most embedded platforms, have a flat
memory model by now, and could implement the operation above without problem.

Another difference I know of is signed integer overflow. Most platforms use a
2's complement architecture, where signed overflow simply wraps around. In C,
such an operation is undefined.

Yet another difference relates to pointer aliasing. On most platform, a
pointer is just a pointer to a memory slot. In C, it is assumed for
optimization purposes that pointers of different types cannot point to
overlapping regions. This prevents practical stuff like type punning, for
which you have to use unions.

------
joveian
I haven't looked at this updated version (site is busy :/) but the version I
looked at a while ago is quite good.

The author's use of register to avoid aliasing is something I hadn't heard
before and seems like a good idea in some cases.

Beyond the learning C aspects, I really hope that some of the author's
suggestions for language extensions are implemented.

------
giis
I looked into the table of contents & jumped into pages of it. It looks like
good material on best practices & optimization rules & tips. Quickly,
bookmarked it & definitely worth reading.

IMO, the title here is misleading, I don't think new feature is added to C to
make it modern.

~~~
lfowles
Eh? C11 isn't modern?

~~~
mitchty
It is indeed. But its more a polishing of C99 imo. At least annex k is
optional now iirc.

~~~
cygx
The C standards kind of have a main theme, eg numerics (aka eating Fortran's
lunch) for C99 (_Complex, restrict, variable-length arrays, type-generic math
functions, ...).

While C11 is indeed to some degree a polishing of C99, its theme is multi-
threading.

~~~
mitchty
Yep, I've not tried the c11 threads yet, pthreads tend to work for me and in
kernel, well not like i'll be using c11 threads anyway. So its been a bit of a
"maybe someday" task. :)

------
sndean
Pardon my noobness, but if I learned and became proficient in C and knew
nothing else, would I have a marketable skill?

Is it possible for C to be a standalone skill, where ones job could be 100%
programming in C, or do you need a lot of auxiliary knowledge outside of that?

~~~
crpatino
> would I have a marketable skill?

Yes. Systems programming and Embedded are your best and most visible playing
fields, but many large, legacy applications were written in C and continue to
be maintained.

> Is it possible for C to be a standalone skill,

No. As others have pointed out, the language + standard library is very
spartan and will only take you so far. This will only get you an entry level
position, and only in teams that are big enough to have some senior people
with spare capacity for mentoring, and a stream of small, self contained tasks
for you do while in training.

To be able to work independently you need to have at least some basic
knowledge of the whole toolchain: Compiling (you need to know to heart the
different steps that are taken by the compiler, and at least its 20% most
common cmdline flags), Building (make), program analysis (lint, valgrind),
debugging (gdb, or whatever comes with the compiler you are using), 3rd-Party-
Libraries (pick 2-3 of: glib, pthreads, antlr, curses, openssl, etc),
Standards (MISRA, POSIX - which is at least as much about the API to Unix-like
OS as it's about the C language).

From there, there are more tools to help you, but those are typically OS
dependent and are not exclusive to C.

------
leighflix
Would any C-lovers recommend this book to people that already know programming
(but not sys programming) wanting to learn C?

I personally know Java, (lil bit) Elixir, and Python.

EDIT: I'll also be reading K&R along side it.

------
pawadu
Any idea if it will be available as a physical book?

------
manish_gill
If people aren't so into this book, can anyone suggest some other book beyond
K&R?

~~~
hackcrafter
Learn C the Hard Way was Zed Shaw's aspiration of a K&R Replacement[1].

If you like it, great, if you don't, you have company[2]

[0] [https://learncodethehardway.org/c/](https://learncodethehardway.org/c/)

[1]
[https://web.archive.org/web/20141205223016/http://c.learncod...](https://web.archive.org/web/20141205223016/http://c.learncodethehardway.org/book/krcritique.html)

[2] [http://hentenaar.com/dont-learn-c-the-wrong-
way](http://hentenaar.com/dont-learn-c-the-wrong-way)

------
MrRobotic
I didn't use C much. What kind of features of are in the book that make it
modern?

~~~
cygx
Being based on a language revision that is only 5 years old instead of 17 or
27, for one.

------
hitlin37
Hi, is there a epub version of this book? The pdf format is painful to read.

~~~
kasabali
Seconded. If it is not feasible I would appreciate a narrow PDF (like A5) with
minimal margins which should be good enough for reading on a Kindle.

------
kzrdude
Will this book be printed? I would love to get a paper copy.

------
maqbool
where i can find Errata list of this book.

------
aleksei
Another day for the HN crowd to express their distaste for C :)

~~~
buserror
Yeah, again and again. it's like blaming a hammer for the potential of
breaking your finger when you use it, and proposing the use of a spoon,
instead.

~~~
adamnemecek
This C apologism is holding the industry back. Software development has
changed many times over since C came out and it just isn't a good tool for
tackling a lot of the issues that we have today. Just about every software
project written in C has some serious bugs.

I personally judge a language by how well it lets you to define abstractions.
In C's case, it doesn't let you do that very well.

~~~
buserror
"Just about every software project written has some serious bugs"

There, fixed it for you. Seriously, vulnerabilities and bugs are found
_everywhere_ not just in C.

I've been programming for 35 years, in so many languages I lost count, and
every time I've seen the 'let's not use C because it'll lead to bugs' it was
to be replaced by another thing that was ALSO leading to bugs, and/or become
so bloated it was in itself... a bug.

~~~
pcwalton
> Seriously, vulnerabilities and bugs are found everywhere not just in C.

The black-and-white security fallacy again!

The simple fact (and this is a fact, not an opinion) is that the most severe
security problems—remote code execution, in particular—are found _way way way
more_ in programs written in C and C++ than in other languages.

~~~
paulmd
I strongly agree with your point in general, memory safety is a major problem
with C/C++ to an extent that is not found in other languages. Even worse,
these often become problems long after they are written when the compiler gets
"smarter" and does a trick with your code. Call it "code rot" or whatever you
want, but it happens a lot more and a lot faster on C/C++ than other
languages.

However, the nature of where C/C++ code is used does lend itself to severe
problems. When you have a JVM vulnerability, you shouldn't be able to get any
farther than a regular user with only sandbox privileges. When you have a
kernel vulnerability, by definition you have the keys to the kingdom.

Of course there are always holes that can be used to escalate, but that also
comes back to the problems of having a primarily C/C++ ecosystem...

------
fb03
ITT: Heated arguments and zealotry. In resume: "C is outdated, its
ubiquitousness is just a historical accident"

"Better tools exist to do this job"

"C is not needed anymore" (Yet no contender has ever come close to it, hehehe
--my2c)

There, saved you a ton of reading time.

~~~
wangchow
A time will come when our entire concept of programming will shift due to
advances in hardware unlike what we have today. Consider quantum computers or
some biological machine etc.

Those who use C and assembly I imagine would be better equipped to understand
the new paradigms. It's best to understand how to implement data structures in
their most rudimentary form because implementing them on new platforms becomes
easier.

In addition, higher-level idioms become easy to understand if the parts that
make up the whole are understood. And underneath all those layers of
translation and compilation we have raw assembly and the bare machine.

~~~
fb03
I agree!

I also believe efforts like LLVM are actively trying to 'bridge the gap'
between both worlds (totally raw VS fully dynamic/scripted). Stuff like
emscripten is enabling the "old farts" and the "newfags" to share common
ground, and that's amazing... I just hope these youngsters keep learning stuff
instead of just piling framework after framework after the new 'hot shit' gets
released in a 6 month timeframe.. really, adhd is in full effect, specially in
the webdev world, and imho that's hurtful.

o/

------
timthorn
Discussion at:
[https://news.ycombinator.com/item?id=13052486](https://news.ycombinator.com/item?id=13052486)

~~~
sctb
We've merged the discussions into this thread.

------
grabcocque
I mean, the problem cut C is not that it is old, but that it is dangerous.

~~~
quickben
Every language is, just about different things.

Array out of bounds is dangerous even in Excel VB.

------
grabcocque
It strikes me as odd you'd even go to the lengths of producing such a book. If
you really wanted to protect people from the worst vagaries C the book should
simply say "don't".

~~~
AsyncAwait
C is still the lingua franca of computing, like it or not.

------
grabcocque
Sounds like an Oxymoron. "Make sure your buffer flow exploits are up to the
minute! Make sure your systematic lack of memory safety totally captures the
zeitgeist!"

