
Let’s Stop Bashing C - ingve
http://h2co3.org/blog/index.php/2016/12/01/lets-stop-bashing-c/
======
ekidd
I could care less about integer division and minor details of syntax, but
that's not going to stop me from being grumpy about C's continuing popularity.

Every C program I have ever worked on has been riddled with integer overflows,
undefined behavior, buffer overflows and bugs. This was perfectly acceptable
for a Vax or a Sun workstation in the 80s, and at forgivable in the mid 90s
when plenty of developers still only had 8 MB of RAM and wasted half of that
on Netscape Navigator.

But we live in an age of Pwn2Own and ClusterFuzz, where even the tiniest of
bugs can be used as step 4 in a 6-step exploit, and where C compilers reserve
the right to completely and silently break your program at the faintest whiff
of undefined behavior.

I know my limitations: I'm not smart enough to write C code that deals with
hostile data off the network. Neither are the authors of 99% of the C code
that I've seen, with the possible exceptions of djb and a few OpenBSD
contributors. It's long past time to invent and use better systems languages.
Choose what you want, but choose something safer.

~~~
Annatar
That's an easily correctable problem: learn the living daylights out of
programming in assembler, and then C will be like a walk in the park: no more
overflows, no more pointer screwups, no more forgetting to free the memory,
because you'll know exactly what will happen in the chip and the memory when
that code compiles.

You'll even start writing code which relies on integer overflow (rollover from
#$ff to #$00 again) because, depending on the situation, it will give you a
performance boost, or make the algorithm implementation simpler, or both, and
you'll know it.

 _It 's long past time to invent and use better systems languages._

I can't comment on Go, but the textbook examples of Rust and even treatises on
the subject from recognized experts in the computer science field (Adam
Leventhal) show it to be an utterly overcomplicated language for which there
should be a criminal trial and a prison sentence of several decades.
Immutables, borrowing, no, I have to stop; my head hurts already, and I'm
getting a migraine thinking about solving even the simplest of classic input-
output problems with such an overly complicated language. (And my head wants
to explode when I compare Rust with AWK for solving input/output problems, the
biggest reason why computers were invented.)

So... C, C, and more C, because it's infinitely flexible and simple. I like
simple, because simple is smart. Complicated and complex isn't.

Let's all please be humane to one another and stop inventing more languages.
We need less languages, not a bazillion variations on the one central task:
making the computer do something. And for that, we already have all the
languages we could ever want or need, and for the theoretical case where we
don't, there is always the almighty assembler, in which exist no limitations
as to what one can program with it.

~~~
pcwalton
> I can't comment on Go, but the textbook examples of Rust and even treatises
> on the subject from recognized experts in the computer science field (Adam
> Leventhal) show it to be an utterly overcomplicated language for which there
> should be a criminal trial and a prison sentence of several decades.

You sure are angry with me for working on a programming language.

~~~
sanderjd
Yeah I don't get this sort of attitude at all. It seems like a very simple
sequence of steps:

1\. Creating something non-trivial in language X, 2. Identifying things that
language X made difficult, 3. Brainstorming ways to implement solutions to
those issues within a set of acceptable trade-offs, 4. Concluding that some of
the brainstormed ideas might actually work, 5. Working really hard to
implement and iterate on them.

I can't figure out which step the parent commenter scorns. I guess probably
step 2, where they see no pain points to begin with. That's fine, but trying
to convince people they haven't felt pain that they have felt is a silly
battle. I guess they're actually implying that people who have felt that pain
are just not smart enough.

~~~
Annatar
No, I'm implying that people are either too egotistical or that they
overestimate their insights, or both, when they create a new language. These
days it seems like there is a new language every 15 minutes. It's getting to
be ridiculous.

And the worst part of it is, no such language covers all problems effectively;
they all have flaws in one way or another... so the next guy comes along
thinking he can do it better, and just repeats the mistake of his or her
predecessor(s).

Instead of realizing that this would just exacerbate the problem, they go
ahead and make it worse.

 _I guess they 're actually implying that people who have felt that pain are
just not smart enough._

It is extremely difficult to solve a complex problem in a simple manner. One
literally has to be a genius with a lifetime of experience and insights from
that experience in order to be able to pull that off, and even then, it
doesn't always work and requires multiple iterations, and lots and lots of
experimentation. Case in point: Go, still a work in progress by exactly such
people. That should be instructive, but it seems that it isn't.

~~~
sanderjd
> And the worst part of it is, no such language covers all problems
> effectively; they all have flaws in one way or another...

Which is exactly why you don't try to create a language that covers all
problems effectively, you define the problems you'd like to speak to. If you
have defined those problems in such a way that they are shared by many people,
and if you solve those problems in a way that many people find effective, then
you may have created a compelling solution. How does that exacerbate "the
problem"?

Honestly, what even is "the problem"? Why does it bother you that there are
new languages created? The "literal geniuses" from the Bell Labs era
experimented a ton, and that was a great thing then, and remains a great thing
now. It's always valuable to point people to prior art, but it makes no sense
to bemoan research and experimentation.

~~~
Annatar
_Honestly, what even is "the problem"? Why does it bother you that there are
new languages created?_

The problem is that new languages have such fatal flaws, that they make the
problems they are trying to solve even worse.

The second, and much more serious problem, is that it takes at least ten years
to truly master a programming language. Humans only live about 70 years on the
average, which means that it is impossible to keep up with every new language
that comes out. I mean I live for this stuff, but I would literally have to be
chained to the computer my entire life without time for anything else. That is
nonsense.

Anybody who thinks they have mastered a computer language in less than ten
years is either a genius, or unbelievably egotistical and arrogant; and
knowing a whole bunch of computer languages in a perfunctory manner, enough to
make one dangerous leads to a really shitty job of sorting out that mess
afterwards... that's the problem. My problem, and likely other people's
problem.

~~~
sanderjd
But _you_ don't have to learn all of them... If you think a new language makes
the problems its trying to solve even worse, you definitely shouldn't bother
with it. But many others may disagree with you, and decide to invest that 10
years in really learning it. There is no problem with this.

What I don't understand about your position is its absolutism. You keep
talking about people being egotistical and arrogant, but it seems like you're
the only one on this thread being those things, by claiming your subjective
opinion as objective fact, and then making ridiculous hyperbolic statements
about throwing people in jail based on it. It's not a contradiction for you to
personally dislike a language's philosophy and for that language to have been
a worthwhile creation and valuable to many people.

Your feelings about different languages are just, like, your opinion man.

~~~
Annatar
_But many others may disagree with you, and decide to invest that 10 years in
really learning it. There is no problem with this._

Just wait until you invest ten years of your life to master a language, only
for everyone else to move on to the next new trend, with you stuck holding the
bag.

AngularJS, Rust, and node.js, is it? I can't wait to ask you how you're
feeling about all that time you invested into all of those, when all the kids
move on to something else and they're no longer popular, for no reason other
than the newcomers' ignorance, or because something shinier came along, but
didn't actually add any value, just re-implemented the old thing in a worse
way.

We can then have a discussion about how you're feeling, having wasted a decade
or more of your life away on what will then be perceived irrelevant. That's of
course under the (optimistic) assumption that HN is still going to be popular
in ten years...

------
kstrauser
My complaint against C is that some of the smartest people on the planet use
it to write infrastructure that regularly fails catastrophically. If the
experts who love it aren't capable of writing correct code, I have no chance
whatsoever. Humans just aren't good at the fiddly bits (remembering to free();
not to free() twice; not to use after free(); accidentally adding a second
clause to a non-bracketed if expression; how many times to we see these in
vulnerability reports?).

C is obviously very powerful. There's no denying that. But it does
approximately zero of the things to help programmers that a modern language
does. It's not even strongly typed in the usual sense (is there the equivalent
of casting void pointers anywhere else?).

Finally, C is slow. Yeah, you read that right. Its main problem is that it has
no non-wizardry high level semantics for parallelism. For example, modern
languages have map() to say "do this operation on all of these items". In C,
you have to write that as "for each item in this collection, do this thing,
then the next, then the next, ...". If the compiler is exceptionally smart, it
might infer what's going on and help out. But with map(), you're explicitly
telling the language "I want all of these things to be done" and it doesn't
have to _infer_ anything.

I respect C. It's given us a lot of great stuff. But there's no way I'd
willingly start a new project in C today given the alternatives. I am dumb,
and given the choice between languages which help me and languages which are
basically a small step up from assembler, I'm going with one that gives me a
reasonable shot at writing safe, fast, and understandable code.

~~~
dzdt
Are there any real world examples (with actual measurement) of languages where
map uses hardware parallelisation to be faster than a c loop? Serious
question! My impression is, no, outside of something like hadoop.

~~~
kstrauser
I don't have any off the top of my head, but I used a forking map in Python
where the inner call was an image processing operation, and it was
approximately #cores faster. I think it'd be enormously dependent on workload:
if the inner operation is adding two ints, the gains might be lost in the
parallelization overheard.

Map is just one example of a high-level construct that permits faster code,
though. A lot of us here have done async IO in C because that was the option
we had available to us for doing it, but I'd vastly rather do such stuff in
Python (3), Go, or Node where the language itself gives you constructs for
describing what you're trying to accomplish.

 _My impression is, no, outside of something like hadoop._

That's a pretty big exclusion, though! We use Hadoop-like stuff on, ahem,
sizable quantities of data. That's not exactly an edge case.

------
pmontra
> Spaces are for the human eye, while braces are for the compiler

and for the editor auto indenter too.

I don't like doing the job of the compiler but this is an exception.

I already wasted more time in a few months of Python by hand aligning and
debugging code after I moved it around than it saved time for me by not having
to type } or end at the end of the block. Note that Python has a { in most
cases: it's a mandatory :

And how about the time lost because of "IndentationError: unexpected indent"
when copy-pasting code from the editor to the command line interpreter?

To the point of bashing C, the post is an answer to a very few points of
[https://eev.ee/blog/2016/12/01/lets-stop-
copying-c/](https://eev.ee/blog/2016/12/01/lets-stop-copying-c/) It's mostly
about syntax and it's an interesting read. I particularly like "No hyphens in
identifiers"

~~~
robert_tweed
Perhaps C's biggest mistake (at least in terms of syntax) is allowing _fewer_
braces than it could have. Specifically the fact that statements like if and
for which allow blocks, they aren't required, so you can end up with code like
this:

    
    
      if(foo)
         bar; baz;
    

Which, when you throw in a preprocessor has probably caused more bugs than
anything other than out-of-bounds pointer errors.

Of course the fact that everything, including a block, is a single statement
is C is one of the things that gave me my first big "whoa" moment when
learning to program, and makes me glad that C was the first "real" language I
learned (after Basic). K&R C is far more elegant than people give it credit
for, probably because modern, real-world C is full of all sorts of weird stuff
(and often egregious preprocessor abuse - yes, I'm looking at you PHP), or
people learn C++ first and assume it's all the same.

~~~
RUG3Y
Is the above code example, though legal, considered bad practice?

~~~
nkurz

      if (foo)
         bar; baz;
    

_Is the above code example, though legal, considered bad practice?_

Yes, misleading indentation like this is bad practice for anyone who believes
that there is such a thing as "bad practice" exists. For clarity to those who
do not know C, the problem is that "baz" is executed unconditionally, even
though the formatting misleadingly implies that it depends on "foo".

Unfortunately (in my opinion) there is not consensus on whether the following
is bad practice:

    
    
      if (foo)
         bar;
    

My belief is that this formatting should be avoided, to avoid the case where
someone not familiar with the rules of C (or not thinking about them) edits it
to add a manually indented "baz" on the next line:

    
    
      if (foo)
         bar;
         baz;
    

Personally, I'm fine with a single line without braces:

    
    
      if (foo) bar;
    

But I believe that as soon as the body is moved to another line, braces should
be required:

    
    
      if (foo) {
         bar;
      }
    

This belief is common, but not universally shared. Some go farther and say
that braces should always be required (there is a good argument for this).
Others say that a two line if statement without braces is just fine (I think
they are wrong).

~~~
RUG3Y
Thanks for your informative answer.

------
coreyp_1
I like this short article for a few reasons:

1\. It makes the case that just because you don't like something, it doesn't
mean that there's not a good reason behind it.

2\. Just because you like something, it doesn't mean that it is good (or bad).

3\. The things that the naysayers constantly bring up just aren't really that
bad to begin with.

C is amazingly powerful, and I love it. I love JavaScript, too (especially
since ES6!), but I don't confuse where I should use one rather than the other.
I tolerate Python. ;)

Languages are just tools, and you should use the one most appropriate for the
job at hand.

~~~
quanticle
Tools should be ergonomic and should have guards and other safety mechanisms
to make it easy to do what you intend to do, and difficult to do things that
you do not intend to do. C tends to do the opposite of that. It makes certain
things that you want to do more difficult than necessary, while making it very
easy to do things that no reasonable person would intend to do. Evee's
original post highlights that.

In fact, if you look at this post, some of the so-called defenses seem more
like indictments to me. For example, in the increment/decrement section, there
is this:

 _But look, there’s an even more direct argument: the ++ and -- operators are
not even “one” operator. There is a prefix and a postfix variation of them
that do slightly different things. The usual semantics is that the prefix ones
evaulate to the already-modified expression, while postfix ones yield the
original value. This can be very convenient, as in different contexts, code
might require the value of one state or another, so having both versions can
lead to more concise code and less off-by-one errors. Therefore, one can’t
simply say that “++ is equivalent to += 1“, as it is simply not true: it’s not
equivalent to at least one of them. In particular, in C, it’s not equivalent
with the postfix increment operator._

This illustrates exactly the point that Evee was trying to make. It's hard
enough to justify having increment and decrement as their own operators (as
opposed to a performance optimization implemented by the compiler). Having two
variants, which work exactly the same in the vast majority of instances, but
do subtly different things in e.g. if-statements and for-loops is complete and
utter madness.

~~~
accatyyc
I really disagree that this is madness. It may be madness to someone not used
to coding in C. But if you code a lot of C you _will_ know when to use which
one of these, and you will know that they're immensely useful.

For example, iterating and adding stuff to arrays:

array[index++] = object;

instead of

array[index] = object; index += 1;

and the other way around:

array[++index] = object;

instead of:

index += 1; array[index] = object;

There is no way you will miss incrementing your index here, or do it in the
wrong place. This is so common, that as a C-programmer you WILL recognize this
pattern.

It's even better when iterating pointers instead of integers. May I ask what
the following means to you?

float *float_ptr; float_ptr += 1;

I think writing ++float_ptr here is just much more clear. Incrementing
pointers by integers just feels wrong when what you really are doing is
incrementing it with sizeof(float_ptr).

~~~
pka
Until the logic is complex enough and you screw it up. Then you write over
'\0' in some string and have a root exploit. Congratulations!

float *float_ptr; float_ptr += 1;

is no more confusing than

float float_ptr[...]; float_ptr[1];

------
notacoward
Even more important, let's stop seeing bash.

If somebody had set out to create a language prone to error and exploitation,
bash would result. (It's fine for interactive use BTW, just not as a language
in which to write anything complex or long-lived.) While it might not have C's
pointer/memory problems, its lack of real types or scope rules and its other
idiosyncrasies (e.g. making it nearly impossible to distinguish "" from an
undefined variable or the whole $*/$@ mess) make it even more of a nightmare.
I've actually gotten pretty handy with it because I had to, but it's a far
worse crime against developer sanity than C ever was.

~~~
seagreen
Totally agree! By the way, the Haskell community is working to make scripting
easier in Haskell, so if you want to go from one of the most buggy languages
straight to one of the least, now you can:
[https://docs.haskellstack.org/en/stable/GUIDE/#script-
interp...](https://docs.haskellstack.org/en/stable/GUIDE/#script-interpreter)

(Downsides: Haskell is weird (but good), you have to have Stack installed, and
the first time you run the script it will take a while to download
dependencies)

More resources: [https://github.com/Gabriel439/post-
rfc/blob/master/sotu.md#s...](https://github.com/Gabriel439/post-
rfc/blob/master/sotu.md#scripting--command-line-applications)

------
kutkloon7
Hating C because it has integer division is like hating haskell because it is
functional. It is inherently low level. C's real problem is that is has
undefined behavior in a lot of cases. C with some kind of compile-time memory
protection and no undefined behavior (so basically Rust) would be a pretty
sweet systems programming language.

------
radiowave
I do wonder whether this might have missed the point.

"Let's Stop Copying C" isn't a criticism of C, so much as a criticism of other
languages that have applied a kind of conceptual copy-paste of C-like
characteristics.

Demonstrating that there are good reasons for those characteristics to exist
in C talks entirely past the question of whether the sharp edges of some of
those characteristics are still warranted in other languages with other design
goals.

~~~
m45t3r
The author completely missed the point. It is pretty clear in the Eevee's post
(hell, it is even in title) that she isn't saying that C is bad per see,
however she thinks that modern higher level languages copying C design parts
is silly because it really is. Heck, Java has a god dammit reserved goto
keyword for the sake of it.

For sure, the author takes the most controversy points of Eevee's points, that
AFAIK are mostly (her) personal preferences. However it does not enter in the
interesting parts of her argument, like weak typing, C-style loops* or textual
inclusion.

This post was much less interesting read than Eevee's post, that makes me
wonder if the author did a click-baiting title just to try to get some clicks
on his blog following Eevee's post...

*: really, this is the most retarded thing of the whole C-family languages; it kinda makes sense in C, however if you're writing a modern language that only supports this kinda of loop (pre-ES6 Javascript comes in mind), you're retarded.

------
pjc50
Interesting list of things to bash C and Algol-derived languages on. I really
like Eevee's original approach of doing a comparative survey rather than
talking about things in isolation. ( [https://eev.ee/blog/2016/12/01/lets-
stop-copying-c/](https://eev.ee/blog/2016/12/01/lets-stop-copying-c/) )

Point by point:

\- textual inclusion and macros: yeah, this technology was obviously chosen
for ease of implementation on small systems and makes little sense. Now
obsolete and a handicap. Especially in C++.

\- optional block delimiters: responsible for gotofail and similar errors.

\- modulo: symptom of C not having standardised maths semantics. Standardise
your semantics, language designers! Don't just say "whatever the CPU gives us,
I'm sure it'll be fine".

\- octal: Nobody uses goddam octal and it's a trap for people used to leading
zeroes in decimal.

If we're changing this then I'd like to appeal for some features from Verilog
e.g. 8'b1101_1101 : leading width specifier, and the internal underscore which
is ignored but provides visual clarity.

\- power operator: meh.

\- switch: actual operation is kind of bananas, see Duff's device. Should be
block-based.

\- integer division: Careful. Huge argument here about maths.

(tbc)

~~~
dom0
> \- octal: Nobody uses goddam octal and it's a trap for people used to
> leading zeroes in decimal.

0b..., 0x..., 0o... the latter is still a bit subtle (0Oo), but far more
distinct and far less likely to happen by accident.

> \- switch: actual operation is kind of bananas, see Duff's device. Should be
> block-based.

the goto-table semantics of switch are kinda useful in many instances, OTOH a
compiler should be smart enough today to achieve the same degree of rather
simple optimization.

------
module0000
Writing performant interpreters is _really hard_. People wrote those
performant interpreters in C because it was necessary. Rather than blog about
what's wrong with C or C++, why not write re-write your Python or Ruby
interpreter in Rust/Swift/Whatever - that should _really_ show us dinosaur
C/C++ programmers that we were wrong all along.

PS: it's not as easy as you might imagine.

~~~
Manishearth
The original blog didn't blog about what was wrong with C or C++. It blogged
about other languages blindly picking up design decisions from C even though
they don't really have a reason to.

And Rust has been consistently delivering software with performance on par
with or faster than C counter parts.
[http://blog.burntsushi.net/ripgrep/](http://blog.burntsushi.net/ripgrep/) is
a recent example. Servo is a more long term one.

------
IshKebab
Weak arguments.

Integer division _is_ misleading. It uses the mathematical division operator
for something that doesn't behave in that way. Having a separate 'integer
division' operator (e.g. //) is much clearer.

Nobody could say that the pre/post increment aren't confusing. They feature
heavily in 'trick question' C++ tests (along with undefined behaviour, type
promotion and so on).

~~~
zamalek
> It uses the mathematical division operator for something that doesn't behave
> in that way.

Floating point does not behave in a mathematical way, either.[1][2] Unless
your language uses fractional representation using big integers for real
numbers.

> that the pre/post increment aren't confusing

The operator and retrieval of the variable are two separate operations that
occur in the order that you read them. None less trivial than "I" before "E"
except after "C".

[1]: [https://en.wikipedia.org/wiki/Single-precision_floating-
poin...](https://en.wikipedia.org/wiki/Single-precision_floating-point_format)
[2]:
[https://en.wikipedia.org/wiki/Arithmetic_underflow](https://en.wikipedia.org/wiki/Arithmetic_underflow)

~~~
IshKebab
> Floating point does not behave in a mathematical way

It approximately does. The only common way that newbies would get tripped up
is something like (3 * 1/3) != 3. Maybe that is a case for not allowing ==
operator to operate on floats (use a function/keyword instead), but I think
that may be a step too far.

~~~
vonmoltke
I don't think that is a step too far. When I was writing embedded code, == and
!= were prohibited for floating-point values. I support such a rule in
general.

~~~
krylon
Years ago, I worked on software that dealt with computational geometry a lot,
and I quickly found out the hard way that the equality operator is pretty much
useless for floating point values.

On the other hand, that is a property of floating point numbers, regardless of
what language one uses.

~~~
dragonwriter
> On the other hand, that is a property of floating point numbers, regardless
> of what language one uses.

That's true, but, OTOH, having floating point rather than an exact
representation be the default representation for values represented as decimal
literals, and the only non-integer type supported by convenient operators is a
language-specific "feature" of C that many other languages don't share.
(Though, to be fair, there are also many newer and popular languages that do
share both, and more just the first, of those features.)

------
defgeneric
C really is still a beautiful language.

~~~
yoodenvranx
I like the language itself, I just dread the tooling around it. If there would
be a better build system, easier integration of libraries, better strings and
an overhauled standard library I would start to use it again.

~~~
SwellJoe
It struck me during the "Modern C" discussion a day or two ago that C would be
very well-served by a good package and build system with dependency management
(akin to npm, cargo, cpanm, gem, etc.). While there are some pathological
cases (npm, with it's insanely long dependency graphs...I once installed a
small package that pulled in 53,000 files in dependencies), the value of a
massive standard library that everybody is using, contributing to, and
testing, is just incredibly valuable. There's a gazillion lines of C code out
there, but finding/using/distributing it can be tricky. If it's not part of
the standard OS distribution, in particular.

Make, while I quite liked it in its day, has some real limitations. cmake,
while better on some fronts, doesn't actually solve the right set of problems
(or, it solves the set of problems as they were perceived ~15 years ago).

I wish there were a strict subset of C (compliant with standards, but that
prohibits the trickiest bits during build), with a good package/build system,
a huge/modern library of high level functionality. I always enjoyed poking at
C, but I just can't spend enough time coding in C to ever be really good at
it. I can go years between working on C code, so I forget all the gotchas by
the time I look at it again. It's not forgiving of casual users the way Ruby,
Perl, Python, Go, and even (some) Java can be.

Then again, I guess Go is kinda that language for some classes of problem (not
the very lowest level systems stuff, but I rarely do any of that kind of
coding, anyway; I haven't touched a kernel module in a decade).

~~~
yoodenvranx
> a good package and build system with dependency management (akin to npm,
> cargo, cpanm, gem, etc.)

In most cases I don't even need a full build system. I do most of my hacking
in Python and I often create small "throwaway libraries" to create a nicer
structure of my project. In Python you can do this by just creating a sub-
directory and a __init__.py file. This takes 45 seconds and then you can
easily "import lib_xyz" in your project. And if you need this library in
another project you can just copy/paste the. (I know, this is not an ideal
solution but when I am hacking on some stuff it is a good way to do things
because it takes no effort and brainpower to do it)

~~~
SwellJoe
You can whip up a makefile in seconds to do simple stuff, too. That's not the
problem I'm talking about; it's not that make is _hard_ , it's that it is
incomplete (and kinda hard, for advanced stuff).

------
jetru
Bashing C is like bashing English. Well, maybe Tolkien Elvish might have a
better grammar system, but really come on now.

~~~
throwanem
Natural languages and programming languages aren't really comparable. The
former evolve and are beholden to their history, because you can't tell people
how to speak. The latter are user interfaces, and to suggest that the only
meaningful criterion in user interface design is that it should match what
previous designers have done is just self-evidently silly.

~~~
devnonymous
> The former evolve and are beholden to their history, because you can't tell
> people how to speak

Think about that for a moment in the light of Python 2 - Python 3 or K&R C -
C99 or original C++ - C++14

~~~
throwanem
"The former" refers to natural languages, not programming languages. What you
cite is one of the primary reasons I called the two classes incomparable. If
programming languages worked like natural languages do, Python 3 wouldn't have
been controversial, or exist.

------
_petronius
From the original article: " Which is why we still, today, have extremely
popular languages maintaining compatibility with a language from 1969 — so old
that it probably couldn’t get a programming job."

Yikes! How about "old enough to have a crapton of experience I'd really like
to learn from/integrate with my company"? I know there is ageism in this
industry (towards both people and tools), but sometimes the cavalierness with
which people throw out the idea that "old == bad" and "good == new" comes as a
shock.

And besides that, shouldn't we be _proud_ of half a century of stability and
compatability? Isn't "well-designed systems that stand the test of time" a
holy grail to reach for, rather than poke fun at?

------
antiquark
Are people actually boggled by the "++" operator? If they can barely wrap
their minds around that, why are they even in programming? Seriously!

~~~
SlySherZ
There are two typical use cases for `++` and they should be considered
separatly.

Firstly there is simply the `i++`, which is redundant with `i+=1`, but it's
also very easy to understand syntactic sugar. Keep it or leave it, no big
deal.

Then there is the more advanced and very subtle use of the return value:

    
    
      int i = 3;
      while (i--) {
          printf("%d, ", i);    // What does this print?
      }
    

This one is very controvertial. On one side, it simplifies and shortens code
by a lot, but it also makes the code say too much at the same time. Can you
tell by glance what this code would do? Personally, I prefer longer code and
more expressive code, but I can also cut corners with the short version from
time to time, when quickly testing something out.

EDIT: Fixed code formatting

~~~
user5994461
I'm glad it's the first time I see that code, instead of a usual for loop.

It doesn't compile in C++. Conditions have to be booleans. i-- is an integer
:D

In C, 'true' is defined as anything else than '0', there is no proper boolean
types, i-- is an integer which is perfectly okay for conditions, the condition
is equivalent "while (i != 0)".

There might be some variations, errors and warnings depending on the
compilers, the strictness level and the revision of the language chosen.

~~~
mining
This is completely false, it compiles in C++ fine. Are you thinking of Java?

~~~
user5994461
Well, it shouldn't. It's possible that you have to give a flag to the compiler
to error on that kind of implicit casting. It should at least give a warning
for sure.

I am definitely talking about C++

~~~
tomlx
Compiles fine with -Wall -Werror. No warnings.

~~~
user5994461

        int x = 1;
        bool y = x;
    
        # cl.exe /W3 main.c
    

This code gives a warning on VS2012. But it doesn't give one when the cast is
in a loop condition. That is weird.

[http://stackoverflow.com/a/31552168/5994461](http://stackoverflow.com/a/31552168/5994461)

This stackoverflow message talks about the specs for C11, and the first
comment adds information on the C++03 spec. It seems that implicit cast from
integer to boolean is allowed... under all circumstances... depending on what
specification the compiler is following :D

For future references, I'll just summarize this as "C and C++ are minefields".
We'll just add that to the list of WTF behaviors.

By the way, if you think that "C has had bool for 19 years" [the C99 spec
specifically]. You clearly didn't work in C for long enough with a large
variety of tools. The world is bigger than just GCC.

~~~
mining
> But it doesn't give one when the cast is in a loop condition. That is weird.

I believe that the justification for that is that you'll often want to do e.g.

    
    
        while (node) {
          node.val += 3;
          node = node->next;
        }
    

Implicit conversion of a type into a bool is pretty useful here, or for e.g.

    
    
        while (std::cin >> x >> y) { ... }

------
rtpg
Some of the issues defended here can be resolved if operator overloading
weren't so prevalent.

"++ is also used for iterators" Python has "next". I understand there's
"*p++", but the semantics of ++'ing an integer or ++'ing an iterator are very
different! Why should they share the same mechanism? (There's an argument that
++ on integers is iterating over N. Pedantically true)

"Integer math is useful!" Floating point semantics are different, so why not
use different operators for such? Caml has + for its, +. for floats.
Impossible to mix different semantics together implicitly.

\--

I find saying "whitespace sensitivity makes auto-indentation impossible" to be
a bit silly. The equivalent of auto-indentation in Python to C is auto-brace
insertion! Just as impossible.

There are definitely parsing difficulties with whitespace sensitive language
(technocally, lexing difficulties). If you're willing to go for parenthetic
function calls like in Python, it's not too hard. But go for Haskell-like
function application and you're in for a treat! Would not want to have to
parse Scala.

I can see the argument to braces being better, though. It's such a subject of
taste that if you don't want to be opinionated, going for explicit blocks is
really your only choice.

------
Manishearth
> What’s Wrong with Increment/Decrement?

This sort of misses the point there. What eevee was saying was that "Usually
you use ++ to mean +=1". It _is_ equivalent to +=1 in those cases.

Eevee then goes on to say "The only difference is that people can do stupid
unreadable tricks with ++.". She _is_ talking about postfix/prefix there. She
never said that ++ and += are _exactly_ the same, she's saying that they're
mostly the same, except for some cases which often read to more unreadable
code. Using i++ or ++i as an expression within a larger expression often leads
to unreadable code.

The author here then goes on to talk of off by one errors, but ++-as-
expression is what _causes_ a lot of off by one errors.

[https://news.ycombinator.com/item?id=13089663](https://news.ycombinator.com/item?id=13089663)
explains this a bit too.

\---

The other two points also miss the point a bit. The post is not "why C is
bad". The post is "Let's stop copying C". There are good reasons behind C
having many of these features. These reasons do not necessarily port to other
languages; yet they copy the feature.

------
agentgt
In college (over 10 years ago) I used to think C was very viable. Sure it took
longer to develop but the reward was fast execution and generally KISS like
code (just an anecdotal observation I have of C is the tedium to write complex
things forces simpler solutions... I mean who wants to malloc over and over).

But near the end of college I had to write code using pthreads. It was awful.
It was exceedingly painful. Banging on the keyboard cussing continuously
painful.

Maybe it was just pthreads (I'm sure there are nicer libraries) or my
stupidity but that exercise killed my mild liking of C.

Languages I like are heavily expression based but don't require braces (algol
like languages) nor parenthesis (lisp like). As much I dislike braces and
parenthesis (for blocks) I despise statement heavy languages more (sadly
python). I wish the article of the "Stop Copying C" mentioned that (or maybe I
only share that opinion).

~~~
module0000
> But near the end of college I had to write code using pthreads. It was
> awful.

For what it's worth, I don't think it's C's pthread library that is painful,
but POSIX threading in general. Languages and libraries try to "simplify"
threading by re-inventing pthreads. Eg: python's 'threading' module, erlang in
general, java 'runnable' interface, golang's 'coroutine' thread manager...etc.

The difference between all those languages specific threading implementation
and POSIX threads, are that POSIX threads work across _every_ language, and
all those implementations are only relevant to the language itself. Working
with "one size fits all"(pthread) tools is inherently more complex than a
simplified tool specific to one language.

Summary: once you learn POSIX threading [well] in C, using other pthread
abstractions in other languages becomes much easier.

~~~
agentgt
If I recall it wasn't that I didn't understand the concepts but rather
difficulty in debugging. At the time I was ignorant and ill experienced of
proper tooling. Today when dealing with C I would probably not even bother
with using threads and might just use multiple processes.

I agree on the reimplementing POSIX threading and despite my rough experience
pthreads was worthwhile learning about.

------
bArray
I 100% agree with this article. I'm a student that has learned Python, Java
and C - using each depending on what I need to do.

Having taught as well, Python is not a language for beginners. They are
punished for getting their spacing wrong, often confuse types, oo seems to
have been an after thought and versions 2 & 3 are completely different
languages.

C can also be confusing for beginners, but not for the reasons mentioned.
Teaching people about pointers spins heads for the first time.

Java for me seems to be the middle ground. Good understanding of oo, portable
code, solid types, well formed errors, beautiful garbage collection and well
thought out (libraries, types, access, concurrency, etc).

C is obviously an advanced language, but like PHP there is a good reason why
it's not ready to be buried yet.

~~~
defgeneric
One of the things (among many) that _does_ make python a good language for
beginners is its short developer feedback loop. The pleasure of programming is
important, so even if they're writing terrible code at first, there's a strong
motivation to keep going. Along the way their skills develop.

~~~
bArray
I agree, but this is achievable in other languages in the same way it is for
Python - simply output something interesting via a package or library. First
practical we have students drawing shapes in Java and crudely animating them.

------
junke
Regarding integer division, the Wikipedia article about modulo operation has
nice graphics.

[https://en.wikipedia.org/wiki/Modulo_operation](https://en.wikipedia.org/wiki/Modulo_operation)

------
pmarreck
Is there any room for a form of C with immutable semantics which forced you to
jump through special hoops to get mutable data access (solely for performance
reasons)... Or does Rust basically have this design goal already?

The limitation of programming (from a bug-minimization perspective) is the
programmer's ability to understand all the possible states his code and data
can get into which means that the key to bug and security hole elimination is
any pattern which reduces cognitive load

~~~
sidlls
I'm struggling to understand how requiring explicit mutability has anything to
do with performance or "hoops."

I mean, getting mutable access is trivial:

    
    
      let mut x = ...
      fn frobnicate(x: &mut...)
    

The performance "problems" in Rust requiring "hoops" are in very narrow
applications (e.g. certain applications of high performance numerical
computing). I'm not a fan of the "unsafe-offset" idiom required so that every
time access into a vector occurs doesn't waste cycles on a pointless boundary
check (for example).

But I must say for a given chunk of Rust code the equivalent C++ code is
generally more bloated, because unless you're writing code begging for abuse
you're going to have "const" and move operations everywhere, which is sort of
inverted from Rust's model.

~~~
steveklabnik
Bounds checks are often elided, so many access of vectors won't waste cycles
on pointless checks. Or at least, if they are truly pointless, and they're not
elided, that's an LLVM bug.

------
hellofunk
> The only thing I can suggest you to do is to actually go program in C for
> some years, write some good software, and you will see what I mean.

(from the OP's comments to another reader)

Allow me to translate: Ok, let's stop bashing C. Let's start bashing people
who have legit complaints about C.

I don't think the above comment was warranted. Someone took the time to read
your arbitrary article and attempt to offer a meaningful point and your
response was to demean the fellow.

------
Pica_soO
And of course engineers and architects are blameless on this - we would not
allow for gradual replacement, no, Sir , we need a revolution, tear down this
Wall and build a brand new Castle in the sky. Nobody wants to weld new things
to a rusty Compiler, until the Compiler is all rusty, but still fully able to
bind all the legac inertia factor to his will.

------
OskarS
I've always been a huge fan of the increment and decrement operators. I don't
use them all the time, but they're very nice to have on occasion, and with
operator overloading they can be made incredibly useful. I thought it was a
very silly decision for Swift to remove them.

~~~
Tempest1981
Here are some reasons that Python removed them:
[http://stackoverflow.com/questions/1485841/behaviour-of-
incr...](http://stackoverflow.com/questions/1485841/behaviour-of-increment-
and-decrement-operators-in-python)

Edit: a nice write up by Chris Lattner: [https://github.com/apple/swift-
evolution/blob/master/proposa...](https://github.com/apple/swift-
evolution/blob/master/proposals/0004-remove-pre-post-inc-decrement.md)

~~~
dap
It's very unfortunate that attempting to use preincrement and postincrement
operators in Python results not in a syntax error, but in code that silently
does something very unexpected.

------
autognosis
I've seen these pixels before. C is a permanent fixture. It is not going
anywhere in your lifetime. You can use it, or not.

[http://www.joshianlindsay.com/index.php?id=170](http://www.joshianlindsay.com/index.php?id=170)

------
MaulingMonkey
Given that the original article that this replies to starts with "It [C] works
well for what it is, and what it is is a relatively simple layer of
indirection atop assembly." I'm not sure "bashing" is an exactly accurate.

Given that this reply also starts with "To begin with, I agree with most of
the things Eevee wrote", I'm not sure "Let's stop X" is the right title to
use, even if it matches the theme of the original article.

The reply itself is... a decent alternative viewpoint, to a few of the points
made, but I don't think it quite matches the title. And now some counter-
counter points:

> What’s Wrong with Integer Division?

This is the wrong question. The right question is: What's wrong with integer
division _if and only if both arguments are integers, and ambiguously
conflated with floating point division in syntax_? I'm sufficiently used to it
to not find it a big deal, but that's a poor argument for it's merits in new
languages.

Nobody thinks integer division shouldn't be _an option_.

> However, the behavior of integer division is very useful. I would argue that
> in most cases, one expects it to behave just as it does.

I've seen even professionals trip up over the semantics of this just regularly
enough to be annoying - you read float a = x/y; and assume terrible things
about x and y, things that were perhaps even once true. Even if you really did
intend to round, you probably assumed unsigned 'round down towards zero'
semantics and will have a bug to contend with when x<0 and end up rounding up
towards zero.

Personally, I haven't found it that difficult to adjust to languages that have
a separate integer division operator, or languages requiring an explicit
rounding operation, despite a long history of only using languages with C's
semantics before them.

> Unfortunately, Eevee seems to fall for the extremely common “++ is
> equivalent with += 1” fallacy. You can see it’s a fallacious statement when,
> in the end, even the author herself admits that there are things that can’t
> be implemented in terms of “+= 1“; for instance, incrementing non-random-
> access iterators.

That Eevee herself points out that "++" is not always "+= 1" makes it quite
clear that she hasn't "fallen" for any such fallacy. The point is that even
including other unmentioned (ab)uses of overloading, there _is_ always a way
to rewrite things in an equivalent manner in terms of "+=", "-=", or custom
functions - _combined with_ either the abuse of the comma operator, or
multiple statements. For the mentioned case of iteration - plenty of other
languages have a separate, named function for iteration, even if they allow
"++" and "\--" to be overloaded.

The only real point of debate here is how often the "++" and "\--" operators
produce _concise_ code, versus how often they produce _write-only_ code.

------
drivingmenuts
It seems to me that a lot of issues could be better handled if we weren't
limited to programming in a subset of the ASCII char set. Unfortunately, the
alternative is to completely retool the modern keyboard.

~~~
dagw
We've already been down that road:

[https://en.wikipedia.org/wiki/APL_(programming_language)](https://en.wikipedia.org/wiki/APL_\(programming_language\))

------
Pica_soO
Only if C stops bashing us first- it started it.

------
VT_Drew
I will stop bashing C when C code looks as beautiful as Python code.

~~~
smitherfield
Of all the problems C has, looking ugly isn't one of them, and most of the
ugliness would be done away with if the language adopted a proper module
system and C++ "constexpr". The Linux kernel in particular is quite nice to
look at:
[https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux....](https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/tree/kernel?id=v4.9-rc7)

------
piyush_soni
Let's stop bashing bashing.

~~~
paulddraper
Is prefer to keep bashing bash, if possible.

~~~
Pica_soO
Hey, dont mock the wailing parade around the C-monument. This must be its 300
cycle and its still growing strong.

Helping the inertia by self-sabotaging with over idealistic , non backwards
compatible languages, the software architects and computer scientists lead the
procession.

Followed they are by self-flagellating programmers, trying to figure out who
beat them to it again and again.

Followed they are by the silent businessman parade, who search for the perfect
compromise between throw-away code and reuse ability on hardware where the
solder-lead is not stable yet.

All hail the holy C, all hail the mighty procession of pain.

