
Those Who Say Code Does Not Matter - swannodette
http://cacm.acm.org/blogs/blog-cacm/173827-those-who-say-code-does-not-matter/fulltext
======
eldude
This is fundamentally what I call the Tower Defense[1] model, borrowed from my
old manager here at LinkedIn, Rino Jose[2].

The Tower Defense model is an approach to software reliability that focuses on
catching bugs at the lowest level, to avoid the inevitable combinatorial
explosion in test coverage surface area. In other words, deal with problems
like these at the language level, so there is NO NEED to deal with them at a
higher process-level.

No one is disputing that processes, QA or Devops couldn't/shouldn't
hypothetically catch these bugs before entering production. The problem of
course is that they usually don't, because the lower level defenses are
allowing too many bugs through, that they really shouldn't and the higher
level processes become overwhelmed, fail, and allow bugs to cross the critical
production threshold.

This means always giving higher priority to lower-level methods of
reliability. For example,

* Language rules _are more important than_

* Unit tests _are more important than_

* Integration tests _are more important than_

* Code reviews _are more important than_

* QA _is more important than_

* Monitoring _is more important than_

* Bug reports

[1]
[http://en.wikipedia.org/wiki/Tower_defense](http://en.wikipedia.org/wiki/Tower_defense)

[2] [https://www.linkedin.com/in/rinoj](https://www.linkedin.com/in/rinoj)

~~~
blueblob
Can't you make the argument that even in C/C++ they could have enforced
different "Language rules" like forcing no warnings? When compiling with gcc
you could use -Wunreachable-code. Then it is part of the process.

EDIT: with -Werror it will make this bug an error but not necessarily the
class of bugs.

~~~
gweinberg
an unreachable code warning would have caught this particular defect, but it
wouldn't have helped if the duplicated line were something other than a goto.
I think the policy of always using curly braces for conditionals (even if only
one line) in c like languages is a good one.

~~~
AnthonyMouse
> an unreachable code warning would have caught this particular defect, but it
> wouldn't have helped if the duplicated line were something other than a
> goto.

In that case always using curly braces or using a language that requires an
"end" statement after the conditionally executed code may not have helped
either. Imagine the incorrectly repeated statement was "a = a + 1" or
"error_mask ^= error_x" etc. Putting the erroneous line inside the conditional
doesn't erase the error, it just modifies the conditions under which it
executes. That's about as likely to hang you as save you.

~~~
lukeschlather
Is there any language rule that can save you from incorrectly repeating an
operation that is not idempotent?

------
loumf
He's right, but he's disingenuous in saying that random line duplication can't
cause catastrophic problems in Eiffel. This very specific bug can't happen in
Eiffel, but the class of bug can (bug caused by bad merge or accidental,
unnoticed line duplication).

If most code were idempotent, functional, immutable, etc -- then we'd start to
get there, but usually randomly duplicating lines is going to be an issue
unless it's always a syntax error.

I'd say clojure has more of a chance. (1) lots of immutable data and
functional style (2) duplicating code lines is likely to result in unbalanced
parens -- the unit of atomicity is the form, not a line. Many forms span lines
in real code, and many lines contain partial forms (because of nesting).

Still there is plenty of clojure code that is line oriented (e.g. data
declarations)

~~~
coolsunglasses
Clojure isn't particularly well suited to avoiding problems like this. I've
written a lot of Clojure for work and for open source.

We need pure, typed FP langs like Haskell/Agda/Idris.

To boot, I'm having a much more enjoyable and relaxing time in a Haskell REPL
than I was in my Clojure REPL.

Somebody I follow on Twitter _just_ said something apropos:

"girl are you Clojure because when I'm with you I have a misplaced sense of
[optimism] about my abilities until I enter the real world"

~~~
loumf
I don't think clojure (or lisp) was designed to avoid line-duplication errors.
It's mostly an accident of it being not very line oriented.

I just randomly picked a function from core to illustrate this, but a lot of
clojure code looks similar

    
    
         (defn filter
           "Returns a lazy sequence of the items in coll for which
           (pred item) returns true. pred must be free of side-effects."
           {:added "1.0"
            :static true}
           ([pred coll]
            (lazy-seq
             (when-let [s (seq coll)]
               (if (chunked-seq? s)
                 (let [c (chunk-first s)
                       size (count c)
                       b (chunk-buffer size)]
                   (dotimes [i size]
                       (when (pred (.nth c i))
                         (chunk-append b (.nth c i))))
                   (chunk-cons (chunk b) (filter pred (chunk-rest s))))
                 (let [f (first s) r (rest s)]
                   (if (pred f)
                     (cons f (filter pred r))
                     (filter pred r))))))))
    

There are very few lines of this function that can be duplicated without
causing a syntax error because parens will be unbalanced.

I see two:

In a let with more than two bindings, you could repeat the middle ones. In
clojure, this is very likely to be idempotent. In this code, it's the

    
    
            size (count c)
    

In any function call with more than two arguments, if the middle ones are put
on their own line, they could be repeated, like this in the final 'if'

    
    
            (cons f (filter pred r))
    

In many cases, you will fail the arity check (for example in this case). If
not, the function should fail spectacularly if you run it.

So, I think it's accidentally less likely to have problems with bad merges and
accidental edits (not designed to have that property)

~~~
yohanatan
Your analysis only holds if closing parentheses are all gathered on the same
line as final expressions (which may not be true for some styles).

~~~
loumf
I'm not thinking this through completely, but it seems resilient to a lot of
styles.

Function calls (which is a lot of what clojure is) are an open parens to start
and very likely not to have that close on the same line (because you are
building a tree of subexpressions).

Wherever you put the close (bunched or one per line), if you don't put it on
the line with the original open, it will be unbalanced in both spots (meaning
the first line and the last line can't be duplicated without causing a syntax
error).

~~~
yohanatan
True. But consider the form:

    
    
        (if (someExpr)
          (doTrueStuff)
        )
    

Then a duplication of the `doTrueStuff` line would lead to true stuff being
done regardless of the truthiness of someExpr (as the third [optional]
argument to `if` is the else branch).

This form is not entirely unheard of either. The overtone library for example
assigns labels to its event handlers like such:

    
    
       (defn eventHandler ( ...
          stuff
       ) :: event_handler_label)

~~~
Jtsummers
This is actually why I really like `cond` in Common Lisp (and other lisps and
languages). You have to make explicit what should happen if your desired
expression is true, and the only way to have an `else` clause is `(t ...)` so
you have to intentionally create that last wildcard spot.

------
ChuckMcM
That was a long way to go to insult C and brag about Eiffel. Ada doesn't have
this problem either but nobody is jumping up and down saying how its the one
true language. Back when I was looking at Phd topics (I ended up jumping into
work instead) "provably correct" code was all the rage. Lots of folks at USC-
ISI were looking into proving the code expressed the specification, and the
resulting executable faithfully expressed the intent of the code. End to end
correctness as it were.

What struck me about that work was that invariably there was some tool you ran
over the specification and the code and it implemented some algorithm for
doing the work. And yet if you went that far, then you should at least be
willing to run something like lint(1) and had anyone at Apple run it, or made
warnings fatal (always good practice), the repeated goto would never have
escaped into the wild (useless code is useless, its a warn in both GCC and
Clang, and always flagged by lint).

So is the challenge the language? Or the processes? I tend to favor the
latter.

~~~
kerkeslager
> So is the challenge the language? Or the processes?

Did you finish the article? This is exactly the false dichotomy the article
was about.

It's the language AND the processes.

~~~
joe_the_user
I finished the article.

It went on and on with its point but I didn't notice anything more
illuminating than the rhetoric he was peppering into his arguments half-way
through.

I think Chuck's argument still just trumps this guy: Language will always have
problems. You always need good process no matter what language and you can
find a tool to help you get over whatever "holes" some older language might
have. No amount of cursing lazy-people-unwilling-to-enter-the-21st-century or
referencing WWI will make the "you need ze one language zhat does zit all"
argument that much better.

~~~
kerkeslager
Process and tools will always have problems. You will always need a good
language no matter what process or tools you use. No amount of touting of
process will make the "you need ze one process zhat does zit all" argument
that much better.

Believe me, I know there will never be a perfect language. But this idea that
we only need process is tantamount to a claim that there will be a perfect
process which will solve all the problems that a better language could solve.
That won't happen.

Why would you choose a crappy language and a good process when you can have a
good language AND a good process? Is there something about having a good
language that prevents you from having a good process?

------
freyrs3
Whenever "language doesn't matter" or "use the right tool for the job" is used
in an argument it's quite often as a thought-terminating cliche used as a
post-hoc justification for personal prejudices. I think almost everyone
intuits that there is at least a partial ordering to language quality and
safety, we just often disagree about how that ordering is defined.

~~~
michaelochurch
"Use the right tool for the job" != "language doesn't matter".

There isn't one language to rule them all. If there were, we'd be able to stop
discussing them. But sometimes you need explicit memory management, sometimes
you need rapid development, sometimes you need high performance, and sometimes
you need a strong library ecosystem. "Right tool for the job" means that
getting the language correct is important, but that it's not always going to
be the same language.

~~~
shasta
This "use the right tool for the job" reasoning is just as flawed, though.

Switching between programming languages when you need explicit memory
management vs. garbage collection or high performance vs. rapid development
incurs a huge cost in interoperability. Should an English speaking person
switch to French when they want to discuss love, since that's the right tool
for the job? Of course not. If one (natural) language has words or idioms that
are useful, they get incorporated into the languages that don't have them.

As the field of programming languages matures, we most certainly will want to
pick languages that can rule them all. Languages will be adaptable to support
different tradeoffs in the various dimensions you mentioned, without making a
bunch of arbitrary ad hoc changes to all of the other dimensions.

Whether there will actually be just one is another question. I'd guess
"probably not" for the same reasons that there's not a single natural
language. Probably we'll have fragmentation, instead. But it won't be because
people are choosing "the right tool for the job" and it won't be a good thing.

~~~
SoftwareMaven
There are idioms in spoken languages that do not translate to other languages.
There may be a literal translation, but it doesn't make sense or are
incredibly awkward in a different language. Programming languages share this
behavior.

~~~
shasta
Most current ones do, but this will change as new languages gain the ability
to express most of what the others can express.

~~~
tel
There are a couple limitations on that. For instance, you cannot have a
language that's both total and general recursive.

~~~
shasta
So have a total fragment in your general recursive language, or encode
recursion in your total language. Either is preferable to using an FFI to
combine markedly different languages.

~~~
tel
That's fine, but you're beginning to talk about a lot of languages glued
together instead of one "master language", I think.

------
habosa
PSA: Always put brackets after your conditionals (for languages where you
can). You never know when a one-line conditional will become a ten-line, and
you can get this sort of bug. It's not worth the two saved keystrokes now to
have the NSA in your data later.

I think the most readable code has no shortcuts and no tricks. I'll take
unambiguous over concise or 'beautiful' any day.

~~~
jobu
It's amazing how following good coding standards can make bugs go away. If
anyone disagrees, I would suggest reading Joel Spolsky's article "Making Wrong
Code Look Wrong":
[http://www.joelonsoftware.com/articles/Wrong.html](http://www.joelonsoftware.com/articles/Wrong.html)
That article changed my perspective on coding standards.

To your PSA, I would like to add: Never use the '!' operator at the beginning
of a conditional. It's too easy to miss when reviewing or changing code,
especially next to certain characters:

    
    
        if(!llama) {
            ...
        }
    

This is a tiny bit more text, and so much safer for maintainers:

    
    
        if(llama == false) {
            ...
        }

~~~
wting
Now you have the bug occur if someone forgets an equal since this is always
true:

    
    
        if (llama = false)
    

So use Yoda conditionals instead:

    
    
        if (false == llama)
    

Is that really an improvement over the original? As I've stated in my sibling
reply, you are decreasing human readability for machine readability.

~~~
wtetzner
Maybe

    
    
        if (llama != true)
    

instead?

Or maybe you just use a not function:

    
    
        if (not(llama))

------
pjungwir
Paleographers have a whole catalog of scribal errors, which can be useful when
trying to reconstruct a text from conflicting extant copies. Perhaps it would
be helpful to compile such a list of common programming errors, and consider
that list when designing a new language. It would include "scribal" errors
like Apple's goto or = vs ==, and also low-level logical errors. It seems like
this could make a fun paper for any CS grad students out there.

~~~
marcosdumay
There are such lists, and compilers do generate warnings when you use a
construct in it. Most of the problems is with people ignoring warnings.

But then, once in a while you have an example of bugs being inserted because
the right construct was one that generated warnings. The Debian SSH bug is
quite a clear example.

~~~
pjungwir
That's a good point. I'm thinking more about 10-20 categories of language-
agnostic error rather than hundreds of language-specific errors. And there may
still be scope for a paper that explicitly examines the paleography lists for
inspiration. Yay multidisciplinary research! :-) But it's nice to think that
such a paper might have applications not just in language design but also
compiler warnings, which certainly seems more practical/realistic.

------
AnimalMuppet
"My pet language renders that problem impossible."

Um... OK.

"Therefore you should use my pet language rather than one written in 1968 for
a PDP-11."

Not so fast.

First, when the language was written has nothing whatsoever to do with how
useful it is today. (Cue the Lisp advocates.) It's just a gratuitous slam, and
it comes off as being petty.

Second, even if Eiffel does completely prevent this class of problem, what
about the reverse situation? What classes of problems does Eiffel allow that
other languages prevent? (Don't bother claiming "None". That's not realistic.
It just means that either you don't see the flaws, or you're a propagandist.)

It's about the best tool for the job. Now, it's fine to argue that another
tool would have been better for that particular job, but "avoiding one
particular class of bug" is nowhere near good enough.

One point for the original article, though: Code does matter. Choice of
programming language matters. Code reviews matter. Testing matters. Code
review policies matter. Develop training matters. Developer culture matters.
It all matters.

------
guelo
Since he lumps in Java with C and C++ it's worth pointing out that this
specific bug is not possible in Java since unreachable code is a compiler
error. I assume the same for C#.

Also, many style guides such as Josh Bloch's highly influential 'Effective
Java' recommend against the 2-line if statement without curly braces since
it's known to be prone to this type of error. His argument that keywords are
better than braces for ending blocks is weak.

~~~
mikestew
> it's worth pointing out that this specific bug is not possible in Java since
> unreachable code is a compiler error

Not on my Java compiler (Android toolkit and Eclipse). I see it as a warning
throughout the code base I work on. :-(

~~~
josephschmoe
Use IntelliJ (Google's favorite Android IDE - they make a mod of it called
Android Studio but the good stuff makes it back to IntelliJ)

------
rguldener
Well the ironic part is that the official eiffel compiler compiles the eiffel
code down to C, which is then compiled again into assembly. So technically
speaking eiffel still relies on C... Note that its not very optimized C
either, its much slower than Java for most of the stuff I tried (with
contracts disabled).

Said compiler also happens to be terribly buggy and unreliable: The author
still teaches the CS "Introduction to programming" class at my university with
this language and every year students struggle with the language and the
obscure IDE. Also don't know anybody that ever wrote a line of Eiffel again
after that class even though the idea with contracts is kind of interesting.

Summa summarum: Best language constructs don't help if your basic tools are
broken and make it a pain to write in that language.

------
byuu
I really don't see how enforcing {} syntax on all conditionals is going to
make us so much safer.

Yes, people make mistakes, but this is a pretty huge screw-up. If you are
modifying an unbraced if-statement and aren't paying attention to scoping,
then you are being woefully negligent at your job. Especially when you are
working on cryptographic code used by millions of people to protect their most
valuable information.

So let's say we force more red tape to make sure this doesn't happen. Those of
us who pay attention to scoping probably won't mind too much, it's good
practice to do this anyway.

But what about the mediocre programmer? He may decide that now his if/else
if/else three-liner, when adding new lines for {}, should really just turn
into a switch/case. And now he neglects a fall-through case, or adds an
unconditional break; before important code. And we're right back where we
started.

It doesn't matter how much we safeguard and dumb down languages. We can load
our languages full of red-tape: extra braces, no jumping or breaking, no fall-
throughs, always requiring explicit conversions, no pointers, no null types
... all we'll end up with is code that is much harder to read (and possibly
write), while the mediocre programmers will find new and inventive ways to
screw things up. It's just as likely to make them even more lax, and attract
even less disciplined programmers into important development roles. You know,
since it's presumed to be so much safer now.

The real problem is the amount of poor programmers out there, and the lack of
repercussions for these sorts of things. A doctor that leaves a scalpel in a
patient is (rightly) ruined for negligence. Do you think the "goto fail;"
writer even received a written warning? Why not?

I'm not saying people can't make mistakes, but I think your pay scale and the
importance of what you do should come with some actual responsibility for your
errors. Just like in every other profession out there.

Yes, sometimes you can blame the tool. But there are also times when you need
to blame the user.

~~~
konstruktor
Ruined for negligence - medieval thinking combined with the US legal system.
Like that's going to help the patient whose body the scalpel was forgotten in.

But I like your example. Because in a well organised OR, the doctor cannot
leave a scalpel inside the patient all by herself. The nurse would have to
fail to properly count the instruments at the same time. BTW: all non-metallic
objects have embedded pieces of metal that will show up on X-ray, so errors
can at least be detected. Do they let plumbers operate on patients because
it's now much safer? Not where I live, the laws are pretty strict with respect
to who can practice medicine.

If you want safety, stop thinking in terms of blame and vengeance and design
systems that avoid errors, and reduce their impact if they occur. This
includes culture, processes and tools to protect against errors by those who
do the work, and some regulation to stop management from putting employees in
situations where they are likely to cause harm.

Those measures have made aviation safe, and medicine is catching up. Time for
the software industry to mature.

~~~
byuu
> If you want safety, stop thinking in terms of blame and vengeance

But that's the thing, I don't think anyone is really doing that in the
industry. It's not the solution to start blaming the programmer, but it's a
part of it. The other part is like you said, better accountability. It should
be every bit as concerning that the compiler didn't catch the dead code, that
there was no code reviewer, that there was no static code analysis, that there
was no test suite to ensure bad SSLs weren't passing validation, etc.

> Time for the software industry to mature.

Exactly! The way I see it now, there's no real accountability. We go, "Oh
well, it's the fault of the language. It shouldn't have let me screw up. If
only we had a new language without unbraced-ifs ... and it somehow caught on
and replaced all 40-years of legacy C code. Whelp, back to business as usual."

I don't see unbraced-ifs as this great security flaw, and I don't see 'fixing'
it as curing some endemic problem with language design that's going to lead us
to not have bugs like this again. It's too reactionary.

It may not be best-practice to do this, but I'll admit there are times I want
to add a quick one-liner check: "if(already_initialized) return;", and it's
nice not having to put the extra braces there just because an Apple engineer
once made a mistake.

For better or worse, the nature of technology is pragmatism, and not idealism.
C let you use unbraced-ifs, and now it's the most used language in the world.
We can argue about how this should change, but it's never going to. We can
design a new language and maybe one day it'll overtake C. But until then,
let's stop blaming our tools for things we should be taught on the first day
we start using them.

------
haberman
"When people tell you that code does not matter or that language does not
matter, just understand the comment for what it really means, "I am ashamed of
the programming language and techniques I use but do not want to admit it so I
prefer to blame problems on the rest of the world", and make the correct
deduction: use a good programming language."

As emotionally satisfying as it can be to stick it to people we disagree with,
I think we as an industry could do with a lot less of this black and white
thinking.

Programming languages do not fall into a neat good/bad dichotomy. Tell me your
favorite programming language and I will tell you three things that absolutely
suck about it (even if I like it overall).

Yes, if C could do it all over again it would probably mandate that brace-less
blocks go on the same line as the "if" (or are disallowed completely). So I
agree with the author that certain features of programming languages can make
it more or less error-prone.

But people still use C for a reason. That reason is that C has real
advantages. If you really want to improve software engineering, then help the
Rust guys out, but don't just tell C users to "use a good programming
language."

------
darrencauthon
His code example is:

if (error_of_fifth_kind)

    
    
        goto fail;
    
         goto fail;  
    

if (error_of_sixth_kind)

    
    
        goto fail;
    

The_truly_important_code_handling_non_erroneous_case

My question: If the "truly important code" is really that important, where are
the unit tests to verify that it "handles" the "non erroneous case????"

Test. Your. Code.

~~~
sparkie
Also: just because you can omit braces, doesn't mean you should. Stick to 1TBS
and this kind of mistake shouldn't happen.

~~~
darrencauthon
This mistake shouldn't happen?

Mistakes always happen. Even if I dedicated myself to using braces
_everywhere_ , my mistake might be that a) I put the braces in the wrong
place, or b) I forgot to put the braces.

~~~
Jtsummers
Shouldn't != couldn't. In Haskell and the MLs, for instance, certain classes
of mistakes _shouldn 't_ happen because of the type system and pattern
matching, but a single wildcard could throw that off.

------
john_b
To address only the "goto fail" example the author uses, I don't see how the
proposed Eiffel solutions are conceptually any different than always using
brackets in your C/C++ constructs. Brackets are the mathematical notation for
a set, and having a set of instructions inside them makes perfect sense even
if the set only has a single element.

Since

    
    
      if(condition){ instruction; }
    

instead of

    
    
      if(condition) instruction;
    

is already considered good practice, couldn't it also be enforced via compiler
pragma?

~~~
zackmorris
This is one of my most loathed developments in "best practices" lately. I see
it everywhere and it just adds noise, since it doesn't change the
functionality of the code. It's like adding a comment on every flow control
statement. A concrete example is try/catch, which forces the user to add
superfluous curly brackets.

In fairness, I have hit the "goto" style of bug he talks about in switch
commands, when I forgot to add a break statement. But I can't remember a
single time in my programming career when I added a line to an if/for/while
and broke something.

I think people are wasting their energies on subjective topics like telling
people to use spaces instead of tabs, or putting { on the same line as the
flow control statement, or even telling people how much whitespace to use
between () and where.

What we really need is a standardized formatter for any language that works
like Google's Go Format. Programmers should not be spending time refactoring
code just for looks. If anyone knows of a utility that does this and is easily
configurable for personal taste, I would sure appreciate it. Being able to
convert /* */ comments to // and back again would be a plus.

~~~
benihana
> _I see it everywhere and it just adds noise, since it doesn 't change the
> functionality of the code._

I'm confused. It sounds like you're saying

    
    
        if (condition) { stuff(); other_stuff(); }
    

is the same as

    
    
        if (condition) stuff(); other_stuff();
    

because the brackets don't change the functionality, but they do. I might be
misreading what you're saying though.

~~~
detrino
What he was saying should be obvious, since there is actual code in the
comment he was replying to, and its not what you wrote.

------
ryanobjc
It's about both!

From a programmer point of view, the ideal is a language that doesn't let you
make simple mistakes like the goto fail; bug.

From an engineering point of view, it's having the processes in place to make
sure that when such bugs inevitably happen, they don't end up in the final
product.

The reality is having both of these things would be ideal.

~~~
calinet6
Yes! It's about creating systems to reduce defects.

Processes are systems; language and its constructs are systems. Both may
contain methods to control variation and improve quality.

The point is to control variation. Nothing more. Any complete system that
connects what we intend, with we tell the computer to do, with what the
computer actually does, is good. We need to move in that direction.

Blaming an individual is pointless. Never was there a more consistent or
uncontrollable source of variation than the human. We need to surround
ourselves with systems that enforce quality, and that do not let us choose to
introduce defects.

For this to take place, we have to first and foremost understand that this is
the goal. That quality comes from the system and not the individual. Until
then we will see only human error translated to computer error continually and
unstoppably.

[http://en.wikipedia.org/wiki/W._Edwards_Deming](http://en.wikipedia.org/wiki/W._Edwards_Deming)
had it.

------
jw2013
But the problem the author described in the article can be solved just by
better programming habit- always add braces even for a one line condition
statement.

Or just put the one line statement in the same line with the conditional
statement such as: if(condition) statement; so when you try to add a line next
time, you will notice it was a one line if statement.

But yeah, not explicitly requiring braces for one line condition statement can
give us more succinct code but does requires better programming pratice.

~~~
loumf
Exactly. And these things can be enforced by style checkers, so you don't even
need to have good habits.

~~~
pdpi
Style checkers should be exactly that: tools that check code for correct
style. If you have "stylistic" choices that are so important that you disallow
their usage altogether because they might lead to critical bugs, then what
you're really saying is that the language design is broken.

In fact, you're saying that you think that certain parts of the language
design are so broken you'd rather work with a subset of the language that
doesn't include those features rather than risk having them be misused. This
very much puts the choice of language in question.

~~~
AnimalMuppet
Not necessarily. "This language is the best available tool for the job. It
would be even better if..." That doesn't necessarily make it a bad choice. (It
may in fact be a bad choice, but more data is needed.)

------
nshepperd
The "improving our tools is no silver bullet, so let's keep using what we
already have" thinking that this guy is criticizing is an excellent example of
the fallacy of gray
([http://lesswrong.com/lw/mm/the_fallacy_of_gray/](http://lesswrong.com/lw/mm/the_fallacy_of_gray/)).
Of course, using Haskell doesn't prevent all bugs, and PHP doesn't _always_
cause disasters. But it's easy to see which of these is the darker shade of
gray. And there _is_ a point where switching to safer (but not perfectly safe)
tools is the right thing to do.

------
acbart
A lot of the Computational Thinking movements seem to stress that "Computer
Science is more than just computers!" And that's true, we have a lot more to
offer! But at the same time, it's so misleading because so much of our really
cool stuff is wrapped up in being able to program. I mean, CS is about solving
problems at scale, and computers are how we best solve problems at scale. We
can teach lots of carefully crafted things without them, but it's always going
to ring a little false, and we won't get students more than an inch-deep into
why we're truly cool.

------
pron
_Of course_ the programming language matters. The problem is, that there are
many ways in which it matters, and some of those come at the expense of
others.

A language may be more concise, leading to shorter code (which is good), but
do so using tricks and "magic" that is hard to follow, which makes it,
eventually less prone to analysis by others (which is bad). A language could
be very declarative, thus clearly communicating intent (which is good), but do
so with leaky abstractions that remove it away from the computer's view of
things, and introduce subtle, and severe bugs that, in order to catch, require
expertise in exactly _how_ the language is implemented (which is bad).

So while there are certainly languages which are categorically better than
others (at least for some domains), there is no consensus on the right path to
choose among the "better languages", and they all take completely opposing
approaches. I'd even say that most languages used in new code written today
are among those "better languages". So while in theory the choice of the
programming language matters a lot, in practice -- not so much (as long as you
choose one of the many good languages). I don't think that we have a language
(or several) that is _that much_ better at preventing bugs as to offset any
other advantages other languages may have.

------
PavlovsCat
> "If you want the branch to consist of an atomic instruction, you write that
> instruction by itself"

No, I don't, I generally use curly braces for those too. So for me, the
solution would be throwing an compile error when those are missing. Is that
really all it takes to make a language "modern"?

I also don't understand the jab at semicolons, which I like, nor do I see how
getting rid of brackets around the condition is really a net saving when you
have to write "then" or "loop" instead. Apart from being twice as many
characters to type, words distract me (much) more than symbols, and now that I
think of it, I wonder how programming would be like with symbols for reserved
keywords like "if" and "function", so the only text you see anywhere in your
code would be quoted strings or variable/function/class names...

Anyway, I think when talking about distractions, one should be careful with
claiming to speak for everybody (at least unless you did a whole lot of
studying test subjects or something).

~~~
Roboprog
Having worked on both line oriented and free-form languages over the last 30
years, there's less opportunity for foolishness when 1 line = 1 statement. (so
long as you can continue a long line with an obvious convention such as a
trailing backslash, ampersand or perhaps a comma)

Line oriented syntax need not be limiting: Ruby, for example, does a good job
of treating statements such as "if", "while", "case" as expressions that can
return values.

Also, there's only one correct way to indent with "END" type keywords: the
body is indented to the right of the start/end lines, and the start
(FOR/IF/...) and end (END/DONE/...) lines are aligned.

Now if we could just get an "AGAIN (parameters...)" keyword for tail call
elimination, coupled with _almost_ mandatory immutability, in more
languages... :-)

~~~
PavlovsCat
Even though I already only put one statement in each line, and indent as if
indentation mattered, I still like semicolons and curly braces... just to have
the option to make the occasional very long freak line more readable without
having to change characters because of whitespace changes.

------
wellpast
This is terribly wrong.

Correctness is brought about by ALL of your tools in hand. These tools include
unit testing, processes like continuous integration and code review, and so
on, _in addition_ to language features such as its syntax and static analysis
capabilities.

The job of the programmer is to understand all of your tools and to then use
them conscientiously and use them well. There is NO tool a programmer can't
shoot themselves with. There's no prima facie perfect tool. And the
combination of your tools is a better thing to evaluate anyway. A nail isn't
universally useful. With a Phillip's head screwdriver things get a little
better; but with a hammer, you'll start moving.

A good architecture and intelligent, disciplined execution is WAY WAY WAY more
important than the specific tools we use. Arguments like this one are bike
shedding.

~~~
marcosdumay
Yet, some tools are objectively worse than others. And there is such thing as
the right tool for the job.

You can't just push the problem into lack of discipline. People make mistakes,
if you stubbornly ignore that fact, you'll get defective products.

~~~
TeMPOraL
Agree 100%. If there's one rule the history teaches us over and over again in
every aspect of our lives, is that you should never count on human discipline.
You build your systems to work in spite of, not thanks to, human behaviour.

------
aetherson
It's clearly true that having a language which forced an explicit ending to an
if block would have prevented the goto fail bug.

But is there any actual evidence that code written in modern languages have
fewer bugs overall? Or is it all, "let's focus on this one example"?

As another commenter mentioned, the goto fail bug would have been utterly
trivially caught by any unit test that even just tested that it reached the
non-error case in any way (you don't even need to test the correctness of the
non-error code).

I would like to see data before I believe that "errors that would have been
prevented by non-C semantics" constitutes a significant fraction of all bugs,
or that they aren't just replaced by bugs unique to the semantics of whatever
replacement language you're using.

~~~
pjmlp
I think it is also a matter of culture.

The modern language communities tend to be more open to static code analysers
and unit tests than the C community, even though they have lint since the
early days.

------
TheLoneWolfling
How about forbidding newlines in a single-statement if and requiring a newline
after the first statement after the if?

So this is allowed:

    
    
      if (foo) bar();
      baz()
    

But this isn't:

    
    
      if (foo) bar(); baz()
    

And this isn't:

    
    
      if (foo) 
      bar(); 
      baz()
    

Edit: formatting

~~~
jloughry
The formatter ate your indentation. Is this what you meant?

So this is allowed:

    
    
        if (foo) bar();
    
        baz()
    

But this isn't:

    
    
        if (foo)
            bar();
        baz()
    

And this isn't:

    
    
        if (foo) bar(); baz()

~~~
TheLoneWolfling
Yep.

------
AdrianRossouw
Code doesn't matter, what matters is what it does.

How usable, secure, stable or fast it is are properties of how well it
accomplishes it's task.

There's an amazing presentation by the author of Clojure called Simple Made
Easy. Since I couldn't just link people to a 1 hour presentation, I made some
notes on it :

[http://daemon.co.za/2014/03/simple-and-easy-vocabulary-to-
de...](http://daemon.co.za/2014/03/simple-and-easy-vocabulary-to-describe-
software-complexity)

The code that we write he calls a construct, and our application is what he
calls the artifact.

We need to evaluate constructs based on the complexity they create in the
artifacts.

Using C, for instance, affects the complexity of our applications by
introducing many more opportunities for human error.

------
dasil003
I absolutely believe that languages matter and are the best hope for improving
code quality and productivity overall. We need better languages.

But the way we get there is not to pick some small but critical bug that could
be avoided by an arbitrary style change and declare that a language which does
not suffer that stylistic pitfall is superior. The new language may have much
worse flaws. You're just playing language flaw whack-a-mole at that point.

If we want to improve we have to get a sense of what types of bugs are most
common across all programs and reason from a logical standpoint about how they
may be prevented. This will solve orders of magnitude more problems than
fiddling around with the syntax to minimize the number of typos people make.

------
wpaprocki
There are other reasons why choice of language matters. If you need a simple
web app, you're probably writing it in PHP or Ruby instead of C. But you'll
likely use C if you're interfacing with hardware. A lot of apps that need high
concurrency use Erlang. If you can write a quick Python script that solves a
problem before you can even get Eclipse to load, then why would you even
bother trying to write the solution in Java?

Language errors aside, it's pretty obvious that at least in some cases, the
choice of programming language does matter.

------
MortenK
Some types of errors might be easier to make in one language versus another,
but a language that through syntax eliminate the possibility of all errors, is
of course a ridiculous notion. Cherry picking particular error types that are
avoidable in the authors language of choice, does not prove anything.

The concept of why code does not matter, comes from development management
literature. It's not a case of actually meaning "I'm ashamed of the language
and the techniques I use". That's an awfully developer-centered point of view.

The influential factors of a successful software project are mainly the
quality of the people involved. Next product scope and from there a huge drop
down to development process, and finally technology, i.e language.

It's been statistically shown that barring a crazily bad technology choice
(Visual Basic for the space shuttle kind of bad), language has very little
influence on the success of a project.

That's of course not a nice thing to hear for a developer who's convinced his
language of choice is the one true language. Regardless, it's well established
knowledge, gained years and years ago through statistical analysis of
thousands of projects.

------
miscreant
On the surface, this reasoning makes sense. Unfortunately, human coders are
the ones writing code. This means the code might have bugs in it, regardless
of the language you choose. While it would be great to invent a new language
n+1 whenever the blame for bad code can be directed at programming language n,
it is not likely that you will find coders that are willing to repeatedly take
that ride.

------
ebbv
Just use curly braces all the time. There's nothing in C or C++ stopping you
from using curly braces around single line statements. It's just that you CAN
omit them.

But having been bitten by this issue early on, I started always using curly
braces. I think it's the better way to write C/C++. Frankly I think those who
omit them are just lazy.

------
scotty79
Another thing that enabled this bug is that it was a language that allows
misleading indentation.

Programmers indent all code. By making indentation to be not meanigful in your
language you are ignoring large part of how programmers read and write code
and allow for many misunderstandings between programmer and compiler.

~~~
clarry
Is there evidence that misleading indentation had a role in enabling the bug?

You can make a typo, you can duplicate a line, you can misindent a line, you
can put a brace in the wrong spot. I believe different languages simply trade
forms of potential mistakes for other forms. Ultimately no language will
magically fix or make bad code known; it has to be detected somehow. Tests,
static analysis, manual review, whatever.

~~~
scotty79
You are right. It wasn't the case for this exact bug. The source of this bug
was probably the the design of the code editor that had function that
duplicates line bound to a single keyboard shortcut right next to shortcut for
saving file that every programmer uses roughly 1000 times a day. (What for?
What's wrong with Home, Home, Shift+Down, Ctrl+C, Ctrl+V, Ctrl+V? Do you
really need to duplicate single line in place that much so you need single key
combination?).

However language that doesn't ignore indentation would make such error benign
as duplicating the line wouldn't place code in completely different level of
AST.

All programmers indent. Why so many languages happily ignore that?

~~~
mattgreenrocks
That's a great question.

I'm a huge fan of the Go compiler's general stubbornness. Have unused imports?
Compile error. Have unreferenced vars? Compile error. Perhaps misleading
indentation should be another compile error.

Pain the ass? You bet; it's a feature. Ignore the whiny kids who insist their
'flow' is broken by having to insert semicolons. Most software work is
maintenance, not new code.

------
maninalift
Surely the key point is that most of us read indentation first, it doesn't
matter whether you are witting C with semicolons and curlies or ruby with no
semicolons and "end"s or lisp with parens, what the programmer really reads
first is the indentation. Those other things are sometimes checked afterwards.

Therefore there are two reasonable courses of action to prevent this kind of
problem:

    
    
      * use automatic code indentation
      * use a whitespace significant language
    

The second is absolutely the better choice. You may disagree but you are
wrong. This is not a matter of taste, it is a matter of programming language
syntax actually reflecting what the programmer reads.

~~~
matthewmacleod
That's just not the whole story. Significant whitespace can itself cause
issues - for example, it can be really easy to accidentally indent to the
wrong depth when merging or copying and pasting Python code.

Ultimately, well-formed indentation and whitespace is important, as you say.
But whitespace significance is not a panacea.

~~~
marcosdumay
Well, personally I don't like whitespace significance... But then, I wonder if
forbidding spaces* at the beginning of the lines wouldn't correct all the
problems.

* Yes, mandatory tabs. I know lots and lots of people will disagree, those lots and lots of people are wrong. Spaces are ambiguous, and have different semantics on different context. In a flexible language that does not make much difference, but when you want whitespace to have a specific meaning, it does.

------
bowlofpetunias
As far as I can tell (and I may not be the best judge of that, because I've
used a dozen languages over the past 25 years and still fail to see any really
significant difference other than language philosophy as in OO, functional,
etc), this is not about language quality but language _safety_.

Those two are not necessarily the same, and some of the most elegant languages
aren't particularly safe. Conversely, unlike people like to claim based on
hearsay and no longer existing features, a badly designed language like PHP is
definitely not less safe than Python or Ruby.

By the standard for "good" set by the author, no dynamically typed language
would make the cut.

------
abshack

        if( DoSomething() )
            goto fail;
        else if(DoSomethingElse())
            goto fail;
            goto fail;
        else if(DoSomethingOtherThanElse())
            goto fail;
    

You get a syntax error at the final else-if. A better way would probably be:

    
    
        int err = 0;
        if(!err)
            err = DoSomething();
        if(!err)
            err = DoSomethingElse();
        if(!err)
            err = DoSomethingOtherThanElse();
    
        if(err) goto fail;
        

I would prefer chainable commands that abstracts out the error checking
though.

    
    
        err = Chain(DoSomething).Then(DoSomethingElse).Then(DoSomethingOtherThanElse);

------
hyperion2010
My take home is this. If we are going to do everything in our power to make
these systems work better, then choosing or developing a language that is
intentionally designed to draw attention to common security mistakes or
prevent them structurally is a damned good thing to look in to. We will also
do everything else in our power, but we had bettered put choosing or making
that language on the list.

------
Guvante
Alternatively use a whitespace sensitive language. Defining how to handle tab
vs space is irritating, but there are plenty of solutions to that.

------
ww520
But Eiffel can't prevent the bug of swapping the order of two statements
accidentally; it must be a bad language!

Come on. Cherry picking a weak feature of a language to invalidate the whole
language is just disingenuous. All languages have strength and weakness. One
has to weigh the positives and the negatives and decide whether it would work.

------
anigbrowl
I found the article interesting, but I wonder why he didn't also discuss using
'switch/case' which would surely have been more appropriate than a succession
of IF statements.

Of course you can screw things up with switch/case too, but in my limited
experience that usually involves a design flaw rather than just a typo.

------
joesmo
The article's suggestion is: "make the correct deduction: use a good
programming language."

It would be enlightening to hear what Mr. Meyer or anyone else thinks would
fit that bill on the Apple's platforms. Until the article actually provides a
real solution, the article isn't making a point at all.

------
peterashford
Calling out Java for this issue is BS. Java doesn't allow unreachable code. If
Java had a 'goto' instruction, that code wouldn't compile because the latter
code was unreachable.

The article is poorly researched rahrah for Eiffel. Eiffel is a good language,
but not for the reasons the author states.

------
ef47d35620c1
What language is he implying to be "good"?

It's obvious that he thinks C, C++, C# and Java are bad (due to syntax). The
world mostly runs on those, so I guess we're all doomed. But if they are so
"bad", then what does he consider "good"? I read it, but must have missed that
part.

~~~
pdw
He's well known as the designer of Eiffel.

~~~
ef47d35620c1
Thanks. I did not know that.

------
jheriko
Yet nobody is making a better c and we are stuck with performance vs safety.

Hit the nail on the head about if although I disagree about the fancy syntax
examples - enforcing scopes is enough - every decent coding standard I've
worked with forbids one line ifs... as does common sense acquired 15 years
ago...

------
al2o3cr
I think you wind up with problems no matter what the tooling - for instance, a
language that required that every line be provably terminating would never
suffer from infinite loops, but whether the _project_ using said language
would ever halt remains to be proven. :)

------
gweinberg
BTW, am I the only one who thinks that duplicating a line of code is 1) not
all that common in the first place and 2) something that should FUCKING JUMP
OUT AT YOU IN CODE INSPECTION LIKE A BLOCK OF ALL CAPS FUCKING FILLED WITH
PROFANITY?

srsly, this was not a subtle hard-to-find error.

------
hyp0
That font - tiny and grey - was by far the hardest to read on Android
(increasing the minimum font size to 20 pt fixed it).

It's considered good practice to brace if/loop clauses (unless on the same
line) for this very reason. Not enforced, though I expect lint picks it up.

------
boomlinde
> Often, you will be told that programming languages do not matter much. What
> actually matters more is not clear; maybe tools, maybe methodology, maybe
> process.

In this case it could easily have been caught if they had full test coverage,
whatever language was used, so yes.

------
jds375
About the single and multiple expression if-statements, I couldn't agree more.
Everyone says I'm an idiot always assuming multiple expressions (it is ugly),
but in the end it is safer.

------
pjmlp
I wonder if Eiffel could have gotten a more widespread use, if Bertrand Meyer
had gotten a major OS vendor behind it, instead of trying to cater for the
enterprise.

~~~
loumf
I looked at Eiffel seriously in the late nineties. My issue was
interoperability and libraries. A little later, he tried Eiffel.NET, but I had
moved on by then.

Eiffel's main selling points (safer, purer OO) were eventually credibly
implemented in Java and C#, and people who wanted something else, wanted
something really different.

Not sure anyone really wanted Eiffel's extensive (and sane) multiple
inheritance support. I certainly wanted pre/post conditions and thread-safety.

~~~
phpnode
Eiffel's contracts are really compelling, I wish there was a reasonable, non-
hacky way to implement them in more mainstream languages.

~~~
loumf
I wouldn't say that clojure is mainstream, but their implementation is
reasonable and makes sense in the clojure idiom (just a meta-data map on the
function)

~~~
phpnode
Turns out there's a sort of reasonable way to implement them in JavaScript,
it's definitely hacky though -
[http://codemix.github.io/contractual/try.html](http://codemix.github.io/contractual/try.html)

------
sfk
Is it possible to prevent timing attacks, control secure wiping of memory etc.
in Eiffel?

~~~
loumf
My recollection of Eiffel (from 10+ years ago) is that it's a more pure OO
system where memory allocation isn't usually done raw, but more like Java or
C# -- in the context of constructing an object.

If you were writing OpenSSL in it, and you decided to use system calls then
you'd be in the same boat. If you OO modeled it with classes, then you might
be more safe. Not sure what it does if you model a bunch of bytes as an array,
allocate it and don't manually wipe.

What do you mean by timing attacks? If you mean timing the string comparison
on a login attempt, then no way -- no language forces you not to compare byte-
by-byte, right? I doubt they even discourage it.

------
brianbarker
I agree that distinguishing between compound and non-compound statements is
terrible. However, so is using whitespace-delimited blocks (sorry, Python). I
wish he'd proposed the simple solution Go uses: require braces.

~~~
brianbarker
So I offended a Python enthusiast. This is why I hate commenting on these
types of articles.

------
autokad
i thought it was bad programming to not insert the {} even if the statement
only had one line of code. is this not true?

------
Dewie
With all the pitfalls of C, the unbraced ifs feels like a very impotent
example. Are pretty printers never used? Maybe a good linter would be able to
warn about this, too.

------
justizin
blah blah blah acm blah

~~~
stephenmm
I am surprised I have not seen anyone mention this yet but a simple regression
with line coverage enabled would have caught this. Is it not common in the
software community (I do mainly Hardware) to run regularly with line coverage?
This seems like a basic problem with process in my opinion.

~~~
mikestew
My experience says it's uncommon. Microsoft seems pretty good about it, and
has good tools available, though it might vary a lot from team to team.
Smaller companies? I've at times been revered as a testing deity when showing
someone code coverage results. Other companies know what the word "coverage"
means, but seldom know about or utilize the tools available. They just know
they probably ought to do it, which is a step ahead, I guess.

Where I'm currently at, my manager wanted code coverage but needed someone to
do it. Since I'm just that guy, I setup the infrastructure for iOS. Android,
you're such a pain in the ass to setup for (rooted device, or the dog-slow
emulator, and it still doesn't give me results) I'm surprised anyone does code
coverage for Android. I can't remember the name of the python tool, but holy
smokes it's slick and painless.

So tooling quality varies, which might have a lot to do with the lack of
coverage testing. I long for the days of the coverage tool Tim Paterson wrote
(yeah, that Tim Paterson). No profiling, it would inject the coverage DLL at
runtime. Then, as you used the program, it would log coverage on the fly. So
program running on one monitor, coverage log on the other, and start clicking.
Can't remember the name of it (it's no longer sold by Paterson Technologies),
but damn it was slick.

------
usumoio
I think this guy might be confusing code with style. This is why I never ever
let the interns not use braces. We built a style guide that is good at
blocking these types of errors. Not to mention that Apple should have tested
enough to catch that error... It is about the code, but maybe not in the way
this guy thinks.

~~~
jerf
Few languages that have copied their if statement from C have adopted C's
exact syntax. Most, if not all, make the braces mandatory. It is a flaw in the
language that it _permits_ such a dangerous style.

~~~
mercurial
Go and Rust make the braces mandatory. Hugely more popular curly-braces
languages such as Java, Javascript, or C#, don't.

