Hacker News new | comments | show | ask | jobs | submit login

I'm new to JavaScript, but from what I can tell, the no-semicolons style

- is incompatible with existing tools

- is incompatible in places with upcoming versions of JavaScript

- is harder to refactor, since you might need to add additional tokens to the beginning of lines depending on the previous lines

- is confusing to new JavaScript programmers, since

  + it goes against the recommendations of many of the most popular books

  + it requires adding tokens to the _beginning_ of lines in places, making the entire enterprise more challenging to learn and of dubious utility

  + it discards one of the attractive qualities of JavaScript for programmers of other languages: familiar syntax
And further, the use of constructs like

    !somethingHappened && otherwiseDo(x)
instead of

    if (!somethingHappened) {
        otherwiseDo(x)
    }
is bad practice in any language.

"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?" - Brian Kernighan, "The Elements of Programming Style"

Although everyone has the right to maintain their projects with whichever style they prefer, given the above, I wonder why the chosen style was chosen.




Actually constructs like:

    somethingHappened or otherwiseDo(x)
used to be idiomatic perl (and might well still be for all I know).

Edit: Just for amusement it would probably be:

    somethingHappened not { x otherwiseDo } if
in PostScript


And would allow for constructions like

  open FILE, "filename.txt" or die
Which is nice enough to read (and can be seen in actual examples around the web: http://www.perlfect.com/articles/perlfile.shtml )

The semicolon wars frustrate me. We have an arbiter of what's correct/valid syntax and what isn't. It's the freaking lexer, heaping additional prescriptivism on top for something as trivial as semicolons is irritating. If your code has ambiguity problems, fix it by writing unambiguous code. Semicolons are not the only way.


This boggles my mind. Why would someone use a convention that brings in ambiguity, when they could just type a semicolon and avoid all the problems of ambiguity in language parsing?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: