Hacker News new | past | comments | ask | show | jobs | submit login
Stack Overflow answer explaining JS in the "Wat" talk (stackoverflow.com)
152 points by sblom on Jan 29, 2012 | hide | past | web | favorite | 23 comments



That's all quite straightforward, but I think the real takeaway is that it's important to have operations that don't make any sense complain loudly.

    >>> "wat" + 1
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    TypeError: cannot concatenate 'str' and 'int' objects
    >>> "wat" - 1
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    TypeError: unsupported operand type(s) for -: 'str' and 'int'


There are two separate issues at play here. One is NaN--you can either have NaN values or have errors from invalid mathematical operations. Since JavaScript only has floats (which contain NaN values), NaNs make just as much sense as errors.

Another, more general, issue is type coercion. You can either have a "strongly typed" or "weakly typed" paradigm (although both terms are somewhat nebulous). I rather like JavaScript's coercing everything--after all, a 1 is a 1 even if it happens to be a string. Thus, I think concatenating a string with a number or trying to do arithmetic with strings makes perfect sense.

A different option--this is what Python does--is to not coerce types. So trying to concatenate a string with a number would give you a type error. However, this also has an advantage: you can now reasonably overload your operators. You can have + add numbers, concatenate strings, append lists...etc.

The real problem comes from trying to have both type coercion and an overloaded + operator; either option by itself is reasonable. JavaScript--probably in an unfortunate attempt to ape Java--has an overloaded + operator that does both concatenation and addition. If it instead had just a separate concatenation operator, the endless problems with + would not arise.


You're quite right in that this is a design choice rather than a defect. I do, however, have a preference for strongly typed languages (and there are very different flavors thereof, from Python to C++ to SML). I think weak typing is great for getting prototypes out the door in a hurry (or even just for getting up to speed in a language in a hurry), but dynamic typing really helps in the later stages of an application.


First, let's be clear on terminology: dynamic is the opposite of static; weak is the opposite of strong; the two are orthogonal. "Weak" and "strong" typing is about coercion; static and dynamic is about compile-time verification.

I personally think that if your language is dynamically typed, it should be weakly typed; if your language is statically typed, it should also be strongly typed. So I would (and do) prefer JavaScript/Haskell to Python.

I figure if you're going to be dynamically typed, you may as well be as flexible as possible; it also makes sense from a semantic point of view (my whole a 1 is a 1 is a "1" spiel). On the other hand, if you're enforcing types at compile time, having type coercion just seems weird. (Of course, you can have ad hoc polymorphism like Haskell's type classes that could do something similar, but that's different.)

I think the practical difference between "weak" and "strong" typing is much smaller than the difference between dynamic and static typing.

Finally, I do think the behavior of + is a defect, but not because of coercion. Rather, I don't like how it does two different things. If it only added numbers, it would work perfectly; the other arithmetic operators all work as expected. (Yes, I think the NaN behavior makes sense.)


> after all, a 1 is a 1 even if it happens to be a string

Is it? In JS: "1" + 1 == "11"; 1 + 1 == 2


The problem there isn't with coercion, it's with + being both addition and concatenation, as I mentioned at the end of my comment.

If you use any other operator (any reasonable operator), it works: "1" - 1 === 0, "1" - "1" === 0 and 1 - 1 === 0.


Was an initial goal for Javascript that it be extremely syntactically accepting? It seems like a lot of these oddities are the result of having coercions that, while certainly not crazy when looked at in isolation, make for strangeness when combined.


Nitpicking, but it would be more accurate to say JS is semantically accepting. Eich tried to make as many syntactically valid constructs executable as possible.


I may be wrong, but I believe that the first version of javascript didn't have exception handling. This might explain why it was decided to use implicit coercions rather than throw an exception, since there were no way to catch it.


True! This is why people invented type checking.


And then implemented dynamic typing, promoting it as a breaking feature of the language.


Stack Overflow is being stupid and not allowing me to post comments there. What I wanted to post is that the JSFiddle code the original asker links at http://jsfiddle.net/fe479/1/ runs into the problem that ({}) (opening braces inside of parentheses) is not the same as {} (opening braces as the start of a line of code).

So when you type out({} + []) you are forcing the {} to be something which it is not when you type {} + []. This is part of the 'wat'-ness.

The basic idea is half-hearted: JavaScript wants to allow both of these forms:

    if (u)
        v;
    
    if (x) {
        y;
        z;
    }
    
And to do so, two interpretations were made of the opening brace: 1. it is not required and 2. it can appear anywhere.

This was a wrong move. Real code doesn't have an opening brace appearing in the middle of nowhere, and real code also tends to be more fragile when it uses the first form rather than the second. (About once every other month at my last job, I'd get called to a coworker's desk when their modifications to my code weren't working, and the problem was that they'd added a line to the "if" without adding curly braces. I eventually just adopted the habit that the curly braces are always required, even when you're only writing one line.)

Fortunately in many cases eval() will replicate the full wat-ness of JavaScript. The JSFiddle code should read:

    function out(code) {
        function format(x) {
            return typeof x === "string" ?
                JSON.stringify(x) : x;
        }   
        document.writeln('&gt;&gt;&gt; ' + code);
        document.writeln(format(eval(code)));
    }
    document.writeln("<pre>");
    out('[] + []');
    out('[] + {}');
    out('{} + []');
    out('{} + {}');
    out('Array(16).join("wat" + 1)');
    out('Array(16).join("wat - 1")');
    out('Array(16).join("wat" - 1) + " Batman!"');
    document.writeln("</pre>");
[Also that is the first time I have written document.writeln in many many many years, and I feel a little dirty writing anything involving both document.writeln() and eval().]


Please post this as an answer to the question.


heres some more js Watness

    var a = undefined = 1
    //vs 
    b = undefined = 1
what happens there?


There's a lot of nuisance because JavaScript has the same reserved words as Java. So for example you may sometimes accidentally find that this code barks an error at you:

   var x = new CustomObject();
   x.property.class = "some new class";
Why? Because 'class', even though it doesn't do anything in vanilla JavaScript, is a reserved word that cannot appear in this location. When you want to modify CSS classes, for example, you must use element.className to get that sort of access.

I mention this because in reverse, it is the same problem as your example. There are a bunch of default names in JavaScript which you'd expect to have but whose names aren't reserved words and can therefore be used just like variables.

One of these is 'undefined', another is 'Array'. These are both not so bad: if some fool misdefines 'undefined' globally you can just preface your own code with:

    undefined = void(0);
...to reset it. Here, 'void' is a Java keyword and so it will never be overwritten. Or if they redefine Array, you have the [] syntax to construct a new array and append elements to it; nobody really writes new Array() for anything anyways.

But suppose someone globally redefines 'isNaN' or 'isFinite' or both 'eval' and 'Function'. You're going to have a tremendously difficult time reimplementing those if you don't have them.


If you are in a browser environment you can get fresh window objects by building some iframes. Of course, this is really inefficient an annoying and in an ideal world no one would ever need to know this is possible.


Nothing interesting, per current spec. a and b both get 1 assigned to them, in V8, JSC, and current Spidermonkey, and the value of undefined doesn't change. Carakan seems to still implement the old spec version, so it assigns 1 to all three. The presence or absence of var is not relevant.

The only reason a and b get 1 assigned to them is that the value of an assignment expression is always the RHS, no matter what actually happens to the assignment. Since "undefined" on the global is readonly, the assignment itself is just silently ignored.

All this without strict mode, of course; in strict mode the assignment to undefined throws.


those are all great explanations. unfortunately they fail to address the "why". i would expect

[] + [] to behave like [].concat([])

{} + {} to behave like jquery's $.extend({}, {})

anything else should just barf. IMO.


I think having both type coercion and overloaded operators is not a good idea. Since JavaScript uses type coercion widely--and I agree with this myself--a better solution would be to have + only do addition. As it, it tries to do too much: both numerical addition and concatenation.

If all it did was addition, you could simply coerce both sides to numbers and return NaN if the conversion did not make sense. Of course, this would mean needing an extra operator for concatenation, but I think that would be fine.

On the other hand, your approach would make sense if JavaScript did not try to coerce everything. In that case, having + act differently for different types would be a good option, and your ideas would fit right in.

In short: the problem is not type coercion and not overloaded operators; the problem is combining the two in one language.


You could have ++ to do concats like in haskell


No, ++ is already taken. So is . (dot operator). Maybe # would do, but they're thinking of using it for a bunch of other syntax in the future.

Anyhow, there's no way to change it without breaking backwards compatability :( And just adding a concatenation operator alone would not do because + would still behave in an annoying way unless it was changed to only do math.


In JS operators only works on scalar types (number, string or boolean). Operations on objects (including Array) must be methods (like concat, extend), since JS doesn't support operator overloading.

Some operators work on numbers (eg. * , /, ++, prefix +), some work on booleans (eg &&, ?:). Infix + is special cased because it works on either strings or numbers.

JS uses implicit coercion with operators. An operator will newer throw a type error. Instead the operands are converted into the expected type, e.g `a * b` is equivalent to `Number(a) * Number(b)`. The rules for how objects are converted into scalar values are non-obvious, but then again, you shouldn't use numeric operators on objects in the first place.


Isn't is sad that the lingua franca of client side development is a language that constantly surprises you in a bad way instead of one that adheres to the principle of least surprise?

Yes, there are always good reasons for why a piece of JS code behaves in the strange way it behaves, but I wonder how many hours have been wasted around the world just because of some of these quirks. All the technologies out there have their quirks, but the JS case is pathological.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: