

In JavaScript, why does (null + null === 0) return true? - swozniak

I could be convinced into thinking == would return true if null was treated as 0 (so 0 + 0 = 0), but with ===, this doesn&#x27;t seem right at all, especially considering:<p>typeof null === &#x27;object&#x27;<p>and<p>typeof 0 === &#x27;number&#x27;<p>Can someone explain?
======
singold
I don't really know (I don't have that in depth knowledge of javascript) but
it is probably like you said, because you are doing a sum, it treats null as
0, so (0 + 0 === 0) is true.

You can do the same with false: (false + false === 0)

And also (true + true === 2) is true

Edit: The same way as if you do "" \+ 1 you get "1" because it assumes + as
string concatenation

------
joelbirchler
The + operation is evaluated first, and + coerces nulls to zeros. This looks
like:

(null + null) === 0

(0) === 0

true

~~~
swozniak
Ah, it looks like JavaScript coerces null to the type against which it is
being compared.

Thanks!

------
zamfi
> null + null

0

> +null

0

Looks like JS converts null to 0 when it's used in a number context.

