This is exactly why when ever I need to do any sort of math in a program I make sure to put parens around any single operation. Even 2*2+2 gets the treatment because god knows there have been people who screwed that up.
2+2x2 I can see (Smalltalk has strict left-to-right evaluation of operators, so it parses as (2+2)x2), but for 2x2+2 to give the "wrong" would require `+` to have a higher precedence than `*`, something I've yet to see (short of abusing haskell to redefine the operators with fucked up precedences)
edit: god fucking dammit HN can't you get any markup right?
My point exactly. Had I had to do something in Smalltalk and not known that, that would have been a fun bug to find had I decided to be frisky and write it as 2+2*2.
It makes more sense for 2^3^4 = 2^(3^4). If you meant (2^3)^4, then it would be more efficient to calculate 2^(3*4), because exponentiation takes much longer than multiplication.
On a related note, in Excel, -1^2 = 1 because they are calculating (-1)^2 instead of the more standard interpretation, -(1^2).
Incidentally, I mentally parse -1^2 as (-1)^2, but I parse -1² as -(1²). Reading your post I was thinking "well yeah, minus one squared is one, what else would it be?"
All of the "4096" results have exponentiation as left-associative, despite the common conventions that exponentiation is right associative.
Seeing the example really makes me wonder why that convention exists, and also makes me ponder whether having variate associativity is actually a good idea after all. Especially in languages like Scala, where you can change the associativity of your binary functions through function naming.
Though I can see why things like the cons operator "::" make a lot of sense.
It's interesting to think that the whole notion of associativity and order of operations only exists because for some reason humans seem to find infix notation more "natural" for binary (or fixed arity) operators than more easily parseable and less ambiguous alternatives like prefix.
Prefix notation doesn't automatically solve the problem, you need to assume an evaluation order. More specifically, you need to not assume associativity.
Isn't the lack of associativity explicit in prefix notation? Prefix notation doesn't even contain the notion of order of operations. You would convert the infix "2^3^4" to prefix in different ways depending on the associativity of exponentiation:
For left associativity (uncommon):
^ ^ 2 3 4
For right associativity (common):
^ 2 ^ 3 4
Of course, all this is assuming all operators are fixed arity, otherwise you need parentheses no matter what.
At the first glance I thought 3 is smaller than 2, 4 is smaller than 3. But after reading your note, I realize they're of the same size. That's what "conventional" means for me. :-p
That's exactly the difference between binding to the right and binding to the left (aka left-associativity and right-associativity). A left-associative exponent will parse to `(a^b)^c`, a right-associative one will parse to `a^(b^c)`
The fact that infix notation removes the need to think about that is a bad thing, because infix notation is inherently ambiguous unless you litter the expression with parentheses or explicitly think about operator precedence and associativity. Infix notation just appears to be easier on the mind, because we're first taught arithmetic using it.