Hacker News new | past | comments | ask | show | jobs | submit login

What if we had just 4 digits on each limb or even 6 perhaps.

How does those numbers look in base 8 or 12.




I never bought this argument, but I'm not confident about it. Isn't base 10 inherently intuitive because of the obvious reasons? IE an order of magnitude is just another 0?

Since I learned about base 2, etc, long ago, I always thought there was something magically elegant about base10 and never understood this? The explanation I've always heard, being 10 fingere, doesn't seem to explain all the elegance with base 10 being easy to work with?..?


You are correct that "10" is a very special number, as long as you don't assume that it can only mean "ten".

In fact, every number base is base "10" when you interpret the "10" in that base.

Try it:

10 binary is two.

10 octal is eight.

10 hexadecimal is sixteen.

This is the very definition of a number base: it is the multiplier that you represent by appending "0" to a string of numeric characters in that base.

So that is where the fundamental and special nature of "10" comes from; it's not because it happens to mean "ten" in our customary number base.

Ten is nothing special, "10" is. "10" is simply the way you write N where N is whatever number base you're working in. It's just as special in every number base!

p.s. I'm sorry you were downvoted so heavily for asking an honest question. You can't be the only one who has wondered about this, and your question led to an interesting discussion.


That is nice of you to provide such a well-written answer, and a human response at the end.

Your reasoning seems quite obvious now with hindsight after having given it some thought; and yet, even still I have such a strong inclination that ".........." is a special number? Why? I guess it really is entirely my cognitive bias, because I can't find a reason for it.

But I should have known better as I've--probably obviously being a user here--encounteted binary more than just a few times. And even still, it never occurred to me that 10 in binary is just ".." And then 100 in binary is 1 order of magnitude of "..," 4. And 1,000 is just two orders of magntiude, 8. But still, intuitively this does not seem as natural as 10, and I guess that is completely cognitive bias.

Am I retarded to not have realized this? Maybe, but I actually was so curious about this that I tried to quiz some colleagues by asking what 1000 and 1001 is in binary and only one person got it right immediately, probably by understanding orders of magnitude and not by rote memorization. All the others got it by counting in binary, and one final person was annoyed and questioned why I was asking about binary (oops, sometimes being inquisitive is not socially acceptable). By the way, I work with app developers, most of whom do not have backgrounds in computer science, same as myself.


It seems you are inutitively always converting everything to the decimal system and taking that as "the way" to think about numbers. That wouldn't be surprising, because we are brought up this way and even our language focusses on the decimal system. Not having good words to speak about the binary number 1101001 makes it difficult to think about it without converting it first. Two (I'm decimal again!) isn't a good base for human communication because there is a lot of repitition of simple symbols. Maybe a new way to pronounce hex numbers like a17c03 would be able to replace the decimal system.


This illustrates the idea well: https://i.imgur.com/II5W6Pl.png


Amusingly, the alien and astronaut would be referring to the same thing when they say 'Base 3'


Base 10 is intuitive because we are taught to work in base 10. If we worked in base 7, then multiplication by seven would be just another 0. (And if we worked in base 7, we would probably have defined “an order of magnitude” to be a multiplication by 7, rather than 10).


Though probably a base with a couple convenient small factors is useful. Especially 2, since parity (even/odd) is so useful.

Past cultures thought even more factors were good, e.g. sexagesimal with 2, 2, 3, and 5. It means that e.g. the expansion of 1/3rd and 1/6th don't form a repeating fraction in sexagesimal notation.


I’m convinced base 12 would be far superior to base 10. It has four common factors: 6, 4, 3 and 2, rather than just one. This would make handling common whole number fractions in place value form much easier. It’s also easy to count to 12 on one hand - just point to your finger bones with your thumb. That way with two hands you can count all the way up to 24 (in base 10 equivalent).


Me too! And telling if a large number was a multiple of 2, 3, 4, 6 would be trivial - just check the last digit!


Still only two prime factors. I really wanted bass thirty - negative powers of 2 3 and 5 all terminate.


In base 8 (if we'd had 8 fingers), an "order of magnitude" would have been defined as "times 8" instead of "times 10", so it would also be adding another 0. Same with base 12. Base 16 would have the further advantage that we could easily halve, quarter, eighth, or 16th any number ending in 0 to a whole integer (in base 10, we can only halve, fifth, or tenth).


Yea you are obviously right now that I think about it, and I still have such a strong willingness to think there is something special about the number 10.


Meanwhile, there is something special about base-12, namely that the log base 2 of 3 has a really good rational approximation as 17/12, the log base 2 of 5 has a pretty good approximation as 7/3 (you can do better with 28ths).

This is the basis for the 12-tone equal temperament scale in music, and it only works if you use base-12. So if we used base-12 for our numbers then someone would have the bright idea to name all of our musical notes with numbers and we could just do a key change (or chord formation) by addition.


There's an argument against intelligent design right there (4 or 6 fingers per hand are obviously better).


Eh, evolution is a pretty nifty mix of oo class extensions, recursion, brute-force and bias weightings.

I'd wager Gawsh made the best system S/He could given product constraints (completely unfocused if you ask me [which I know no one did]) and the real need to deliver (take it easy over there Leibniz, the world is still crap as evidenced everywhere).

Anyway, can't knock it 'til you've built it.

This is an interesting article: https://www.scientificamerican.com/article/why-do-most-speci...


Well, my comment was partly in jest (though I do think it's by no means clear that 5 is a local optimum, thus it's quite possible that 4 or 6 would be better, and twice either would give us a better base for counting), but I'm amazed that there's actual scientific discussion of the issue. I wish to quote the most pertinent part of the article though:

> Is there really any good evidence that five, rather than, say, four or six, digits was biomechanically preferable for the common ancestor of modern tetrapods? The answer has to be "No,"


Sure, if you assume the only purpose of fingers is for counting.


> In base 8 (if we'd had 8 fingers)

Seven fingers.

Base 11 is the natural base for a ten fingered person: Base 11 has a distinct symbol for ten, base 10 does not.

[A prime base has quite a few practical disadvantages... and their advantages are fairly esoteric...]


I mean it's not so bad. If we meet aliens there is a decent chance that they will have a base-9 counting system in balanced ternary: so their digits would be -4, -3, -2, -1, 0, 1, 2, 3, 4. Rather than prefixing negatives with a minute sign, maybe negation would flip a number top-to-bottom.


So you know how you can tell if something is a multiple of 2 or 5 based on the last digit? & all those shortcuts you get with multiplying by 5? For base 12 you'll lose that, but then get to do it for multiples of 2,3,4,6


Yea I guess in base 4, 5 would be 11.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: