That is nice of you to provide such a well-written answer, and a human response at the end.Your reasoning seems quite obvious now with hindsight after having given it some thought; and yet, even still I have such a strong inclination that ".........." is a special number? Why? I guess it really is entirely my cognitive bias, because I can't find a reason for it.But I should have known better as I've--probably obviously being a user here--encounteted binary more than just a few times. And even still, it never occurred to me that 10 in binary is just ".." And then 100 in binary is 1 order of magnitude of "..," 4. And 1,000 is just two orders of magntiude, 8. But still, intuitively this does not seem as natural as 10, and I guess that is completely cognitive bias.Am I retarded to not have realized this? Maybe, but I actually was so curious about this that I tried to quiz some colleagues by asking what 1000 and 1001 is in binary and only one person got it right immediately, probably by understanding orders of magnitude and not by rote memorization. All the others got it by counting in binary, and one final person was annoyed and questioned why I was asking about binary (oops, sometimes being inquisitive is not socially acceptable). By the way, I work with app developers, most of whom do not have backgrounds in computer science, same as myself.

 It seems you are inutitively always converting everything to the decimal system and taking that as "the way" to think about numbers. That wouldn't be surprising, because we are brought up this way and even our language focusses on the decimal system. Not having good words to speak about the binary number 1101001 makes it difficult to think about it without converting it first. Two (I'm decimal again!) isn't a good base for human communication because there is a lot of repitition of simple symbols. Maybe a new way to pronounce hex numbers like a17c03 would be able to replace the decimal system.

Applications are open for YC Winter 2020

Search: