Hacker News new | past | comments | ask | show | jobs | submit login
The Land Before Binary (medium.com/bellmar)
132 points by mbellotti on June 9, 2018 | hide | past | favorite | 29 comments

Various corrections:

The article says that ternary uses exponentially fewer symbols (actually, in further nitpicking, it says "bits" rather than "symbols") than binary, but that's not correct. The decrease is just linear, by a factor of log_2(3).

The comment about subtraction being a lot easier doesn't really make a lot of sense in light of 2's-complement notation; yes, it's a tiny bit easier, but... (the comments about sign bits also seem a bit out of place in this light).

The 2 of 5 encoding reminded me of "8 to 10" codes [1], which (as far as know) is still in use by disk and tape drives. It's for a slightly different reason than the 2 of 5 code but it's roughly the same idea.

[1] https://en.wikipedia.org/wiki/8b/10b_encoding

...and those have been expanded to even bigger units: https://en.wikipedia.org/wiki/64b/66b_encoding

Disk drives use a variant of RLL:


Computers used to run on food...

And run for it on foot

> (FYI I’m going to reverse the conventional order so that the 2⁰ is the left most throughout this post)

Um, why would you break (or rather worse, invert) such a common convention?

> So 11 is 9+3–1.

...and not follow through, just two sentences later?

> [...] at which point the number 11 would be 2-0-1 (1+1+9).

Surely you mean 2+0+9 or 9+0+2?

> POSTNET (the old barcode system the Post Office used to route mail up until a few years ago)

Oh, are these not in use anymore?

> The first group had two bits, one representing the number 0 and the other representing the number 5. The second group had five bits representing the numbers 0–4.

Sounds like an abacus to me…

The Harwell WITCH used Dekatrons, which is a device that has 10 states and can therefore store one decimal digit each.


And let's not forget Charles Babbage's Difference and Analytical Engines, both of which were decimal-based.

And mechanical :)

This was a fun read, thanks for sharing!

Analog computers are even more interesting and I predict them to make a comeback.

Unfortunately doing so would be extremely difficult. Good quality Linear MOSFETS are decreasing in availability as everyone (including the analogue guys) switch to using them in saturation mode. There are exceptions but by virtue of being the exception they are expensive.

A lot of MOSFETs don't even spec their linear regions anymore. The trend is away from linear.

Can you elaborate on why and where you believe they will make a comeback?

They can solve optimization problems better than digital computers. It's a wholly different paradigm, for example the analog sorting algorithm has a runtime of O(n). It's called spaghetti sort.


There used to be electric analog computers. I've been working on an analog photonic computer (very theoretically so far) that I think could work.

Do you know how many electrons you move when you execute an instruction? It's like on the order of 10^18. I might be able to get away with a couple of photons.

Considering the high cost of developing new analogue hardware, wouldn't it be cheaper to virtualize this on digital processors anyway?

Also there appears to be some dispute on the validity of "Spaghetti sort"


You won't get the gains from using only a few photons vs 10^18 electrons.

Is there a shortage of electrons I am not aware of?

You can pack 10^18 times more computation with the same size.

So the physical components could be smaller?

I guess my point is that unless you can take advantage of traditional chip architectures or manufacturing, it may take an immense amount of effort to develop something close to their component density.

Amongst other yes. The main advantage is that you aren't dealing with metals but with photonic crystals. These don't heat up so you can build a cubic meter sized CPU theoretically.

Please keep us all updated on your work. Would love to see some blog posts or something.

Lol you can bet that if I manage to pull this off you'll hear about it.

The problem with blogging about this is that it diverges from current status quite a bit and as a result I've been called a crackpot. But I'll show them all lol.

Oh I wouldn’t worry about it. Pretty much 100% of all successful inventors have, at one stage or another, been considered crackpots. Not 100% of crackpots are successful inventors though. Just blog with a pseudonym, so your real life credibility isn’t compromised.

For semi-related (?) work IBM did some research on photonics inside a CPU.

I wonder if bi-quinary decimal influenced the later Packed Decimal (known as COMP-3 in COBOL), which stored numbers in just over half the space of the text equivalent (e.g. a 7-digit number would need 4 bytes).

What's up with the binary representation of 11?

They included 6 in the decimal representation of binary bits for some reason. Outside of that, it's correct. This is a surprisingly common mistake, since (2, 4, 6) is such a common pattern.

Well shit, thanks for noticing that. I've fixed it.

Telephony used 2 out of 5 code extensively when passing dial digits around within the system.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact