
Third Base – Ternary Notation (2001) - sidcool
http://web.williams.edu/Mathematics/sjmiller/public_html/105Sp10/addcomments/Hayes_ThirdBase.htm
======
aaronchall
In case you were wondering: "As in ordinary ternary numbers, the digits of a
balanced ternary numeral are coefficients of powers of 3, but instead of
coming from the set {0, 1, 2}, the digits are –1, 0 and 1."

e.g.

1 x 3^3 - 1 x 3^2 + 0 x 3^1 + 1 x 3^0,

Imagine, trits instead of bits. I like the idea of base e, if unrealistic.

(BTW: I have an old answer on Math.stackexchange that cheats by using base 3:
[http://math.stackexchange.com/a/734723/108109](http://math.stackexchange.com/a/734723/108109)
)

~~~
mathattack
We'd all be more attuned to power laws if we thought in base 3. :-)

I like the -1, 0 and 1 concept, but is there any way to pull it off from an
electrical engineering concept? Or would it all have to be overlayed as
software?

~~~
nitrogen
[https://en.wikipedia.org/wiki/Ternary_computer](https://en.wikipedia.org/wiki/Ternary_computer)
\-- it has been done before.

------
SagelyGuru
There is a Texas Instruments 1962 patent for a tristable circuit (instead of a
bistable flip-flop), using vacuum tubes, so it would be possible to build
similar trinary computers in silicon. Though the same problem as with the old
Russian Setun seems to remain: you need four transistors, which would
ordinarily suffice for two binary bits, so any advantage is lost.

If you can design a tristable "flip-flop-flap" circuit with just three
transistors, you might be in business!

[https://encrypted.google.com/patents/US3051907](https://encrypted.google.com/patents/US3051907)

~~~
jonsen
Seems as if _one_ memristor can do it:

[http://spectrum.ieee.org/semiconductors/memory/sixstate-
memr...](http://spectrum.ieee.org/semiconductors/memory/sixstate-memristor-
opens-door-to-weird-computing)

------
plesner
While balanced ternary is nice in all sorts of ways, there's an argument
against it being the most efficient representation. You might consider 2
better in practice in the sense that it makes it easier to pick the optimal
amount of state space for a word when designing hardware.

On any practical architecture a word has some fixed length, N, and the vastly
most efficient case will be when your number fits within that space. That is,
it is more efficient if the number can be represented in r^N bits of state.
The amount of state an architecture gives you is dictated by many different
considerations but the radix that offers you most control (that is, most
different possible values for r^N) is 2. For instance, say you want a word
length that can store numbers up to 500, you have the options of

2^N: 2, 4, 8, 16, 32, 64, 128, 256

3^N: 3, 9, 27, 81, 243

So 2 gives you 8 choices for N whereas 3 gives you 5. So 2 gives you more
design space to choose the most efficient representation in practice.

------
Normati
I don't understand what is being optimized by minimizing the product rw.
Reducing w reduces the number of digits, but how does reducing the radix r
help with component count? Surely a memory cell that could store 1,000,000
different values in a base-1-million system is just one component, even if it
is quite difficult to make. Are they talking about the components needed to
perform calculations?

~~~
lysium
I'd say, your base-1-million system is virtually impossible to make...

But you are right: what counts is how much information you can store or
process per square inch / per Watt, not really how many components you need to
achieve this.

------
breckinloggins
My favorite thing about "trits" is that they can directly encode the notion of
"true / false / unknown". This is attractive because a lot of computer science
(and type theory in particular) seems to gravitate toward constructivist /
intuitionist logics.

------
todd8
After reading about ternary number notation, it shouldn't be hard to solve one
of my favorite brain teasers:

Given a simple balance scale with pans on both sides, how many weights does a
merchant need to be able to weigh any whole number of grams from 1 to 40?

Most people have trouble getting the right answer, but its easy with balanced
ternary. Try it!

~~~
jonsen
This is actually explained in the article.

------
bit-player
Please note: There's a PDF available at:

[http://bit-player.org/bph-publications/AmSci-2001-11-Hayes-t...](http://bit-
player.org/bph-publications/AmSci-2001-11-Hayes-ternary.pdf)

------
PhantomGremlin
Someone I worked for a while ago would have summed this up as: "a cure for no
known disease".

I know a little about CMOS transistor and logic design, but I'm certainly no
expert. Still, I'm not seeing the advantages of ternary numbers. I'd be
interested in hearing otherwise.

At today's geometries we have a hard enough time keeping _binary_ gates and
memory cells working reliably. Ternary just doesn't seem to be practical.
Perhaps with another supply voltage? E.g. +1.5, 0, -1.5? But today's CPUs very
very aggressively vary supply voltage in order to conserve power. Now they'd
have to fiddle with two voltages? Bah!

~~~
databass
Ternary (or even higher n-ary) storage is already widely used: multi-level
cell (MLC) SSDs store more than one bit per cell by using multiple voltages
(most commonly 4, i.e. 2 bits per cell). I believe Ethernet also uses 3
voltage levels.

~~~
darkmighty
Communication systems in general routinely have up to 1024 modulation symbols
(32 voltage levels for both the sin and cos carrier).

[http://en.wikipedia.org/wiki/Quadrature_amplitude_modulation...](http://en.wikipedia.org/wiki/Quadrature_amplitude_modulation#Quantized_QAM)

------
DanielBMarkham
Makes you wonder what a ternary - based, memsistor built functionally
programmed CPU with 243 cores and roughly 4G of on board RAM would be like.

