
The Future of Computing Is Analog - zerogvt
https://medium.com/s/story/the-future-of-computing-is-analog-e758471fbfe1
======
twtw
With respect, I think most comments here are missing Dyson's point (perhaps
because it was somewhat poorly made).

I don't think his point is about whether the integrator and analog electronics
will be resurgent in the next century, and analog hardware will be common.

I think Dyson is talking about the complex network of the modern world, where
humans interact with computing machine, and with each other - humans influence
computers, and computers come back around and influence humans. I think his
"future of computing" is a future where human society and culture is decided
based on the interplay between humans and our machines, and this decision is
an "analog computation" made by a massive scale hybrid computer that no one
has intentionally designed or understands.

You can see examples of this already, with youtube recommendation engines
influencing the belief systems of millions (billions?) of people across all
kinds of subjects, and with our thoughts frequently dominated by whatever
happens to show up on our phones.

~~~
patrickyeon
Oh absolutely you're correct here.

> In analog computing, complexity resides in network topology, not in code.
> Information is processed as continuous functions of values, such as voltage
> and relative pulse frequency, rather than by logical operations on discrete
> strings of bits.

...

> Individually deterministic finite-state processors, running finite codes,
> are forming large-scale, nondeterministic, non-finite-state metazoan
> organisms running wild in the real world. The resulting hybrid
> analog/digital systems treat streams of bits collectively, the way the flow
> of electrons is treated in a vacuum tube, rather than individually, as bits
> are treated by the discrete-state devices generating the flow. Bits are the
> new electrons. Analog is back, and its nature is to assume control.

> Say, for example, you build a system to map highway traffic in real time
> simply by giving cars access to the map in exchange for reporting their own
> speed and location at the time. The result is a fully decentralized control
> system. Nowhere is there any controlling model of the system except the
> system itself.

...

> Even in the age of all things digital, this cannot be defined in any
> strictly logical sense, because meaning, among humans, isn’t fundamentally
> logical. The best you can do, once you have collected all possible answers,
> is to invite well-defined questions and compile a pulse-frequency weighted
> map of how everything connects. Before you know it, your system will not
> only be observing and mapping the meaning of things, it will start
> constructing meaning as well. In time, it will control meaning, in the same
> way the traffic map starts to control the flow of traffic even though no one
> seems to be in control.

In these passages, "The Computer" that's running is the network, a meta-
computer, made up of the individual "computing elements" if you will that are
the physical pieces of hardware that most anyone would point to and call "a
computer". And as you've recognized, that's Dyson's point: No matter what
those little elements are made of, the overall picture is analog. If we want
to understand how "Algorithms" are influencing the world, we can't think of
them as "discrete computing algorithms". And finally, the emergent behaviour
out of this can lead to large-scale control effected on our society without
being designed in, and _even if_ nobody is thinking to effect control.

~~~
TheOtherHobbes
That's no different to conventional history and culture. It just has computers
in it.

Dyson seems to have only just worked out that evolution, culture, and politics
are emergent.

Which is why the argument makes no sense. Either digital systems add something
new and unique to the mix, or they don't.

If they don't, then what's the point of the piece?

If they do, blather about analog metazoans is irrelevant. The problem becomes
one of understanding emergence without obfuscation and hand-waving.

In fact so-called emergent behaviour is not the problem. Not even slightly.

The biggest digital systems we have now have been consciously and deliberately
designed to monitor and manipulate human behaviour.

There's nothing emergent, surprising, obscure, or quasi-mystical about either
the outline goals or the specific techniques used to achieve them.

Instead of resorting to obscurantism, it would be far more useful to invent a
new kind of "digital democracy" where the goals of large software corporations
were open to democratic oversight.

Of course that's not going to happen while free market excuses and rhetoric
are used to keep democratic oversight well away from these corporations.

~~~
armitron
There is this concept of emergent control called a "metasystem transition" [1]
which is what Dyson is really talking about. Conventional history and culture
has never before been shaped by the - ever shortening, ever accelerating -
techonomic feedback loops that we see today. Cybernetics has been summoned
like Yog Sothoth in "The case of Charles Dexter Ward".

[1] [http://pespmc1.vub.ac.be/MST.html](http://pespmc1.vub.ac.be/MST.html)

[2]
[https://en.wikipedia.org/wiki/The_Case_of_Charles_Dexter_War...](https://en.wikipedia.org/wiki/The_Case_of_Charles_Dexter_Ward)

------
coupdejarnac
Too bad the author didn't substantiate his claims that analog computing will
make a comeback.

I found this article [0] with a real world application, but no performance
comparison to a digital computer. I also take issue with the claim that a
transistor has infinite possible states, and that we're ignoring most of them.
This doesn't take into account real world limitations of components' precision
and noise.

[0] [https://news.mit.edu/2016/analog-computing-organs-
organisms-...](https://news.mit.edu/2016/analog-computing-organs-
organisms-0620)

~~~
adamnemecek
> I found this article [0] with a real world application, but no performance
> comparison to a digital computer.

This is impossible right now, no one is really making analog computers to make
this comparison.

------
joe_the_user
The article seems a bit abstract.

I think the question that comes to mind is whether it's possible to take GPU-
styles architecture and give it 100x more power or more by replacing bits with
approximate voltage levels that are "fuzzy" but statistical guarantees to
their performance, along with gates that allow the values to be filtered back
to zeros and ones.

The thing is the "neural architectures" seem to have been more or less failure
through requiring the user to accept one particular neural net structure while
the standard GPU seems to have succeeded through being generic-enough to use
for a variety of tasks. So some sort of analogy-GPU should also have a similar
generic model - but of course it seems likely that the creator of this stuff
will want to impose their special model.

Edit: More or less like the D-wave "quantum computer" except not milking
quantum hype and being reconciled to being understood as "massively analogue"

~~~
krastanov
The "fuzzy but statistical guarantees" is basically "error correction". Which
is (handwavily) the difference between digital and analog computing. What you
are describing is the engineer's definition of a digital computer. Admittedly,
there might be interesting performance gains if we use less stringent
statistical guarantees... which happens to be what is happening each time we
make the transistors smaller and more susceptible to noise.

C.f. the paper that introduced the distinction between analog and
"statistically guaranteed" digital in the case of classical computers (before
it people were arguing that you can not build a scallable classical digital
computer because of noise): by von Neumann
[http://www.sns.ias.edu/pitp2/2012files/Probabilistic_Logics....](http://www.sns.ias.edu/pitp2/2012files/Probabilistic_Logics.pdf)

The paper that did the same for quantum computers 50 years later: by Shor
[http://www-math.mit.edu/~shor/papers/good-codes.pdf](http://www-
math.mit.edu/~shor/papers/good-codes.pdf)

P.S. FYI DWave is not a scallable quantum computer. They have another
"quantumy" word for what they do, so that they can keep the hype without
angering people that are trying to build an actual quantum computer.

~~~
kangnkodos
Yes. Concentrating on error correction reveals the difference between digital
and analog computing.

Digital computers have error correction. Analog computers don't.

Some small number of problems might be solved efficiently using analog
computers, but they will never take over the role of general purpose computers
because of error correction issues.

------
vgoh1
Digital computation has been able to rise to this level of complexity because
we can precisely predict/repeat outcomes because of exact numbers and boolean
logic. I could not fathom anything like what we have using analog. I was
reading the article, waiting for some type of plan for how to tackle analog
computing, but it never came. A thought provoking article, but I'm not holding
my breath for analog.

------
medius
Just a few thoughts I've been mulling for a while about this topic:

Machine learning is something that I believe can take advantage of analog
computing. A machine learning algorithm does not need highly precise or
accurate representations. Most current implementations of such processing
units use fewer bits (usually 8).

However, even if we use fewer bits, the engineering effort (design, layout,
lithography, etc.) that goes into making the processing unit still assumes
that those few bits are error free. The manufacturing process treats it like
any other digital circuit. It assumes data processing part should be fault
free (e.g. treat MSB and LSB the same). Digital circuits also demand higher
power compared to analog versions.

If an analog circuit can be designed for such algorithms, not only could it be
much faster, it will probably consume far less power. With a super high
bandwidth consuming little power, an analog processing chip may give us a much
better playground to try advanced algorithms. The materials can then be
optimized and we might end up with something like a brain.

Brains (all animals) process far more information for the power they consume.

Digital circuits give us low level reliability and so they are really good for
simple control. Analog/biology don't give us that. But they can give us a high
level reliability while delegating the low level reliability to digital
counterparts.

~~~
usgroup
I think you’re wrong about the ML precision. You need highly precise for most
recursive machine learning tasks because you’re compounding errors otherwise.

Typically you can’t even use floating point representation: not accurate
enough.

~~~
SomaticPirate
Disagree. [https://arxiv.org/abs/1805.08691](https://arxiv.org/abs/1805.08691)
demonstrates 8-bit architecture for a pre-trained CNN provides more than
acceptable results with lower latency and higher throughput than a higher
precision version.

~~~
usgroup
Sure but we were talking about precision rather than throughput ; tbh the
result is hardly surprising .

------
ineedasername
I guess I see the point being made, but it was all a bit long on rhetoric and
analogy, and short on concrete examples.

~~~
Animats
Yes. This is the same argument that audio nuts make for analog recording. It's
known to be bogus. (Yes, 16-bit CD audio has resolution problems for soft
passages, and early filters for the sampling rate were not too good. We're
past that.)

One of the few concrete examples of a complex analog computer system still
used in recent decades was the F-16 flight control system. It's a four channel
fly by wire stabilization and control system, all analog. It was, at the time,
the most advanced flight control system, and it's still well thought of.
That's from the 1970s, and modernized F-16s use a digital replacement.

For several decades, full authority digital flight control systems were
disfavored in aerospace because there were no analysis techniques to be sure
they were bug free. There are ways to analyze an analog computer system to be
sure that the test case set is sufficient, and that behavior will be smooth
continuous between the test case points. Eventually that problem was solved
for digital flight control systems, and now everybody goes digital.

~~~
Junk_Collector
Analog computers still crop up in all sorts of places these days but have
become very niche and are typically a small part of an otherwise larger
system. It's rare to see a general purpose analog computer outside of a very
small number of research labs and FPAAs exist but are expensive novelties. As
digital processors continue to get better and we develop better ways to work
with them, there just isn't the need for the cost and effort required to make
analog computers.

Some common places where you might see one is in audio driver amplifiers where
it is common to implement a bit of trans-linear logic at the output stage to
reduce distortion. Same in some high quality power supplies.

Sometimes very high performance sensor systems will have an analog pre-
processor to perform some calculation on the incoming signal before handing it
off to the digitizers and DSP. Think multi-microphone arrays.

------
_bxg1
A couple of interesting examples in this (loose) subject area:

[https://arstechnica.com/science/2018/07/neural-network-
imple...](https://arstechnica.com/science/2018/07/neural-network-implemented-
with-light-instead-of-electrons/)

[https://pruned.blogspot.com/2012/01/gardens-as-crypto-
water-...](https://pruned.blogspot.com/2012/01/gardens-as-crypto-water-
computers.html)

------
Junk_Collector
Perhaps I missed it, but the point of the article seems to be that eventually
a complex "neuro" computer will be to complicated to understand and produce
unaccountable results. The Author makes a few poor assumptions about analog vs
digital computing and seems to ramble a lot, but ultimately, his main point
doesn't have much to do with either.

------
ubu7737
Perhaps I'm missing the more grandiose point, but I think his message is
simple: digital computing is a handy abstraction for humans who like to count,
but computing is essentially analog.

This layman likes to refer to something about genetic algorithms and magnetic
flux for reference on this.

I mean this:
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50....](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.9691&rep=rep1&type=pdf)

------
unnouinceput
There is no such thing as "analog". Everything in this Universe is digitized,
down to elemental particles inside atoms. What you experience as "analog" is
just digitization with a very fine grain, or if you like it, with better
sampling. So no, the future of computing is not analog at all, it will still
be digital, just with better sampling, aka quantum computing.

------
peter_d_sherman
Quantum, Analog, and Fuzzy Logic - are different terms that I believe some
bright person in the future will prove as describing _exactly_ the same
underlying phenomenon.

(Also, if this could be accomplished, the next step in human evolution might
be proving that the entire universe is a giant Analog computer, but that's
Sci-Fi at this point in time...)

------
ychen306
> It is entirely possible to build something without understanding it. [...]
> Our relationship with true A.I. will always be a matter of faith, not proof

WTF. I have no problem with building something without understanding how or
why it works, but I do have problem of using something without at least some
sort of guarantee on its behavior.

~~~
jethro_tell
You may, but the vast majority of the population is already way past that with
an iPhone and search bubbles, apps . . ..

And honestly, even if you're in tech, this feild is so broad and so many
people are doing so many cool things that there's almost no way to keep up
with it unless you're a ludite. If the case we are talking about involves
building a single giant computer that no one knows how to use, you won't have
much say in that.

------
etaioinshrdlu
I suspect that the 'excess' precision of digital computers could be able to be
retargeted towards some other use, negating any benefit that a analog computer
had.

------
jakeogh
not really related, there is a interesting use of analog effects in fpga's:
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50....](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.9691&rep=rep1&type=pdf)

old paper. 1996. Fragile.

------
reading-at-work
> even though vacuum tubes are commercially extinct

Sounds like someone's never shopped for guitar amps before.

~~~
tabtab
In art, sometimes you want "happy accidents", even if they are not always
reproducible.

------
nat8265639392
Can anyone recommend any good books or resources to learn more about analog
computing?

~~~
boomlinde
The Heathkit EC-1 operations manual seems like a good start for electronic
analog computers and has some examples of problem circuits:
[http://www.ccapitalia.net/descarga/docs/1959-ec-1-heathkit.p...](http://www.ccapitalia.net/descarga/docs/1959-ec-1-heathkit.pdf)

I also like to share an instruction video on mechanical computers on every
suitable occasion:
[https://www.youtube.com/watch?v=s1i-dnAH9Y4](https://www.youtube.com/watch?v=s1i-dnAH9Y4)

------
adamnemecek
Does anyone know how to contact the author?

------
agumonkey
Everytime I see large digital systems, the deterministic nature fades and it
starts looking probabilistic, noisy .. continuous

------
adamnemecek
I've been saying this for a while.

[https://hn.algolia.com/?query=adamnemecek%20analog%20quantum...](https://hn.algolia.com/?query=adamnemecek%20analog%20quantum&type=comment&sort=byPopularity&prefix&page=0&dateRange=all)

The main problem with old school analog computers is that they were based on
electricity which ended up causing rift (imprecision that gets worse over
time).

A photonic, analog, quantum computer (also called continuous variable quantum
computer) [https://en.wikipedia.org/wiki/Continuous-
variable_quantum_in...](https://en.wikipedia.org/wiki/Continuous-
variable_quantum_information) is possible and would run circles around
discrete quantum computers (those with qubits).

~~~
krastanov
The claim about relative performance of continuous variable models versus
circuit models is completely unsubstantiated. Especially given that most uses
of the continuous variable systems is to encode discrete qubits on top of them
with GKP/cat/binomial codes. Is there any Complexity Theory work published
about continuous variable models?

And photonic systems do not magically fix the noise issue. Noise grows in a
fast non-linear fashion with the size of the system, so the "constant factor"
noise suppression you gain from switching to photonic systems is quickly
washed away.

~~~
adamnemecek
> Especially given that most uses of the continuous variable systems is to
> encode discrete qubits

Current uses, maybe.

> Is there any Complexity Theory work published about continuous variable
> models?

"Complexity and Real Computation".

> And photonic systems do not magically fix the noise issue. Noise grows in a
> fast non-linear fashion with the size of the system, so the "constant
> factor" noise suppression you gain from switching to photonic systems is
> quickly washed away.

Is there anything published on the fact that this cannot be overcome?

~~~
krastanov
Just looking at the wiki pages for Real Computation is enough to see it is not
a physically realizable model. For more detailed discussion of the problem you
can see the essay "NP-complete Problems and Physical Reality". To quote from
it "The problem, of course, is that unlimited-precision real numbers would
violate the holographic entropy bound".

At every level of physics there is a bound on precision, from boring things
like classical macroscopic thermodynamics and noise, to quantum noise, to
bounds that emerge in speculative theoretical physics. Basically, anything
capable of encoding an infinitely precise real number in a finite amount of
space will collapse and form a black hole.

In case this is not convincing enough, to your question about how this noise
can not be overcome: if there was a method that can overcome the noise
asymptotically in photonic systems, then that method would work in electric
systems too. And there is actually such a method: turning the computer into a
digital computer thanks to error correction codes.

The claim that Real Computation can be realized in our universe is comparable
to the claim one can construct a perpetual motion machine or some other
generator of free energy. They are both preposterous given our understanding
of physics. And yes, I would celebrate if either of one turns out to actually
be possible, but incredible claims require incredible evidence.

~~~
adamnemecek
> Just looking at the wiki pages for Real Computation is enough to see it is
> not a physically realizable model.

This computer is literally at the physical limit reality can take. This number
is actually quite high, although not unlimited. If you could make a computer
faster by 10^30, ok I might not need unlimited. What if this number could be
10^300?

Furthermore, and this is important, this computer allows for a fundamentally
different type of computation.

> At every level of physics there is a bound on precision, from boring things
> like classical macroscopic thermodynamics and noise, to quantum noise, to
> bounds that emerge in speculative theoretical physics.

Correct, this can be accounted for. Don't press on details, you won't be
satisfied with the answers. Can we talk about would be possible if this
computer were possible and work backwards? As a mental exercise.

Also what if I'm fundamentally more interested in probabilistic computation
and this error can actually be a foundation of my computation.

> then that method would work in electric systems too.

Electricity is fundamentally more "unstable". This error compounds. This of
the difference in attenuation rate in electric vs optical media. Where does
this attenuation come from?

> And there is actually such a method: turning the computer into a digital
> computer thanks to error correction codes.

Nope. Think of it as averaging unstable signals. The result is very much
continuous.

You are getting too hung-up on the infinity aspect. Let's talk more about what
sort of programming model this would allow for.

A Turing machine technically has a tape of infinite length, while current
computers don't. Does that mean that no Turing machine has ever been
constructed? Does the answer to this question matter?

Also, and this is important, compared with a normal computer, this computer
would not heat up. Current CPU's can't get much larger because they can't
dissipate heat fast enough. What if you could have a CPU of the size of a cube
of the volume of one cubic meter?

~~~
krastanov
> Can we talk about would be possible if this computer were possible and work
> backwards? As a mental exercise.

Certainly! This type of thought experiments is how much of physics progresses.
But their results should be contemplated seriously. Assuming it is possible to
prepare a system that works like your computer, it will immediately collapse
into a black hole, because it breaks the holographic principle (and plenty of
other bounds).

Similarly we can imagine what happens if FTL was possible: time paradoxes. Or
if we could measure both position and momentum: UV catastrophe.

> You are getting too hung-up on the infinity aspect.

The infinity aspect is what makes Real Computation more powerful than other
types of computation. It is also what makes it impossible in our universe. It
has absolutely nothing to do with the type of (countable) infinity that is the
length of a Turing machine tape. You can make asymptotic statements that are
useful about Turing machines or logic circuits. The only useful thing you get
out of Real Computation is __strictly after __you take the limit to infinity.
Otherwise you have old boring less-powerful analog computers (which are
nonetheless marvels of engineering and their creators deserve praise).

> Electricity is fundamentally more "unstable".

This is just nonsense.

By the way, the majority of continuous variable hardware is actually in the
microwave regime, which is closer to the fields of electrical and radio
engineering, than to the fields of lasers or THz systems.

Lastly, yes, I completely agree that such hardware might be interesting in
many cases. But claiming that Real Computation is possible (as oppossed to
just admitting that __small __analog computers are occasionally useful) is far
fetched. Claiming that such hardware can be built when it flies in the face of
everything we know about physics is like claiming you can build a perpetual
motion generator. As I said, extraordinary claims require extraordinary
evidence.

Please believe me, it is extremely rewarding to learn the details of these
arguments, much more rewarding than the existence of Real Computation. In
comparison with the real world, "Real Computation" is a boring cop-out. The
last strip here describes the idea well:
[http://calamitiesofnature.com/](http://calamitiesofnature.com/)

~~~
adamnemecek
> Certainly! This type of thought experiments is how much of physics
> progresses. But their results should be contemplated seriously. Assuming it
> is possible to prepare a system that works like your computer, it will
> immediately collapse into a black hole, because it breaks the holographic
> principle (and plenty of other bounds).

You are condescending.

> Similarly we can imagine what happens if FTL was possible: time paradoxes.
> Or if we could measure both position and momentum: UV catastrophe.

This has been done extensively and not much new can be brought to the table.
Not many people talk about photonic analog computation, so indulge me.

There are people who research this field extensively (continuous variable
quantum computation). Your whole argument just relies on a section from a
Scott Aaronson paper. Can I ask what other research do you use as a foundation
of your worldview? Are you familiar with work of say Alessio Serafini?

You read too much theory of complexity, it doesn't have all the answers. Can I
ask about your stance on say Lie theory as it relates to quantum physics and
why exactly is it that Lie theory provides a good foundation reasoning about
quantum phenomena?

Can we talk about anticommutavity? And infinitesimals?

> The infinity aspect is what makes Real Computation more powerful than other
> types of computation.

10^300 is infinity. 10^3000 is infinity. 10^300000000 is infinity. It's that
you can make each atom do a lot more computation and much more valuable
computation. I'm not interested in your arguments about infinity. I legit
don't care, stop trying, I find the premise of the question flawed.

And last but not least it is "Real" computation. It's the limit of computation
you can "fit into reality". Are you legitimately saying that current
architectures are at the physical limit that reality can take? There's nothing
realer than reality.

Who has actually attempted to build this computer? Like we haven't really
given this a good shot. Things are impossible until they are not.

~~~
krastanov
> You are condescending.

I certainly did not mean to, and I would blame the limits of this medium for
that false impression. I was serious in my comments about thought experiments.

> Can I ask what other research do you use as a foundation of your worldview?

Aaronson is good at explaining things, so I use him as a reference, but look
at any quantum computing textbook and the same argument will be present there.
I am not familiar with Alessio Serafini, but there is plenty of other work in
the field, from Xanadu Inc. to GKP's to encodings used at the institute where
I work.

> it doesn't have all the answers.

This is always true, but you have given fewer answers.

> Can I ask about your stance on say Lie theory as it relates to quantum
> physics and why exactly is it that Lie theory provides a good foundation
> reasoning about quantum phenomena? Can we talk about anticommutavity? And
> infinitesimals?

Sure? If you are doing a litmus tests of my knowledge, yes, I do understand
those topics and would be happy to discuss them. You can find my email on my
profile page. I am sincerely always interested to talk about this with curious
people.

> 10^300 is infinity. 10^3000 is infinity. 10^300000000 is infinity.

This is really missing the point. When things grow exponentially (like
compounding errors) the numbers you are quoting are small.

> I'm not interested in your arguments about infinity. I legit don't care,
> stop trying, I find the premise of the question flawed.

This is simply intellectually dishonest. How do you expect a discussion to
reach the truth if you disregard arguments you dislike?

> Are you legitimately saying that current architectures are at the physical
> limit that reality can take?

No, I am just saying that any technique that might be used to improve
practical analog computers would be already of use in digital computers. I am
also saying that making a better real world analog computer has nothing to do
with the theoretical model of Real Computing which is unphysical. It is not a
continuous change - the gulf is discrete and qualitative.

> Who has actually attempted to build this computer?

Analog computers in various media, including light, have been built for a
century (more recently with light). Xanadu seems to be trying to commercialize
them in the case of quantum continuous variables, and they will probably
produce some interesting hardware, but claiming that their (potential)
hardware is more powerful than the usual model of quantum computing (also only
potential for now) is unscientific. My employer (Yale Quantum Institute) is
also employing continuous variable systems in other contexts.

------
tabtab
I agree that analog computing is probably more efficient, or can be more
efficient if predictability is sacrificed.

However, there are _societal implications_ to this. We prefer the processing
steps be traceable and dissactable for accountability and managing the
distribution of tasks/parts in terms of accountability. We may not accept a
higher degree of "rogue machines" to gain average efficiency.

However, I suppose a given country or group could accept the tradeoff to gain
a military advantage, which could spell chaos. They may accept a higher degree
of rogue battle bots in order to win via average efficiency, or at least be
willing to take the gamble that the theory is true. The chance of a high-
stakes or borderline suicidal leader/dictator _eventually_ coming on the scene
is historically high, leading inadvertently to run-away human-flattening bots.
Maybe that's the answer to the Fermi Paradox.

~~~
tabtab
Okay, people, why the "-2". Fess up.

