
Analog Computers - StreamBright
https://blog.degruyter.com/algorithms-suck-analog-computers-future/
======
cr0sh
A good summary of analog computers can be found on the wikipedia article:

[https://en.wikipedia.org/wiki/Analog_computer](https://en.wikipedia.org/wiki/Analog_computer)

Takeaway: Analog computers are limited in precision and by "analog noise"; the
precision of the components used determine the precision of the output.
Usually no more than 3 or 4 decimal places are possible, at least with the
tech that was used in their heydey. I would say that is still close to the
case even today. Of course, one could do things like cryogenic cooling or
such, but it becomes a cost factor at that point.

Something else that wasn't mentioned:

The ADALINE/MADALINE memister technology is an analog component, and is a
hardware component equivalent to a McCulloch–Pitts neuron model (perceptron).

The memister is NOT to be confused with the memristor, which is a different
technology; the memister was a 3-terminal device (memory transistor):

[https://en.wikipedia.org/wiki/ADALINE](https://en.wikipedia.org/wiki/ADALINE)

...whereas a memristor is a two terminal device ("memory resistor"):

[https://en.wikipedia.org/wiki/Memristor](https://en.wikipedia.org/wiki/Memristor)

~~~
robotresearcher
> Usually no more than 3 or 4 decimal places are possible

By that do you mean accurate to 1 part in 100 (3dp) or 1000 (4dp) or what?
Since the scale of a representation is arbitrary, I'm not sure what dp means
here.

~~~
flavio81
Good question. I guess precision is finally going to be fractions of the max
voltage swing allowed by the computer. For example, if voltage goes from -5 to
+5 volts, the voltage swing is 10v, and if noise allows 0.1mV of precision,
then the precision is 1/100000 of the full voltage swing.

This could be expressed, at the end, simply in decibels, though. Signal-to-
noise, as in classic analog systems.

~~~
kurthr
I think it usually refers to accuracy out of a range of 1. typically 3 decimal
places means 1000ppm and 4dp means 100ppm.

The typical problems with analog computers are many... precision of components
(e.g. gain or attenuation) is limited to ~0.1% for resistors and ~1% for
capacitors (inductors aren't typically used). You can try to tune things
(ratiometrically) to get higher accuracy, but at the cost of increased noise
and temperature sensitivity. The more complex the system, the more things can
go wrong... so you end up needing simple systems or simple tools (digital).

The typical problem is that if you build a filter (e.g. a transfer function
with a summer or differencer) then you will tend to clip the dynamic range or
either with a maximum voltage (integrators) or a minimum noise level
(differentiators) pretty quickly. You can play some games with log converters,
but accuracy really still matters and drift or gain error with time is rarely
an option.

The best way to use analog computers is with negative feedback to null the
input. They do that amazingly well... so you can build a temperature
controller, missile tracker, or actuator that only minimizes an error so that
high gain corrects for any inaccuracy or offset.

~~~
VLM
A big technical EE problem for analog computers is interconnects and their
EMI/EMC interference issues and impedance issues. The analog specs for on-chip
digital circuitry are much more relaxing to develop around. You can work
around the interconnect issues on analog computers by dumping lots of power
into the driver and input circuits but eventually some joker is going to point
out that it would be electrically cheaper (in terms of current/power draw,
etc) to transmit that 0 to 5 volt signal using something like I2C or SPI and
then you're on a fast slippery slope to turning your analog computer into an
exercise in DSP programming. At some point of complexity the interconnect
cable driver circuitry is going to be power hungry enough that its cheaper to
emulate the whole thing in floating point on a digital computer.

If you make a graph of PITA vs bit resolution, we're all pretty comfortable
emulating digital computers on analog real world circuits using binary ones
and zeros. Surely the gain is very little and the PITA increases very much by
implementing digital computers on trinary + - 0 analog computers. Some think
the graph is U shaped and at some resolution level, the PITA of analog high
resolution falls beneath performance so it makes sense. Many like me think
that graph never U shapes such that anything is "better" at emulating digital
computers than using analog physical computers based on binary 0/1\. AFAIK no
one has built a modern floating point accelerator using opamps and A/D and D/A
converters, so I find it unlikely its useful.

A two transistor NAND gate is after all just a analog computer using simple
binary signals. All computers are analog its just the popular digital ones are
only defined and well behaved when using binary analog signals.

There is some audiophile effect going on. Surely a mp3 codec running on a
vacuum tube opamp would sound more mellow and all that.

~~~
nickpsecurity
I always hear such things from EE's. So, what's your thoughts on stuff like
this in terms of analog "always" being more expensive or power hungry:

[http://www.cisl.columbia.edu/grads/gcowan/vlsianalog.pdf](http://www.cisl.columbia.edu/grads/gcowan/vlsianalog.pdf)

[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.325...](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.325.4142&rep=rep1&type=pdf)

Now, I won't argue with cheaper to _develop_ since analog is manual with a lot
of issues to contend with. I'm just wondering if there are more applications
that can get huge speedups at lower power or cost than digital. I know the
ASIC makers in power-sensitive spaces are already moving parts of their chips
to analog for power reduction. That's what mixed-signal people tell me anyway:
the specifics are often secret. So, I have to dig into CompSci looking for
what they've tried.

~~~
pjc50
The analogue neural net stuff is quite a reasonable example, because it's
specifically trying to mimic a real analogue system and tends to be noise-
tolerant.

Couple of points from your lower link:

\- return of "wafer-scale"! Nice.

\- " the average power consumption is expected to stay below 1 kW for a single
wafer"; not bad but you're still going to need to cool that

\- actually a hybrid system: long range comms is digital and multiplexed to
save wiring, converted to analogue at the synapse

\- "All analog parameters are stored in non-volatile single-poly floating-gate
analog storage cells developed for the FACETS project" => basically analogue
Flash? A development of MLC I suppose

On reading the whole thing, it seems the magic is actually in choosing which
bits to make digital. The "long range" neural events are sent as differential
6-bit bursts, multiplexed, which they claim saves significant power.

------
ChuckMcM
I have fond memories of building an analog computer as a project from Popular
Electronics that simulated a lunar lander mission. At reset you had fuel,
altitude, horizontal and vertical velocity. Your input was an angle and a
thrust knob (two potentiometers) and a comparative that latched when altitude
reached 0 based on your velocities being less than 1m/s. It was tremendous fun
to play but no graphics, just some mA meters to tell you your status.

~~~
JKCalhoun
Sounded cool, so searched for it. Page 41:

[http://www.americanradiohistory.com/Archive-Elementary-
Elect...](http://www.americanradiohistory.com/Archive-Elementary-
Electronics/1970/Elementary-Electronics-1975-11-12.pdf)

~~~
ChuckMcM
Great! That is definitely the thing I built so other than wrong magazine name
and it went up instead of down, it's exactly like I remember it :-)

~~~
JKCalhoun
I printed the article to (hopefully) build in the future.

------
mindcrime
If any of you guys are into analog computing, just as an FYI, there's a sub-
reddit dedicated to the topic (full disclosure, I started this particular
one). analogcomputing.reddit.com

There's not a ton of content there yet, but please feel free to add anything
you come across. I'm a fan of the idea and suspect, like the author of this
piece, that there is "something there".

~~~
smellf
How long have subreddits been accessible through a subdomain? Is this
something the subreddit mods have to turn on?

~~~
snuxoll
It's been this way since I started using the site in 2009.

~~~
smitherfield
til.reddit.com !

------
peterburkimsher
My boss at my previous job at Fisher & Paykel Healthcare in New Zealand is
using analogue computing today.

He developed Pertecs, which is a rudimentary analog computer paradigm written
in C.

[http://tcode.auckland.ac.nz/~mark/Signal%20Processing%3A%20P...](http://tcode.auckland.ac.nz/~mark/Signal%20Processing%3A%20PERTECS.html)

He got me to write some code to compile schematic diagrams into the XML config
files. He's also done something similar now to compile from LaTeX, and he
ported the controller from a Mac Mini to a Raspberry Pi.

Pertecs is being used to control an artificial lung, which is used for
research into obstructive sleep apnoea (snoring).

The problem is, he's retiring, and I'm probably the only other person in the
world who knows how to use his program. I would move back there, but the
immigration policy got more difficult (minimum salary of $75k), so I'm
seriously wondering whether I should stay in Taiwan longer and try to
naturalise here.

~~~
sleet
I believe the $75k minimum salary is for non-skilled jobs (the point being, if
you're in a non-skilled but 'high' paying role we're still interested in you).
The minimum salary for skilled jobs is $50k [1][2]

That and I doubt you would find many people working at F&P Healthcare that
earn less than $75k.

[1] [https://www.immigration.govt.nz/about-us/media-
centre/news-n...](https://www.immigration.govt.nz/about-us/media-centre/news-
notifications/skilled-migrant-category-changes) [2]
[https://www.immigration.govt.nz/employ-migrants/hire-a-
candi...](https://www.immigration.govt.nz/employ-migrants/hire-a-
candidate/support-a-candidates-visa-application/essential-skills-visa)

~~~
peterburkimsher
"we're" \- it sounds like you're connected to NZ immigration! I can fill you
in with more details if you're interested.

$75k is 4x my current salary in Taiwan. Yes, I know the economy is totally
different in NZ, but I don't have high hopes that changing country will
suddenly make me become rich. My boss here pays me the minimum that the
government allows for a Masters graduate on a foreigner work visa.

The other consideration is my girlfriend. She applied for Working Holiday, but
wasn't one of the 600 lucky ones. We were in an internet café with the fastest
connection in Kaohsiung, but the site just wouldn't load in time. She's 30, so
she can't try again next year. If we wanted to get a partnership visa, we
would have to live together and share a bank account for 1 year. Getting
married doesn't even help, just living arrangements.

We're getting kind of sidetracked from the original topic of analog computers,
but if it's something you want to talk about more, then just search for my
name on Facebook and send me a message. It would be nice to personify the
immigration forms.

------
ramgorur
It's strange that the article does not mention anything about hydraulic
macroeconomics and MONIAC, they were once widely used to verify theories in
economics.

[https://en.wikipedia.org/wiki/MONIAC](https://en.wikipedia.org/wiki/MONIAC)
[https://en.wikipedia.org/wiki/Hydraulic_macroeconomics](https://en.wikipedia.org/wiki/Hydraulic_macroeconomics)

~~~
alimw
So this looks interesting, but my first thought is how can a conserved
quantity like water model something like money that is created and destroyed?

~~~
kybernetikos
I don't think there's any requirement that the amount of water in the model
remains constant. In principle you could drain or open valves to add more.

------
brian-armstrong
A related topic - running digital logic at subthreshold voltages, where
transitions usually (but not always) happen correctly. It can be useful if
you're attempting to measure something probabilistically anyway

[https://pdfs.semanticscholar.org/7244/1c8377b1dfde1909d21463...](https://pdfs.semanticscholar.org/7244/1c8377b1dfde1909d21463a7c20ccf0cf38d.pdf)

------
11thEarlOfMar
So, Keith Emerson's Moog Synthesizer[0] was an analog computer, yes?

[0]
[http://i.telegraph.co.uk/multimedia/archive/03593/emerson6_3...](http://i.telegraph.co.uk/multimedia/archive/03593/emerson6_3593555b.jpg)

~~~
kitotik
Certainly. Most modular synths have all the basics - Sum, divide, multiply,
add, XOR, etc.

And don’t forget, analogue random is the shit. In your face entropy!

------
kitotik
Excellent. So maybe all that time I’ve wasted playing with analogue modular
synthesis will finally pay off!(?)

~~~
nixpulvis
My thoughts exactly. _beep boop_

------
hcarvalhoalves
The author certainly has a point, but I wonder if analog computers (in the
sense of DA/AD + ICs that can be plugged into general purpose systems) suffer
from economies of scale? Maybe energy + depreciation costs of general purpose
computers are still lower than ordering a minimum of 10~100k units of some
custom analog computer w/ good enough quality control.

Apparently something similar to FPGA exists for analog signals [1], I wonder
how popular/practical it is.

[1]
[https://books.google.com.br/books?id=qjnnBwAAQBAJ&pg=PA93&lp...](https://books.google.com.br/books?id=qjnnBwAAQBAJ&pg=PA93&lpg=PA93&dq=EPAC+analog&source=bl&ots=Z7MaNV9Wpb&sig=MG-
KEukCW4NI7j-uqOzVzGowWW0&hl=pt-
BR&sa=X&ved=0ahUKEwjkkeyO3vXUAhWHfZAKHd7ZB_4Q6AEITTAF#v=onepage&q=EPAC%20analog&f=false)

~~~
anfractuosity
I got some FPAA chips I assume you're referring to, the ones I've got are
Anadigm ones, I need to get round to using them. The downside is the ones I've
got, simply make use of switched capacitors for creation of filters, so there
will be some form of discretization in the temporal domain I guess.

------
adamnemecek
I'm hoping that analog computing will come back in the form of photonic analog
computing. This would be more powerful than quantum digital computing (you
know the things that people are wasting time on).

Fun fact, with analog computing, one could imagine achieving Real Computation
([https://en.wikipedia.org/wiki/Real_computation](https://en.wikipedia.org/wiki/Real_computation))
which is above and beyond Turing. completeness.

Note the fun sentence "If real computation were physically realizable, one
could use it to solve NP-complete problems, and even #P-complete problems, in
polynomial time. ".

We are on a wrong evolutionary branch of computing. Bits are lame-o-rama,
whereas differentiable signals are pure unadulterated flavortown.

~~~
mgraczyk
What you're describing is generally accepted to be physically unrealizable. In
fact, the sentence that follows your quoted sentence cites two commonly known
physical limitations that prevent the existence of your "computational class
above and beyond Turing".

Whether or not there exist physically realizable computations that are not
computable by a turing machine is an open question, but most physicists and
computational complexity theorists seem to believe there does not exist such a
class.

[https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis](https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis)

~~~
adamnemecek
Read unlimited as arbitrary precision.

I'm familiar with the Turing thesis but he's wrong.

~~~
shmageggy
> I'm familiar with the Turing thesis but he's wrong.

You're going to have to back up a statement like that with a whole lot of
supporting evidence if you want to be taken seriously.

~~~
adamnemecek
Look into work of Lenora Blum. She wrote a book "Complexity and Real
computation".

------
dragontamer
> Luckily, with today’s electronic technology it is possible to build
> integrated circuits containing not only the basic computing elements but
> also a crossbar that can be programmed from an attached digital computer,
> thus eliminating the rat’s nest of wires altogether.

This is the most important paragraph in the entire article.

Analog Computers can be made very small. It'd take an ASIC, but the 741 OpAmp
was less than 100 transistors. A more modern OpAmp might be under 1000
transistors... although noise issues would be abound.

Bernd Ulmann has developed a methodology that performs non-trivial
computations (such as:
[http://analogparadigm.com/downloads/alpaca_4.pdf](http://analogparadigm.com/downloads/alpaca_4.pdf)),
but its still hand-programmed by connecting wires together.

If it were digitally programmed with a digital crossbar switch (consisting of
CMOS Analog Gates instead), then it'd be controllable by a real computer.

I think what Ulmann is arguing here... is to use analog computers as a
"differential equation accelerator". Perform a lot of computations in the
digital world, but if you need to simulate a differential equation, then
simulate it on an analog circuit instead.

And there are a large number of interesting mathematical problems that are
described as differential equations.

\-----------------

The main issues, as far as I can see, would be the multiplier, logarithm, and
exponential functions. IIRC, these are created using a bipolar transistor...
and modern manufacturing doesn't really mix BJT with MOSFET.

I mean, IGBT transistors exist, but modern computers are basically MOSFET all
the way down. MOSFETs would be able to make a lot of things though: digital
potentiometers / variable resistors... the crossbar switch, capacitors,
resistors, and OpAmps.

And all of those can simulate addition, subtraction, derivatives and
integrals. More than enough to build a "differential equation accelerator"
that the author proposes.

------
danmaz74
> Instead, you program it by changing the interconnection between its many
> computing elements – kind of like a brain

This is one of the worst "brain" analogies I've read lately.

------
supermdguy
>What causes this difference? First of all, the brain is a specialized
computer, so to speak, while systems like TaihuLight and TSUBAME3.0 are much
more general-purpose machines, capable of tackling a wide variety of problems.

That's why digital computers have so far been so much more successful than
analog computers. While analog computers may have the potential to be
"better", they'd require massive global hardware and software changes to
become the next big thing. People are lazy, and there's no way they'd want to
port everything over to a completely different paradigm just for better energy
efficiency and speed.

------
janekm
Analog computing makes a lot of sense in the context of genetic algorithms and
"deep learning" and I wouldn't be surprised if there's already some ASICs
under design using those principles. One big challenge is that the design kits
from the foundries aren't likely to include all the analog computer cells that
would be needed (but perhaps for example a current mirror into a MIM capacitor
could make for an integrator?).

~~~
read_only
At the ISCA conference last week, Yoshua Bengio, of DNN fame, talked about
their early efforts in analog computing. Here are some related slides of his:
[http://www.iro.umontreal.ca/~bengioy/talks/Brains+Bits-
NIPS2...](http://www.iro.umontreal.ca/~bengioy/talks/Brains+Bits-
NIPS2016Workshop.pptx.pdf)

------
smitherfield
_> The human brain is a great example – its processing power is estimated at
about 38 petaflops, about two-fifths of that of TaihuLight._

Huh? So we now have computers more powerful than the human brain? I thought
that was still some decades off. And how would one even measure such a thing?
In the apples-to-apples comparison, a stupid human trick floating-point
calculation savant might manage 1 flop/s.

~~~
oldandtired
In speed yes, in terms of continuous parallel computing power at low energy
levels, technology hasn't come even close. In terms of sensory input and
processing, not even close.

I find it amusing that there is much hype about computer systems beating
humans in very specialised areas, such as go and chess.

But the missing piece here is that the human is still doing this while
continuously processing all the sensory input that is occurring to that human,
dealing with so much more than what the computer system is dealing with. The
computer system is dealing with one and only one subject matter at a speed
many magnitudes faster and is only just getting ahead.

~~~
dom0
> one and only one subject matter at a speed many magnitudes faster and is
> only just getting ahead.

+1

Kasparov didn't simulate 200 million moves per second to make his move.

------
jayd16
Something that might be interesting is some kind of analog fpga for neural
networks. Seems like NN weighting would translate well.

------
cctan
At last, we will have a truly random number generator! Not the simulated fake
rand() using timestamps as seed.

~~~
RachelF
Most modern Intel and AMD CPUs have a truly random number generator, generated
in a non-digital way.

------
aj7
It would be interesting to simulate that airflow in LTSpice, given that that
analog contraption can do it.

------
gautam1168
this feels like forcibly renaming electronics engineering as programming

------
trophycase
First I've heard of analog computers but it sounds super interesting. So
basically the structure of the system defined the algorithm and is therefore
very specialized and efficient?

------
aj7
The brain is not an analog computer.

~~~
chroem-
Well it's definitely not a Turing machine.

~~~
smitherfield
It is however definitely Turing-complete.

~~~
copperx
You say that as it was a feat.

------
ankurdhama
>In analog computers there are no algorithms, no loops, nothing as they know
it. Instead there are a couple of basic, yet powerful computing elements that
have to be interconnected cleverly in order to set up an electronic analog of
some mathematically described problem.

This is exactly why the digital computer has won over the analog computer.

~~~
dom0
Uhm, no. Analogue vs. digital computers use fundamentally different kinds of
computation. Analogue computers were not used for "programming", nor was it
really necessary for the simulations done on them.

Essentially, with an analogue computer you have a rack full of analogue
building blocks and you build an electronic system equivalent to your real-
world system from them. Then you can apply inputs and observe outputs. Often,
the inputs were connected to sensors in a device, and the outputs were
connected to actuators or recorders.

When analogue computers were already in wide use, there were maybe three
digital computers on the whole planet. Later still, in the 60s to perhaps the
early 80s "analogue computers" could (to varying degrees) perform some
simulations _orders of magnitude faster_ than contemporary digital computers.
Only when digital computers became fast, cheap and easy enough to do these
they became a replacement, however, moving from an analogue computer to a
digital program could be quite difficult, since the two operate in vastly
different ways.

Large systems often used digital computers since the ~70s for e.g. recording
and analyzing outputs: a company in my home town developed test rigs for
performance and crash testing of cars (and also did the testing to some
extent); they still had a _massive_ hybrid computer in the 80s (multiple
analogue racks plus I think two DG Nova systems).

------
dphov
Stop comparing brain with computer

------
xfer
The author's so called analog computers can be built by using FPGAs. And the
term he might be reaching for is called data-flow programming.

~~~
mindcrime
_The author 's so called analog computers_

Why do you say "so-called"?

 _can be built by using FPGAs._

Also, FPAA's (Field Programmable Analog Array)[1]

[1]: [https://en.wikipedia.org/wiki/Field-
programmable_analog_arra...](https://en.wikipedia.org/wiki/Field-
programmable_analog_array)

~~~
xfer
Maybe it was the wrong term. My point was we have them and they are used, but
they are not cost efficient in general. So saying that they will be the future
of computing, is a bit ridiculous imo.

~~~
mindcrime
Fair enough. I think they will be _part_ of the future of computing, but
saying they are "the future of computing" is probably hyperbole.

