
The Comdyna GP-6 analogue computer that I use daily at work - pmarin
https://www.reddit.com/r/electronics/comments/b1h0o0/the_inside_of_the_comdyna_gp6_analogue_computer/
======
jonfw
This thread had a very good OP. I don't buy that this is at all useful though.
It would be trivial in a modern computer to get much more precision than this.
I feel like his arguments about conversion from analog to binary would have
merit, if the scale that modern computers operated at wasn't so severe. Yeah
conversion from analog to binary can be lossy, but the losses become smaller
as you use more memory. And these days we have access to a lot of memory.

What's interesting is that our modern computational platforms have
capabilities that far exceed most of our use cases. And this is a good
demonstration of why.

~~~
halfeatenpie
From what I've gathered and read from OP's comments, it sounds like their
analog model makes sense and is precise enough for them to work with. People
talk about how much more powerful a "32-core Xeon machine" is and they're not
wrong, but just because new technology is available you also have to look at
is it worth it to transition over/update it.

I have an example from my daily work. We were hired to update a company's
model from 6 years ago. Their model is an Excel VBA model that we're
rebuilding in Python with an increase resolution of detail and accuracy.
However, for our "more accurate" model to be accepted, we have to also prove
the following:

\- We have to show that our model's outputs are the same as the VBA's model
for the same input/starting condition (makes sense at a higher level, but at a
higher resolution this means you have to spend more time looking at why a
specific value is in a specific way and can be different from the lower
resolution model)

\- Train the new users on the new model (knowledge transfer)

\- Ability to update and modify the new model wherever they need to change it
or understand it

\- Understand the limitations and the constraints of the new model

This basically boils down to how much time is needed in training the new
model, how much money and time is needed to invest in building a new model,
and is it really worth it to build a new model when the old model has worked
good enough for now (and you understand the limitations of said model and is
within your needs)? There's only so much time in the day and you obviously
have a ton of other tasks that also needs your attention, and getting these
newer models verified and accepted by the board is great but it's also
additional meetings and time that you need to setup with them and get
everything OK-ed... doubly hard when everyone else is also super busy.

I'd say modern computational platforms are better when it comes to raw power,
but the logic and just resources required to build, verify, accept, and train
(human resources-wise) the new model might just not be fully worth it.

Remember, there are also other industries other than tech that uses computer
hardware to get jobs done.

~~~
jonfw
I agree that updating tech stack can be expensive and sometimes not worth it.
Being on the bleeding edge isn't a great idea for a business use case. You
have to draw the line somewhere. I do think that the line is drawn very well
ahead of analog computers though.

I think that this guy is using the analog computer because he's an enthusiast.
That's fine. It's not better by any means other than making an employee happy.

------
basementcat
Analog computers aren’t all that unusual. For example, many airplane pilots
use a Jepperson Circular computer to calculate headings and other parameters.

[https://www.amazon.com/Jeppesen-Circular-Computer-
diameter-J...](https://www.amazon.com/Jeppesen-Circular-Computer-diameter-
JS514237/dp/B003VSCID0)

Even Spock used it on the Enterprise.
[https://www.flickr.com/photos/10901121@N06/1196208122](https://www.flickr.com/photos/10901121@N06/1196208122)

~~~
Koshkin
Well, a slide rule (even a circular one) is not an "analog computer"...

~~~
howard941
How about the E6B's wind vector calculating elements? Those pieces are on the
reverse side of the slide rule bits.

[https://en.wikipedia.org/wiki/E6B](https://en.wikipedia.org/wiki/E6B)

~~~
zackbloom
Still no, an analog computer is an electronic computer which uses varying
voltages and currents to calculate results. It's not the same as a manual
computer.

~~~
Someone
Analog computers need not be electronic. They can be mechanical (e.g.
[https://en.wikipedia.org/wiki/Digi-
Comp_I](https://en.wikipedia.org/wiki/Digi-Comp_I)) or hydraulic (e.g.
[https://en.wikipedia.org/wiki/MONIAC](https://en.wikipedia.org/wiki/MONIAC)),
too.

(Generic entry point:
[https://en.wikipedia.org/wiki/Analog_computer](https://en.wikipedia.org/wiki/Analog_computer))

------
djhworld
I appreciate the OP (of the reddit post) wants to keep his/her identity secret
but I remain completely unsatisfied with the response to someone asking
exactly what they need this analogue computer for

> _My job is to make and maintain models of economic trends in my company’s
> industry. I find analogue computing to be the best way for me to easily and
> effectively model dynamic real world systems._

I don't have much of an understanding of analgoue computers so I can't grasp
why this computer is better at solving this sort of problem over a digital
one.

~~~
leggomylibro
I don't have a good answer, but there does appear to be a history of using
analog computing for financial modeling.

One famous example is the MONIAC, which used fluids to simulate the flow of
money:
[https://en.wikipedia.org/wiki/MONIAC](https://en.wikipedia.org/wiki/MONIAC)

I wonder if 'fuzzy' or uncertain models like economic ones can tolerate or
benefit from slightly less deterministic algorithms.

~~~
floatingatoll
My favorite “impossibly difficult” computing problem that’s unsolved (brute
force required) by digital but solved by analog is in finding the shortest
path from A to B, wherein you just apply electricity to the start and endpoint
and let the neon glow demonstrate the optimal solution. For example:

[https://www.nature.com/news/2002/020520/full/news020520-12.h...](https://www.nature.com/news/2002/020520/full/news020520-12.html)

~~~
leggomylibro
Apparently the Navy also used mechanical 'computers' on old battleships to
calculate firing solutions.

Sailors would adjust cranks and shafts to set the input values, and those
inputs were connected to cams which were machined in the shape of solutions to
the equations that needed to be solved. You can watch an old training video
about it from the '50s:

[https://youtu.be/s1i-dnAH9Y4](https://youtu.be/s1i-dnAH9Y4)

~~~
benj111
Bombers too.

[https://en.m.wikipedia.org/wiki/Norden_bombsight](https://en.m.wikipedia.org/wiki/Norden_bombsight)

------
driverdan
To me OP comes off as a curmudgeon that won't learn how to do this more
efficiently in software. There's no reason why they couldn't do it faster and
more accurately in software.

~~~
gh02t
I kinda feel the same, but I'd probably put a bit more of a charitable spin on
it. I can certainly see how it might be more intuitive to the OP to use an
analog computer and how it might not be worth the cost of moving to a software
system when this is good enough.

What I would disagree with is any assertion that this is objectively the
_best_ way, and OP does acknowledge in the original thread that this isn't
necessarily the overall optimal setup.

------
madengr
That's a fascinating read.

Analog computing may make a comeback:

[https://spectrum.ieee.org/computing/hardware/not-your-
father...](https://spectrum.ieee.org/computing/hardware/not-your-fathers-
analog-computer)

------
rotexo
As some of the commenters on this reddit thread mentioned, any electronic
musician who has a Eurorack rig containing, for instance, a Make Noise Maths,
technically uses an analog computer regularly at work.

------
stillworks
How fascinating. I wonder what is the measure of precision on such computers
for real numbers. And can this be used for integer math if at all ?

~~~
brandmeyer
Closest that I've come is working with opamp circuits. Generally speaking:
Doing almost anything in analog is more expensive than doing it digitally.
These days, the only things you do in analog are things that are impossible to
do digitally, like anti-aliasing filters. RF is another story, but even there
the economics are changing with direct-sampling ADCs.

1% resistors are easy to find. 0.1% can be found at a high price. 0.01% can
also be found, for roughly 1 USD/ea, and only on the E12 sequence. Capacitors
are worse by a factor of 5 or so. Those tolerances turn into scale-factor
error, common-mode rejection ratio, DC bias, and so on. Without breaking the
bank, you can easily build a circuit to maybe 7 bits of accuracy, and for a
small timer you can do maybe 12 bits or so. But in volume, its almost always
better economically to use a lower-accuracy circuit and perform calibration at
the factory or in the field.

Aside: I fact-checked myself on Digikey and was pleasantly surprised to find
that you can get 0.01% resistors without "calling for quote." A few years ago
the limit for small-timers was effectively only about 0.1%. Even still 1
USD/ea is a BOM budget-buster for all but the most narrow of uses.

~~~
esmi
Circuit with an ENOB > 12 definitely exist. See table 1 in the app note below
for an example. [https://www.maximintegrated.com/en/app-
notes/index.mvp/id/53...](https://www.maximintegrated.com/en/app-
notes/index.mvp/id/5384)

~~~
brandmeyer
ADC's well above ENOB exist. 24-bit sigma-delta ADC's are pretty easy to come
by without breaking the budget.

But analog computation circuits (subtraction, gain, addition) are another
matter entirely.

~~~
esmi
Doesn’t one make a sigma delta from subtracting, integration and gain stages?
If one could make a sigma delta one could make other stuff too.

[https://en.m.wikipedia.org/wiki/Delta-
sigma_modulation#/medi...](https://en.m.wikipedia.org/wiki/Delta-
sigma_modulation#/media/File%3ABlock_Diagram_Delta-
Sigma_ADC\(minor_change\).svg)

~~~
brandmeyer
Sigma-delta ADC's are built in such a way that component errors result in
changes in bandwidth instead of changes in scale factor.

