
How to Set Up an OpAmp Circuit to Do Complex Mathematics - mindcrime
http://www.dummies.com/education/science/science-electronics/how-to-prepare-an-op-amp-circuit-to-do-complex-mathematics/
======
drfuchs
My dad designed Frequency Response Analyzers in the 60's and 70's that used
OpAmps to compute Fourier Integrals and then "solve the right triangle". That
is, relative to a sine wave it generated, it would take a return analog
signal, multiply by sine and cosine waves, integrate both, and then take the
square-root of the sum of the squares and also the arctangent of their ratio,
to finally output amplitude and phase-shift of the return signal. All as
analog voltages; all with OpAmps (and a tiny bit of TTL to turn the
integrators on and off). It simply wasn't feasible to do this digitally; it's
not just that doing a real-time FFT in the digital sphere was cost-prohibitive
at the time, but also the A-to-D converters weren't fast enough to keep up (at
the required accuracy).

Typical users went from aerospace and automotive ("will our wing or chassis
oscillate badly when we hit a thermal or pothole"?) to audio (that's how they
would make those Frequency Response charts for speakers) to hard disk drive
head servos ("if we send the head from track X to track Y, will it get there
directly, or "ring" \-- slightly overshoot and then overcorrect for a
while?").

One fun detail: eventually they had to encase their OpAmps in special little
heaters that would keep them at a constant temperature, to avoid drift as the
ambient temperature changed.

You can still occasionally buy one surplus on eBay for a few hundred dollars;
they were about $10k new (in 70's dollars).

~~~
CapacitorSet
That sounds _extremely_ interesting for people who are into control theory! Do
you happen to have schematics?

~~~
drfuchs
I think so; I'll have to look through storage. Drop me a line at {myid} at
yahoo.com.

Let me point out that the description here is as understood by a pure-digital
person: I didn't seem to inherit the analog gene, and as a teen couldn't grok
his deeper explanations ("What do you mean 'the imaginary part of the signal'?
We're in the real world; the electrons don't have any imaginary components!").

~~~
dragontamer
Imaginary numbers are pretty easy to understand, once you realize they're just
a 2x1 matrix, and that all "imaginary numbres" math are equivalent to 2x1
Matrix linear algebra.

Roughly speaking, you need 2-degrees of freedom to represent a circuit: the
voltage and the current. Instead of throwing "two numbers" around all the
time, we create a complex number.

We call the two parts of the number "real" and "imaginary", but really...
they're roughly lining up with "voltage" and "current". We just call them
"real" and "imaginary" because that's what all the math guys call it.

A "purely imaginary" number in phasor world is when the current is +/\-
90-degrees out of phase with the voltage. (180-degrees out of phase means the
current is simply negative, going backwards).

A "purely real" number happens at 0-degrees (positive current) or 180-degrees
(negative current). "Purely Real" numbers line up exactly to all of the DC-
voltage experiments from beginner-level electronics.

As it turns out, all circuits are made up of resistors, capacitors, and
inductors can be calculated using "simple" addition, subtraction,
multiplication and division with complex numbers.

\-------------

The above is a gross simplification of whats going on. More details can be
found here:
[https://en.wikipedia.org/wiki/Phasor](https://en.wikipedia.org/wiki/Phasor)

------
seiferteric
For my control systems class several years ago I created a PID controller with
just OpAmps. Each portion had a switch to enable/disable to see the results of
P+I, P+D, P+I+D etc. The output from each portion could be amplified
(controlled by a trimpot) and then was summed (also by an OpAmp) and then went
to a pair of transistors to drive a DC motor on a Plexiglas board with a
pointer and 0-180 degrees marked out. The input was a potentiometer with a
knob to set the desired angle. Pretty fun project. Others in the class decided
to do it digitally with a micro controller but seeing it all done with analog
components was really cool!

~~~
jcoffland
You made a servo.

~~~
seiferteric
Yes that's true, just wanted to explain the mechanism behind it. Just a note,
I don't think most hobby type servos implement full PID control, though many
industrial ones do I am sure.

~~~
snovv_crash
Hobby servos often have controllers far more complex and accurate than PID.
Something taking into account the gearbox and motor inertia, input voltage,
etc. And not just tuned with high pass, low pass and gain filters.

~~~
seiferteric
Is that true? I know there are more optimal control schemes than PID IF you
know the parameters of the load, but with a servo you could connect anything
to it, so its not just the gearbox and motor, so how would it handle it? I
would love to see some implementation code or diagram of a hobby servo
control. Also my understanding is that PID is optimal IF you don't know about
the load, but maybe that is wrong, its been a while.

My impression was that hobby servos used basically just PI control and were
heavily damped so you don't really get overshoot, but again I don't really
know.

~~~
snovv_crash
The cheap, older 'analogue' servos were just PI, but the new 'digital' servos
have much more advanced control. They actively reverse power to the motor when
getting close to the target to extract regen power from the motor inertia
while still getting a faster response. Yes, you can get this from PID, but you
can't keep it at full reverse right up to the moment it hits the target
without getting nasty oscillations as the voltage changes.

------
westbywest
Analog electronic computers actually offered significant advantages in
precision over their digital equivalents for much of the mid-20th century, due
to limitations imposed by slow clock speeds and small word sizes. Calibrating
them was non-trivial, though.
[https://en.wikipedia.org/wiki/Analog_computer#Modern_era](https://en.wikipedia.org/wiki/Analog_computer#Modern_era)

~~~
mindcrime
Interesting, especially in light of the "modern" view where the prevailing
narrative is that analog computers are (in general) less precise, but faster.
And, of course, less flexible, but I think that goes without saying.

I am curious to what extent an analog/digital hybrid can blend the best of
both worlds, especially for something where the speed <-> precision tradeoff
may somewhat favor speed. Like training neural networks.

~~~
dragontamer
It appears that digital is king for modern calculations. However, digital
exists only in the ~millivolt to 5V range, its difficult to get a purely
digital circuit to "sense" microvolts or nanovolts.

That's where Analog comes in. You can shape and calculate microvolt and even
nano-volt level signals before upscaling them to millivolts so that the CPU
can handle the rest of things.

In general, digital RAM will hold a value more reliably for a longer-period of
time than an analog storage device (aka: capacitor). True, RAM is implemented
using capacitors (DRAM needs to be "Refreshed" as those capacitors lose
value).

Its easy to "refresh" a capacitor if you only care about two states: high and
low. If you're analog however and care about microvolts / nanovolts, its
impossible to "refresh" capacitors.

After all, what does a capacitor at 5000uV really mean? In digital, you can
refresh it back to 6000uV every few seconds. In analog, you don't know if part
of the information leaked out or not.

~~~
brandmeyer
At the boundary between analog and digital, you typically need to provide an
anti-aliasing filter.

If you are measuring a differential signal with a large high-frequency common-
mode component, then it also makes sense to perform the subtraction in analog.
Otherwise the inevitable small phase shift between the sampling of the two
channels in your ADC will end up coupling the common-mode signal into your
difference.

~~~
dragontamer
While what you say is true, its incredibly amazing how good ADCs are in the
modern era.

3 Gigasamples / second:
[http://www.ti.com/lit/ds/symlink/adc07d1520.pdf](http://www.ti.com/lit/ds/symlink/adc07d1520.pdf)

The fastest OpAmp I can find is this 18GHz GBP:
[http://www.ti.com/lit/ds/symlink/ths4303.pdf](http://www.ti.com/lit/ds/symlink/ths4303.pdf)

At 3 GHz, the OpAmp would only offer a gain of 6 which isn't really enough for
much accuracy.

Granted, the OpAmp is like $5 and the ADC is hundreds of dollars (and the ADC
is only 7-bit accurate)... but the digital world has gotten scary fast and
scary good.

That's why Oscilloscopes, even GHz-Oscilloscopes are being made with digital
technology today.

Hell, this crazy product raises eyebrows: [http://www.digikey.com/product-
detail/en/analog-devices-inc/...](http://www.digikey.com/product-
detail/en/analog-devices-inc/HMCAD5831LP9BE/1127-3475-ND/5170112)

26Giga-samples (Nyquist of 13). I mean, 3-bits sucks, but holy crap is that
fast. A 26GHz ADC would be able to perform digital-filter analysis on 2.4GHz
Bluetooth and Wifi without any aliasing what-so-ever.

~~~
brandmeyer
Like anything, its a cost tradeoff. DSP can be _much_ cheaper than active
analog for many frequency ranges. You can certainly oversample your way to a
cheaper antialiasing filter in many applications. But you can't ever eliminate
it entirely.

------
spott
An interesting point when using Op-Amps: doing linear operations
(addition/subtraction/multiplication by a scalar/integration/etc) is pretty
easy (they are "linear devices").

Doing multiplication/division/exponentiation/etc is actually a little more
difficult. Typically, they are done through log amplifiers[0], of which there
are plenty of monolithic versions.

[0] [http://www.analog.com/media/en/training-
seminars/tutorials/M...](http://www.analog.com/media/en/training-
seminars/tutorials/MT-077.pdf)

------
tingletech
Back in the '80s my grandpa worked for Northrup flying target drones he
designed the guidance system for back in the '70s. He hated digital computers
-- thought there was no way digital with all ones and zeros could ever be as
accurate as the continuous values in analog computers he built with OpAmps.
(He also didn't like seat belts or the UN).

~~~
dragontamer
Well, back then analog circuits were likely faster and more accurate. There
are lots of textbooks that showed you how to get 6.5 digits (21-bits) or 7.5
digits (23-bits) of accuracy using normal parts machined to 1% precision.
These circuits would operate all the way to 100MHz and react to impulses
within nanoseconds... even way back in the 80s... and could be mass produced
for only a few dozen dollars.

You know, back in the era where computers were 33 MHz (That's 30-nanoseconds
per clock!) with 16-bit adders and cost tens-of-thousands of dollars.

Today, things are different. We have 2GHz octcore computers in our pockets
with 4GB of RAM. Analog-circuits have improved, but not nearly as much as
digital circuits.

The primary issue with Analog is that we've hit the noise-floor. Getting more
than 7.5 digits of accuracy on all calculations means getting rid of noise at
the -75db (volts) level or 150db-watts. Case in point: 7.5 digits of analog
accuracy on a 5V circuit is accurate to +/\- 150-nanovolts.

At those minuscule levels of voltages and currents, simple heat, airborne
static electricity, and even sound... noise... and physical movement... can
disturb your calculations. (Crystal Oscillators for example are Piezoelectric
and are tiny-little microphones that pick up sound / movement and convert them
into voltages!! Every PCB trace is a potential antenna that may pick up a
stray radio signal. Things get hard...)

Voltages are connected to the real world after all. At some point, the circuit
can "feel" you breathing on it (condensation in the air changes humidity,
which changes the resistance of the surrounding air. Extra heat decreases
and/or increases the resistance of various components)

~~~
revelation
Of course those 2 GHz octocore computers with 4 GiB of RAM have no chance in
hell to react to impulses within nanoseconds or operate anything at 100 MHz.
All sacrificed at the altar of throughput.

~~~
dekhn
a dedicated 2-3GHz processor can definitely receive a signal, turn it into an
interrupt, and respond to it within nanoseconds. it's the OS that prevents
that from happening.

~~~
dragontamer
> it's the OS that prevents that from happening.

Not quite. Its more about the difficulty of writing the code. A lot of OSes
aren't realtime which can be part of the problem, but a careful administrator
can tune Windows or Linux to be "more realtime".

Consider this: most hardware is set up to DMA to RAM through the northbridge.
It never touches the CPU, the hardware just writes the data directly to RAM
(indeed: this is a good thing. When the CPU is executing from L1 cache, it
isn't touching the RAM anyway).

Then, the CPU gets an interrupt, and then it can read from RAM.

It takes roughly 60 ns alone to access DRAM on a modern processor
([https://software.intel.com/sites/products/collateral/hpc/vtu...](https://software.intel.com/sites/products/collateral/hpc/vtune/performance_analysis_guide.pdf)),
plus whatever time it took for the DMA to happen.

NOW the CPU finally has the data in its cache and can start calculating.

\------------

To fix this issue, you DMA directly onto the CPU's cache itself.

[http://www.intel.com/content/dam/www/public/us/en/documents/...](http://www.intel.com/content/dam/www/public/us/en/documents/white-
papers/data-direct-i-o-technology-overview-paper.pdf)

That takes special device drivers, a lot of optimizing, a lot of thinking
about latency.

~~~
dekhn
Obviously I'm talking about code that is only using L1, and touching only
registers.

~~~
dragontamer
I see your point, but raise you this thought:

Exactly what "received signal" exists inside of a CPU's L1 cache and register?

In practice, any I/O of a modern CPU / Microprocessor is going to run through
the Northbridge and most likely be dumped into DDR3 / DDR4 RAM. That's already
60ns of delay minimum that you've introduced into the system and you haven't
even done any math yet!

Building a digital system that responds within nano-seconds is difficult,
while a purely passive LRC filter... taught in maybe ~2nd year or ~3rd year
Electrical Engineering classes, can indeed respond within nanoseconds!

\--------------

I guess what I'm saying is... building Digital systems that respond within
nanoseconds is certainly possible, but its specialized knowledge and requires
specialized CPUs (aka: a Digital Signal Processor). Most programmers don't
need to do that after all (write a damn login page again)

I'm not exactly an expert on the ways of the DSP. I just know people who moved
in that direction. Its an intriguing field and takes lots of study to be
effective.

~~~
dekhn
I don't think we were talking about DSP... at least when I originally replied
I was making the official point that a high frequency CPU doing single-clock-
cycle reads of GPIOs, can respond in nanoseconds. I think you are talking
about something different...

The GPIOs are pins that are directly attached to the CPU. They are accessible
by the CPU without requiring access to cache.

Not denying that it's hard to do on a general purpose Intel CPU with its
absurdly high clock rate. But anybody with a fast oscilloscope can tell you
that fast microcontrollers have that kind of response time, too (10s of ns),
because ops take a clock cycle, and 100MHz = 1ns.

~~~
revelation
They tried this on an AM335x, a modern ARM Cortex A8 used on the BeagleBone
Black for example. Clocked at 1 GHz and doing nothing else other than toggle a
GPIO pin - it took _200 ns_. This is for a fully integrated SoC. Merely due to
various levels of interconnect within the CPU.

Here is the slide for reference:

[http://imgur.com/a/sWyH5](http://imgur.com/a/sWyH5)

~~~
dekhn
That's with an OS (right? You left out the necessary context), and it's not a
fast processor.

------
mindcrime
And on a related note.
[https://archive.org/details/anacomp](https://archive.org/details/anacomp)

Lately I've become really interested in analog computing and want to explore
the possibilities of analog/digital hybrid computers. If anybody has done / is
doing anything fun in this regard, I would love to hear about it.

Also, there's a subreddit for Analog Computing if anyone is into that sort of
thing. [http://analogcomputing.reddit.com](http://analogcomputing.reddit.com)

~~~
aswanson
Hear about it, courtesy of nickpsecurity:
[https://news.ycombinator.com/item?id=10614952](https://news.ycombinator.com/item?id=10614952)

~~~
nickpsecurity
Thanks for posting it. I wanted to but my mobile didnt have it. Those were
best papers I found showing amazing results you can get with analog computing.

~~~
aswanson
Thanks for originally sharing; hoping to do something with this info and will
share.

~~~
nickpsecurity
If you want some funding or impact, an accelerator for a proven deep-learning
architecture could be useful. Prior work in neural computation got plenty
results. Just havent seen many independents try it for deep learning. One
wacky idea I had was using analog accelerators for complex, evaluation
functions in genetic algorithms. The digital chip would just feed each
solution to a bank of them with fitness result popping out other side. Another
idea was attempting any of that with Field-Programmable, Analog Arrays to see
what results they got. Lowers development costs vs making masks for custom
circuits.

~~~
aswanson
Those are awesome ideas, ones I never would have thought of! I was thinking
more along the lines of using analog filtering for convolutional layers of
deep neural networks, dropout layers and filter templates for 2-d features are
just bandpass filters, which analog filters can approximate easily. That
analog fpga idea is a good one also; there needs to be a way to perform rapid
experimentation on these types of ideas. Are there any products/companies you
know of in this area?

~~~
nickpsecurity
Your ideas are possibly better just because you already have a chance of
building them if they're that close to existing analog schemes. I'd try them
first to get a feel for the whole thing.

Far as reprogrammable, type in what I told you into Google: Field-
Programmable, Analog Arrays (FPAA's). Or reconfigurable analog. You'll get
companies and CompSci. Also remember that analog works best/easiest on oldest
process nodes. That combined with multi-project wafers can get your ASIC
prototypes down to _thousands_ of dollars rather than tens to hundreds. Middle
ground company is a structured ASIC that's basically like a FPGA programmed
with 1-2 masks at fab time. Triad Sdmiconductor does that for digital+analog
with claimed $400k per design with few weeks turn around. Idk if their analog
cells are suitable, though.

Also, for good measure you might enjoy at least looking at the last general-
purpose (programmable), analog computer:

[http://www.comdyna.com/gp6intro.htm](http://www.comdyna.com/gp6intro.htm)

------
bsamuels
you don't even need electronics to make a computer! here's a 1953 training
film for a mechanical fire control computer

[https://www.youtube.com/watch?v=s1i-dnAH9Y4](https://www.youtube.com/watch?v=s1i-dnAH9Y4)

~~~
gozur88
This kind of stuff fascinates me. There's an amazing amount of intelligence
you can put into oddly-shaped cams.

------
castratikron
Another cool example: solving differential equations with opamps

[https://www.youtube.com/watch?v=DBteowmSN8g](https://www.youtube.com/watch?v=DBteowmSN8g)

------
nanxor
Is it possible that high precision/performance analog devices might perform
some graphics tasks (e.g. ray tracing) faster than digital computers?

~~~
xyzzyz
Sure -- if you cast light at physical objects, and then collect the reflected
light with photo-sensitive component, you can perform real-time ray-tracing
with arbitrarily good accuracy.

------
b1gtuna
Sounds like a fun weekend project. Any guide on how to actually put this
together to run check the math?

~~~
qntty
Get a breadboard, opamp, some resistors/capacitors and wires. Read the data
sheet for the opamp to connect it correctly (connect V+ and V- to +/\- 15 V
power supplies). Feed in a saw tooth wave and look at the output with an
oscilloscope (you won't see anything interesting with a constant voltage for
input). If you need a reference for understanding the principles that make it
work, use The Art of Electronics 2nd or 3rd ed.

~~~
b1gtuna
Thank you. I had a hunch that this would require a scope and a signal
generator. And I have neither unfortunately.

