The coolest application of an FFT I implemented was for a custom oxygen sensor for a wearable breath measuring device. The sensor would flash a blue light at about 40 kHz, and there was a phosphorescent dye which would flash orange with a phase lag that changed based on temperature and oxygen level.
Using an FFT we could get that phase lag at 40 kHz and back out the % oxygen in exhaled breath! Typical oxygen sensors at the time were not anywhere near fast enough to do that.
Was the phosphorescent dye inhaled? What was the physical process responsible for the phase lag? Would love to read more about your device if you ever published your results!
It's in a similar style to 3blue1brown (uses the same visualisation library) but the video focuses specifically on the FFT algorithm, not on Fourier transforms in general. I'm pretty sure Grant himself recommended the Reducible video at some point.
If you’re new to signal processing, then just remember that the basic algorithm of spectral analysis is the DFT and the FFT is just a clever way to compute it by skipping all the redundant operations.
The DFT is not trivial at all. The standard way of teaching goes over calculus (including complex numbers), introduces the original Fourier transformation and then discretization (numerical mathematics). Having a PhD in physics, let me assert you that we still have to think twice many times when we "just do a dft" of something. This is also obvious if you look into the manuals, for instance the many variants of a FFT in https://docs.scipy.org/doc/scipy/reference/tutorial/fft.html or https://www.gnu.org/software/gsl/doc/html/fft.html . This is not trivial.
Yes I know, I used to do research in signal processing.
Part of that complexity is because the FFT only works on power-of-2 length (e.g. 1024) input vectors so they have special tricks to work with arbitrary length data.
The DFT has no such restriction.
I’ve read somewhere that if you apply dynamic programming to the DFT (recursive sub problems, memoization) then you end up with the FFT. One of these days I’ll try to derive that.
sure, that’s the standard way of teaching and is useful background, but you don’t need calculus or the continuous Fourier transform to come up with or understand the DFT. Once you know what a basis is for a finite dimensional vector space, the DFT is a straightforward change of basis to a particularly useful set of basis vectors which are easy to write down.
The manuals you cite cover several different transforms (not just the DFT) as well as the complexities of the FFT.
Or more generally, it’s the correlation of your signal with a set of sine waves of different frequencies.
It doesn’t have to be a sine wave. You could correlate your signal to a fart sound (or a set of them). It’s a valid transform but it would not be invertible—given a vector of scalar values showing how much your signal resembled each type of fart, you could not reconstruct the original signal.
However you can do that with the FT. It’s invertible and you can recover the signal from the spectrum. So in some sense it doesn’t lose information. Energy in the signal is equal to energy in the Fourier spectrum.
Trivial, or just straightforward? I don’t think that it’s trivial to know why multiplying by this powers-of-roots-of-unity matrix is interesting, and does the things that it does.
The Correlation of two signals a and b is just multiplying them (if they are vectors, multiply each element of a by the corresponding element of b)
Corr(a,b) = sum(a. * b)
The correlation of a signal s with a sine wave of frequency f just gives the signal power at that frequency.
Repeat for a set of regularly spaced of frequency values from 0 to Fs/2 (the maximum frequency you can work with with data sampled at Fs) and you have the DFT. You can even clump the sine waves into a matrix and your DFT gets reduced to a single matrix multiply.
This is much easier to reason about than worrying about how an FFT works which is just a speed optimization (which can be derived from dynamic programming applied to the DFT).
For someone doing signal processing the results are the same. Introducing the FFT without making reference to the DFT is wrapping the subject up in an air of mystery.
Physicist’s nitpick: this is Fourier transform (or its discrete equivalent DFT, not to be confused with density functional theory). FFT (fast Fourier transform) is just an algorithm to calculate DFTs. They are used interchangeably by computer people, but they really are not the same thing.
That aside, yes, it is amazing to play with this sort of things and produce Fourier transforms by shining a light through a physical object.
I remember a lab assignment from undergrad where we de-noised a picture by blocking some parts of its transform before transforming it back. We could remove the noise or the image itself (keeping almost only the noise) by blocking different parts of the beam. It makes the concept of frequency filtering very intuitive.
I loved this explanation of FFT, full with interactive examples and it makes it also easy to understand how the JPEG compression utilizes FFT https://www.jezzamon.com/fourier/ AKA how can you represent an image with Fourier transformations
Using an FFT we could get that phase lag at 40 kHz and back out the % oxygen in exhaled breath! Typical oxygen sensors at the time were not anywhere near fast enough to do that.