Hacker News new | past | comments | ask | show | jobs | submit login
Generalizations of Fourier analysis (2021) (gabarro.org)
130 points by mscharrer on April 15, 2023 | hide | past | favorite | 26 comments



A couple of other things that AFAIK aren’t special cases of the ones in the list:

- The idempotent (“tropical”) Fourier transform turns out to be the Legendre transform;

- The fractional Fourier transform, known to physicists as the propagator of the quantum harmonic oscillator, is a pretty fun thing to consider;

- The Fourier-Laplace transform on Abelian groups seems like a fairly straightforward extension of the idea of plugging in a complex frequency, but I haven’t seen a textbook exposition (only an old article);

- The non-linear Fourier transform (with Ki xi + Lij xi xj + ..., finite or infinite sum) seems impressively obscure (I know of a total of one book reference) but occurs in quantum field theory as the “n-loop” or “∞-loop effective action”;

- The odd (in the super sense) Fourier transform turns out to underpin stuff like the Hodge star on differential forms;

- On a finite non-Abelian groups, the duality splits into two: every function on conjugacy classes is a linear combination of irreducible characters; every function on group is a linear combination of irreducible matrix elements; this is probably also doable on Lie groups but I’m too much of a wimp to learn the theory.

(Also, generating functions should by all appearances be a fairly elementary chapter of the Fourier story, as electronic engineers with their “Z-transform” also realize, but I haven’t seen that implemented convincingly in full.)

See as well Baez’s old issue of “This Week’s Finds” where he started with sound and well all the way to spectra of Banach algebras and rings—as in Gelfand duality, algebraic geometry etc. (Can’t seem to locate the specific issue now.) Of course there are also wavelets (there’s even a Fields Medal for those now), but I don’t know that they fit into the representation theory ideology (would be excited to be wrong!).


> See as well Baez’s old issue of “This Week’s Finds”

Well seen! TFA is certainly inspired by that very old Baez post, where he explained to Oz the many viewpoints of Fourier analysis. I was dismayed to see that all such viewpoints were algebraic in nature, requiring special structure in the base space, thus neglecting the fundamental case of a general manifold without symmetries. Now it seems that there are still missing generalizations!


> The non-linear Fourier transform (..) seems impressively obscure (I know of a total of one book reference)

Are those talking about the same thing?

[Information Transmission using the Nonlinear Fourier Transform, Part I: Mathematical Tools](https://arxiv.org/abs/1202.3653)

[Nonlinear Fourier Analysis](https://arxiv.org/abs/1201.5129)


Not sure if you covered this, but I was super impressed by the general discrete Fourier transform: https://en.m.wikipedia.org/wiki/Discrete_Fourier_transform_o...


what is an intuition for complex frequency?


Signals estimated by the FFT have two parameters: magnitude and phase. FFT results evade intuition because complex numbers are cartesian. If you convert them to polar coordinates they make more sense as magnitude and phase

https://www.gaussianwaves.com/2015/11/interpreting-fft-resul...

Note that complex numbers are merely convenient for working with two dimensional quantities. The square root of -1 is just math geek for orthogonal, and has nothing whatsoever to do with signals


(Note: GGP, not GP.) I meant complex frequency and not complex amplitude though.


From a physics perspective, complex frequency results in “evanescent waves” - ie, waves that decay rather than oscillate (technically a fully complex frequency of the form a+ib will both oscillate and decay)


Yep: e^iω (with ω real) is an oscillation, but e^σ (with σ real) is an exponential decay when σ < 1, an exponential growth with σ > 0 and constant with σ = 1

so e^(σ + iω) = e^σ * e^iω is just an exponential growth or decay modulated by a sinusoid.. or, if σ is one, is just a pure oscillation

ω is the usual frequency, but σ + iω is the complex frequency. the fourier transform deals with function that receives ω as input, and the laplace transform deals with functions that receives σ + iω instead.

so the fourier transform is just a special case of the laplace transform with σ = 0


> so the fourier transform is just a special case of the laplace transform with σ = 0

Another useful way of looking at it: Laplace transform is doing many extra Fourier-transform-but-with-decay giving you a map of which "global decay timescale" fits your data best -- since each "slice" is itself sufficient to fully describe the time series

They are all cases of integral transforms with different choices of the set of "primitive fingerprints" -- see chirp transform, wavelet transform, chirplet transform etc -- all taking advantage of the fact that if you choose one set of basis "brushes" that are not redundant with each other (e.g. having red-green-blue brushes is independent, as is magenta-green-yellow but having red-green-blue-yellow is not) then you will be able to describe your signal in terms of a composition of those kernels.


So, so I'm going with a rusty knowledge of a computer engineering course from years ago,

> a map of which "global decay timescale" fits your data best

What do you mean by this?

> -- since each "slice" is itself sufficient to fully describe the time series

But each slice is multiplied by an exponent.. which, okay, becomes a convolution that lets you recover the original function


The baseline decay describes the global dissipation or amplification over time.

While Laplace transform is most useful in more complicated systems, this concept is actually best illustrated in a damped/amplified harmonic oscillator model as it serves as the primitive archetype that more complex systems are composed from.

In a nutshell, the general solution is a linear combination of two exponential that are either pure-decay or oscillatory whose imaginary parts, if the signal is to have no imaginary parts, must cancel each other out so that the result is a real signal, which means that the two must be "synchronized" in time against each ither, i.e. having the same oscillation frequency but in the opposite rotation direction (so the imaginary part opposes each other out), and with the same decay progression. This means that the average of their complex frequencies must be real, i.e. <real mean> +/- <imag diff> for underdamped and <real mean> +/- <real diff> for overdamped, and so you can split out the mean as a common decay function, giving you decay(t)*( a*clkwise(t) + b*ccwise(t) )

    a,b:real
    E,F:real->complex
    y:real->real
    y = aE + bF
      = sum(a,b)sum(E,F)/2 + diff(a,b)diff(E,F)/2


As I meant it—of, not for.

I referred to the idea that by plugging an imaginary frequency into the Fourier transform [ETA: the grown-up Fourier transform with the complex exponent, not the schoolboy cosine kludge], you get the Laplace transform, and while that changes the inverse Fourier transform in a different way, it’s not hard to work out how specifically and obtain the inverse Laplace transform.

Why you’d want to do that, I actually don’t know how to explain convincingly. The post hoc rationalization is simple and more or less the reason people prefer the Laplace transform in signal processing: you still get a convolution theorem, but are now allowed to work with exponentially increasing functions, which standard Fourier theory (even the tempered distributions version) can’t accomodate. But while that’s useful from a toolbox standpoint, it isn’t satisfying as motivation, I think.

This is not the only way looking at the complex frequency plane turns out to be useful—there’s a whole thing about doing complex analysis to response functions aka propagators—but there too I can’t really say why you’d guess to look in that direction in the first place.

What I mentioned was that this idea of Laplace as imaginary Fourier extends beyond the reals to the group setting at least to some extent, so it’s not entirely an R-specific accident. Again, dunno why, I’ve explored this stuff a bit but am far from an expert.


> but are now allowed to work with exponentially increasing functions, which standard Fourier theory (even the tempered distributions version) can’t accomodate.

So you can't take the fourier transform of an exponential? But.. it seems you can? https://proofwiki.org/wiki/Fourier_Transform_of_Exponential_...


exp(-|x|) is not an exponential, it's just the easy, bounded, integrable, half of it :)


I am reminded somewhat of a line in Sanjeev Arora's lecture notes A Theorist's Toolkit:

"Sanjeev admits that he used to find Fourier transforms intimidating as a student. His fear vanished once he realized that it is a rewording of the following trivial idea:

If u_1, u_2, ..., u_n is an orthonormal basis of R^n then every vector v can be expressed as Sum_i alpha_i u_i where alpha_i = <v, u_i> and Sum_i alpha_i^2 = |v|^2"

https://www.cs.princeton.edu/~arora/pubs/toolkit.pdf


That's not a good enough statement to really summarize Fourier transforms, I think? It really just summarizes the idea of an orthogonal basis.


Ugh, I need to learn more math. How do I even start? I know multi variable calculus, I know the basics of linear algebra, and I know Fourier transforms. Yet this article is half gibberish to me.


imo abstract algebra is pretty much the gateway to most modern math. it well feel at first like it's a lot of machinery without purpose, but understanding groups rings and fields well opens the door to topology, advanced number theory and a bunch of the rest of math. the other option would be to learn some real and complex analysis, but imo the algebra side is where a lot more of the cool stuff is.


I wish I could help you.

If you are any sort of programmer "Linear Algebra" is going to be the best bang for your buck. You can have a look a some of the books of Gilbert Strang or his online courses. Beyond that the classic books in "Abstract Algebra" are those by Serge Lang or Jacobson. Either or might be too difficult and maybe not worth it.

I always tell people to study integration correctly. By that I mean measure theory, with the Riemann integral taught in schools you can only go so far.

With those two you are in a good position to tackle functional analysis. The classic book on the subject is that of Walter Rudin but maybe it too hard for self learners. A better alternative might be the open course at MIT:

https://ocw.mit.edu/courses/18-102-introduction-to-functiona...

I also like the book by Vladimir Kadets.

Functional analysis is terribly important in practical areas like signal processing, quantum mechanics or partial differential equation in general. so it might be worth your pain.

There are many ways you can go from there.

I can't leave without recommending one more things. If you are curious of advanced mathematics as a tourist the three books by Avner Ash and Robert Gross make a great read


> Notice that, even if their formulas look quite similar, the Fourier series is not a particular case of the Fourier transform. For example, a periodic function is never integrable over the real line unless it is identically zero. Thus, you cannot compute the Fourier transform of a periodic function.

Someone correct me if I'm wrong, but I do think the latter does generalize the former. I vaguely remember seeing it derived as essentially linking +/- infinity so the function is "periodic" on the real line. But I could be misremembering

This is the class I took, it's incredible: https://see.stanford.edu/Course/EE261/137


Let's not forget the graph Fourier transform: https://en.wikipedia.org/wiki/Graph_Fourier_transform


Yes. It is the discrete analogue of spectral geometry on manifolds. A graph is a discrete manifold.


I asked ChatGPT to explain the intro section of this article like I'm five:

""" Fourier analysis is like playing with music notes. We have different ways of looking at these notes, like how high or low they are, or how long they last. In the same way, we have different ways of looking at signals and how they change over time or space.

Just like how we can make different types of music by combining different notes in different ways, we can make different types of signals by combining different types of Fourier analysis.

There are four classic ways to do Fourier analysis, and each one is like a different way of looking at the signal. They're like different types of glasses we can put on to see the signal more clearly.

But there are also other ways to do Fourier analysis, like looking at signals on different types of spaces or using different types of math. It's like having even more types of glasses to look at the signal!

So when you ask a mathematician what Fourier analysis is, they might give you a different answer depending on which type of glasses they like to wear. """


Yosida is a great reference for this functional analysis.

For a much broader generalization, albeit with expensive concepts, cf. Tannaka-Krein duality.


This sounds like a great direction for an advanced seminar in university.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: