
Deep Learning Meets DSP: OFDM Signal Detection - k3blu3
https://blog.kickview.com/deep-learning-meets-dsp-ofdm-signal-detection/
======
oliver_quartic
I'm sure this was fun to build, but it doesn't sound particularly practical.

Specifically, it's unclear why one would eschew the high degree of structure
in OFDM (specifically designed to aid analytical approaches to
time/frequency/channel estimation) in place of applying a general-purpose
learning technique.

I suspect this results in a much lower-performing, much more expensive
algorithm, relative to state of the art. The lack of a relative performance
comparison is telling here. (Detection "below the noise floor" sounds
impressive, but in practice that's how many/most digital radio systems work.)

It's also unclear whether this actually provides fine time/frequency offset
estimates. These are the numbers one actually needs, not just "was there a
signal".

------
AstralStorm
Nice, but did they even try to make a Maximum Likelihood detector first?
Combined with optimal control to sweep frequency spectrum in just the right
pattern?

