Hacker News new | comments | show | ask | jobs | submit login
Deep Learning Meets DSP: OFDM Signal Detection (kickview.com)
33 points by k3blu3 5 months ago | hide | past | web | favorite | 2 comments



I'm sure this was fun to build, but it doesn't sound particularly practical.

Specifically, it's unclear why one would eschew the high degree of structure in OFDM (specifically designed to aid analytical approaches to time/frequency/channel estimation) in place of applying a general-purpose learning technique.

I suspect this results in a much lower-performing, much more expensive algorithm, relative to state of the art. The lack of a relative performance comparison is telling here. (Detection "below the noise floor" sounds impressive, but in practice that's how many/most digital radio systems work.)

It's also unclear whether this actually provides fine time/frequency offset estimates. These are the numbers one actually needs, not just "was there a signal".


Nice, but did they even try to make a Maximum Likelihood detector first? Combined with optimal control to sweep frequency spectrum in just the right pattern?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: