Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Google Replaces BERT Self-Attention with Fourier Transform: 92% Accuracy, 7 Times Faster on GPUs" (2021) https://syncedreview.com/2021/05/14/deepmind-podracer-tpu-ba...

From https://news.ycombinator.com/item?id=40519828 :

> Because self-attention can be replaced with FFT for a loss in accuracy and a reduction in kWh [1], I suspect that the Quantum Fourier Transform can also be substituted for attention in LLMs.

From https://news.ycombinator.com/item?id=42957785 :

> How does prime emergence relate to harmonics and [Fourier,] convolution with and without superposition?

From https://news.ycombinator.com/item?id=40580049 :

> From https://news.ycombinator.com/item?id=25190770#25194040 :

>> Convolution is in fact multiplication in Fourier space (this is the convolution theorem [1]) which says that Fourier transforms convert convolutions to products. 1. https://en.wikipedia.org/wiki/Convolution_theorem :

>>> In mathematics, the convolution theorem states that under suitable conditions the Fourier transform of a convolution of two functions (or signals) is the pointwise product of their Fourier transforms. More generally, convolution in one domain (e.g., time domain) equals point-wise multiplication in the other domain (e.g., frequency domain). Other versions of the convolution theorem are applicable to various Fourier-related transforms.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: