No he didn't, he just proved the wellknown and widely implemented Schoenhage–Strassen conjecture from 1971 to be correct.
Which was O(n log log n), now implemented for O(n log n).
Schoenhage–Strassen is used for multiplying large numbers with >30.000 digits. Below Karatsuba is used, and for >trillion digits Fuerer could be used. What he did was to improve Fuerer's method from 2007, nobody is using yet.
From my understanding, this won't have any implications on the way ALUs are designed, correct? I was taught that Russian Peasant multiplication[1] was the most straightforward method for implementing multiplication efficiently on hardware. I would have to imagine that performing a Fast Fourier Transform (FFT from the paper) would add significant overhead. Can anyone more familiar with ALU design confirm or deny this?
Edit: I see my error. N is number of digits, not size of number. By our usual definition of n, table multiplication is log10(n)², which is much smaller.
Schoenhage–Strassen is used for multiplying large numbers with >30.000 digits. Below Karatsuba is used, and for >trillion digits Fuerer could be used. What he did was to improve Fuerer's method from 2007, nobody is using yet.
https://en.wikipedia.org/wiki/F%C3%BCrer%27s_algorithm