O(k log n log(n/k)) complexity for the general case.
Has anyone else been turned off by the sensationalism of MIT TR articles?
Granted, this doesn't apply to every single situation. However, citing the article in the case of video signals, you have on average 7 "non-negligible" coefficients (k~7). So there are certainly applications that will benefit dramatically from this.
The big-O results of the paper are interesting and with more work could prove a starting point for improvements in very long FFTs. Why long? Because, for big-O asymptotics to kick in, you'll need large n.
But, the domain of applicability seems to be a niche. The "7 non-negligible coefficients" result you refer to is for 8x8 DCT's (as in JPEG). That's n = 8. There are optimized implementations for the small-n cases that will utterly dominate a big-O optimal algorithm like this. Compare Strassen's algorithm for matrix *.
Have you looked at the code complexity of the algorithm in the paper? There are several interlocking components (hash functions, rebinning) that have their own big-O optimality claims and, in some cases, randomization. Getting it all to work together efficiently, given cache issues, even for large n (say, 10^6) would be a serious engineering challenge. People have been working on optimizing data flow for FFT for decades now, and the state of the art is very advanced.
I agree, that the result is interesting in principle. Runtime is actually sub-linear in n, the sequence length.
I guess if you use a smaller k it is useful.
However, be wary of algorithms that expect K as input, as they are asking you to classify your signal before analysis.
High-res Dirac video encoding is definitely possible in real-time, you just need a very optimized encoder, which doesn't really exist. Dirac's main speed cost comes from the overlapped-block motion compensation, not the transform.
Implementation of an FFT on a chip has two components: the logic/computing elements ( governed by O(n.log(n)) ) and the routing of signals between those elements. It turns out the size and speed of the FFT is mainly determined by the routing, not by the logic, and there is a tractable routing solution to a reasonable number of points . The computation complexity becomes secondary if the complexity of the implementation is determined by the non-computational aspects.
 Based on experience in 1995.
- Jonathan Tilly (stem cell research)
- John A. Rogers, Ralph Nuzzo, George M. Whitesides, Etienne Menard (Semprius founders)
- Ren Ng (light field photography, Lytro founder)
- Nikhil Jaisinghani, Brian Shaad (solar-powered microgrids, Mera Gao Power founders)
- Mark Bohr (3D transistors, head of Intel's process technology)
- Piotr Indyk, Dina Katabi, Eric Price, Haitham Hassanieh (Sparse Fourier transform)
- Gordon Sanghera, Spike Willcocks, Hagan Bayley (DNA sequencing, Oxford Nanopore founders)
- Perry Chen, Yancey Strickler, Charles Adler (Kickstarter founders)
- Peter Schultz, Robert Downs, Donald Murphy (Wildcat Discovery Technologies founders)
- Mark Zuckerberg (Facebook founder)
This is the list of the people behind all of the ten emerging technologies, as listed on http://www.technologyreview.com/tr10/
Number of people on the list: 23
Number of women on the list: 1
A while ago, there were a few front page posts about the role of women in IT, and about how they find themselves in their work environments. The MIT list is not restricted to computer science, but perhaps it can be interpolated to the relevant fields here, too?
I think it stands out that women are heavily under-represented on this list, and although explanations for that fact might be manifold, I personally just find it a bit sad to look at that outcome.
Computational resources is always the bottleneck in bioinformatics, quantum chemistry, or any sort of high data volume analysis or simulation, and the FFT is a fundamental and commonly used transform in all fields.
At least for people who still use computers to, well, compute things, a faster FFT is a huge deal.
I'm not sure what you mean by "on a small window such that this isn't really applicable"; can you give an example? As long as you accept the inherent time-frequency resolution trade-offs, there's no obstacle to using FFT on a small window. It's called the short-time Fourier transform (STFT), and it's used everywhere; it's probably used more than analyzing an entire signal, since we frequently want to know how power and phase change over time in a signal, and a full-signal FT can't tell you that.
FFT is very useful on a small window, but algorithms that improve the asymptotic efficiency are unlikely to be useful at that scale. With n=100 the asymptote doesn't matter. We'll still be using the FFT forever, I'm just skeptical that the frontiers of technology will be advanced by a faster FFT, since it seems like the coolest stuff is happening elsewhere. It's increasingly becoming the quick-and-dirty counterpart to the sophisticated-but-slow methods.
This is a useless post except that I wanted to highlight that wonderful little bon mot. Well done.
It'll sure be nifty when 2029 rolls around and the patent expires, so applications can actually use the algorithm. (No, I don't have any information that an sFFT patent has been filed, but this would be standard practice at MIT).
Tornado/fountain codes are a similar case. Pardon my bitterness, but it's an interesting question to wonder whether, by funding researchers to invent algorithms of this type and lock them away behind a patent-wall for two decades, USG is advancing the progress of technology or in fact retarding it.
Mathematical facts aren't patentable. That much has been established. Perhaps a "pure" algorithm isn't patentable; but all software patents start out with something like "a general purpose computer which...", tying it to hardware and making it patentable. Good luck implementing that algorithm without a general purpose computer.
For instance, the i4i patent is a patent purely on an algorithm, and Microsoft lost that case and had to remove the feature from its software: http://en.swpat.org/wiki/I4i_v._Microsoft . I'm not sure how you can square that with algorithms not being patentable.
It's commonly supposed that the Supreme Court could void these illegal patents - could dissolve this entire illegal industry - with a swish of their shiny polyester robes. Actually they already tried that, in Flook. Jedi mind tricks, which in the end is all the Court really has, worked about as well on Andrew Jackson.
I think this explains the disappointing result of Bilski. If you are supposed to have a magic power which in reality doesn't exist, make every excuse to avoid using it. You may still be suspected of impotence, but at least the suspicion is not confirmed.
Unfortunately there isn't really a solution to the patent problem that's compatible with the rule of law, mostly because the rule of law was so long ago abandoned for the rule of lawyers. Perhaps Andrew Jackson could round them all up and march them to Oklahoma - or at least, Marshall, Texas.
A large fraction of the native population that was rounded up and marched in that incident died. Accurate records were not kept, but according to estimates about a quarter of those evicted died as a direct result. Including some of my relatives.
Some of them died justly, others not. I'm not an Albanian and have no interest in investing emotional energy in multi-generational blood feuds. Eg, some of my relatives were killed by the Germans, who make a really fine motorcycle.
Also, life was pretty cheap before the 20th century. It's easy to appall kids these days with the death rate among chained slaves crossing the Atlantic. In fact the death rate among Europeans voluntarily crossing the Atlantic was quite similar. It also wasn't easy or safe to walk across half of North America, whether voluntarily or not. Basically, to anyone who lived in this era - regardless of race, color or creed - we're all a bunch of whining pussies.
Really the main difference between the expulsion of the Cherokees and my proposed ethnic cleansing of the Palo Alto patent lawyers is that the Cherokees didn't deserve it, whereas the patent lawyers do. Lawless force can work good as well as evil. And when we've already left law behind, what's the alternative?
Transparently offload the "patent sin" to a system's users. (With plausible deniability.)
"Because the SFT algorithm isn't intended to work with all possible streams of data, it can take certain shortcuts not otherwise available. In theory, an algorithm that can handle only sparse signals is much more limited than the FFT."
It may well be a very useful algorithm that does FFT-like operations, but the title is marketing hype and it's impossible to judge the true range of applicability of this algorithm from the article alone.