Hacker News new | past | comments | ask | show | jobs | submit login

If a problem can be reduced to efficiently sampling from the summation of an exponentially large number of FFT's, then a quantum computer will destroy a classical computer.

If a task can't be efficiently reduced to such a problem, then a QC probably won't ever help at all; the square root time advantage from grover's algorithm is too easily overwhelmed by simple engineering factors.




What's stopping classical computers from doing this sampling?

If it's sampling, you don't have to deal with the exponential here.


See https://www.scottaaronson.com/papers/optics.pdf

“We give new evidence that quantum computers—moreover, rudimentary quantum computers built entirely out of linear-optical elements—cannot be efficiently simulated by classical comput- ers. In particular, we define a model of computation in which identical photons are generated, sent through a linear-optical network, then nonadaptively measured to count the number of photons in each mode. This model is not known or believed to be universal for quantum com- putation, and indeed, we discuss the prospects for realizing the model using current technology. On the other hand, we prove that the model is able to solve sampling problems and search problems that are classically intractable under plausible assumptions.”

Which is the basis for the experiment discussed here: https://scottaaronson.blog/?p=5122




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: