
Machine Learning using Quantum Algorithms - vikram360
http://googleresearch.blogspot.com/2009/12/machine-learning-with-quantum.html
======
esk
Could please someone give a layman's explanation for this bombshell?

> Assume I hide a ball in a cabinet with a million drawers. How many drawers
> do you have to open to find the ball? Sometimes you may get lucky and find
> the ball in the first few drawers but at other times you have to inspect
> almost all of them. So on average it will take you 500,000 peeks to find the
> ball. _Now a quantum computer can perform such a search looking only into
> 1000 drawers._ This mind boggling feat is known as Grover’s algorithm.

If 999,000 drawers are left unopened, how can the algorithm guarantee that the
ball will be found?

~~~
mynegation
Disclaimer: I do not know much about quantum computation, but got interested
in it only very recently. I'll give you my understanding, take it with a grain
of salt.

AFAIU, Grover's algorithm is not really an algorithm of search of unstructured
data, like phone book, but rather a quantum algorithm for inverting a
function. The way it is related to search is that you have to have a
computable function f such that f(x)=1 if 'x' is one of the million 'boxes'
and f(x)=0 otherwise. Your goal is to find that one 'x' (assumption being, of
course, is that 'f' is not easily inversible).

How would traditional computer solve the problem - by going through all x's
and once f(x)=1, you have your x.

The power of quantum computer is that the state of the quantum computer is the
quantum superposition of many classic states. As such quantum computer may
perform a computation of 'f' at the superposition of many 'x's at one and get
a superposition of function results (that can be measured with a certain
degree of uncertainty)

The algorithm essentially goes through superpositions narrowing sets of
possible candidate x's pretty fast. Algorithm is inherently probabilistic but
the point is that you can converge to x that is your correct x with a high
degree of certainty in a (relatively) small amount of steps.

If anyone has more qualified explanation to offer I will be happy to stand
corrected.

~~~
greeneggs
"AFAIU, Grover's algorithm is not really an algorithm of search of
unstructured data, like phone book, but rather a quantum algorithm for
inverting a function."

Grover's search algorithm works both for inverting a function and for database
search. But if you want to use it for database search, then your database
needs to be accessible in coherent superposition, i.e., in some sort of
"quantum RAM." For many problems, this probably does not make sense.

------
benhamner
This is an old article (from 2009). Hartmut Neven provided an update at ICML
2011 in the latter part of his keynote talk -
<http://techtalks.tv/talks/54457/>

~~~
mangala
You know, it's funny how ridiculous D-wave sounded when they insinuated they
were scaling qubits about as fast as Moore's law, but here they are in 2011
with a 512 qubit chip that clearly works. That's nuts.

It's hard to believe our technology has advanced this far so fast.

Combined with technologies like IBM Watson, at this rate in 10 years we'll
have no idea what the fuck our computers are even doing anymore.

~~~
da-bacon
Though one should be careful as the D-wave qubits aren't of the kind that
produce a universal quantum computer AND I don't think anyone knows whether
their current machines are scaling better than classical analogues.

~~~
rfurlan
You are correct, their QC is more of a hardware solver for a very specific
problem than a general purpose computer. On scaling, I was a D-Wave a few
weeks ago, they seem to be prepared to scale-up the chip aggressively in 2012.

~~~
da-bacon
I think one has to be careful about the engineering challenge of scaling
(which I think D-wave has heroically succeeded at so far) versus the question
of whether the adiabatic algorithms in their system (finite temperature,
imperfections, etc) will scale. I think they can pull of the first challenge
(again, an amazing accomplishment), but the second is a question of whether
the physics and computer science of their device will scale. I think the good
news is that they will probably be able to answer this question in the next
few years (thanks to the purchase of one of their devices my Lockheed Martin),
but I do worry the answer they get won't be the one they are hoping for.

------
Wilya
Non js-heavy version, for those who need/want it:

[http://googleresearch.blogspot.com/2009/12/machine-
learning-...](http://googleresearch.blogspot.com/2009/12/machine-learning-
with-quantum.html?v=0)

------
temphn
Someone should modify the post to note that this is from Dec. 2009. Not sure
what to think here. Google is endorsing, but IEEE is slamming. DWave actually
has a reasonable looking dev kit on this page, but Scott Aaronson is quite
critical.

[http://spectrum.ieee.org/computing/hardware/loser-dwave-
does...](http://spectrum.ieee.org/computing/hardware/loser-dwave-does-not-
quantum-compute)

<http://www.dwavesys.com/en/dev-tutorials.html>

<http://www.scottaaronson.com/blog/?p=306>

<http://www.scottaaronson.com/blog/?p=291>

Would be nice if a real expert weighed in on this thread.

------
rfurlan
D-Wave's computer is basically a hardware solver for Ising/QUBO models. It is
not programable in the traditional sense and you need to find a way to express
the problem you want to solve in a way that maps well over the hardware.

~~~
rfurlan
To clarify: you express your problem as an Ising model, load the model into
the D-Wave One QC and it will solve it for you. The interesting thing is that
solving Ising models is a NP-hard problem.

D-Wave's current chip has only 128 qubits and it is somewhat limited. As they
scale it up (and they are prepared to do so) you will be able to use their
computer to solve larger and larger models that could not be solved with
standard silicon in a reasonable amount of time.

------
zeratul
In my line of work I see RAM limitation rather than CPU clock speed
limitation. Will this technology also solve the big data problem?

~~~
mvzink
This isn't about RAM or CPU limitations, it's about entirely different genres
of algorithms that aren't feasible with classical computers.

------
viscanti
Quantum Algorithms (run on quantum computers) are the future for Machine
Learning. This article was from 2009 though, and I haven't seen a whole lot of
progress in the field. Artificial Neural Networks that can model every
possible value of every node at the same time is very powerful.

~~~
arido
I don't buy it. Quantum computing is extremely important, ground breaking even
for complexity theory and theoretical computer science in general. Quantum
computing is to complexity theory what the real numbers are to number theory.

From a practical point of view though there are only two interesting family of
algorithms: Shor's Quantum Fourier transform and Grover's square root speedup
for searching an unstructured list (give or take). The former doesn't seem
very applicable to machine learning (yet?). The later is AFAI am concerned
useless for machine learning: if we really want to speed up searching some
unordered structure then I think we will rely on sorting for a very long time
to come - the advantage is much more impressive than Grover's square root
speedup.

May I remind you that in the past 20 years there hasn't been much progress on
"practical quantum algorithms" (that is not to say quantum complexity
results). I think until something brand new comes out of the quantum computing
community, these quantum claims are great PR but that's where it ends ...

