
Quantum Computers as Bayesian Networks - psakkaris
http://arxiv.org/abs/1607.07887
======
Strilanc
My opinion of this paper is not very high. It's not totally confused or
crankery or anything like that, but the author doesn't show the data structure
they propose doing anything well. Also they seem to think that it will somehow
lead to a polynomial time classical simulation for quantum mechanics for no
good reason.

I'll just list three quotes that bugged me:

> _The X, H, R(k), M and SWAP gates are discussed in detail and results show
> linear scaling as the number of qubits are increased._

How did they manage to spend linear time per operation when you can't even
make two qubits interact with the given operations? Just working on the qubits
individually will be constant time instead of linear time.

> _The efficiency is still unresolved for these coherent control gates. The
> number of edges can still grow exponentially for coherent control gates,
> however, it remains to be seen if duplicate edges can be efficiently
> merged._

They didn't do the hard part yet, but the easy part was easy, so they expect
the hard part to be easy.

> _if BPP = BQP then it is possible that quantum mechanics can be de-
> randomized by a deterministic theory thus confirming Einstein’s conviction
> that nature does not play dice_

Confusing "can be simulated classically in polynomial time" with "is not
random". The existence of merge sort doesn't prove that quick sort isn't using
randomness.

The arguments for randomness in quantum mechanics aren't based on complexity
in the first place (e.g. we can trivially simulate Bell tests).

~~~
psakkaris
Thank you for the criticisms, It will really help with revision so I can
explain the work better:

1\. The work is "easy" or "trivial"

I disagree. It does not scale the way we want yet for coherent control gates
but we were able to scale the terminal QFT accurately by using Graph Theory on
a classical computer vs matrix mechanics. That is an alternate algorithm at
the least so not very easy. For example, if someone told you go implement the
terminal QFT on a classical computer using graph theory how quickly do you
think you could have done it? Or even better, what if I tell you now
"implement an alternate scalable terminal QFT algorithm on a classical
computer without using QuDot Nets" how easy do you view that question?

2\. Maybe I did not explain the IF BPP=BQP argument well

IF BPP=BQP, and that is a BIG IF, then it is very likely that a different
deterministic theory could reproduce the randomness in quantum mechanics. We
are not saying that quick sort isn't random, we are saying that quick sort can
be replaced by a non random algorithm. Why? because there is much evidence
that P=BPP which implies that randomized algorithms can have equivalent non-
random algorithms. Similar to how the primality testing algorithm was shown to
have a non-random equivalent. "Quantum computing from Democritus" does a nice
job of explaining this.

Lastly, it could be that this graph algorithm of quantum computing will never
be able to scale coherent control gates. But at the very least we came up with
an alternate algorithm to do quantum computing on a classical device. Not
completely worthless right? What else are you going to do on a Saturday night?
:-)

Thanks again for feedback, even though it was negative

~~~
Strilanc
For 1: I didn't mean to imply anything about the work being easy or hard, by
the way. I was just contrasting against the difficulty of proving BQP=BPP.
(e.g. I might refer to Terence Tao's work on Navier-Stokes [1] as 'solving an
easier version', but certainly no connotation about the difficulty of the work
would be intended).

But I do think you should avoid making such large claims based on what was
actually done. It comes off as naive; unaware of how much work has gone into
trying to make these things work. Of course people have tried hierarchical
models!

For 2: You're still confusing "I have an efficient deterministic algorithm to
make predictions about X" with "X must be deterministic". The efficiency of
the algorithm is totally independent of whether or not the other system is
"really" random or not! Some random processes are downright trivial to make
predictions about, even with really truly literally random inputs. And some
are arbitrarily hard to predict, even when we replace the randomness with a
cryptographic pseudo-random number generator (but don't tell you the seed).

Consider that we _already_ have deterministic algorithms that predict quantum
mechanics. The fact that the algorithms take a long time to run has little
bearing on _the predictions they make_. Finding a more efficient algorithm for
quantum won't turn it into a deterministic process. You'll still be able to
pass Bell tests, and the algorithm had better predict that too or its
demonstrably wrong!

1: [https://terrytao.wordpress.com/2014/02/04/finite-time-
blowup...](https://terrytao.wordpress.com/2014/02/04/finite-time-blowup-for-
an-averaged-three-dimensional-navier-stokes-equation/)

------
tdaltonc
All kinds of architectures and algorithms can be implemented at the chip level
in silicon. But, until recently, the "just make a faster Von Neumann machine"
school of chip design has always beaten the "make a task optimized chip"
school.

Is there a similar tension of the deign of Q-computers? Can we make an
educated guess on which design school will dominate?

