
IBM has simulated a quantum computer with 56 qubits on a classical computer - rbanffy
http://www.alphr.com/technology/1007446/ibm-has-just-achieved-an-impossible-step-in-quantum-computing
======
davidkuhta
I always enjoy the Eureka moments:

>“My seemingly inconsequential moment came one night while washing dishes and
using a bristle brush to clean a tall glass,” said IBM’s Edwin Pednault. “It
suddenly occurred to me that if one looks at the gates applied to a given
qubit in a grid circuit, the gates form a bristle-brush pattern where the
bristles are the entangling gates that are being applied to that qubit.” He
then took this idea and applied it to simulating qubits.

So let us all applaud Louis F. Kutik who after inventing the plastic bristle
brush in 1964 could not have surmised it's inspiration for advancements within
the quantum computing domain.

~~~
matt_wulfeck
Let us also remember the importance of taking a break and “walking away” from
hard problems. Sometimes the best thing to do is to simply stop working.

~~~
jesseb
Rich Hickey's "Hammock Driven Development" is a great talk regarding just
this, for anybody who hasn't seen it.

I personally find this is one of the most effective ways of cracking a
difficult problem. It's amazing what your subconscious is capable of.

------
Strilanc
The work is about simulating wide-but-shallow quantum circuits (e.g. 49 qubits
with 27 layers of operations, or 56 qubits x 23 layers [edit: HN title much
better now]). The article entirely fails to mention the depth restriction.
(The paper is very upfront about it: [1].)

Personally, I happen to agree with Scott Aaronson's take [2]:

> _The goal, with sampling-based quantum supremacy, was always to target the
> “sweet spot,” which we estimated at around 50 qubits, where classical
> simulation is still possible, but it’s clearly orders of magnitude more
> expensive than doing the experiment itself. If you like, the goal is to get
> as far as you can up the mountain of exponentiality, conditioned on people
> still being able to see you from the base. Why? Because you can. Because
> it’s there. Because it challenges those who think quantum computing will
> never scale: explain this, punks! But there’s no point unless you can verify
> the result._

Although doing that _and_ pushing into a regime that no one can simulate until
a couple years later sure wouldn't hurt.

1: [https://arxiv.org/abs/1710.05867](https://arxiv.org/abs/1710.05867)

2:
[https://www.scottaaronson.com/blog/?p=3512](https://www.scottaaronson.com/blog/?p=3512)

~~~
taneq
> But there’s no point unless you can verify the result.

I thought there were quite a few problems (like factoring) which were hard to
generate a correct answer, but easy to check whether an answer is correct?

~~~
jdironman
A bit off topic to your question, but this reminds me. Where are we exactly
progress wise with P vs. NP problem? I've not thought to track the millennium
problems in a while. I'm sure I would have heard if something substantial had
arisen though. I used to work with it at some point on specific variation of
the problem with was the dorm room housing problem [1]. That particular
scenario always stood out to me as something that should be relatively easy to
solve and I worked on it for a few years. Anyways, I'm way off topic now.

[1] [http://www.claymath.org/millennium-problems/p-vs-np-
problem](http://www.claymath.org/millennium-problems/p-vs-np-problem)

~~~
Aardwolf
Every now and then there is a paper that attempts proving P=NP or attempts
proving P!=NP, I've seen different papers on hacker news several times. The
most interesting link someone posted was this one listing many such papers,
nicely indicating [equal], [not equal], [undecidable] or [unprovable]:

[http://web.archive.org/web/20171006224902/https://www.win.tu...](http://web.archive.org/web/20171006224902/https://www.win.tue.nl/~gwoegi/P-versus-
NP.htm)

Archive link because original page has certificate error :(

~~~
jdironman
I am especially intrigued by the argument made by Nicholas Argall. In that the
question of whether P = NP or not is incomplete in and of itself.[1] I think
this is very true on the basis that not all of these problems are the same or
present the same criteria to solve. I think understanding these problems is
going to require thinking from a different depth though. It's fun to me to
think about and try to wrap my head around. I'm no expert but have a lot of
fun with it.

[http://web.archive.org/web/20170817220728/http://www.win.tue...](http://web.archive.org/web/20170817220728/http://www.win.tue.nl/~gwoegi/P-versus-
NP/argall.txt)

~~~
thethirdone
The argument put forward in the link misrepresents Godel's theorem.

> 6) Therefore, no consistent and complete definition of the set NP is
> possible Rationale: If we accept that the set NP can only be rigorously
> defined via a language, this conclusion follows from the premises above.

This doesn't even make sense. Definitions do not change a languages
completeness or consistency; they simply relate a new concept to previously
defined concepts.

> 7) Therefore, no consistent and complete statement of the problem of P=NP is
> possible Comment: A conclusion which is not only proven in this paper, but
> supported by the years of argument between mathematicians regarding the
> relevance of proposed answers to the problem.

This is not supported from Godel's theorem. The fact that you need a non-
complete system to be able to reason about something does not mean the theorem
you want to prove is not provable.

The main problem with the argument is that it works for other theorems that
have been formally proven.

------
dracodoc
This news raise so many questions that not answered, provided little
information for the issue.

This link is much better.

[https://www.newscientist.com/article/2151032-googles-
quantum...](https://www.newscientist.com/article/2151032-googles-quantum-
computing-plans-threatened-by-ibm-curveball/)

------
wyager
The article’s description of whatever they’re theoretically doing is absolute
fucking garbage, but it sounds like they’re probably doing some sort of
compressed representation like Young Tableaux for efficiently representing a
subset of quantum circuits. Unfortunately, it’s usually not the subset you’re
interested in.

~~~
xiphias
If they found a method to make elliptic curve cryptography less secure, I
would start to be worried, but I guess that's not the case.

------
adamnemecek
a.) I’m taking most news coming from ibm with a grain of salt, there was so
much commotion around Watson and nothing came out of it

b.) quantum digital is boring, quantum analog, now that’s flavortown

~~~
marcosdumay
> quantum digital is boring, quantum analog, now that’s flavortown

What do you mean by that? You mean simulating on analogical classical
computers? Or quantum computers without qbits?

~~~
adamnemecek
The latter [https://en.m.wikipedia.org/wiki/Continuous-
variable_quantum_...](https://en.m.wikipedia.org/wiki/Continuous-
variable_quantum_information)

~~~
marcosdumay
Are there expected advantages from using continuous values?

I imagine error resistance would become incredibly harder. Does information
density grow fast enough to compensate it?

~~~
adamnemecek
This is actually a really long answer but to give you a flavor, on an analog
computer, you can sort in o(n)
[https://en.m.wikipedia.org/wiki/Spaghetti_sort](https://en.m.wikipedia.org/wiki/Spaghetti_sort)
also you have integration and differentiation as primitives.

------
ckastner
It was unclear to me what the impossibility is that is referred to by the
article, but it appears to be the exponential nature of the required storage
for the simulation:

> _Simulating anything more than 49 qubits on a classical computer was thought
> impossible because of the limited memory classical computers have compared
> to their quantum counterparts. Because of the difference between classical
> and quantum computing, each time you add a qubit to a simulation it
> increases the necessary memory exponentially in the classical simulation._

If true, then the jump from a simulation of 49 to 56 was clearly not
impossible in the first place.

Following some links, the term "impossibility" is first introduced by the _New
Scientist_ , but it's not backed up by anything substantial, as far as I can
tell.

~~~
zellyn
Well, they explained it a couple of paragraphs later.

The previous record, a 45-bit quantum simulation, took 500 Terabytes of
memory. I assume there's some sort of terrible exponential increase that would
have made 49-bit simulations require four orders of magnitude more, or
something impractical. But this one did 56 bits with just 4.5 terabytes of
memory.

All seems pretty clear.

~~~
ckastner
That's my point: impractical as it may be at the current time, I could not
find an argument as to why it should generally be impossible.

------
donkeyd
So, what does this actually mean for quantum computing? I have hardly any
understanding of the subject, so to me it seems odd that a breakthrough in
simulation is very relevant. I also don't have a research background, so it
may be that.

~~~
_rpd
From the paper ...

> With the current rate of progress in quantum computing technologies,
> 50-qubit systems will soon become a reality. To assess, refine and advance
> the design and control of these devices, one needs a means to test and
> evaluate their fidelity.

[https://arxiv.org/abs/1710.05867](https://arxiv.org/abs/1710.05867)

It is for testing new quantum computers. We can now use classical computers to
simulate larger ( > 49-qubit ) quantum computers. Previously ...

> Such calculations were previously thought to be impossible due to
> impracticable memory requirements.

------
EA
Google cache:
[http://webcache.googleusercontent.com/search?q=cache:TF06flI...](http://webcache.googleusercontent.com/search?q=cache:TF06flIYfa8J:www.alphr.com/technology/1007446/ibm-
has-just-achieved-an-impossible-step-in-quantum-
computing+&cd=1&hl=en&ct=clnk&gl=us)

------
andyjohnson0
Does anyone know if this new approach, using tensors, also affects simulation
speed?

~~~
mfukar
It's not a new approach; the equivalence was published ~1999 IIRC. It also
affects speedup; tensor slicing can now be fully parallelised.

------
jlebrech
can graphics be done with a quantum computer?

I feel like AI would be it's best field, i.e computing every possible move at
once?

~~~
sp332
It does't really calculate all the moves. But you could probably use Grover's
search algorithm to improve your odds of finding a good move.
[http://twistedoakstudios.com/blog/Post2644_grovers-
quantum-s...](http://twistedoakstudios.com/blog/Post2644_grovers-quantum-
search-algorithm)

~~~
jlebrech
could quantum enabled graphics cards find better algorithms, and speed up it's
rendering of a game for example?

