
Rigetti Forest 1.0 – programming environment for quantum/classical computing - reikonomusha
https://medium.com/rigetti/introducing-forest-f2c806537c6d
======
reikonomusha
Hello HN! We at Rigetti are really excited to announce and release Forest. My
colleague @dangirsh and I will be here today to answer any questions that
might come up about Forest and quantum computing!

Some potentially HN-interesting links:

pyQuil, a Python library for quantum programming:
[https://github.com/rigetticomputing/pyquil](https://github.com/rigetticomputing/pyquil)

pyQuil on RTD:
[https://pyquil.readthedocs.io/en/latest/](https://pyquil.readthedocs.io/en/latest/)

Grove, a collection of quantum algorithms in Python:
[https://github.com/rigetticomputing/grove](https://github.com/rigetticomputing/grove)

------
jvns
> We’ve gone from single-qubit devices to fully-functional 8-qubit systems
> that are now in the final stages of validation.

How hard of a problem can you solve with 8 qubits? For example, if you're
implementing shor's algorithm -- it looks (very naively) from the wikipedia
article that to efficiently factor a number of size N, you need about log(2N)
qubits. So with 8 qubits you could factor the number 32 efficiently (is that
calculation right?). Can you do things more difficult than 'factor 32' with 8
qubits?

(not intended as an attack at all, i genuinely just do not understand how to
reason about the power of quantum computing devices with X qubits)

~~~
reikonomusha
This is a good question. 8 qubits will not solve any problems that a classical
computer can't. In fact, considering just how blazing fast modern CPUs are,
you can _simulate_ an 8-qubit chip faster and with more fidelity than you can
with 8 homegrown qubits.

Shor's algorithm isn't a good candidate for near-term use of a quantum
computer. Other algorithms, such as the variational quantum eigensolver or the
quantum approximate optimization algorithm are better. These can be used to
solve chemistry and optimization problems, for example.

We actually made a little demo to show a little game you can play with 8
qubits. It may not be able to handle the load of HN if it finds its way to the
masses, but check it out! [0]

[0] [http://demo.rigetti.com](http://demo.rigetti.com)

~~~
jvns
> you can simulate an 8-qubit chip faster and with more fidelity than you can
> with 8 homegrown qubits.

That is really useful to know! At what point does that stop being true? like,
if I had 100 qubits could I still simulate them faster on a classical computer
than with actual quantum qubits? (I imagine this is a hard question to answer
because it's a complexity theory question, and I guess quantum computers with
100 qubits don't actually exist in real life, but I'd be curious to know what
known bounds there are on the answer)

~~~
reikonomusha
Right now, if you want to simulate 40 qubits, you need a computer with at
least 16 _terabytes_ of RAM. 41 qubits needs 32 _terabytes_. You can't do this
in a single machine, but you could get these with enough computers linked
together. But big iron doesn't scale exponentially, so at some point, you run
out of atoms in the universe to use as transistors.

And 100 qubits. That needs more than 18 _quintillion_ _terabytes_ of RAM. In
other words, it's not going to happen.

With more memory, you need more time. Every quantum operation [0] needs to
touch all of that memory. And if it takes 10 ns to touch one byte of memory,
then it will take hours to even do one simple operation on 36 qubits. [1] On a
quantum computer, an operation would be on the order of _nanoseconds_.

I would say this, as a rough answer to your question: A quantum computer is
practically faster and more useful when you can no longer touch the RAM
necessary to simulate it in a 1 millisecond. Going off of our 10 ns rule, 1 ms
= 10^6 ns. So if 10 ns allows us to do something with 1 byte, then that's 10^5
bytes. So just about 10 KB.

And, _drum roll_ , that's just under 10 qubits [2].

[0] There are some very special cases where this isn't true, but these special
cases also don't give you any sort of quantum advantage.

[1] There are some tricks you can play to speed things up, like parallelizing
across processors. But now you'll have to factor in communication time,
memory, and bandwidth.

[2] One factor I'm not considering is the fidelity. Superconducting qubits can
be thought of as analog devices with analog noise.

------
pieteradejong
As a software engineer looking for a highly marketable and differentiated
skill set, given your projections for quantum computing roadmap, when should I
start exploring this area? (or: when should I start writing code and doing
side projects)

~~~
reikonomusha
Now. "Exponential" is faster than most people, including myself, can believe.
When one unit of resource doubles your computational capacity, it doesn't take
many units.

I like to use the analogy. Adding 1 GB of RAM these days isn't that big of a
deal. You can maybe open two more tabs in Chrome. :) Adding 1 giga-qubit to
your computer would make it 4.6 x 10^301029995 times better. That's
unimaginably more powerful than anything any human can think of.

We don't have quantum software engineering figured out. And it's not going to
be figured out by a few academics, although they may lay some good
foundations. It's going to be figured out by the same folks who figured out
traditional computing: people who try stuff, break stuff, and experiment.

~~~
algorias
> Now. "Exponential" is faster than most people, including myself, can
> believe. When one unit of resource doubles your computational capacity, it
> doesn't take many units.

Please don't make bullshit claims about exponential speedups. I don't know
exactly what technology you are claiming to have, but statements like this
cause me to believe less in your technology, not more.

We've been through the cycle of unfounded hype many times (with D-WAVE and
others). Scott Aaronson has an entire category on his blog filled with
depressingly many posts debunking the same bullshit over and over [0].

[0]
[http://www.scottaaronson.com/blog/?cat=17](http://www.scottaaronson.com/blog/?cat=17)

~~~
reikonomusha
The size of the state space in which the qubits live is exponential in the
number of qubits. This is because the qubits live in an n-fold tensor product
of two-dimensional Hilbert spaces. Performing an operation on a single qubit
is the same as performing a 2^n-dimensional unitary transformation on the
state of the system.

This is not disagreed by experts in the field of quantum computing, including
Scott.

~~~
algorias
I know what a Hilbert space is, and I also know that this 2^n-dimensional
space cannot be accessed except through a destructive measurement operation.
An exponential state space does not imply that there is exponential computing
power to be harnessed there.

As an analogy, when you execute a randomized classical algorithm, the size of
the state space in which the bits live is also exponential (and at the end you
observe the result, and your uncertainty collapses from a probability
distribution to one of its possible outcomes). Yet you would look at me like
I'm crazy (or a fraud) if I claimed that randomized algorithms have
exponentially more computing power than deterministic ones.

The only way in which the quantum case differs from the classical picture
above, is that amplitudes have a phase and can thus interfere (constructively
or destructively). The art of creating quantum algorithm lies entirely in
orchestrating favorable interference patterns.

~~~
dangirsh
It seems like we should be more careful when saying "exponential" increase in
computational performance. For many quantum algorithms, the speedup is
actually _superpolynomial_ [1] [2]. In some sense, this is due to the fact
that the _state space_ grows exponentially but, as you correctly pointed out,
it can only be accessed in a destructive manner. For many algorithms (e.g.
Shor's), the net result is a superpolynomial improvement in the resources
required for solving a practically important problem (factoring).

Unfortunately, the nuance of superpolynomial vs exponential is lost in many
high-level discussions about quantum computing. Maybe we should just say
"much, much faster" ;) To make matters worse, quantum computing textbooks
often present Simon's Problem [3] as a showcase for truly exponential speedup.
It turns out this is misleading, as I've never heard of a practically relevant
algorithm with truly exponential speedup.

[1]: [http://math.nist.gov/quantum/zoo/](http://math.nist.gov/quantum/zoo/)

[2]:
[https://en.wikipedia.org/wiki/Time_complexity#Superpolynomia...](https://en.wikipedia.org/wiki/Time_complexity#Superpolynomial_time)

[3]:
[https://en.wikipedia.org/wiki/Simon%27s_problem](https://en.wikipedia.org/wiki/Simon%27s_problem)

~~~
algorias
> In some sense, this is due to the fact that the state space grows
> exponentially

What I take issue with is precisely the conflation of the size of the state
space with the quantum speedup. Shor's algorithm is fast because QFT (quantum
fourier transform) creates an interference pattern that can reveal the period
of certain functions, and QFT can be implemented efficiently because of its
specific structure. As I said before, the size of a classical state space of a
probability distribution is also exponentially large, so no, the root cause is
emphatically not the size of the state space, but the way in which that space
can be manipulated and the fact that amplitudes add up in a way that's not
linear (when looking at the resulting probabilities).

Note that Grover's algorithm achieves only a quadratic speedup with the same
size of state space as Shor's. Your explanation doesn't add up, it just adds
to the confusion.

I just think that it's very important to stay far away from the (wrong, but
pervasive in pop science) idea that quantum computers are fast because they
"try exponentially many solutions in parallel". Excessively highlighting the
size of the state space is already a step too far in that direction for my
taste.

My words are a bit harsh, but I do appreciate the fact that you are engaging
honestly, and please don't take my skepticism personally. I would like to hear
what your technology brings to the table, how it differs from competing
approaches, etc.

~~~
dangirsh
> As I said before, the size of a classical state space of a probability
> distribution is also exponentially large...

This is true, but a single state in a classical probability distribution is
_not_ exponentially large. Because of superposition, a single quantum state
can be associated with an exponentially large number of amplitudes. As you
mentioned, quantum algorithms rely on the interference of these amplitudes.
However, if you could somehow assign a complex amplitude to each state in a
classical probability distribution, you would still be limited to manipulating
only one amplitude at a time. It is in this sense that the exponential scaling
is important.

> Note that Grover's algorithm achieves only a quadratic speedup with the same
> size of state space as Shor's. Your explanation doesn't add up, it just adds
> to the confusion.

I didn't mean to imply that _all_ quantum algorithms have superpolynomial
speedups. But (especially) for the ones that do, I about the exponentially
large set of amplitudes being manipulated in parallel.

> I just think that it's very important to stay far away from the (wrong, but
> pervasive in pop science) idea that quantum computers are fast because they
> "try exponentially many solutions in parallel".

100% agreed.

~~~
dangirsh
Ah,looks like I botched parts of this:

> However, if you could somehow assign a complex amplitude to each state in a
> classical probability distribution, you would still be limited to
> manipulating only one amplitude at a time.

This is probably just more confusing. What I should say is that classical
probabilities have no physical manifestation that you can directly manipulate
- they just denote our lack of information about a system. Amplitudes in
quantum systems can be related to probabilities, but they _don 't represent
lack of information_. The probabilistic nature of quantum systems is deeper
than that: measurements project superposition states onto classical states in
a probabilistic way. This is

For exponentially large superposition states, there are an exponential number
of amplitudes. When we act on the state in certain ways, we update _all_ of
the amplitudes in parallel. There is no counterpart to this when acting on
classical states, even when you have incomplete information about the state
(and thus an exponentially large probability distribution).

> But (especially) for the ones that do, I about the exponentially large set
> of amplitudes being manipulated in parallel.

Let's try again.

------
teafaerie
Also what percentage of the engineers at Rigetti actually think that you are
in any sense "leveraging the multiverse" with this sort of technology, and how
many of you prefer an alternative explanation?

I understand if you don't want to make a public statement about such divisive
(and perhaps, more importantly, ill-defined) matters in this context. I'm just
curious about the way that the people who are actually building these things
tend to view them...

~~~
dangirsh
I think this is the best question so far :)

After an informal poll of ~30 of our scientists/engineers, ~8 said they
subscribe to the many-worlds interpretation [1] [2] of quantum mechanics. To
be honest, this is a discussion that comes up surprisingly infrequently at the
office!

This result reminds me of Sean Carroll's "Most Embarrassing Graph in Modern
Physics" [3]. When top theoretical physicists are polled, it seems like no
interpretation of quantum mechanics even takes the majority! In my view, the
lack of consensus around this (after ~100 years) underscores the strangeness
of the theory.

I should mention that David Deutsch (one of the pioneers of quantum
computing), Steven Hawking, Max Tegmark, Sean Carroll, John Preskill, and many
other prominent physicists prefer the many-worlds interpretation (citation
needed).

If you like the "leveraging the multiverse" view of quantum computing, be sure
to watch David Deutsch's video lectures [4]! In his view, multiple universes
is _the_ way to explain the power of a quantum computer.

[1]: [https://en.wikipedia.org/wiki/Many-
worlds_interpretation](https://en.wikipedia.org/wiki/Many-
worlds_interpretation)

[2]: [https://plato.stanford.edu/entries/qm-
everett/](https://plato.stanford.edu/entries/qm-everett/)

[3]: [http://www.preposterousuniverse.com/blog/2013/01/17/the-
most...](http://www.preposterousuniverse.com/blog/2013/01/17/the-most-
embarrassing-graph-in-modern-physics/)

[4]:
[https://www.youtube.com/watch?v=24YxS9lo9so](https://www.youtube.com/watch?v=24YxS9lo9so)

------
drdre2001
Your GitHub account [0] gives Common Lisp some love. Why did you choose to use
this language? Specifically, for the implementation of your Quantum Virtual
Machine?

[0][https://github.com/rigetticomputing](https://github.com/rigetticomputing)

~~~
reikonomusha
That's my doing.

When we started thinking about quantum programming languages, we didn't know
what they should look like. Experimenting with different languages efficiently
requires a language that makes language-building easy. I find that there are
two classes of languages that provide that: Lisp-likes with metaprogramming
and ML-likes with a good type system including algebraic data types.

Quil [0], our quantum instruction language, came out of language
experimentation in Lisp. In fact, the Quil code:

    
    
        H 0
        CNOT 0 1
        MEASURE 0 [0]
    

used to look like this:

    
    
        ((H 0)
         (CNOT 0 1)
         (MEASURE 0 (ADDRESS 0)))
    

This was a no-hassle way to play around without thinking about how to wrangle
lex and yacc with shift/shift and shift/reduce issues. If you've ever used
them, it takes a while to get what you want, and integrate it into a product
you're trying to create.

In addition to the need to construct languages, we also needed to
simulate/interpret the language. Simulating the evolution of a quantum state
is very expensive (which is why we are building quantum computers!), and is
usually relegated to some closer-to-metal language like C.

Unfortunately, in C, high-speed numerical code (that is also blazing fast) is
very difficult to experiment with, extend, maintain, etc.

Fortunately, over the past 30 years, Lisp compilers have become excellent at
producing native machine code. With a compiler like SBCL [1], you can produce
code which is very nearly optimal. For example, we have a function to compute
the probability from a wavefunction amplitude. Here it is:

    
    
        > (disassemble #'probability)
        ; disassembly for PROBABILITY
        ; Size: 90 bytes. Origin: #x22B67F5A
        ; 5A:       F20F104901       MOVSD XMM1, [RCX+1]
        ; 5F:       F20F105109       MOVSD XMM2, [RCX+9]
        ; 64:       F20F59C9         MULSD XMM1, XMM1
        ; 68:       F20F59D2         MULSD XMM2, XMM2
        ; 6C:       F20F58D1         ADDSD XMM2, XMM1
        ; ...
    

There's some additional code that follows, but it disappears because this
function is also inlined at the call sites.

Writing the QVM and associated compiler in Lisp has allowed us to move fast.
The code is compact, extremely robust, and extremely fast.

I could say a lot more about this, but that's the gist of why we thought it
was a sensible decision.

[0] [https://arxiv.org/abs/1608.03355](https://arxiv.org/abs/1608.03355)

[1] [http://www.sbcl.org/](http://www.sbcl.org/)

------
teafaerie
So if I understand it right this process requires some pretty specialised and
delicate equipment... for instance you have to be able to get Helium3 down to
something like as cold (compared to us) as the sun is hot. For what I'm asking
it doesn't matter if that is technically true... point is you need some pretty
fancy technology.

Now it's true that classical computers used to take up whole buildings. Living
people remember this. And progress is supposed to be getting faster and
faster. But given the particularly arcane constraints... how long if ever
before this kind of technology can be a part of the daily lives of most Teran
Citizens? Will it ever be possible for us to have it at home? Or will we
always have to send out requests to more centralised machines that will then
send us back answers?

~~~
reikonomusha
I don't know whether or not we will have the technology in the future to
eliminate the need for such environments. Different kinds of qubits are an
active research activity. Ions, dots, diamond vacancies, etc.

The bigger point to realize, I think, is that most people's computing doesn't
even happen at their home anymore. Much of it happens on some blade in a
server rack. I think, for the time being, quantum computing will be just like
that.

It was pretty inconceivable, in my retrospective of computing, to think that
vacuum tubes would be miniaturized to fit in little boxes in bedrooms. You
really needed the discovery of a IC transistor to let that happen. That hasn't
happened yet with qubits, and I don't think one can with any certainty predict
it.

~~~
teafaerie
Thanks. That helps me to think about it in a more realistic and pragmatic way.
Good analogy!

------
jcccc
How is Forest different from IBM's quantum experience?

~~~
dangirsh
Great question. Both the IBM Q experience and Rigetti Forest allow users to
write quantum algorithms with Python that can execute on real quantum
hardware. Forest is different in 3 main ways:

1\. Forest was designed with near-term applications in mind. Specifically, it
uses our quantum instruction set (Quil) [1], which was designed for
implementing classical/quantum hybrid algorithms [2]. These algorithms can
leverage near-term quantum devices significantly more than "textbook" quantum
algorithms (like Shor's). IBM Q places much less emphasis on hybrid
computation.

2\. Forest provides raw access to the quantum hardware. There's an API [3] for
users to run "analog" experiments to understand the performance and noise
characteristics of our qubits. If you're developing near-term applications for
quantum computers, having access to this physical layer of quantum devices is
crucial. IBM Q doesn't provide a similar API to my knowledge.

3\. Programs written with Forest can execute on up to 30 virtual qubits on the
Rigetti QVM [4]. This allows users to develop quantum algorithms ahead of any
physical device that can run them. Especially if you include noise modeling
(we do), 30 qubits is well beyond what you could simulate with your laptop.
IBM Q offers a 20 qubit simulator, which is roughly 1000 times less powerful
than 30 qubits.

I must mention that IBM recently announced their experience will have up to 17
real qubits! This is larger than any physical device Forest is currently
connected to, and represents exciting progress.

[1] [https://medium.com/@rigetticomputing/introducing-quil-a-
prac...](https://medium.com/@rigetticomputing/introducing-quil-a-practical-
quantum-instruction-set-architecture-a684f0590a0c)

[2] [https://arxiv.org/abs/1509.04279](https://arxiv.org/abs/1509.04279)

[3]
[http://pyquil.readthedocs.io/en/latest/qpu.html](http://pyquil.readthedocs.io/en/latest/qpu.html)

[4]
[http://pyquil.readthedocs.io/en/latest/qvm_overview.html](http://pyquil.readthedocs.io/en/latest/qvm_overview.html)

~~~
greeneggs
Can you measure and reinitialize qubits? Testing fault-tolerant error
correction with eight qubits would be very exciting.

Is there a chance that you can roughly summarize noise levels, to give an idea
of what to expect? Something along the lines of Table 2 (page 6) in
arXiv:1705.02771 would be helpful.

A. Bermudez et al. "Assessing the progress of trapped-ion processors towards
fault-tolerant quantum computation", arXiv:1705.02771
[https://arxiv.org/abs/1705.02771](https://arxiv.org/abs/1705.02771)

~~~
reikonomusha
The computing model is given by Quil [0]. Section III-F talks about exactly
this idea of "measurement-for-effect". You can use measurement in Quil as a
way to project into a state that you want. (You can even use a conditional
instruction to get feedback and flip it with an X if it measures into an
undesired state.)

[0] [https://arxiv.org/abs/1608.03355](https://arxiv.org/abs/1608.03355)

------
pierre_d528
I came across this video trying to understand what are quantum computers
about... After all, Feynman himself said: "If you think you understand quantum
mechanics then you don't understand quantum mechanics." and "If you cannot
build it, you do not understand it."

I wonder if anyone could link to something that makes the stuff clear.

[https://www.youtube.com/watch?v=dKAF9OCQtIo](https://www.youtube.com/watch?v=dKAF9OCQtIo)

"QBism is NOT NEW but at least people are reviving what Bohr thought. QM just
involves expectations of observables and the Born rule is just "metaphysical
fluff." The confusions are all about false counterfactuals."

~~~
wzeng
If you're looking for a short, and practical introduction using Python, then
one is included as part of the documentation for pyQuil (part of the Forest
toolkit):

[http://pyquil.readthedocs.io/en/latest/intro_to_qc.html](http://pyquil.readthedocs.io/en/latest/intro_to_qc.html)

------
maimaiml
Hi there! I see CL repositories on github too ? Do you guys also actively use
Lisp at Rigetti ?

~~~
reikonomusha
Answered here:
[https://news.ycombinator.com/item?id=14598948](https://news.ycombinator.com/item?id=14598948)

------
juliangoldsmith
What is the state of languages for programming quantum computers?

I noticed you seem to be using assembly at the moment. Are there higher-level
languages out there at the moment, or are those still a ways off?

~~~
dangirsh
We have a Python library called pyQuil [1] for writing high-level programs for
quantum computers. The "assembly" you mentioned is Quil [2], which we don't
expect most users to use directly.

Other high-level languages for quantum computing include LIQUi|> [3] and
Quipper [4]. Each approach has it's strengths, but we believe Quil/pyQuil is
the best choice for near-term applications. See the Quil paper [5] for more
details.

[1]:
[https://pyquil.readthedocs.io/en/latest/](https://pyquil.readthedocs.io/en/latest/)

[2]: [https://medium.com/@rigetticomputing/introducing-quil-a-
prac...](https://medium.com/@rigetticomputing/introducing-quil-a-practical-
quantum-instruction-set-architecture-a684f0590a0c)

[3]: [https://www.microsoft.com/en-us/research/project/language-
in...](https://www.microsoft.com/en-us/research/project/language-integrated-
quantum-operations-liqui/)

[4]:
[http://www.mathstat.dal.ca/~selinger/quipper/](http://www.mathstat.dal.ca/~selinger/quipper/)

[5]:
[https://arxiv.org/pdf/1608.03355.pdf](https://arxiv.org/pdf/1608.03355.pdf)

------
Karrot_Kream
Do you guys provide access to actual quantum/hybrid computers now or is it all
simulated?

~~~
wzeng
Signing up automatically gives you access to run on the simulator. For select
users, we are providing some limited access to one of our prototype quantum
processors. It's the one discussed in this paper [0]. This initial access
doesn't allow a user to run full quantum programs like on the simulator, but
but one can test out a few interesting experiments. You can read about what
experiments are available here [1]. If you'd like to apply to run some
experiments on our hardware then email us with some information about you and
your interests at support@rigetti.com

The access to quantum hardware is limited at this point, but we'll be adding
new features over the coming months.

[0]
[http://www.rigetti.com/papers/Demonstration_of_Universal_Par...](http://www.rigetti.com/papers/Demonstration_of_Universal_Parametric_Entangling_Gates_on_a_Multi-
Qubit_Lattice.pdf) [1]
[http://pyquil.readthedocs.io/en/latest/qpu.html](http://pyquil.readthedocs.io/en/latest/qpu.html)

~~~
Karrot_Kream
How many issues are there going from the simulator to the practical circuit?
I've had issues with the IBM computers.

Thanks for all the cool stuff to play with!

~~~
wzeng
We haven't integrated arbitrary circuit execution features in yet, but when we
do we'll make the transition as smooth as possible.

It's important to remember though that noise is a fact of life for this
generation of small prototype quantum processors. You'll definitely see a
difference between a perfect simulation and true hardware.

------
andreyf
But does it run Linux?

~~~
dangirsh
We couldn't make the binaries small enough for our current device ;)

We use Linux heavily to develop our own quantum OS. Programs written with
Forest targeting the QPU [1] pass through the quantum OS, which runs on the
control systems at our facilities. Thanks to our cloud API, users don't need
to know these details.

[1]:
[http://pyquil.readthedocs.io/en/latest/qpu.html](http://pyquil.readthedocs.io/en/latest/qpu.html)

------
negativ0
too bad "rigetti" in Italian means literally "you puke"

~~~
reikonomusha
Unlike Italian, the company name is pronounced with a hard G. And as far as I
know, "rigetti" in Italian is "you discard".

Take it as discarding the transistor era of computing for wonderful qubit era.
:D

~~~
tomdre
It's both "discard", "puke", and "throw again"

------
ConAntonakos
I read this as "Bighetti" from HBO's Silicon Valley.

