
Amazon Braket – Get Started with Quantum Computing - aloknnikhil
https://aws.amazon.com/blogs/aws/amazon-braket-get-started-with-quantum-computing/
======
tdhttt
Summary:

\+ Amazon didn't build any quantum devices.[0]

\+ AWS provides a software stack called Braket. [0]

\+ AWS provides classical simulation or access to quantum computers from
D-Wave, IonQ, and Rigetti.[0]

\+ Many companies (e.g. IBM, Rigetti) have already provided similar software
stack/cloud quantum devices. For a more comprehensive listing, please see [1].

[0]: [https://aws.amazon.com/blogs/aws/amazon-braket-get-
started-w...](https://aws.amazon.com/blogs/aws/amazon-braket-get-started-with-
quantum-computing/)

[1]: [https://qbnets.wordpress.com/2019/11/23/list-of-quantum-
clou...](https://qbnets.wordpress.com/2019/11/23/list-of-quantum-clouds/)

~~~
LeftHandPath
I would enthusiastically recommend IBM Q [0] and the command-line variant of
Hello Quantum [1] to anyone who wants to investigate quantum circuits.

If you start googling (or searching with your engine of choice) about Quantum
Logic Gates, Bell States, et c., you'll pick up the basic concepts pretty
quickly. Then you can follow the rabbit hole from there.

[0]: [https://quantum-computing.ibm.com](https://quantum-computing.ibm.com)

[1]:
[https://www.pythonanywhere.com/gists/a5d885816f7dc042a78df11...](https://www.pythonanywhere.com/gists/a5d885816f7dc042a78df11ce6cf9652/main.py/ipython3/)

~~~
cgb223
Are these quantum computers currently powerful enough to tackle things like
breaking non-quantum proof encryption?

~~~
fhfhdhdbdjx
Short answer is no. Longer answer requires understanding quantum error
correction and how long before we get quantum computers with 1000s of qubits.

~~~
sampo
Short answer: no. Longer answer: currently no.

------
sixdimensional
Does anyone else feel that quantum computing via the cloud is sort of the
penultimate test of the "-aaS" (as a service) model?

I mean, quantum computing is something that today seems unlikely to be widely
available or achievable in hardware to anyone but a handful of specialists and
companies/organizations.

I wonder if someday, in the future, people will look back and see that
classical computing spread because it was made accessible and ubiquitous, and
we did not have the lust for centralization which we seem to now. I wonder
what kind of future this spells for quantum computing - will it continue to
spread or will it be limited/stunted by being controlled by only the few?

This feels like it has the potential to be the ultimate kind of lock-in. If
the way that the system and the hardware/software is exposed to the world is
through cloud services, and the knowledge of how to build/operate/use quantum
computers stays locked-in to only the privileged who can afford to have access
to and utilize it...

Imagine if you started building on quantum computing technology, but then
decided you wanted to change.. to what other option!?

I'm not trying to be a Luddite here, I think it's pretty amazing you can even
access a quantum computer as a service. But, I am being "that person" who asks
the question.. "hmm, where is this going"?

~~~
boothby
One of the questions folks tend to ask me is some form of, "when can I expect
my cellphone to contain a quantum co-processor?" That gives me an excellent
opportunity to tell them everything that I know about cryogenic refrigerators
(which only takes a minute).

The chips we make (speaking directly of D-Wave, but afaik this is true of all
superconducting QC efforts) would cost pennies if we produced them at scale.
But the surrounding machinery is extremely complex, and very expensive to
manufacture -- and scale would only get you so far. My rough understanding of
refrigeration is that the temperature differential strongly depends on the
length of the heat exchanger. Qubits are famously sensitive to noise; and
blackbody radiation gives an inescapable dependence between noise and
temperature. In short, a miniaturized fridge would be necessarily hot, and
therefore too noisy to perform quantum computation!

So the sad news is that, barring some major developments, we may never have
miniature quantum computers. In the foreseeable future, hardware costs will be
measured in millions of dollars. So even if you're a millionaire, you probably
don't want to buy a quantum computer. If you work for a university, a national
laboratory, or a major corporation, you might try to convince your
organization to purchase a quantum computer. If you succeed in that pitch,
you'd almost certainly need to share it with your colleagues over a network.

So to me, an industry insider, it feels that public access to quantum
computing is almost necessarily QCaaS.

~~~
hypewatch
Forgive my ignorance with this question. Would it be possible to run these
quantum chips in space? Space is cold and quiet so maybe that’s cheaper at
scale.

~~~
tsimionescu
In general, you shouldn't think of space as being cold for intuitive hot-to-
cold heat transfer. For example, the metallic side of a space ship would not
be anywhere near 0K, whereas if you had a metal plaque between a liquid at
around 0K and your hand, it would be.

It is very hard to dissipate heat from a solid object into space. This is not
true for our bodies on the other hand, but that is more to do with pressure -
if you expose cells to the Void of space, most liquids inside would quickly
expand in size and essentially boil, consuming large amounts of heat to go
through the phase transition from liquid to gas, thus quickly cooling
surrounding tissue. You could theoretically use this to create evaporation-
based heating, but you would have to transport vast quantities of water that
would quickly be used up, since there is no hope of collecting them back most
likely.

------
dvt
Very cool, looks like they'll actually let you run stuff on _real_ quantum
hardware[1]! Of course, the hard part is building algorithms that take
advantage of that kind of hardware. I must've read a dozen papers on Shor's
algorithm and my understanding still leaves a lot to be desired.

[1] [https://aws.amazon.com/blogs/aws/amazon-braket-get-
started-w...](https://aws.amazon.com/blogs/aws/amazon-braket-get-started-with-
quantum-computing/)

~~~
52-6F-62
Completely tangential, but on the subject of the actual quantum hardware:

I swear that if our entire civilization went under and some future
archeologist found this:

[https://media.amazonwebservices.com/blog/2019/qc_rigetti_400...](https://media.amazonwebservices.com/blog/2019/qc_rigetti_400_1.jpg)

They'd surely assume [at least at first] it was a religious artifact (and
they'd only be half wrong?).

~~~
jsty
That pattern reminds me quite strikingly of aboriginal art

~~~
perl4ever
The part that would be particularly interesting to me, without context, would
be the deviations from symmetry. At first glance, the holes around the white
circles look pretty regular. But the sparser lines of holes in between do not
match each other, particularly the upper right.

------
sjy
Many commenters are comparing this to the time sharing era of classical
computing. There is an important difference here: we knew that classical
computers were useful at the time, even though many people failed to predict
that they would become millions of times cheaper and smaller.

I’m not a physicist, but it seems to me that it’s still too early to say
things like “when a quantum computer with enough qubits is available,
factoring large integers will become instant and trivial.” Some experts still
doubt whether quantum computing is possible at all [1]. You could not say that
about classical computing before the transistor was discovered. And silicon-
based computers weren’t preceded by a decade of hype about how they were about
to blow vacuum tubes out of the water, just as soon as we work out the
remaining engineering problems.

[1]: [https://arxiv.org/abs/1908.02499](https://arxiv.org/abs/1908.02499)

I think P ≠ NP is a better analogy. Is it true? Well, most people with an
interest in the problem think so, but we don’t know. I think this announcement
is about as significant as Amazon releasing some tutorials on computational
complexity and giving out a few PhD scholarships for people working on P ≠ NP.
Maybe this is what leads to the invention of the “quantum transistor,” but
it’s too early to say that integer factorisation _will_ become trivial.

------
blt
This is awesome. I didn't see any mention of how much real quantum hardware is
available. I guess it must be extremely expensive, or else they would quickly
become unavailable due to curious people tinkering.

(Also, how do you know that they are actually running your code on quantum
hardware instead of a simulation?)

~~~
dghughes
>I guess it must be extremely expensive

Any quantum computing hardware I see is covered in gold, so I'd say yes.

------
vtomole
> This new service is designed to let you get some hands-on experience with
> qubits and quantum circuits.

Lots of open-source libraries do this[0]. Is Amazon Bracket going to be open-
source?

[0]: [https://github.com/desireevl/awesome-quantum-
computing#devel...](https://github.com/desireevl/awesome-quantum-
computing#development-tools)

~~~
boothby
I can't speak to braket being open source, but at least two of the hardware
providers, D-Wave [1] and Rigetti, have open source stacks. Disclosure, I work
for D-Wave and I'm not terribly familiar with the other providers.

[1] [https://github.com/dwavesystems](https://github.com/dwavesystems)

~~~
edgyquant
Last time I checked d-waves systems didn't have quantum speedup thus weren't
considered true quantum computers.

~~~
codekilla
This is a tricky point, but I would say that D-Wave systems are quantum
computers, however they rely on coherence, and not entanglement (coherence is
necessary for entanglement, but you don't get entanglement automatically from
coherence ), and thus do not have a universal set of quantum gates. D-wave
systems thus can gain a quadratic advantage over classical systems, but will
not see exponential speedups (likely need entanglement for this).

------
anon1m0us
Acknowledging my very primitive understanding of quantum computing, would it
be possible to _simulate_ quantum computing?

~~~
Strilanc
Yes. Here's an online drag and drop simulator:
[https://algassert.com/quirk](https://algassert.com/quirk)

Until recently [1], classical simulation was faster/cheaper/more-accurate than
any existing quantum hardware. But the hardware has been improving and all
classical simulation of quantum computation takes exponential time with
respect to some important property such as the number of qubits, the depth of
the computation, the number of non-trivial gates, or etc.

[1]: [https://ai.googleblog.com/2019/10/quantum-supremacy-using-
pr...](https://ai.googleblog.com/2019/10/quantum-supremacy-using-
programmable.html)

~~~
jhallenworld
I wonder how optimizing the existing simulators are. I suppose any program
with constant input could be reduced to no work, but with potentially
exponentially long compile time. But there must be simpler optimizations...

Before we declare quantum supremacy, we should make sure the simulator we are
comparing with is a good one, not a straw-man one.

I've seen this problem time and again with hardware accelerators. The
accelerator is faster than some crappy software, but with a little work with a
profiler, the software beats the hardware. Of course optimizing will not make
an fundamentally exponential problem polynomial, but it can help a lot.

~~~
Strilanc
> _Before we declare quantum supremacy, we should make sure the simulator we
> are comparing with is a good one, not a straw-man one._

A big part of writing the supremacy paper was optimizing the simulators. We
did three different styles of simulation:

1) A state vector simulator with hand rolled SIMD assembly (called qSim; not
yet publicaly released). This required too much space at 53 qubits. (Well,
unless you're going to use the majority of all disk space on summit, which
brings its own obstacles. IBM says they can run it that way in a few days, but
we'll see.)

2) Treating the quantum circuit as a tensor network and doing optimized
contraction to avoid the space blowup (called qFlex
[https://github.com/ngnrsaa/qflex](https://github.com/ngnrsaa/qflex)). This
required too much time at 53 qubits.

3) Custom code written to run on a supercomputer instead of distributed
computers.

There's only so much effort you can put in before you have to call it. I think
it's more likely for an algorithmic break to save a factor of 10 than for
optimization to save a factor of 10 at this point.. although someone should
probably try using GPUs or FPGAs.

I also take the view that if it takes a month to produce a new optimized
implementation of a classical simulator that beats the quantum hardware, then
the quantum hardware is still outperforming classical hardware for that month.
The theoretical bounds are important, but in the day-to-day context of a race
they aren't directly relevant.

------
hacker322234
Are there any quantum programming tutorials to get started with this stuff?

~~~
Pandabob
Andy Matuschak and Michael Nielssen have a pretty interesting tutorial online
about Quantum Computing
[https://quantum.country/qcvc](https://quantum.country/qcvc)

~~~
nestorD
I can recommend it, it is a great primer on the subject.

------
eduren
Interesting choice for the name:
[https://en.wikipedia.org/wiki/Bra%E2%80%93ket_notation](https://en.wikipedia.org/wiki/Bra%E2%80%93ket_notation)

~~~
bentcorner
I was thinking that this name would make hallway conversations tougher (no,
it's bracket with the "c"), but I'm guessing (and it's just a guess, I know
nothing about this field) that the people actually interested in this service
know what braket means.

~~~
ajkjk
They certainly would. Also, bra-ket would be pronounced "brocket", rather than
"bracket".

------
cryptozeus
This is exciting and anxiety provoking at the same time. Just when I am
starting to grasp the inner workings of cloud / kubernetes infrastructures,
this is on the horizon. What a world we are in !

~~~
bamboozled
What does this have to do with Kubernetes? You should read the article.

~~~
cryptozeus
You should read my comment again.

------
brookhaven_dude
Are there any quantum simulating frameworks that I can download on my high
performance gaming PC and start playing with simple quantum programs?

------
welder
Competition: [https://www.rigetti.com/](https://www.rigetti.com/)

~~~
FanaHOVA
Funny that their homepage says "Rigetti Quantum Computers Are Now Available On
AWS" :)

~~~
dmix
Rigetti is one of the 3 hardware providers for Amazon Braket, along with
D-Wave and IonQ

[https://aws.amazon.com/braket/hardware-
providers/](https://aws.amazon.com/braket/hardware-providers/)

------
CodeWriter23
Serious question: does this mean any developer with a credit card can now
break our strongest crypto?

~~~
dshields1
No. We’re still a long way from quantum computers powerful enough for that.

~~~
randomsearch
Indeed. We currently have machines with ~ 50 qubits and we’d need likely 10s
of millions of qubits.

------
ksnieck
What happens to the information from your explorations and experiments?

~~~
sputknick
From the FAQ it looks like it will be dumped into an S3 bucket

------
pizzaparty2
What types of problems are quantum computers used for?

~~~
fsh
None. Current hardware is way to small and noisy to run any known useful
algorithm.

------
BenoitP
The shiny apparatus surrounding the hardware makes me think of The Talk, by
smbc comics:

[https://www.smbc-comics.com/comic/the-talk-3](https://www.smbc-
comics.com/comic/the-talk-3)

------
appwiz
Jeff Barr's post - [https://aws.amazon.com/blogs/aws/amazon-braket-get-
started-w...](https://aws.amazon.com/blogs/aws/amazon-braket-get-started-with-
quantum-computing/)

~~~
dang
That seems to have more information, so we've switched to it from
[https://aws.amazon.com/braket/](https://aws.amazon.com/braket/).

------
Upvoter33
Amazon seems to have missed the in-house opportunity to build quantum, so are
partnering widely. It'll be interesting to see how Google squanders their tech
lead in this space too...

------
jostmey
So quantum computing is expected to based on Bra-ket notation? In that case,
is it simply based on linear algebra and statistical sampling? I assume you
can get that experience with low-level Tensorflow and a GPU for parallel
computing

~~~
archgoon
> So quantum computing is expected to based on Bra-ket notation? In that case,
> is it simply based on linear algebra and statistical sampling?

Yes. You can simulate a 53 qubit quantum computer with a 2^53 complexed valued
vector as input (about 20 petabytes), and a 2^53 x 2^53 complex valued unitary
matrix, which will take about a trillion exabytes to represent exactly.
However, that is for a generic quantum operation on all 53 qubits, and some
programs can be represented significantly more compactly.

> I assume you can get that experience with low-level Tensorflow and a GPU for
> parallel computing.

No. Most GPUs have at most about 13 gigabytes of onboard memory and would not
be able to hold the matrix in memory. Also, GPUs still do not reduce the
computational complexity of matrix multiplication, you still have to perform
the full ~ n^2.37 operations (using Coppersmith-Winograd). Though again, this
is for a general Unitary transformation; reductions can sometimes be made;
this is how IBM was able to validate result computed by Google.

However, yes, this _does_ mean that you can simulate a smaller quantum
computer using just linear algebra and no particularly fancy tricks.

------
legitster
> Taken together, I think it is safe to say that most organizations will never
> own a quantum computer, and will find the cloud-based on-demand model a
> better fit. It may well be the case that production-scale quantum computers
> are the first cloud-only technology.

There was a magical moment when an automobile went from being a gimmicky horse
replacement to an actual innovation.

I feel like this is a possible glimpse at something similar for the cloud.

~~~
filoleg
I think the trend is reversed here. A lot of things are moving away from
personal on-device computing to cloud, so it makes sense that for quantum we
can jump straight to cloud, especially considering both the cost and
practicality. Like, I don't remember any major trend in computing moving in
the opposite direction in the past 10 years.

