Hacker News new | past | comments | ask | show | jobs | submit login
Amazon Braket – Get Started with Quantum Computing (amazon.com)
451 points by aloknnikhil on Dec 2, 2019 | hide | past | favorite | 137 comments



Summary:

+ Amazon didn't build any quantum devices.[0]

+ AWS provides a software stack called Braket. [0]

+ AWS provides classical simulation or access to quantum computers from D-Wave, IonQ, and Rigetti.[0]

+ Many companies (e.g. IBM, Rigetti) have already provided similar software stack/cloud quantum devices. For a more comprehensive listing, please see [1].

[0]: https://aws.amazon.com/blogs/aws/amazon-braket-get-started-w...

[1]: https://qbnets.wordpress.com/2019/11/23/list-of-quantum-clou...


I would enthusiastically recommend IBM Q [0] and the command-line variant of Hello Quantum [1] to anyone who wants to investigate quantum circuits.

If you start googling (or searching with your engine of choice) about Quantum Logic Gates, Bell States, et c., you'll pick up the basic concepts pretty quickly. Then you can follow the rabbit hole from there.

[0]: https://quantum-computing.ibm.com

[1]: https://www.pythonanywhere.com/gists/a5d885816f7dc042a78df11...


Qiskit[0], the framework that IBM Q uses, also has a awesome community and the introductory textbook called "Learn Quantum Computing using Qiskit" [1]. The "Quantum Algorithms" chapter is thoroughly documented both from theoretical and coding perspective.

[0]: https://qiskit.org/

[1]: https://delapuente.github.io/qiskit-textbook/preface


Are these quantum computers currently powerful enough to tackle things like breaking non-quantum proof encryption?


Short answer is no. Longer answer requires understanding quantum error correction and how long before we get quantum computers with 1000s of qubits.


Short answer: no. Longer answer: currently no.


As long as you don’t need to factor a prime greater than 4088459, absolutely. More seriously, my take is that a lot of the (modest) work on quantum algorithms in industry is to provide financial and some engineering and aerospace domains with potential “off the shelf” algorithms if and when quantum supremacy is achieved for those problems. Right now it’s more of an intellectual exercise, excepting some specific work that can take advantage of quantum annealing approaches.


Also for a more comprehensive listing of the quantum software, see the awesome-quantum-software [0] on GiuHub.

Personally I have used Cirq[1] and Qiskit[2] to implement some classic algorithms like Deutsch-Jozsa[3] and Shor's[4] algorithm and I would say the two frameworks are very similar to use for those implementations.

So, it would be interesting to get some insights/opinions from someone who has used multiple frameworks. Is there any clear win/loss of those frameworks?

[0]: https://github.com/qosf/awesome-quantum-software#experimenta...

[1]: https://github.com/quantumlib/Cirq

[2]: https://qiskit.org/

[3]: https://en.wikipedia.org/wiki/Deutsch%E2%80%93Jozsa_algorith...

[4]: https://en.wikipedia.org/wiki/Shor%27s_algorithm


"This new service is designed to let you get some hands-on experience with qubits and quantum circuits. You can build and test your circuits in a simulated environment and then run them on an actual quantum computer. Amazon Braket is a fully managed AWS service, with security & encryption baked in at each level."

I'm confused.


Does anyone else feel that quantum computing via the cloud is sort of the penultimate test of the "-aaS" (as a service) model?

I mean, quantum computing is something that today seems unlikely to be widely available or achievable in hardware to anyone but a handful of specialists and companies/organizations.

I wonder if someday, in the future, people will look back and see that classical computing spread because it was made accessible and ubiquitous, and we did not have the lust for centralization which we seem to now. I wonder what kind of future this spells for quantum computing - will it continue to spread or will it be limited/stunted by being controlled by only the few?

This feels like it has the potential to be the ultimate kind of lock-in. If the way that the system and the hardware/software is exposed to the world is through cloud services, and the knowledge of how to build/operate/use quantum computers stays locked-in to only the privileged who can afford to have access to and utilize it...

Imagine if you started building on quantum computing technology, but then decided you wanted to change.. to what other option!?

I'm not trying to be a Luddite here, I think it's pretty amazing you can even access a quantum computer as a service. But, I am being "that person" who asks the question.. "hmm, where is this going"?


One of the questions folks tend to ask me is some form of, "when can I expect my cellphone to contain a quantum co-processor?" That gives me an excellent opportunity to tell them everything that I know about cryogenic refrigerators (which only takes a minute).

The chips we make (speaking directly of D-Wave, but afaik this is true of all superconducting QC efforts) would cost pennies if we produced them at scale. But the surrounding machinery is extremely complex, and very expensive to manufacture -- and scale would only get you so far. My rough understanding of refrigeration is that the temperature differential strongly depends on the length of the heat exchanger. Qubits are famously sensitive to noise; and blackbody radiation gives an inescapable dependence between noise and temperature. In short, a miniaturized fridge would be necessarily hot, and therefore too noisy to perform quantum computation!

So the sad news is that, barring some major developments, we may never have miniature quantum computers. In the foreseeable future, hardware costs will be measured in millions of dollars. So even if you're a millionaire, you probably don't want to buy a quantum computer. If you work for a university, a national laboratory, or a major corporation, you might try to convince your organization to purchase a quantum computer. If you succeed in that pitch, you'd almost certainly need to share it with your colleagues over a network.

So to me, an industry insider, it feels that public access to quantum computing is almost necessarily QCaaS.


I wonder if we had the same perspectives when comptuers were first established in the 50's/60's.

We couldn't possible foresee how these monster of a machines could possible fit in the palm of our hands and yet, now, it's hard to see how we couldn't see that far in front of us.

We had this famous quote from IBM but, I wonder if as an industry this was a common perspective. Or wether between those who were building these machines could foresee where we'd be now? Is there some one in Quantum Computing who has the intelligence and creativity to think, nah, we'll have giant refrigerated quantum computing mobile phones in 5x decades time (I understand Quantum Computers wouldn't make great phones I'm just drawing baseless comparisons :) )

"I think there is a world market for maybe five computers."

Thomas Watson, president of IBM, 1943

= = = =

I only found out about wolframalpha a month ago and I'm still in awe of it. Quantum computing as a service? No idea what I'm going to want to do with it. But I re-watch this video by Veritasium and Andrea Morello from UNSW a couple of times a year to just remind myself how much I don't know.

https://www.youtube.com/watch?v=Auha-gXTiqU


Just as an aside, that quote is likely apocryphal. No one has ever been able to find actual evidence that he said this, and those close to him have rejected it as a false attribution. It was being recognized as a myth way back in the 1970s.

Even if he had said it, it was a very accurate statement at the time. Gordon Bell has noted that at the time he is claimed to have made that statement, it would have held essentially true for a decade. As something that would have likely been said (if said at all) in discussions around IBM's near-future business plans, or a sort of market analysis of the conditions at the time, it makes perfect sense.


See, this is why I would suck as a contestant on QI. Thank you. :)


Cooling chips to around room temp was an evolutionary process. Cooling chips to near absolute zero is another ball of physics entirely. I suspect it is like trying to have a pocket sized NMR machine, which needs liquid helium cooling etc. Those things can’t just be scaled down to pocket size.


What about photonic quantum computers? You can access quantum mechanical effects at room temperature with photonics. I don’t know if it’s more difficult to use them for computing though


Yes, because photons don't interact with each other


Are you supporting or refuting my thesis? Assuming the latter, just because photons as particles don’t “interact” (by which I presume you mean they are bosons) doesn’t mean you can’t get interesting multi particle quantum results with photons. See for example https://en.m.wikipedia.org/wiki/Hong–Ou–Mandel_effect


I'm not sure if it's up to the task (yet?), but thermoelectric coolers are routinely used for CCDs that operate below -80 °C (an overview: https://www.azom.com/article.aspx?ArticleID=14681). I suspect someone will come up with a solution once the technology starts to mature.


At those temperatures things get weird. Cryocoolers are established technology, but anything colder than the xx K range very quickly stops being easy in any way.

That said, LH2 temperatures aren't really hard and can easily fit in a 2U rackmount device, providing power/classical-RF uses of type2 superconductors. Think EMI shielding, power conditioning, ~50 GHz traces that can span a full backplane without fancy signal conditioning, etc.


Thermoelectric coolers are not even able to keep up with modern CPUs. Or rather, they can, but they are very energy hungry. As much or more so than the CPU itself.


Mostly: thermometric coolers are bad for >50K difference and >10W heat flux on the cold side. Most uses are better off with a sterling cooler or maybe even an absorption refrigerator, which can have no mechanical parts (just fluids, plumbing and heat exchanges) and could theoretically provide human-centric AC/refrigeration based on server waste heat.


Forgive my ignorance with this question. Would it be possible to run these quantum chips in space? Space is cold and quiet so maybe that’s cheaper at scale.


In general, you shouldn't think of space as being cold for intuitive hot-to-cold heat transfer. For example, the metallic side of a space ship would not be anywhere near 0K, whereas if you had a metal plaque between a liquid at around 0K and your hand, it would be.

It is very hard to dissipate heat from a solid object into space. This is not true for our bodies on the other hand, but that is more to do with pressure - if you expose cells to the Void of space, most liquids inside would quickly expand in size and essentially boil, consuming large amounts of heat to go through the phase transition from liquid to gas, thus quickly cooling surrounding tissue. You could theoretically use this to create evaporation-based heating, but you would have to transport vast quantities of water that would quickly be used up, since there is no hope of collecting them back most likely.


Shedding heat in space is a huge problem because you can only radiate it. Depending on the refrigeration requirements there may be too much excess heat to make it feasible.


You’d probably lose a lot of the advantages because you lose the naturally extremely effective radiation shielding of the earths magnetic field and atmosphere.


You'd probably always have to stay in the shadow of some celestial body and the 3K microwave background would be an issue as well that would have to be solved.


Maybe something a tiny little black hole could fix?


There’s just got to be a way, there’s no way we can just give up.

I’m fairly sure decades ago people assumed we may never have tiny computers in our pockets with the same certainty you have now.

This is what it means to be crazy enough to change the world.


On the other hand, many people assumed we'd have flying cars and colonies on Venus.

Oh, and fusion reactors.

Edit:

Now that I think of it, there was a lot of variation in predictions of computing. But I would say that it's been pretty common for science fiction to describe technology 30 years out somewhat accurately, probably because it has inspired the actual tech in a self-fulfilling way.

So, in the 40s, a spaceship was envisioned able to carry only calculators and slide rules, with a radio link to a big central computer. That wasn't far off of how things developed in the 60s and 70s. But I think by the 60s and 70s, people were imagining pocket computers and tablets and such and that had a huge effect on people actually designing them when it was possible.


If people’s assumptions tend to be wrong then we can expect to someday have mini quantum computers since people assume they are impossible.


See my addition. My point is that some predictions are right and some are wrong.


> There’s just got to be a way, there’s no way we can just give up.

By no means do I intend to discourage progress! I wouldn't do the work that I do if it wasn't so difficult. I did hedge, a bit: "may never," "foreseeable future," "without major developments."

The fridge is just one major obstacle. There's a plethora of physics, engineering, and mathematical challenges out there impeding progress. Get to work!


> This feels like it has the potential to be the ultimate kind of lock-in. If the way that the system and the hardware/software is exposed to the world is through cloud services, and the knowledge of how to build/operate/use quantum computers stays locked-in to only the privileged who can afford to have access to and utilize it...

Well, consider the opposite suggestion: right now, quantum computer access is fairly constrained. Making these systems available to anyone who has a credit card is a wider democratization of access to them, not a constraint.

After all, cryogenics and such isn't exactly free or easy to maintain - access is going to be somewhat controlled unless/until these things are to the point where you have one in your phone.


Yes, I agree with your point, I was kind of vacillating back and forth.

I think it's incredible that the inventors of this technology are pushing so hard to get it in the hands of people who can apply it. So in that regard, I completely agree, it is a better situation to have the technology available for a reasonable price and payment option.

I think part of my hang up of this is thinking about and remembering how much I was amazed by what we could do independently before the cloud vendors. It seems like we are in the part of the cycle that is encouraging centralization, but I don't know how or if we'll ever be able to exit this phase (or maybe we won't have to).

The scale and complexity of the offering of the cloud vendors, compared to what independent organizations can do, is truly mind blowing, and continues to get that way moreso every day. How does one even compete (or why would one want to) against these "utility" technology companies?


Totally agree that there's some loss of control when we allow cloud vendors to be the middlemen in all things; but honestly, this is probably one of the few cases where it's the perfect missing ingredient to be able to give access to what are fundamentally time-sharing systems to the widest number of people possible.

This is like dialing up to use the university PDP-11 in the 70s, basically. Big things are coming.


We were in a similar situation when computers were built with vacuum tubes. The anomaly was the transistor. Maybe there is a quantum equivalent, but we haven't found it yet.


The Quantristor?


Yeah but at the same time, how easy is today to build something with Raspberry PI or deploy software into thousands of IoT devices?


>cryogenics and such isn't exactly free or easy to maintain

Room-temperature superconductors and other technologies using strange edge conditions are theoretically possible, but quantum computers have only a few specialties.


Hard to say. In 1960 classical computing would have seemed totally centralized and unattainable in distributed form for the masses. Who knows what the future may bring for quantum computing assuming there is actually a Conventional use case for the masses (most people probably don’t have a burning urge to factor primes at home like say playing video games).


I agree. I'm trying to see this through that perspective when we were at that point in the cycle where everything was centralized. I'm wondering if there will ever be a time (or even a reason) for things to go back the other way again.

Also, agreed about the use case - sometimes I get the feeling that quantum computing is a problem looking for a solution (but I am sure that must not be the case). That said, I think things are partially that way because quantum computing is just such a different paradigm, so to truly take advantage of it takes a pivot in thinking, but that great dividends may be possible as a result.

My thought is, it's kind of like how we learned about what FPGAs could do. Different paradigm, incredible opportunity.


Privacy, offline access, low latency - these are all excellent use cases for edge computing. Once it's time to do some heavy lifting, though, it makes a lot more sense to centralize. Decentralization gives you control along with responsibility, so the cycle goes something like this:

* Decentralized as a part of early development

* Centralized for ease of early deployment

* Decentralized once it becomes simple / commodity enough that everyone can just have one

* Recentralized once it's cheaper to run them all centrally again

And then you only break back out once the thing you're doing fundamentally changes for some reason.


Disclaimer: I work at AWS, but this post isn't being made in any sort of official capacity, I have no relation to the team in question (this is the first I've even heard of the service), and the opinions here are entirely my own and not necessarily a reflection of that of my employer.

> I wonder what kind of future this spells for quantum computing - will it continue to spread or will it be limited/stunted by being controlled by only the few?

I feel like this is a step in the right direction, though. Right now using quantum computers is totally outside of the realm of possibility for the vast majority of people - they're simply too expensive in materials cost, expertise to create, conditions for operation, etc. etc. etc. - without services like this one. The only chance an "everyday" person has to try out a quantum computer is to rent time on someone's else's.

I don't think at a similar point in the life of classical computers we had options like this that were readily available - you could rent time on the computers, but I can't imagine that getting access to them was as easy as it will be today with the internet being a thing and service providers offering high granularity on billing.

My understanding (and I'm not even remotely an expert, so I could be totally off base here!) is that it's an open question on whether or not quantum computing will ever even be doable in environments where classical computing works - it might not be within the realm of what physics allows for it ever to be possible to have a quantum computer powered smartphone.

I hope access is ubiquitous someday for people, but in general I feel like this is a good step while that's not practical.


> an "everyday" person has to try out a quantum computer

what would an everyday person do on a qc?


Well, practical QC seems to be involved with optimization problems. D-Wave recently demonstrated doing something with bus routing for Volkswagen. I could imagine, say, a map service scaling that out by integrating QC into their route-finding for drivers to cooperatively improve traffic flow by finding optimal solutions to problems of a scale that is intractable with classical systems.

The everyday person will use QC like they "use" machine learning today: from a very high level abstract viewpoint, where services they consume have a little bit of intelligence that makes interacting with them more efficient.


Yeah, but what kind of optimization problem where an exact solution is intractable also doesn't have approximate algorithms that are good enough?


D-Wave has never demonstrated quantum speedup. Many doubt that their approach can be useful, even in theory.


Meh, sounds like overly negative propaganda to me. Clearly they're building up a big body of knowledge and have a lot of potential, as long as someone can figure out a practical application for the kinds of optimization problems their machine is good at.

It seems like neural networks should map to it well. Once the degree of connectivity and the number of qubits approaches the millions, there's no way any normal software solver is going to be able to keep up with it.


Facebook


It's an invitation from amazon to quantum computer makers to show their tech's potential. in the end they 'll buy the winner and the others will fail. They'll also have first dibs in case someone comes up with another useful quantum algorithm in their cloud. it feels like a defensive land grab of sorts (like neuromorphic computing)

Amazon seems to be going -aas on anything ... but how long can this last? Despite computing hardware evolution having slowed, it hasn't halted , and eventually robust hardware will become cheap enough for competitors to commoditize servers once again (as should be the normal)


This is a reasonable take. Consider that IBM, Microsoft, and Google - the credible cloud competitors to AWS - all have their own in-house QC efforts. AWS is outsourcing theirs... they make it sound like it's going to be a marketplace model, but you're right, if/once the profit motive is in place they'll snap up the supplier.

Of course, if nothing productive materializes, they're also out nothing. Sort of a win-win for them.

The question I have is whether we'll see things like SageMaker and other higher level machine learning features use the quantum computers on the back end.


Technically, it is similar to what happened with computers today — computers of old where warehouse sizes, in tightly controlled environments. The term debugging was a physical process of actually removing bugs from a computer. Lol. As computers became more affordable/smaller, they showed up in more places and allowed more people to use them. It appears the quantum computers of today are following the traditional evolution of computers. 40 years ago, people were able to rent time from university/business computers similar to what AWS Braket is offering. The reason question for me is what is the "PC" of the quantum realm. What will a quantum computer look like 40 years from now? I can't even imagine what it would look like, or what programing a quantum machine will even look like for the masses(AKA the python/ruby/java/c++, etc).


It won't hit PC levels until either cryogenics are commodity sized and priced for the home market, or room-temperature superconductors arrive and prove themselves still able to exhibit the Weird Quantum Effects that cold ones have. Look up "Josephson junctions", they're basically the transistors of QC.


My ideas are very fuzzy, but when people say that you can't run quantum computations at high temperature because of noise, it reminds me of something I read not too long ago about how scientists are starting to have insight into how biological systems operate in the presence of large amounts of noise from heat. So I wonder if there's a trick to it that nature already has found.


I'm no scientist but I would be very surprised if the brain is not harnessing similar effects or processes somehow. Consider microtubules, for example.


Quantum on Rails


> people will look back and see that classical computing spread because it was made accessible and ubiquitous,

I don't know if that is true. I mean, it was true eventually, but in the beginning, it was quite limited to only well capitalized businesses.

I mean even in the "ubiquitous" period, it wasn't that accessible. In 1984, and IBM PC cost about $5,000 (in 1984 dollars). That would be about $12,300 in today's dollars. Not out of reach for all, but certainly only for the upper-middle-class at best.


We had 3-4 decades (40's-70's) of 'lock in' when conventional computers were essentially 'big iron' that only large organizations had access to. If quantum computing ends up having useful general purpose applications, I'd expect it to start going down market once some combination of the implementation issues have been worked out, the big/easy money has been mined from government/corporate customers and patents expire.


Very cool, looks like they'll actually let you run stuff on real quantum hardware[1]! Of course, the hard part is building algorithms that take advantage of that kind of hardware. I must've read a dozen papers on Shor's algorithm and my understanding still leaves a lot to be desired.

[1] https://aws.amazon.com/blogs/aws/amazon-braket-get-started-w...


Completely tangential, but on the subject of the actual quantum hardware:

I swear that if our entire civilization went under and some future archeologist found this:

https://media.amazonwebservices.com/blog/2019/qc_rigetti_400...

They'd surely assume [at least at first] it was a religious artifact (and they'd only be half wrong?).


https://www.amazon.com/Motel-Mysteries-David-Macaulay/dp/039... (specifically, the section on excavating the bathroom).


That pattern reminds me quite strikingly of aboriginal art


The part that would be particularly interesting to me, without context, would be the deviations from symmetry. At first glance, the holes around the white circles look pretty regular. But the sparser lines of holes in between do not match each other, particularly the upper right.


I'm pretty sure the hard part is still actually building said computers to run on.


Many commenters are comparing this to the time sharing era of classical computing. There is an important difference here: we knew that classical computers were useful at the time, even though many people failed to predict that they would become millions of times cheaper and smaller.

I’m not a physicist, but it seems to me that it’s still too early to say things like “when a quantum computer with enough qubits is available, factoring large integers will become instant and trivial.” Some experts still doubt whether quantum computing is possible at all [1]. You could not say that about classical computing before the transistor was discovered. And silicon-based computers weren’t preceded by a decade of hype about how they were about to blow vacuum tubes out of the water, just as soon as we work out the remaining engineering problems.

[1]: https://arxiv.org/abs/1908.02499

I think P ≠ NP is a better analogy. Is it true? Well, most people with an interest in the problem think so, but we don’t know. I think this announcement is about as significant as Amazon releasing some tutorials on computational complexity and giving out a few PhD scholarships for people working on P ≠ NP. Maybe this is what leads to the invention of the “quantum transistor,” but it’s too early to say that integer factorisation will become trivial.


This is awesome. I didn't see any mention of how much real quantum hardware is available. I guess it must be extremely expensive, or else they would quickly become unavailable due to curious people tinkering.

(Also, how do you know that they are actually running your code on quantum hardware instead of a simulation?)


>I guess it must be extremely expensive

Any quantum computing hardware I see is covered in gold, so I'd say yes.


There are simulation codes, but IBM makes their old machines publicly available. I wish I had time to look it up for you, but if you are interested that should give you a lead.


It stands to reason that the D-Wave offering will be running on real hardware, since you can sign up to run stuff on a D-Wave directly from their own site already.


> This new service is designed to let you get some hands-on experience with qubits and quantum circuits.

Lots of open-source libraries do this[0]. Is Amazon Bracket going to be open-source?

[0]: https://github.com/desireevl/awesome-quantum-computing#devel...


I can't speak to braket being open source, but at least two of the hardware providers, D-Wave [1] and Rigetti, have open source stacks. Disclosure, I work for D-Wave and I'm not terribly familiar with the other providers.

[1] https://github.com/dwavesystems


Last time I checked d-waves systems didn't have quantum speedup thus weren't considered true quantum computers.


This is a tricky point, but I would say that D-Wave systems are quantum computers, however they rely on coherence, and not entanglement (coherence is necessary for entanglement, but you don't get entanglement automatically from coherence ), and thus do not have a universal set of quantum gates. D-wave systems thus can gain a quadratic advantage over classical systems, but will not see exponential speedups (likely need entanglement for this).


Years ago I met first time with quantum computing through https://www.dwavesys.com/tutorials/background-reading-series..., then there was a python simulator on website and call for the beta testers but the slot was closed quickly. Kudos for great tutorial!


Thanks, that is a good list of learning materials and tools.

Several years ago I spent a few evenings with a quantum computer simulator and a SHOR algorithm example.


Amazon Bracket seems to allow you to run on actual quantum hardware which I’m guessing is the difficult bit rather than the software.


I was asking if the software that allows one to program a quantum computer is going to be open-source. Pyquil and Qiskit allow you to run on actual quantum hardware and they are open-source.


My point is that if Amazon is providing the hardware then that's the selling point since quantum hardware is I'm guessing not easy to get a hold of.


If quantum hardware is the selling point, I'm trying to understand the gain in keeping their SDK closed (If that is what they plan on doing).


To lock you in as a measure to protect their revenue if another company offers quantum hardware as a service. The last thing amazon wants in any of its aas solutions is compatibility with competitors.


Acknowledging my very primitive understanding of quantum computing, would it be possible to simulate quantum computing?


Yes. Here's an online drag and drop simulator: https://algassert.com/quirk

Until recently [1], classical simulation was faster/cheaper/more-accurate than any existing quantum hardware. But the hardware has been improving and all classical simulation of quantum computation takes exponential time with respect to some important property such as the number of qubits, the depth of the computation, the number of non-trivial gates, or etc.

[1]: https://ai.googleblog.com/2019/10/quantum-supremacy-using-pr...


I wonder how optimizing the existing simulators are. I suppose any program with constant input could be reduced to no work, but with potentially exponentially long compile time. But there must be simpler optimizations...

Before we declare quantum supremacy, we should make sure the simulator we are comparing with is a good one, not a straw-man one.

I've seen this problem time and again with hardware accelerators. The accelerator is faster than some crappy software, but with a little work with a profiler, the software beats the hardware. Of course optimizing will not make an fundamentally exponential problem polynomial, but it can help a lot.


> Before we declare quantum supremacy, we should make sure the simulator we are comparing with is a good one, not a straw-man one.

A big part of writing the supremacy paper was optimizing the simulators. We did three different styles of simulation:

1) A state vector simulator with hand rolled SIMD assembly (called qSim; not yet publicaly released). This required too much space at 53 qubits. (Well, unless you're going to use the majority of all disk space on summit, which brings its own obstacles. IBM says they can run it that way in a few days, but we'll see.)

2) Treating the quantum circuit as a tensor network and doing optimized contraction to avoid the space blowup (called qFlex https://github.com/ngnrsaa/qflex). This required too much time at 53 qubits.

3) Custom code written to run on a supercomputer instead of distributed computers.

There's only so much effort you can put in before you have to call it. I think it's more likely for an algorithmic break to save a factor of 10 than for optimization to save a factor of 10 at this point.. although someone should probably try using GPUs or FPGAs.

I also take the view that if it takes a month to produce a new optimized implementation of a classical simulator that beats the quantum hardware, then the quantum hardware is still outperforming classical hardware for that month. The theoretical bounds are important, but in the day-to-day context of a race they aren't directly relevant.


I’m not really an expert—I’ve just been playing with quantum computer simulations since the supremacy paper came out, but yours is basically IBMs criticisms of that paper.

Part of the problem is that comparing a quantum computer simulation to the state of an actual quantum computer is almost inherently an apples-to-oranges thing.

We do not know of any way to compactly represent the state of a quantum computer in a classical computer; it appears to require an exponentially large state vector. So it requires an exponential number of operations just to evolve the full state vector. But it also produces the exact and full probability distribution describing the quantum computer state. Running a physical QC produces one _sample_ from that distribution, and you have to run it completely again to get another sample.

If we pretend that the simplest quantum computer state is equivalent to a coin toss, then classically simulating a fair coin would produce a distribution like {H: 0.5, T: 0.5}, while running the quantum computer would produce “H” half the time and “T” half the time. You could run the QC a bunch of times to estimate the actual probability, but there’s a fundamental difference between what the QC does and the simulation does.

There are some ways to approximately solve the classical simulation, but we haven’t found a way to compactly represent and manipulate the quantum state vector that are as fast as physics/nature seem to be able to do it in hardware. The Extended Church-Turing thesis (ECT) basically says that such an encoding and simulator should exist, and the basic premise of QC is that such an encoding and evolution process is impossible.

We don’t expect to ever conclusively prove that ECT is false (someone could always find an efficient simulator for QCs) but that doesn’t mean QCs can’t be useful in the interim, or even potentially after such a simulator is found (if it is a polynomial with a very large constant factor, for example).

The supremacy paper claimed that classically simulating their largest circuit would take 10,000 years. IBM said it would take them 2 days. The sycamore processor did it in 5 minutes. That still a big improvement, even if the classical simulator can be further optimized—and even IBM admits their simulator wouldn’t work if the quantum processor had even 1 more qubit.


Yes. Right now simulators tend to be faster, cheaper, and more accurate than real quantum computers.


How can a simulation be more accurate than what it's simulating?


Physical quantum computers have noise. Let's take a (simplified) scenario. You've set up your quantum circuit with one qubit, and put it in a position where it will measure as either 0 or 1 with equal chance. In the simulation it'll come out as 0 or 1 with actual equal chance. In the real world, other factors will create a bias one way or the other (this may not be consistent, either) so that it comes out more like 60% 0 to 40% 1, even over 1000s of trials.

If you set up a circuit where you've entangled two qubits so that they should come out as the same value (00 or 11) and the configuration says they should come out with 50% chance of either, the simulation will show that. The outputs of 01 and 10 will never show up in the simulation. But in the real world, there's still a chance that you get those. You'll likely (on IBM's quantum computers) get something like 1-5% 01, 1-5% 10, 45-50% 11, 45-50% 00 (again, over thousands of runs).

If you want to see how this plays out with simulations and real quantum computers, IBM [0] has free access (constrained by credits when you want to run on real quantum computers, they reset each day).

[0] https://quantum-computing.ibm.com/


As far as I understand, it's because what it's simulating is a logical qubit which is different from the very noisy, almost instantaneously collapsing physical qubits present in current quantum computers.

Software simulates what's supposed to happen while the hardware only approaches it through many repeated trials.


Current quantum computers are noisy. Gates aren’t perfectly implemented. Qubits are prone to dephasing and decoherence.


I think he means returns correct results to known problems more reliably.


Quantum computers are used to simulate digital computers, so a digital computer simulating a quantum computer simulating a digital computer can cheat.


Yes it is! But incredibly slow and resource intensive.

A few days ago I shared a blog post here on HN about a simple quantum computing simulador I built as a weekend project, for learning purposes:

- https://thomasvilhena.com/2019/11/quantum-computing-for-prog...

Basically it is just linear algebra ;)


Yeah, up to a threshold which you could roughly say is somewhere in the 40-100 qubit range depending on what type of (fairly useful) computation you’re doing. It’s linear algebra with exponentially large matrices, but there are lots of tricks you can do to optimise simulation - up to a point.


Yes. Slowly.


Currently, yes. Eventually, no.


Are there any quantum programming tutorials to get started with this stuff?


Disclosure, I'm a D-Wave employee; we're one of the hardware vendors. Here's a handful of resources: an introduction to our software stack [1], the SDK [2], a repository of demos [3]; a webinar [4]; and if you're willing to sign up for a free minute of QPU time (using our cloud service directly rather than AWS), there's some live demos and interactive Jupyter notebooks on the Leap service [5].

[1] https://docs.ocean.dwavesys.com/en/latest/getting_started.ht...

[2] https://github.com/dwavesystems/dwave-ocean-sdk

[3] https://github.com/dwavesystems/demos

[4] https://dwavesys.zoom.us/webinar/register/WN_WRHBYy1eQB-20oT...

[5] https://cloud.dwavesys.com/


Yes. Microsoft has the Quantum Development Kit [1], with its language Q# (written in F#, actually), which comes with a local simulator. They also have the Q# Katas [2] for practice.

On my 16GB RAM laptop, I can simulate 29 qubits.

[1]: https://www.microsoft.com/en-us/quantum/development-kit [2]: https://docs.microsoft.com/en-us/quantum/intro-to-katas?view...


Andy Matuschak and Michael Nielssen have a pretty interesting tutorial online about Quantum Computing https://quantum.country/qcvc


I can recommend it, it is a great primer on the subject.


I like the Qiskit community textbook: https://community.qiskit.org/textbook/



From appwiz's link in this thread:

Amazon Braket – A fully managed service that allows scientists, researchers, and developers to begin experimenting with computers from multiple quantum hardware providers in a single place. Bra-ket notation is commonly used to denote quantum mechanical states, and inspired the name of the service.


I was thinking that this name would make hallway conversations tougher (no, it's bracket with the "c"), but I'm guessing (and it's just a guess, I know nothing about this field) that the people actually interested in this service know what braket means.


They certainly would. Also, bra-ket would be pronounced "brocket", rather than "bracket".


I thought the same. Good job product marketing.


This is exciting and anxiety provoking at the same time. Just when I am starting to grasp the inner workings of cloud / kubernetes infrastructures, this is on the horizon. What a world we are in !


What does this have to do with Kubernetes? You should read the article.


You should read my comment again.


Are there any quantum simulating frameworks that I can download on my high performance gaming PC and start playing with simple quantum programs?



Funny that their homepage says "Rigetti Quantum Computers Are Now Available On AWS" :)


Rigetti is one of the 3 hardware providers for Amazon Braket, along with D-Wave and IonQ

https://aws.amazon.com/braket/hardware-providers/


Get ready for my start up LightningBolt. It has Quantum Computers available on Rigetti.


Serious question: does this mean any developer with a credit card can now break our strongest crypto?


No.

Even if Amazon had worked out an agreement with Google to run arbitrary algorithms on a 53 qubit computer; that would not be enough to run Shors algorithm (which only breaks RSA and Elliptic Curve) on normal keys. This is also pretending that Google's 53 qubit computer were full, error corrected, logical qubits.


No. We’re still a long way from quantum computers powerful enough for that.


Indeed. We currently have machines with ~ 50 qubits and we’d need likely 10s of millions of qubits.


No, and there’s no evidence so far that anyone on the planet has a quantum computer capable of running shors algorithm in any useful capacity,


. . . because if they had the necessary hardware, and expertise, they'd totally be dumb enough to leak evidence.


There is no evidence of quantum computers breaking ciphers en masse.


You're simulating a quantum computer with a classical computer. So if you can break crypto with a classical computer, then yes.


What happens to the information from your explorations and experiments?


From the FAQ it looks like it will be dumped into an S3 bucket


What types of problems are quantum computers used for?


None. Current hardware is way to small and noisy to run any known useful algorithm.


The shiny apparatus surrounding the hardware makes me think of The Talk, by smbc comics:

https://www.smbc-comics.com/comic/the-talk-3



That seems to have more information, so we've switched to it from https://aws.amazon.com/braket/.


Amazon seems to have missed the in-house opportunity to build quantum, so are partnering widely. It'll be interesting to see how Google squanders their tech lead in this space too...


So quantum computing is expected to based on Bra-ket notation? In that case, is it simply based on linear algebra and statistical sampling? I assume you can get that experience with low-level Tensorflow and a GPU for parallel computing


> So quantum computing is expected to based on Bra-ket notation? In that case, is it simply based on linear algebra and statistical sampling?

Yes. You can simulate a 53 qubit quantum computer with a 2^53 complexed valued vector as input (about 20 petabytes), and a 2^53 x 2^53 complex valued unitary matrix, which will take about a trillion exabytes to represent exactly. However, that is for a generic quantum operation on all 53 qubits, and some programs can be represented significantly more compactly.

> I assume you can get that experience with low-level Tensorflow and a GPU for parallel computing.

No. Most GPUs have at most about 13 gigabytes of onboard memory and would not be able to hold the matrix in memory. Also, GPUs still do not reduce the computational complexity of matrix multiplication, you still have to perform the full ~ n^2.37 operations (using Coppersmith-Winograd). Though again, this is for a general Unitary transformation; reductions can sometimes be made; this is how IBM was able to validate result computed by Google.

However, yes, this does mean that you can simulate a smaller quantum computer using just linear algebra and no particularly fancy tricks.


Yes, quantum computing is mostly linear algebra.


> Taken together, I think it is safe to say that most organizations will never own a quantum computer, and will find the cloud-based on-demand model a better fit. It may well be the case that production-scale quantum computers are the first cloud-only technology.

There was a magical moment when an automobile went from being a gimmicky horse replacement to an actual innovation.

I feel like this is a possible glimpse at something similar for the cloud.


I think the trend is reversed here. A lot of things are moving away from personal on-device computing to cloud, so it makes sense that for quantum we can jump straight to cloud, especially considering both the cost and practicality. Like, I don't remember any major trend in computing moving in the opposite direction in the past 10 years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: