Hacker News new | past | comments | ask | show | jobs | submit login

Does anyone else feel that quantum computing via the cloud is sort of the penultimate test of the "-aaS" (as a service) model?

I mean, quantum computing is something that today seems unlikely to be widely available or achievable in hardware to anyone but a handful of specialists and companies/organizations.

I wonder if someday, in the future, people will look back and see that classical computing spread because it was made accessible and ubiquitous, and we did not have the lust for centralization which we seem to now. I wonder what kind of future this spells for quantum computing - will it continue to spread or will it be limited/stunted by being controlled by only the few?

This feels like it has the potential to be the ultimate kind of lock-in. If the way that the system and the hardware/software is exposed to the world is through cloud services, and the knowledge of how to build/operate/use quantum computers stays locked-in to only the privileged who can afford to have access to and utilize it...

Imagine if you started building on quantum computing technology, but then decided you wanted to change.. to what other option!?

I'm not trying to be a Luddite here, I think it's pretty amazing you can even access a quantum computer as a service. But, I am being "that person" who asks the question.. "hmm, where is this going"?






One of the questions folks tend to ask me is some form of, "when can I expect my cellphone to contain a quantum co-processor?" That gives me an excellent opportunity to tell them everything that I know about cryogenic refrigerators (which only takes a minute).

The chips we make (speaking directly of D-Wave, but afaik this is true of all superconducting QC efforts) would cost pennies if we produced them at scale. But the surrounding machinery is extremely complex, and very expensive to manufacture -- and scale would only get you so far. My rough understanding of refrigeration is that the temperature differential strongly depends on the length of the heat exchanger. Qubits are famously sensitive to noise; and blackbody radiation gives an inescapable dependence between noise and temperature. In short, a miniaturized fridge would be necessarily hot, and therefore too noisy to perform quantum computation!

So the sad news is that, barring some major developments, we may never have miniature quantum computers. In the foreseeable future, hardware costs will be measured in millions of dollars. So even if you're a millionaire, you probably don't want to buy a quantum computer. If you work for a university, a national laboratory, or a major corporation, you might try to convince your organization to purchase a quantum computer. If you succeed in that pitch, you'd almost certainly need to share it with your colleagues over a network.

So to me, an industry insider, it feels that public access to quantum computing is almost necessarily QCaaS.


I wonder if we had the same perspectives when comptuers were first established in the 50's/60's.

We couldn't possible foresee how these monster of a machines could possible fit in the palm of our hands and yet, now, it's hard to see how we couldn't see that far in front of us.

We had this famous quote from IBM but, I wonder if as an industry this was a common perspective. Or wether between those who were building these machines could foresee where we'd be now? Is there some one in Quantum Computing who has the intelligence and creativity to think, nah, we'll have giant refrigerated quantum computing mobile phones in 5x decades time (I understand Quantum Computers wouldn't make great phones I'm just drawing baseless comparisons :) )

"I think there is a world market for maybe five computers."

Thomas Watson, president of IBM, 1943

= = = =

I only found out about wolframalpha a month ago and I'm still in awe of it. Quantum computing as a service? No idea what I'm going to want to do with it. But I re-watch this video by Veritasium and Andrea Morello from UNSW a couple of times a year to just remind myself how much I don't know.

https://www.youtube.com/watch?v=Auha-gXTiqU


Just as an aside, that quote is likely apocryphal. No one has ever been able to find actual evidence that he said this, and those close to him have rejected it as a false attribution. It was being recognized as a myth way back in the 1970s.

Even if he had said it, it was a very accurate statement at the time. Gordon Bell has noted that at the time he is claimed to have made that statement, it would have held essentially true for a decade. As something that would have likely been said (if said at all) in discussions around IBM's near-future business plans, or a sort of market analysis of the conditions at the time, it makes perfect sense.


See, this is why I would suck as a contestant on QI. Thank you. :)

Cooling chips to around room temp was an evolutionary process. Cooling chips to near absolute zero is another ball of physics entirely. I suspect it is like trying to have a pocket sized NMR machine, which needs liquid helium cooling etc. Those things can’t just be scaled down to pocket size.

What about photonic quantum computers? You can access quantum mechanical effects at room temperature with photonics. I don’t know if it’s more difficult to use them for computing though

Yes, because photons don't interact with each other

Are you supporting or refuting my thesis? Assuming the latter, just because photons as particles don’t “interact” (by which I presume you mean they are bosons) doesn’t mean you can’t get interesting multi particle quantum results with photons. See for example https://en.m.wikipedia.org/wiki/Hong–Ou–Mandel_effect

I'm not sure if it's up to the task (yet?), but thermoelectric coolers are routinely used for CCDs that operate below -80 °C (an overview: https://www.azom.com/article.aspx?ArticleID=14681). I suspect someone will come up with a solution once the technology starts to mature.

At those temperatures things get weird. Cryocoolers are established technology, but anything colder than the xx K range very quickly stops being easy in any way.

That said, LH2 temperatures aren't really hard and can easily fit in a 2U rackmount device, providing power/classical-RF uses of type2 superconductors. Think EMI shielding, power conditioning, ~50 GHz traces that can span a full backplane without fancy signal conditioning, etc.


Thermoelectric coolers are not even able to keep up with modern CPUs. Or rather, they can, but they are very energy hungry. As much or more so than the CPU itself.

Mostly: thermometric coolers are bad for >50K difference and >10W heat flux on the cold side. Most uses are better off with a sterling cooler or maybe even an absorption refrigerator, which can have no mechanical parts (just fluids, plumbing and heat exchanges) and could theoretically provide human-centric AC/refrigeration based on server waste heat.

Forgive my ignorance with this question. Would it be possible to run these quantum chips in space? Space is cold and quiet so maybe that’s cheaper at scale.

In general, you shouldn't think of space as being cold for intuitive hot-to-cold heat transfer. For example, the metallic side of a space ship would not be anywhere near 0K, whereas if you had a metal plaque between a liquid at around 0K and your hand, it would be.

It is very hard to dissipate heat from a solid object into space. This is not true for our bodies on the other hand, but that is more to do with pressure - if you expose cells to the Void of space, most liquids inside would quickly expand in size and essentially boil, consuming large amounts of heat to go through the phase transition from liquid to gas, thus quickly cooling surrounding tissue. You could theoretically use this to create evaporation-based heating, but you would have to transport vast quantities of water that would quickly be used up, since there is no hope of collecting them back most likely.


Shedding heat in space is a huge problem because you can only radiate it. Depending on the refrigeration requirements there may be too much excess heat to make it feasible.

You’d probably lose a lot of the advantages because you lose the naturally extremely effective radiation shielding of the earths magnetic field and atmosphere.

You'd probably always have to stay in the shadow of some celestial body and the 3K microwave background would be an issue as well that would have to be solved.

Maybe something a tiny little black hole could fix?

There’s just got to be a way, there’s no way we can just give up.

I’m fairly sure decades ago people assumed we may never have tiny computers in our pockets with the same certainty you have now.

This is what it means to be crazy enough to change the world.


On the other hand, many people assumed we'd have flying cars and colonies on Venus.

Oh, and fusion reactors.

Edit:

Now that I think of it, there was a lot of variation in predictions of computing. But I would say that it's been pretty common for science fiction to describe technology 30 years out somewhat accurately, probably because it has inspired the actual tech in a self-fulfilling way.

So, in the 40s, a spaceship was envisioned able to carry only calculators and slide rules, with a radio link to a big central computer. That wasn't far off of how things developed in the 60s and 70s. But I think by the 60s and 70s, people were imagining pocket computers and tablets and such and that had a huge effect on people actually designing them when it was possible.


If people’s assumptions tend to be wrong then we can expect to someday have mini quantum computers since people assume they are impossible.

See my addition. My point is that some predictions are right and some are wrong.

> There’s just got to be a way, there’s no way we can just give up.

By no means do I intend to discourage progress! I wouldn't do the work that I do if it wasn't so difficult. I did hedge, a bit: "may never," "foreseeable future," "without major developments."

The fridge is just one major obstacle. There's a plethora of physics, engineering, and mathematical challenges out there impeding progress. Get to work!


> This feels like it has the potential to be the ultimate kind of lock-in. If the way that the system and the hardware/software is exposed to the world is through cloud services, and the knowledge of how to build/operate/use quantum computers stays locked-in to only the privileged who can afford to have access to and utilize it...

Well, consider the opposite suggestion: right now, quantum computer access is fairly constrained. Making these systems available to anyone who has a credit card is a wider democratization of access to them, not a constraint.

After all, cryogenics and such isn't exactly free or easy to maintain - access is going to be somewhat controlled unless/until these things are to the point where you have one in your phone.


Yes, I agree with your point, I was kind of vacillating back and forth.

I think it's incredible that the inventors of this technology are pushing so hard to get it in the hands of people who can apply it. So in that regard, I completely agree, it is a better situation to have the technology available for a reasonable price and payment option.

I think part of my hang up of this is thinking about and remembering how much I was amazed by what we could do independently before the cloud vendors. It seems like we are in the part of the cycle that is encouraging centralization, but I don't know how or if we'll ever be able to exit this phase (or maybe we won't have to).

The scale and complexity of the offering of the cloud vendors, compared to what independent organizations can do, is truly mind blowing, and continues to get that way moreso every day. How does one even compete (or why would one want to) against these "utility" technology companies?


Totally agree that there's some loss of control when we allow cloud vendors to be the middlemen in all things; but honestly, this is probably one of the few cases where it's the perfect missing ingredient to be able to give access to what are fundamentally time-sharing systems to the widest number of people possible.

This is like dialing up to use the university PDP-11 in the 70s, basically. Big things are coming.


We were in a similar situation when computers were built with vacuum tubes. The anomaly was the transistor. Maybe there is a quantum equivalent, but we haven't found it yet.

The Quantristor?

Yeah but at the same time, how easy is today to build something with Raspberry PI or deploy software into thousands of IoT devices?

>cryogenics and such isn't exactly free or easy to maintain

Room-temperature superconductors and other technologies using strange edge conditions are theoretically possible, but quantum computers have only a few specialties.


Hard to say. In 1960 classical computing would have seemed totally centralized and unattainable in distributed form for the masses. Who knows what the future may bring for quantum computing assuming there is actually a Conventional use case for the masses (most people probably don’t have a burning urge to factor primes at home like say playing video games).

I agree. I'm trying to see this through that perspective when we were at that point in the cycle where everything was centralized. I'm wondering if there will ever be a time (or even a reason) for things to go back the other way again.

Also, agreed about the use case - sometimes I get the feeling that quantum computing is a problem looking for a solution (but I am sure that must not be the case). That said, I think things are partially that way because quantum computing is just such a different paradigm, so to truly take advantage of it takes a pivot in thinking, but that great dividends may be possible as a result.

My thought is, it's kind of like how we learned about what FPGAs could do. Different paradigm, incredible opportunity.


Privacy, offline access, low latency - these are all excellent use cases for edge computing. Once it's time to do some heavy lifting, though, it makes a lot more sense to centralize. Decentralization gives you control along with responsibility, so the cycle goes something like this:

* Decentralized as a part of early development

* Centralized for ease of early deployment

* Decentralized once it becomes simple / commodity enough that everyone can just have one

* Recentralized once it's cheaper to run them all centrally again

And then you only break back out once the thing you're doing fundamentally changes for some reason.


Disclaimer: I work at AWS, but this post isn't being made in any sort of official capacity, I have no relation to the team in question (this is the first I've even heard of the service), and the opinions here are entirely my own and not necessarily a reflection of that of my employer.

> I wonder what kind of future this spells for quantum computing - will it continue to spread or will it be limited/stunted by being controlled by only the few?

I feel like this is a step in the right direction, though. Right now using quantum computers is totally outside of the realm of possibility for the vast majority of people - they're simply too expensive in materials cost, expertise to create, conditions for operation, etc. etc. etc. - without services like this one. The only chance an "everyday" person has to try out a quantum computer is to rent time on someone's else's.

I don't think at a similar point in the life of classical computers we had options like this that were readily available - you could rent time on the computers, but I can't imagine that getting access to them was as easy as it will be today with the internet being a thing and service providers offering high granularity on billing.

My understanding (and I'm not even remotely an expert, so I could be totally off base here!) is that it's an open question on whether or not quantum computing will ever even be doable in environments where classical computing works - it might not be within the realm of what physics allows for it ever to be possible to have a quantum computer powered smartphone.

I hope access is ubiquitous someday for people, but in general I feel like this is a good step while that's not practical.


> an "everyday" person has to try out a quantum computer

what would an everyday person do on a qc?


Well, practical QC seems to be involved with optimization problems. D-Wave recently demonstrated doing something with bus routing for Volkswagen. I could imagine, say, a map service scaling that out by integrating QC into their route-finding for drivers to cooperatively improve traffic flow by finding optimal solutions to problems of a scale that is intractable with classical systems.

The everyday person will use QC like they "use" machine learning today: from a very high level abstract viewpoint, where services they consume have a little bit of intelligence that makes interacting with them more efficient.


Yeah, but what kind of optimization problem where an exact solution is intractable also doesn't have approximate algorithms that are good enough?

D-Wave has never demonstrated quantum speedup. Many doubt that their approach can be useful, even in theory.

Meh, sounds like overly negative propaganda to me. Clearly they're building up a big body of knowledge and have a lot of potential, as long as someone can figure out a practical application for the kinds of optimization problems their machine is good at.

It seems like neural networks should map to it well. Once the degree of connectivity and the number of qubits approaches the millions, there's no way any normal software solver is going to be able to keep up with it.


Facebook

It's an invitation from amazon to quantum computer makers to show their tech's potential. in the end they 'll buy the winner and the others will fail. They'll also have first dibs in case someone comes up with another useful quantum algorithm in their cloud. it feels like a defensive land grab of sorts (like neuromorphic computing)

Amazon seems to be going -aas on anything ... but how long can this last? Despite computing hardware evolution having slowed, it hasn't halted , and eventually robust hardware will become cheap enough for competitors to commoditize servers once again (as should be the normal)


This is a reasonable take. Consider that IBM, Microsoft, and Google - the credible cloud competitors to AWS - all have their own in-house QC efforts. AWS is outsourcing theirs... they make it sound like it's going to be a marketplace model, but you're right, if/once the profit motive is in place they'll snap up the supplier.

Of course, if nothing productive materializes, they're also out nothing. Sort of a win-win for them.

The question I have is whether we'll see things like SageMaker and other higher level machine learning features use the quantum computers on the back end.


Technically, it is similar to what happened with computers today — computers of old where warehouse sizes, in tightly controlled environments. The term debugging was a physical process of actually removing bugs from a computer. Lol. As computers became more affordable/smaller, they showed up in more places and allowed more people to use them. It appears the quantum computers of today are following the traditional evolution of computers. 40 years ago, people were able to rent time from university/business computers similar to what AWS Braket is offering. The reason question for me is what is the "PC" of the quantum realm. What will a quantum computer look like 40 years from now? I can't even imagine what it would look like, or what programing a quantum machine will even look like for the masses(AKA the python/ruby/java/c++, etc).

It won't hit PC levels until either cryogenics are commodity sized and priced for the home market, or room-temperature superconductors arrive and prove themselves still able to exhibit the Weird Quantum Effects that cold ones have. Look up "Josephson junctions", they're basically the transistors of QC.

My ideas are very fuzzy, but when people say that you can't run quantum computations at high temperature because of noise, it reminds me of something I read not too long ago about how scientists are starting to have insight into how biological systems operate in the presence of large amounts of noise from heat. So I wonder if there's a trick to it that nature already has found.

I'm no scientist but I would be very surprised if the brain is not harnessing similar effects or processes somehow. Consider microtubules, for example.

Quantum on Rails

We had 3-4 decades (40's-70's) of 'lock in' when conventional computers were essentially 'big iron' that only large organizations had access to. If quantum computing ends up having useful general purpose applications, I'd expect it to start going down market once some combination of the implementation issues have been worked out, the big/easy money has been mined from government/corporate customers and patents expire.

> people will look back and see that classical computing spread because it was made accessible and ubiquitous,

I don't know if that is true. I mean, it was true eventually, but in the beginning, it was quite limited to only well capitalized businesses.

I mean even in the "ubiquitous" period, it wasn't that accessible. In 1984, and IBM PC cost about $5,000 (in 1984 dollars). That would be about $12,300 in today's dollars. Not out of reach for all, but certainly only for the upper-middle-class at best.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: