Hacker News new | comments | show | ask | jobs | submit login
The Machine of Tomorrow Today: Quantum Computing on the Verge (bloomberg.com)
133 points by jonbaer 103 days ago | hide | past | web | 67 comments | favorite



Since questions about crypto came up, here's a summary paper by Daniel Bernstein about it:

https://eprint.iacr.org/2017/314.pdf

Symmetric cryptography would be safe even with quantum computers (they just half the effective bit length of the cypher with Grover's algorithm, which can be compensated for by increasing the key length). Classical asymmetric cryptography would be in danger, but there are alternative systems for which no quantum based attacks are known. They can be a bit unwieldy to use and have much larger key sizes than what we currently use - but methods that are hopefully quantum-proof may exist. The Lamport signature scheme is probably one of the methods that is the easiest to understand.

Not all hope is lost for cryptography even with quantum computers.


As someone who works in the field of quantum computation, it's actually a little unpleasant that crypto always comes up.

Crypto is interesting and quantum computers do have a part in it, but it's one of the least interesting things to talk about when it comes to what quantum computers can do.

It's like at the advent of the regular computer, and the application everyone talks about is how it will be able to do accounting. It's right, but there are so many other interesting things it can do.

To offer some fodder for further discussion, how about molecular and atomic simulation? Wouldn't it be nice if we could just simulate the behavior of a molecule to predict what useful properties it has? One could imagine just exhaustively searching for molecules of interest on a quantum computer.

Or how about using a quantum computer to bootstrap itself? Using a QC to build a better QC is in the realm of its utility.

Or how about solving some of these nasty computer science problems that crop their heads up everywhere, like graph coloring (or any of its equivalent incarnations)?

Factoring integers and searching boxes are really interesting developments in the theory of quantum computation, but the above stuff, to me, sounds a lot cooler.


Good point. NIST has a list of 50 or so algorithms that will provide speedups when implemented on a Quantum Computer:http://math.nist.gov/quantum/zoo/.


I enjoyed your post but I have a disagreement. Graph Coloring is np-complete or np-hard depending on what question you're asking. Grover's algorithm for unstructured search only provides a quadratic boost. If you're pulling from the papers on adiabatic evolution, these are simulated small instances with acceptance probabilities that would likely not be economical in real world scenarios.

For quantum computers to solve np-complete problems in polynomial time would be highly surprising and seems just too good of an outcome. Especially considering the work showing exponential slowdowns for simple problems that defeat local search algorithms like QA.

That said, there'll likely be many practical problems for which quantum annealing provides speed ups which vastly outstrip classical computers.


Can we use QC for deep learning?


> Wouldn't it be nice if we could just simulate the behavior of a molecule to predict what useful properties it has?

We already do this with classical computers to a significant extent. It requires a lot of approximations, but it works fairly well for several useful applications, such as drug/protein binding prediction. I am really looking forward to quantum co-processors to help with this kind of calculation.


> Classical asymmetric cryptography would be in danger, but there are alternative systems for which no quantum based attacks are known.

What about asymmetric crypto schemes executed on a quantum computer? All the discussions I've heard so far assume attackers have a quantum computer and everyone else have classical computers. To me, this seems like an unreasonable assumption. Won't there be a way to use a quantum computer to do asymmetric encryption in a way that makes it hard for it to reverse it?


There will be a long period of time (probably at least a decade, maybe even several) where large government agencies have quantum computers, maybe large businesses, but most people (and many small businesses) don't.

(Remember that "computers" now includes your phone. I'm not sure how soon quantum chips come to phones...)


What about D-Wave computers?


If I understand correctly, D-Wave is not a general purpose quantum computer. It only works on certain specialized problems. And even there, there's still question of whether it actually works faster than a non-quantum computer.


The forecast for hash functions is also optimistic (same deal as symmetric encryption with respect to Grover's algorithm).

Not all asymmetric key cryptography is in danger either. Lattice-reduction and McEliece encryption schemes don't have a known weakness to post-quantum computing.


Besides scientists, I don't really see quantum computers as being useful computationally. There's breaking cryptography, but then you're spending billions to inconvenience people into moving to a different quantum-resistant algorithm.

What about creating cryptography? Wikipedia has a few interesting applications:

https://en.wikipedia.org/wiki/Quantum_cryptography

I'm not a cryptographer - are security people excited about building all sorts of weird quantum protocols?

> “Then you let the computer check all possible solutions ­essentially—or a very large combination of them—and come back with an answer,” he says. In a quantum computer, there’s no mathematician cracking the problem, he says. “The laws of physics crack the problem for you.”

That sounds more like an NTM than a quantum computer. An NTM would turn its owner into a God.

> Superposition is the mind-bending observation that a particle can be in two states at the same time. Bring out your ruler to get a measurement, however, and the particle will collapse into one state or the other. And you won’t know which until you try, except in terms of probabilities.

It's not probabilities because people think of those as real numbers. The coefficients are complex numbers. There's a probability monad, for example, but no quantum monad:

http://twistedoakstudios.com/blog/Post5485_what-isnt-a-monad

The article sort of mentions it later on when they quote Scott Aaronson, but I suspect the author didn't understand and mentally reverted back to a probability model.


I think one of the killer apps is the purpose Feynman originally envisioned – simulating quantum systems: https://people.eecs.berkeley.edu/~christos/classics/Feynman....

I wasn't able to dig up too much, but the introduction to a paper [1] by some Google folks provides a bit of insight into what might be possible:

> In particular, quantum simulation of molecular energies, which enables numerically exact prediction of chemical reaction rates, promises significant advances in our understanding of chemistry and could enable in silico design of new catalysts, pharmaceuticals, and materials.

[1] https://journals.aps.org/prx/pdf/10.1103/PhysRevX.6.031007


Seth Lloyd did a talk at the Long Now Foundation in 2016 about his research into quantum computing. He made the point that his research is (partially?) funded by the NSA - which was great because the NSA's position on quantum computing/cryptography was basically "we would prefer that quantum computers are impossible, but if they are possible then we want the first one". So - every time he'd go to them and say "well...we haven't cracked the nut quite yet" they would turn around and say "great!" which is quite the unusual funding situation.

edit: link to talk

http://longnow.org/seminars/02016/aug/09/quantum-computer-re...


> well...we haven't cracked the nut quite yet" they would turn around and say "great!" which is quite the unusual funding situation.

That's a bit like insurance: Your house hasn't burned down yet. Great, here is another insurance premium for you.


> I'm not a cryptographer - are security people excited about building all sorts of weird quantum protocols?

No.

In reality so-called quantum cryptography is completely impractical. Imagine an Internet where you can't send messages to people more than a few hundred kilometers away, with no wifi and no mobile Internet and which only works if you have pre-shared keys. That's the Quantum Internet (which apparently the EU just decided to fund with billions).


>> > I'm not a cryptographer - are security people excited about building all sorts of weird quantum protocols?

> No. In reality so-called quantum cryptography is completely impractical

What about building quantum safe protocols? Is that as a field any more interesting and/or viable?


> What about building quantum safe protocols? Is that as a field any more interesting and/or viable?

Yes. Plenty of stuff happening in that field.

The challenge is: Right now we can choose between probably safe, but impractical (mcbits, sphincs) and practical, but with big questionmarks about how safe they are (newhope, sidh). The problem is: We really only trust crypto algos if they have been tested and researched over a long time, but lots of these things are pretty new.


That's called post-quantum cryptography, not quantum cryptography.


>> What about building quantum safe protocols?

> That's called post-quantum cryptography, not quantum cryptography.

As I understand it, "quantum safe" and "post-quantum" describe the same thing.

I never used the term 'quantum cryptography'


I think I meant to respond to the parent comment, not yours. But that was hours ago.


Isn't this IDEAL for secure computing?

Really its not that much different than how things work now. When we send a message around the world, its going through dozens of computers to get there. And wifi can always be an end point extension.


What is an NTM? Google search turns up Nontuberculous mycobacteria and a French rap group.


Nondeterministic Turing Machine


That might actually be a good name for a rap group. Or maybe punk...

But: Nondeterministic Turing Machine? What? How can it be nondeterministic, and still be a Turing Machine? Aren't they pretty much deterministic by definition?


It's nondeterministic in the same sense as the "nondeterministic polynomial" of P = NP. Basically, a normal turing machine would also be a nondeterminisic one if P = NP.



Quantum computers have a lot more utility than factoring integers. How about simulating molecules, or solving classically hard optimization problems?


Which optimization problems are you thinking of?


Graph coloring, SAT problems, eigenvalue minimization. Almost any binary problem where you can write down a cost function.


Again, you're thinking of an NTM not a quantum computer. There are some number theoretic problems, only:

https://complexityzoo.uwaterloo.ca/Complexity_Zoo:B#bqp


Are you thinking of the adiabatic QC approach (i.e., the dwave machine)? I think they still haven't been able to show that it's faster than a classical approach for any problem.


Does anybody know of any good links / blog posts that explain the actual inner workings of a quantum computer in more detail?


Sure. Here are a few resources; they vary in technical complexity and depth of coverage, but they're all rigorous:

Scott Aaronson has a good introductory article in Scientific American[1], and he also wrote a short, accessible article aimed at high school students in 2002[2].

Then there's Berkeley's CS 191 Qubits, Quantum Mechanics and Computers syllabus and lectures[3].

Tel Aviv University has lecture notes for the 1997 Quantum Information Processing course online[4]. These are old, but the information is still fundamentally accurate if what you're looking for is a breadth-wise introduction to the topic.

There's the paper, From Cbits to Qbits: Teaching Computer Scientists Quantum Mechanics on arXiv[5].

I assume you have a computer science background; if that's the case, you might want to start with for light reading on the meta between classical and quantum computing, then dive into one of the lecture notes series. The first two resources are much more digestible; if you're looking for something to read in 10 minutes or so and come away with a basic understanding of the field, those are what you want to read. _________

1. http://www.cs.virginia.edu/~robins/The_Limits_of_Quantum_Com...

2. http://www.scottaaronson.com/writings/highschool.html

3. http://www-inst.eecs.berkeley.edu/~cs191/sp05/

4. http://www.tau.ac.il/~tsirel/Courses/QuantInf/syllabus.html

5. https://arxiv.org/abs/quant-ph/0207118


Note that all of these links appear to concentrate on abstract computation principles analogous to learning about computation in a CS course. I don't think they actually describe the physical inner workings of a quantum computer.


That's a good point, I wasn't thinking of physical systems details. In that case, better resources would be ACM Communications' A Blueprint for Building a Quantum Computer[1] or D-Wave's Introduction to D-Wave Quantum Hardware article[2].

(I don't have a stance on D-Wave's legitimacy, but the article is illustrative either way).

_______

1. https://cacm.acm.org/magazines/2013/10/168172-a-blueprint-fo...

2. https://www.dwavesys.com/tutorials/background-reading-series...


I don't think the D-Wave article is very good. To pick one choice line, "The term 'Interference' refers to the electrons - which behave as waves inside a quantum waves, interference patterns which give rise to the quantum effects." If someone didn't know what interference was, that sentence certainly wouldn't help them! D-Wave's article also doesn't give a sense of how unsettled this field is, and how many plausible options there are for every element of a quantum computer. I detect the influence of marketing.

The ACM article is much better.


Preskill's course notes from Caltech are also very good.


Thanks, they're on my reading list!


For some great beginner/advanced beginner level stuff, check out the Veritaseum channel on youtube (search 'quantum computing' within the videos list). The interviews with Andrea Morello of UNSW are especially good.

And PBS Infinite Series, also on youtube, has some good mathematical explanations of quantum computation and relevant algorithms.


Every article I read focusses on how quantum computing differ from classical at a high level. I know these are probably asinine questions, but how are these computers actually​ used? How are these experiments instructed to the machine? Do these machines have I/O? Does the state of the qubits have to be measured with separate equipment?


These "computers" are just experiments in a lab right now. The qubits are prepared and read out by highly specialized equipment that differs depending on the substrate being pursued (e.g., lasers for trapped ions, cryogenic amplifiers for SQUIDs, etc.). To make an analogy with classical computation, imagine wiring the very first transistors together by hand and computing 2+2 using a voltage supply and an oscilloscope.


Rigetti's quantum computers can be operated by way of the means discussed by other commenters (toothpicks and tweezers at the oscilloscope), or can be loaded with a quantum instruction code called Quil [0], which can be written with Python using pyQuil [1,2]. The former is preferred by physicists doing physics experiments, and the latter by software engineers and application developers.

[0] https://arxiv.org/pdf/1608.03355.pdf

[1] http://forest.rigetti.com/

[2] https://github.com/rigetticomputing/pyQuil


This is a solid question and it's worth an answer. There's kind of four things I want to get to: 1. the hardware is highly varied; 2. there are some fundamental operations which any hardware is going to have to be a quantum computer; 3. the actual "programs" consist of some precisely timed electrical signals turning on or off some of those fundamental operations in order, followed by (and occasionally interleaved with) these destructive "blasts" which actually tell you what's going on inside; 4. it helps to maybe revisit 2 and 1 once you know that with some specific examples.

1. You know how there's a lot of different hardware implementations for computers? There's Babbage engines, there's the silicon that we use every day, once upon a time there were vacuum tubes and such... we get a lot of freedom because the definition of computation, e.g. the definition of a Turing machine, just involves a sort of abstract manipulation of symbols but does not say anything about how those symbols are embodied in reality. The same bits that are high/low voltages in your CPU or copper cable can be flashes of light in a fiber-optic cable or aligned spin-domains in your spinning hard-drive or whether current can flow through a channel in your flash SSD. The oldest trick in the book is to represent one set of "logical" bits as a different pattern of "physical bits" (which then get represented their own way downstream) in order to correct errors or transmit heavily biased signals without net voltage bias or whatever.

Same with quantum computers. We don't know which hardware is going to scale so we've got spin qubits and superconducting flux qubits and cavities with super-high-quality mirrors pointing inward which trap photons, and all sorts of other qubits. But you can think of these as small little isolation chambers which must be shielded from each other and the rest of the world, maybe microscopic, perhaps all in a row. We have many candidates for what goes in the box and how the following operations are done.

There are some possibilities for alternative designs for "quantum computers" that are not the conventional quantum computer design, for example D-Wave is trying to do the quantum annealing thing, have lots of these things all connected. But researchers like the following approach because it is very concrete and has separable concerns which can be separately addressed.

2. There are some fundamental operations which these little rooms need to be able to do.

First off, we have the entanglement operations. These need to be able to shove these rooms together so that they affect each other. When shoved together, what matters is a set of 4 energy levels, E00, E01, E10, E11, of how much interaction energy there would be if both bits were forced to 0, if one bit was forced to 0 and the other was forced to 1, etc -- and the absolute value of these energies (not just their differences) also matters relative to the baseline energy B that all the other qubits experience while this happens.

The shoving operation is actually allowed to be pretty much anything that generates a "nontrivial" set of E00, E01, E10, E11, because it's the hardest thing to get right and yet only one little property of it ("quantum entanglement") is really needed.

The next operations are the room-phasing operations. These apply to any room by itself, and come in two forms: you can either raise the energy of the room relative to all of the other rooms (which only affects things if this room is entangled with the other rooms), or you might create an energy difference between the 0 state and the 1 state, or so. It turns out that you can think of all of these as a rotation of a sphere[1], so a very common model is to have "pulses" of light or magnetism or whatever in two different physical directions, with very precise control over how long that pulse lasts, to give full ability to twist these spheres that way.

The final operation is the readout operation, which requires "blasting" the qubit somehow into some really big system which can report and process the measurement. The nice thing about blasts that there is a nice chunk of theory which says that readouts don't actually have to happen before any particular time, though you might need them to happen after some entanglement step or so. There are quantum versions of all of the logical gates which can be implemented on this thing if it is too hard to blast away the room to get a bit of classical information, and then connect that to all of these lasers or whatever that are manipulating this system from these different directions.

The "blasts" are certainly different from the "pulses", the "pulses" need to basically have no real input or output, they just need to change the local environment a little bit. The "blasts" need to hit the specific device slightly harder in some way. The technical difference is that examining the outcome of a pulse you should not have any idea whether the qubit is more 0 or more 1 or anything else about it; after the blast, you should have total knowledge about whether the qubit is 0 or 1. So usually these two need to exploit some sort of very different physics.

3. So, how the computers are actually used is, we get four or more of these boxes together and try to figure out a routine to start out the first four rooms in the |0000> state, move it to the |1111> state representing the number 15, and then we try to figure out whether we can run the quantum factoring algorithm to find out that fifteen is three times five. So there is some fussing around with the thing as it's brought to ultra-cold temperatures, then there are some time-controlled pulses from which do these bit rotations, some lower-finesse entanglement steps where the particles are somehow connected for some set times, then there is maybe some other schedule of more pulses... and then finally your readout lasers blast the thing, and get 4 bits out of it, and we write down what binary number we saw. We repeat a lot and we find out, woah, 40% of the time this was 3, and 40% of the time this was 5, and 20% of the time it was something else. We then daydream of the day when this 20% becomes low enough that we can run a quantum error-correcting-code on top of this thing so that we can use three times as many qubits but the error will drop rather than increase.

4. Just to get some ideas of what the hardware might look like inside this room, let me give two examples.

In flux qubits, which are maybe coming up on 20 years old or so, there is a little superconducting loop which holds the qubit as a superposition of currents going clockwise and counterclockwise around it, the single-qubit pulses are accomplished by firing microwaves at the thing, while the actual readout is done by another superconducting loop that is in close proximity called a SQUID, this thing is sensitive to electromagnetic induction, which. You measure your ability to fire a bunch of electrons through the SQUID in order to figure out whether the current in the flux qubit is going clockwise or counterclockwise. By contrast this "bringing the rooms together" operation really needs to bring these two micrometer-sized devices into inductive contact so that they start influencing each other. And of course nothing is superconducting at room temperature so this all is happening in some big refrigerator.

Another promising thing in recent years has been the "NV qubits," which consist of a nitrogen atom sitting in one of the carbon locations of a diamond's crystal lattice, a "defect" in the otherwise perfect lattice. These can be made to 'fluoresce' or flash, so you can just see where they are and then deposit some structures on top of wherever they turned out, and the key is that you get a nice spin state in the qubit which has a nicely long lifetime even at room temperature, so you don't need the elaborate cooling setup that you need for superconductors -- and it interacts nicely with microwave pulses to do the single-qubit manipulation. There are some tricky problems with readout just because diamond has a very high index of refraction, which means light moves very slowly in it and gets bent a lot, but the bigger problems occur with trying to get these things to interact with each other since you basically just have this diamond lattice and it has some nitrogens in it and you can't really move them relative to each other or insert some sort of spin-wires into the lattice that will connect the two spins or anything like that. So it's missing this most-important property of the entangling operation. My guess is that if one of these looked like a computer it would have to have these little diamond qubits doing this process via some elaborate setup existing above the chip, some constant source of short-lived microwave photons which could be entangled with these qubits and then with each other, then immediately measured and the measurement used to tweak the NV qubits, leaving only the residual entanglement between the NV qubits. But you would see the same pattern of a computer program as some timing of pulses that are fired at these systems, combined with some timing of these entanglements-by-photons or so.

[1] https://physics.stackexchange.com/questions/204090/understan...


> nitrogen atom sitting in one of the carbon locations of a diamond's crystal lattice

This reminds me of an experiment I saw on Sixty Symbols in which they had encased a single H20 atom inside a carbon molecule's "shell", and pelted it with radiation to see how it behaved. Aparrently the water molecule thinks it's all by itself and isn't affected by things on the "outside" of the carbon molecule.


Quantum computer as a physical device means quantum circuit.

Each quantum computer is special purpose circuit implementing single algorithm.

Because quantum computation works trough destructive interference, there is no interaction during execution. You insert the input, run it trough circuit and extract the result as classical output at some point. The whole setup science experiment looking gadget sitting in the lab (with the exception of Dwave).

Universal quantum computer (aka Quantum Turing Machine) is concept used in computational complexity theory. I'm not aware of any suggested physical implementation of universal programmable quantum computer. I don't even know if it makes any sense.


Universal quantum computers do indeed exist. It's true that some quantum devices are special purpose (e.g., D-Wave's annealers), but most groups doing quantum computation are doing universal gate-based machines (e.g., Rigetti, Google, IBM, etc.). They're just being made at small scales right now (2-50 qubits).


These are still using quantum circuit model. Universal quantum gates are universal because they can be can be used to build any circuit.

Programming the quantum computer means designing the circuit. Gates are reconfigured for each algorithm.

These models are computationally equivalent of course.


Is it possible that protecting superposition and entanglement will be a fundamental problem for scaling quantum computing? I could imagine that beyond some point, each time you add an extra qubit the cost of shielding the computation goes up by a multiplicative factor.

Say the observable universe has something like 10^100 subatomic particles total. I believe that by current estimates, this is an upper bound.

A quantum computer with 350 qubits would have 2^350 states, or more states than the total number of subatomic particles in the universe. A single "NOT" operation on one of those qubits would, in constant time, change half of those states. I.e., the quantum computer would do way more work than there are particles in the observable universe. For every single one of its operations.

I just don't see how the universe will let nerds on some tiny little out-of-the-way planet get away with that.

QC demonstrably works for small numbers of qubits. I just wonder if it will ever scale to the point that large problems can be tackled with QC in a cost-effective manner.


Number of states in a superposition and number of particles in the universe aren't really related measurements at all (except in the most trivial way) -- at least so far as I can see.


10^50 atoms on Earth, 2^1024 states in a purely combinational circuit, I wouldn't be shocked if someone once said something similar about scaling silicon, with some different speed constraint.


"Quantum Computing Might Be Here Sooner Than You Think"

Haven't these stories had this same headline for at least 10 years now?


Exponential progress is still exponential progress. The quality of quantum devices has increased exponentially in a variety of ways over the past couple of decades.


Is that similar to the exponential progress in flying cars since the 50s?


yep, just like fusion. I do think the money is on Martinis actually showing a basic working 49-bit QC in the next year, though.


It's going to be fun working in cryptography soon.


It already is! There are a huge number of research opportunities.


I wonder if and when we have working quantum computers bitcoin and crypto will be able to adapt or die. I mean, breaking secret keys with quantum should be easy.


Existing crypto would be broken, but quantum computation also seems provide much stronger encryption possibilities by leveraging entanglement states. [1]

1: https://en.m.wikipedia.org/wiki/Quantum_cryptography


Not all existing crypto would be broken. Symmetric encryption algorithms like AES will be fine. Asymmetric encryption (public-key) algorithms like RSA/DSA will be in trouble due to Shor's algorithm, but other public-key systems will be fine. Hash functions like SHA-2 and SHA-3 will be mostly fine, because Grover's algorithm is at best sub-exponential, not quadratic.

Also, quantum cryptography mostly provides superior key establishment/management capabilities, not superior confidentiality guarantees. It's a misnomer to state that we can achieve "stronger encryption" by "leveraging entanglement states"; most of quantum computing is useful for cryptanalytic attacks right now, not cryptographic construction.


Thanks, that's very enlightening!


The next version of ethereum includes a "bring your own cryptography algo" feature that lets you substitute your preferred algorithm for the default choice. This gives it safeguards against the outside risk that a direct attack against core cryptography ever becomes feasible.


Merkle–Damgård algorithms are thought to be safe. I'm not sure about Keccak.


Keccak (SHA-3) is currently thought to be safe. The current state of the art would be Brassard-Hoyer-Tapp[1], but SHA3-512 should be secure against that. It's basically a birthday attack that generates a giant table, then uses Grover's algorithm to find a collision.

_______

1. https://link.springer.com/chapter/10.1007%2FBFb0054319


I'll believe in a quantum computer when I see one.


That's what I say about Turing machines! </sarcasm>


Same.




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: