Hacker News new | past | comments | ask | show | jobs | submit login
Design of a silicon quantum computer chip (unsw.edu.au)
248 points by breck on Dec 18, 2017 | hide | past | favorite | 145 comments



The important bit: "The UNSW team has struck a A$83 million deal between UNSW, Telstra, Commonwealth Bank and the Australian and New South Wales governments to develop, by 2022, a 10-qubit prototype silicon quantum integrated circuit – the first step in building the world’s first quantum computer in silicon."

So no, they haven't built a many-qubit general quantum computer.


There has been some slow but noticeable progress in building quantum computers using semiconductors. Researchers at Princeton were recently able to implement a two-qubit quantum gate in silicon [0].

[0]: https://arxiv.org/pdf/1708.03530.pdf


I'm pretty sure that's not slow progress.


It depends on what we mean by "slow progress". The physical implementation of a quantum computer in silicon was proposed in 1998 [0] and a two-qubit gate was recently performed on it. This is in contrast to the ion trap quantum computer, where a two-qubit gate was proposed and performed in 1995 [1].

On the other hand, superconducting quantum computers followed the same road that silicon is on right now. It took a couple of years for superconducting qubits to be robust enough to perform universal quantum computation, but it's progress has accelerated in the past couple of years. So this says nothing of how fast silicon quantum computers will progress in the future.

[0]: https://en.wikipedia.org/wiki/Loss%E2%80%93DiVincenzo_quantu...

[1]: https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer#H...


Remarkable as they are, today’s computer chips cannot harness the quantum effects needed to solve the really important problems that quantum computers will. To solve problems that address major global challenges – like climate change or complex diseases like cancer – it’s generally accepted we will need millions of qubits working in tandem.

Are there non-classical algorithms with theoretical advantages for biological or climate simulations? Or is this just more hype-nonsense courtesy of the press office?


In drug discovery, my field, we do a lot of quantum mechanics calculations, which are formally exponential-scaling. Using a lot of approximations, we reduce this to a polynomial, at the cost of accuracy. But a quantum computer could do the original exponential problem with polynomial scaling (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2596249). Quantum superiority for this problem would require a few hundred qubits, make drug screening a lot more accurate, and make obsolete an enormous body of theory devoted to classical approximations of this problem.


This is not my area but I understand that material sciences see the same kind of potential for QCs - predicting how new alloys or ceramics will perform by accurately simulating atomic interactions.


Yes, that is true! During my PhD program I used the same methods to study metalorganic solar cells.

If anything, fast quantum calculations would make an even bigger difference in materials science than in drug discovery. That's because druglike molecules have been studied a lot, so classical approximations (though limited) at least exist.


"Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy." -- Richard Feynman

"Simulating Physics with Computers", International Journal of Theoretical Physics, volume 21, 1982, p. 467-488, at p. 486 (final words) [0]

[0] https://en.wikiquote.org/wiki/Richard_Feynman#Quotes


The ability to simulate small molecules and reactions at a quantum level will be truly profound.

The limiting step in being able to manufacture any small organic molecule from raw materials is algorithmic. Imagine a molecule printer that can compile a set of synthesis steps for an arbitrary molecule via simulation. It's not as far-fetched if we can simulate hypothetical reactions easily: http://www.nature.com/news/organic-synthesis-the-robo-chemis...


somehow i don't think this is going to make drugs cheaper, the development pipeline will still take years, cost billions, and costs hundreds/thousands $ per year per consumer ..and probably won't cure anything but just be another lifelong pill


I actually agree with this, more or less. Finding promising molecules is only step 1, and drug development would still be expensive even if molecule discovery were easy.

The other parts of what you're saying are true but don't depend on what happens in quantum computing, they're just part of the for-profit drug industry. There are government and philanthropic drug hunters today who are doing good work, and if they had quantum drug discovery, they would be able to do even more.


The idea of quantum computing arose out of an attempt to make a better quantum mechanical simulator, which would allow for more efficient large molecule energy analysis, directly applicable to molecular biology.

The new computing model was later found to be generally useful and applicable to things like climate modeling too.

The simplistic intuition is that there are a lot of exact algorithms which are exponential, so we use clumsy polynomial approximations instead. A quantum computer makes many categories of exponential algorithms polynomial and therefore computable if a big and fast enough quantum computer can be built.


I haven't pinned down the research yet, but I've read in popsci mentions and arxiv abstracts that at least a couple other people have cited, that training neural nets would be accelerated by a full-scale (i.e. mega-qubit) quantum computer.

Anyone with more expertise care to point me at a relevant paper?


Here's a ppt deck covering a relevant overview of the subject.

https://medium.com/@_NicT_/quantum-machine-learning-c5ab31f6...


There has been some effort to use quantum computers to solve machine learning problems. One for training neural networks [0]; and another for unsupervised machine learning [1].

[0]: https://arxiv.org/pdf/1712.05304.pdf

[1]: https://arxiv.org/pdf/1712.05771.pdf


I can't speak to a claim that there are proven theoretical advantages for biological or climate simulations; but researchers are looking for them.

With respect to biological simulations, researchers at IBM were able to implement artificial life algorithms on a quantum computer [0]. They discuss some potential advantages to simulating life on a quantum computer versus a classical one.

Researchers have also been trying to figure out if quantum computers can be used for weather predictions [1].

[0]: https://arxiv.org/abs/1711.09442

[1]: https://link.springer.com/article/10.3103/S1068373917090011


With respect to biological simulations, researchers at IBM were able to implement artificial life algorithms on a quantum computer [0]

Genetic algorithms are similar in some regards to simulated annealing. Apparently, D-Wave's approach is kind of like quantum simulated annealing.

https://arstechnica.com/science/2017/01/explaining-the-upsid...


Protein folding and fluid simulations theoretically rely on these sorts of quantum calculations. I don't think any directly relevant algorithms have been developed because the hardware deosn't exist for it in the first place.


Good point about protein folding. Right now it's simulated with classical physics on conventional computers; quantum chemistry algorithms on classical computers scale too poorly for use with macromolecules. But a many-qubit computer could be excellent for that and other molecular-scale calculations. (Not that getting good protein folding models would cure cancer, but it's reasonably likely to aid biological research.)


Ab-initio quantum chemistry calculations certainly could scale much better on quantum computers -- I skimmed a paper once, which made it sound not easy but definitely worth working on -- and that's bound to help with bioscience once they're big enough. I'm surprised at protein folding or fluid dynamics, though -- what's the general idea on how it'd help with those?

Added: that is, I thought protein conformation energy was pretty well approximated with classical molecular-mechanics models, except during actual chemical reactions, and the difficulty was the sheer amount of time it takes this huge molecule to settle down to an energy minimum that's only a little below that of many alternative conformations. Is the biggest problem rather that we can't efficiently compute a consistent-enough approximation to the energy of a given conformation, so we can't distinguish the best conformation of the ones we simulated?


Based on what I read after I saw the protein folding suggestion, the idea is that with a Sufficiently Advanced quantum computer, you could use fully quantum calculations for small molecules and proteins alike. I'm puzzled about fluid dynamics too, though.


All current fluid simulations are iterative and assume the fluid is a continuous indivisible mass (e.g. that it is not composed of atoms or molecules) that we then turn into discrete regions to compute the simulations. Even without common simplifying assumptions (to gain performance), this still has limits. Theoretically, simulating the regions with a quantum computer would allows us to remove these limits, and get the performance without making simplifying assumptions.


That sounds super-ambitious, but if humans can make LIGO, there must be a lot we can do.


Sounds like a lot of the proponents cite improvements in machine learning, which if powerful enough might be able to make improvements.


Isn't climate simulation in and of itself a quantum calculation?


Are you thinking of chaotic systems? The time and length scales for climate simulations are such that it's one of the most-classical simulation domains I can imagine.


how can we know...


Shor's algorithm for integer factorization predates any quantum computers capable enough to do integer factorization faster than classical computers. I'm wondering if there are quantum-computer algorithms for improved biological/climate simulations that, likewise, just await large scale practical quantum computers before they become useful.


The same way Church and Turing knew what classical computers were capable of before the first computer was ever built.


The first computers actually predate them.


Does this require freezing the chip to near absolute zero like existing designs?


Yes it does. From the paper: "We assume that the complete structure is maintained at cryogenic temperatures (∼1 K or less) inside an electron spin resonance (ESR) system, which will be used to apply qubit control pulses."


Note that this is not as difficult as it sounds. Here is a video explaining how laser cooling works, by using 6 lasers pointing towards a sample with a wavelength just longer than the absorption wavelength of its atoms in order to decrease its momentum regardless of which way it's moving (details near the 2:20 mark):

https://www.youtube.com/watch?v=hFkiMWrA2Bc

So mainstream cryogenics could be a side benefit of technologies like this.

After rereading this, I'm not sure if the ESR itself can perform cooling.


People who design science fiction settings who want people mining the Moon always talk about how we'll need He3 for advanced fusion in the future. But I swear if quantum computing takes off we'll need to mine the moon for He3 for all the dilution refrigerators we'll want to build.

https://en.wikipedia.org/wiki/Dilution_refrigerator


Does that mean at all times (permanently, like whenever the chip is being used) or just during manufacturing?


That "using mostly standard silicon technology" phrase implies that cryogenic temperatures are not needed during manufacturing, only during operation.


'Maintained at / in' usually means continuous or on going.I have not read the paper though, so cannot say for certain.


It means whenever it is being used. Manufacturing silicon chips involves temperatures from room temperature to ~1300C.


Previously on HN, Quantum Computing Explained: https://www.clerro.com/guide/580/quantum-computing-explained


The actual article ( shorn of the breathless "oh my gosh" university PR department hype ) is:

Silicon CMOS architecture for a spin-based quantum computer M. Veldhorst, H. G. J. Eenink, C. H. Yang & A. S. Dzurak Nature Communications 8, Article number: 1766 (2017) doi:10.1038/s41467-017-01905-6

It can be found free at: https://www.nature.com/articles/s41467-017-01905-6

ABSTRACT: "Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits.

However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system.

The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout.

We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing."


Does anyone know if this is just marketing or if they've actually solved the technical problem. If they have solved it, what's the ETA (months, years, decades)?


I am not an expert, but my assessment is that it's pure marketing. They talk about needing millions of qubits to do useful computations. This is flat out false. Thousands will already solve interesting problems that no classical computer can currently solve, e.g. factoring RSA moduli. They also say that many qubits operating together will be required to build a practical quantum computer (true), and that their technology allows this (false). The problems involved in maintaining coherence and correcting errors, at reasonable operating temperatures do not boil down to how to fit multiple qubits on a silicon wafer. This is like saying a new process for manufacturing nuts and bolts will allow the construction of Boeing 747's. The paper may contain more sensible statements, but as usual, the press release is probably nonsense.


You're probably confusing physical qubits (inside the actual physical device) and logical qubits (on which the platonic ideal calculation takes place). If we had 1000 logical qubits, we could indeed do useful things, but real-world implementations require huge amounts of error correction, which is much tricker on quantum computers than classical ones. Asymptotically the ratio of physical to logical qubits can approach 1 (just polylog overhead), but for early calculations the ratio will literally be more than a factor of a thousand. (This is true for all concrete proposals to date; people have speculated that there might be something useful to do with 1000 physical qubits for certain problems if the error could be small enough to be ignored and not corrected, but no one has actually shown explicitly how to do this, or produced the components with the quality that would be necessary.)


Yes, I was certainly confused. When IBM announced they are building a 7x7 = 49 qubit machine, I naturally assumed they meant logical qubits, not physical ones. Using surface codes, they can probably get one logical bit. But what confuses me are past records where they used Shor's algorithm to factor small integers on a quantum computer. How is this possible if we haven't even built a single logical qubit yet? It's surely necessary to have ~2N logical qubits to factor an integer of N bits.

I have an idea that might help though: if logical qubits are so difficult to build and require thousands of times more physical qubits, and physical qubits are also difficult to build, perhaps we can simulate physical qubits with thousands of times more "ultra-physical" qubits.


Two excellent questions.

Regarding claims of factoring small numbers: First note that error correction becomes exponentially more important for longer computations. Second, see this

https://arxiv.org/abs/1301.7007

Regarding building a tower of ever less noisy qubits: it turns out there is something called a threshold theorem which requires a certain minimal level of noise before concatenating error correction schemes makes things better rather than worse.

https://en.m.wikipedia.org/wiki/Quantum_threshold_theorem

Reaching this threshold for physical qubits in the laboratory is perhaps the primary goal of experimental quantum computing.


That first article you cite certainly puts the whole thing in perspective. I was tempted to put an article on Arxiv announcing a new architecture for billions of "super-physical" qubits, but you've convinced me that not only would the real experts see right through this, but that it might bring me down to the level of some of the qubit measuring contests currently on display from the likes of IBM, Google and Microsoft. Thanks for the genuinely interesting answers, especially to the not completely serious question!


When I took a course on quantum computing in 2007 with a standford prof, he said that things would get interesting after we get to 128 qubits :)


I guess someone will eventually introduce some kind of terminology like, "qubit equivalent". But I think that if you could build an error free 128 qubit machine, this would be interesting. But the main reason I see for that is if you can build one with 128 qubits, almost certainly someone with a lot more money than you, with a black budget, can build one much larger. The basic thrust of the article, and the paper, seems to be that if you can engineer a large enough quantum computer than doesn't really work, you can simulate one that doesn't exist. It's truly great if they are right, but I can't find a single statement about this paper on the entire web by an expert that is enthused about this announcement. The paper is short enough that there should be some comment by now.



That's a very specific number, do you know what the relevance of 128 is?


No I don't think it was specific. I believe he was just guesstimating the exponential power required to do anything more interesting than basic math. We were all rather young students at the time so I don't think he had a specific application in mind. It was all rather new back then anyways, hard to imagine all the applications.


He is probably referencing the 128 bit security threshold most crypto systems seek to achieve. However this doesn’t really line up with the number of qubits needed to break said crypto, which is typically much larger (thousands of qubits, plus error correction to survive millions of operations).


Edit: less than a doubling. Eg from 128 to 192 or from 256 to 384. I meant to say a doubling is sufficient.


> Thousands will already solve interesting problems that no classical computer can currently solve, e.g. factoring RSA moduli.

With hundreds you can already speed-up chemistry simulations, what will have a much larger impact than breaking RSA.


> With hundreds you can already speed-up chemistry simulations

You can? How?


You need an exponential increase in performance (power of N) to simulate an increasing number of molecules, while for quantum computers you may need only a polynomial increase in performance (power of 2).

So it's possible to simulate 25-qubit-level chemistry with a laptop, 50-qubit-level chemistry with a supercomputer, and 100-qubit-level simulation with a supercomputer is impossible and will be impossible for a very long time.

But going from a 50-qubit quantum computer to a 100-qubit one may take just 2 years (or 4, it doesn't really matter), and then another 2 years to get to 200 qubits, and so on. Quantum computers will solve chemistry problems in minutes that would otherwise have taken 100 years to simulate.


So what I'm asking is how does a quantum computer solve quantum chemistry problems? To make it concrete, say I want to calculate the band structure of a crystal using a quantum computer? What do I do? What kind of algorithms can the QC run that solves this problem faster/better than, say, a DFT code?

I don't think it's a given that just because I want to simulate a quantum mechanical system in some way, that a quantum computer is a useful tool. Or it might be, I don't know. I just don't think it's enough justification that there happens to be the word "quantum" in both the problem description and in the name of the computer.


Simulating quantum chemistry was one of the original motivations for quantum computers.

See this paper for an overview https://arxiv.org/pdf/1203.1331.pdf


Thank you, exactly what I was looking for!


ETA 2022 until a 10-qubit silicon quantum computer prototype is made.

I doubt it's just marketing. UNSW is the leading university in researching silicon quantum computers and one of the best in researching quantum computers in general, too.


You cannot do anything interesting with 10 qubits. Anything you can do with 10 qubits will likely be faster on conventional computers. IBM says you need at least 50 before it gets interesting.


I realize that, but I think after a 10-qubit chip is made, it will be easier to scale up.

This tech is aiming to be "mainstream". Google, IBM and others' quantum computers will likely prove useful earlier, but we won't see them be paired with our personal computers. This on the other hand, offers that opportunity, but it will lag behind other types of quantum computers in terms of performance.


I don't think we will get devices cooled by liquid helium at our homes in any short time.

But this makes it possible to distribute them at more datacenters that can be much easier to reach than what Google and IBM are creating.


This is not my field, but IIRC dilution refrigerators capable of sub-K temperatures are nowadays available off the shelf from a number of commercial suppliers. Which is a pretty cool (har har!) advance considering not many decades ago labs had to create their own, at great expense.

So in principle, there might not be any really fundamental issues preventing such things in every home, with mass production economies making them cheaper and more reliable.

The real question IMHO is whether this will be useful in some way compared to just connecting over the internet to some data center running them.


Safety is a fundamental issue. Home appliances don't get the best maintenance available, and supercool and high-pressure containers aren't easy to keep around.


What interesting things happen at 50 qubits?


We can prove that quantum computers are faster than classical computers at solving certain problems (like simulating atom interactions). Although IBM recently moved that goal post, and proved that we can simulate 56-qubit (and probably a little more than that) with classical supercomputers.

https://spectrum.ieee.org/tech-talk/computing/hardware/ibms-...


It's debatable how generic this 56-qubit simulation is. We can already "simulate" millions of qubits, with entanglement, as long as they dance in a particular way...


I figured as much. So whoever develops a 50-qubit quantum computer first (Google?) will have to prove that the same simulation can't be done with a supercomputer. And they'd need to be serious about it and use the latest research in classical simulation, including IBM's 56-qubit research to try and do it on a supercomputer.

Otherwise, they may claim "quantum supremacy" and 3 months later someone else will prove that the same simulation can be done on a classical computer.


You can't do anything interesting with ENIAC, but you had to build those before you built anything else :^)


The people that built the ENIAC did it to solve actual problems.


Well, every new technology needs a proof of concept first.


Uhm, without even RTFA, it’s a .edu.au domain, so an educational institution (in Australia), not a commercial venture, so presumably pretty immune from marketing hype.

EDIT: sorry guys, I assumed the sarcasm would be transparent. Apologies.


I assume that was meant as a joke. Universities regularly put out utter nonsense in press releases (the marketing departments at universities are often ground zero for the most atrocious exaggerations that end up in the press). And I doubt universities in Australia are specially immune from this.


Yes it was meant as a joke but unfortunately my prose was a bit too deadpan, I edited it to add a warning, sorry.


It's from the University 'Newsroom' department, and AFAIK these are pretty notorious for over-hyping University research. It essentially is a marketing department for the University.


Yes, sorry, as I mentioned to somebody else I though my sarcasm would be self-evident, but maybe I just dripped with too much caustic fluid... sorry.


At some level, this is marketing hype. They have NOT "solved" quantum computing in the sense that a layman would expect: room temperature quantum integrated circuit able to give exponential speedup over classical algorithms.

I would say the biggest innovation here is that they have made a silicon based integrated circuit with fault-tolerant qbits (using error-correction codes they invented).


Given the order-of-magnitude increase in speed (https://www.theregister.co.uk/2016/04/18/d_wave_demystifies_...) combined with the assumption that the price of these chips will go down as commercialization becomes more viable, isn't it naive to even consider the viability of cryptocurrency? I mean, if the core underlying value of this "asset class" is the fact that there is a scarcity of numbers (like collector baseball cards) combined with ease and anonymity of transactions (like diamonds), then unlimited computing power will demolish the value of cryptocurrencies as much as alchemy would diminish the value of Au.


There are several misconceptions about what universal quantum computers could do. One of them is talking about exponential increase of computing power with each new qubit. It just doesn't work like that. This brief talk from Scott Aaronson tackles this issue quite well: https://www.youtube.com/watch?v=JvIbrDR1G_c

While quantum computers threaten some cryptographic schemes, it will just be the end of those schemes, not the end of cryptography in general (see: post-quantum cryptography). As long as we still have problems that are hard to compute and easy to verify, we can have cryptocurrencies. Quantum computers do not give "unlimited computing power" and do not threaten the existence of cryptocurrencies in general.


As far as I know, there isn't a conclusive decent public-key algorithm that is quantum resistent.

We know that Diffie Helmann, RSA and their elliptic counterparts are broken in quantum. That was essentially all we had. I know that 'lattice based cryptography' is a potential option, and believe there are more potential options. However, as far as I know it is not even clear whether these are safe against classical computers.

I should note that I don't know much more about 'lattice based crypto' than the name and the hope it is quantum resistent.


Merkle-Damgard hash functions (such as SHA256) are merely weakened by QC. SHA256 is considered good enough and as such, BTC mining should be able to weather QC. There is a variant of Shor's Algorithm that can solve ECDSA, so you might have problems with your BTC wallet; a fork using a new resistant asymmetric algorithm would be needed.

Memory-constrained algorithms, such as scrypt, should be quantum resistant based on what I know (which isn't much). If I'm not horribly misinformed, LTC mining should therefore be resistant. Again, LTC wallets would face problems.


Does the same hold for hashes based non on Merkle-Damgard but the sponge construction?


It looks like we don't know, excluding some very specific sponge functions, based on some quick research.[1][2]

[1]: https://eprint.iacr.org/2017/771.pdf [2]: https://crypto.stackexchange.com/questions/419/what-security... (NB: comments)


Order of magnitude increase =/= unlimited, right? Also, it's my understanding that there is already plenty of work done on post-quantum cryptography. I don't know if it's entirely a solved problem, but my understanding is that it's at least solvable. Perhaps someone with more domain experience can weigh in.


There are numerous options for quantum-resistant algorithms. Existing cryptographic currencies can fork to whatever seems most suitable. This wouldn't be a trivial change, but it can certainly be done. https://en.wikipedia.org/wiki/Post-quantum_cryptography

There is also no such thing as 'unlimited' computing power with regard to proof of work. Difficulty simply adjusts, and the cost of a network attack will always be related to the rewards of mining.


crypto-currency is not about scarcity of numbers its about proof of work. Bitcoin started with relatively little computation power and has now scaled to where more power is spent than a small country uses in the mining operation... Its unfeasible to mine bitcoin on a cpu or gpu because ASICs are just so much more efficient. None of this scaling in hashing rate really changing the underlying concept which is that you prove you did some work in order to add blocks. If quantum computers do in fact result in a couple of thousand orders of magnitude raise in the hashing rate... well then miners will use different hardware the difficulty of the network will adjust each 10 minutes as new hardware gets added, and after the change you'll just have a large army of quantum miners instead of ASIC miners.

Remember the exitsanse of quantum computers does not prove P=NP, which is to say, even with quantum computers you can still have the class of problems which are difficult to solve but easy to verify which is the essential component of proof of work.


> If quantum computers do in fact result in a couple of thousand orders of magnitude raise in the hashing rate... well then miners will use different hardware the difficulty of the network will adjust each 10 minutes as new hardware gets added, and after the change you'll just have a large army of quantum miners instead of ASIC miners.

If it's a couple of thousand orders of magnitude, the first person with a working quantum chip might entirely dominate the hashpower, long before the chip is available commercially. So you'd end up concentrating hashpower into a couple of BigCorps's hands. Like, if I went back in time to the beginning of BTC with an ASIC I'd probably completely dominate the baby blockchain and outcompete the entire network.


No, there are quantum resistant protocols that could be switched to instead when the gap starts to close.


There are several quantum resistant crypto currencies. I.e. IOTA


IOTA isn't even resistant to differential cryptanalysis.


Care to add sources?


A flaw was found in the hash function they rolled themselves which doesn't build confidence there aren't more vulnerabilities to be found:

https://medium.com/@neha/cryptographic-vulnerabilities-in-io...

https://www.forbes.com/sites/amycastor/2017/09/07/mit-and-bu...

To make things even worse than rolling their own hash function, they're claiming the flaw was intentional! This makes them hostile and untrustworthy to the open source community if this is true and further erodes confidence they know what they're doing if this is their attempt at PR spin:

https://hackernoon.com/why-i-find-iota-deeply-alarming-934f1...

> Next, and in my mind most damningly, Sergey Ivancheglo, Iota’s cofounder, claims that the flaws in the Curl hash function were in fact deliberate; that they were inserted as ‘copy protection’, to prevent copycat projects, and to allow the Iota team to compromise those projects if they sprang up.


These points have all been talked over extensively and are not a concern: https://medium.com/@comefrombeyond/cfbs-comments-on-https-ww...


That blog response is completely unconvincing and makes me trust them even less. They have no proof of the extraordinary claim that the flawed hash function was intended (which is bad whether this is true or false) and their network going down for three days is a huge red flag that the protocol cannot be used in a decentralised way. He talks a good game but if you have any technical background you can see he isn't addressing what's being asked and is avoiding valid criticism via hand waving and character attacks.


I disagree with you entirely.

The creator requested an open, live debate to put these claims to rest. They haven't accepted and probably won't.


Can you please explain what makes it quantum resistant?


Not all crypto becomes easy to crack with a quantum computer: https://en.wikipedia.org/wiki/Post-quantum_cryptography

With its use, the scarcity of numbers remain, that is, finding a clash that would allow to forge a transaction stays infeasible, and mining stays hard.

BTW while mining is so important with Bitcoin and Etherium, it's not a necessary part of a cryptocurrency. Mining is an incentive to keep doing blockchain validation. Some currencies exist without it (e.g. NXT).


To date the only practical replacements for RSA and EC achieve key derivation but not signatures. Without signatures we're not really 'complete' since certificates are hard to do with only key derivation.

Post-quantum cryptograph research is a very active area right now. If these researchers do not find a practical signature algorithm, PKI as we know it will change very significantly.

Ironically, blockchains might elevate in importance as a result, since the evolving blockchain is a useful construct for quantum resistance. It has cryptographic agility built in.


All crypto becomes easier (well, except for the one time pad). But some algorithms (AES, SHA) only require doubling the number of bits to achieve the same security level, which is rather trivial to do.


"IOTA uses hash-based signatures (https://www.imperialviolet.org/2013/07/18/hashsig.html) instead of elliptic curve cryptography (ECC). Not only is hash-based signatures a lot faster than ECC, but it also greatly simplifies the overall protocol (signing and verification). What actually makes IOTA quantum-secure is the fact that we use Winternitz signatures. IOTA's ternary hash function is called Curl."


Before anyone gets excited about getting a practical quantum computer soon, it should be emphasized that while they're experimentalists, this news is only about a design they have in mind to use with their SiMOS quantum dots. There's a gap between that design and the status quo of their physical quantum gates in their lab.

Their gates (both single and two qubit gates) are slow and their fidelities currently are below the threshold required for surface codes. There're also big questions about the readout fidelities (which is way low at the moment) and suppressing crosstalk.

That being said, quantum dots in Si/SiGE and SiMOS is a very exciting field these days.


Are there any good introductory pieces on quantum computing? I feel like I've heard so much about the subject, yet have not found a good paper to read that outlines the essential concepts/topics involved with it.


Michael Nielsen also has a really nice, accessible series of intro videos: http://michaelnielsen.org/blog/quantum-computing-for-the-det...


The standard text for the field is Nielsen and Chuang: https://www.amazon.com/Quantum-Computation-Information-10th-...


I think it is worth to mention that their team at UNSW is hiring and has PhD scholarships available. About 30 positions are open.

I am not affiliated them, but noticed the listings at http://www.cqc2t.org/employment


This is interesting - they plan to deliver a revolutionary device in 5 years and are about to hire 30 new people. Is that really feasible? I suppose the funding cycles mean you have no choice, because it is not possible to maintain a large dedicated research workforce for prolonged periods.


would somebody be able to compare this design with Microsoft's topological qubit design? which one has better unwanted interference/error correction potential, also are there any foundational operations that one design could do, but the other will not be able to ?


Why silicon and not something like graphene


uh so what does this mean?


That we could have quantum computing "accelerators" in our computers and maybe even smartphones by 2030, just like we're starting to see machine learning accelerators now (or GPUs for graphics processing in the 90's).


You can't just casually assume we'll suddenly figure out how to run qubits at room temperature in 13 years.


For clarification for other readers, the authors of the paper here do not make any claim towards that. They assume a setup where

> the complete structure is maintained at cryogenic temperatures (∼1 K or less) inside an electron spin resonance (ESR) system, which will be used to apply qubit control pulses.


The authors don’t, but the previous post (apparently) implicitly assumes it (or fails to understand what maintaining quantum coherence entails, or both).


Uh finally! Thanks for explicating this so succinctly and adroitly. I'm driven spare by the “quantum computers everywhere [because] Moore's Law bla bla bla” attitude.


So what kind of stuff can we do with that?


Nothing significant until we have many more qubits. With 128 qubits you could crack 128 bit RSA encryption in a few mins.


I've thought for a long time that this explains why the NSA is recording so much encrypted data. When QCs are available they will be able to decrypt all the things. Imagine what they'll learn.


That’s a bit conspiracy-theorist-ish but unfortunately in the present climate and given that has transpired from the Snowden leaks not at all implausible.

Either they have made a breakthrough in number theory that allows factorisation over finite fields, or they have a quantum computer already, or they are banking on having one soon. In any case, storage is so cheap that it makes sense to just record the traffic (which is useful for sigint anyway) and refer back to it “if and when”.

Of course, much of what we consider “public-key encrypted” traffic is actually encrypted with a symmetric cipher whose encryption is impervious to Shor’s Algorithm… it is the key-exchange process that really uses the textbook public-key encryption algorithms. So much of the traffic is technically impervious to whatever advances they have made, but of course once you have captured the key-exchange “handshake” that sets up the session and exchanges the symmetric cipher keys, you’re all set to retrieve the plaintext from the intercept.

There are public-key algorithms (particularly those based on lattices) that we currently hold to be secure against attack by quantum computers. I am surprised these are not seeing more adoption amongst the “enlightened paranoid cypherpunk elite”. Here is a primer on such “post-quantum cryptography” approaches, courtesy of the good folks at Wikipedia: https://en.wikipedia.org/wiki/Post-quantum_cryptography


In the process they'd get key exchanges too and it's easy to correlate traffic.

I thought it was a really straightforward obvious idea. I'd do it if I were them and had such deep pockets.

QC is clearly coming. Lots of news lately.


I was agreeing with you, of course if they have intercepted all traffic they have both public-key encryption key exchange and the resulting symmetrically encrypted traffic and that they could correlate it... I’m sorry if I was a bit longwinded and pedantic in my reply, I didn’t realise I was talking to virtual Hacker News nobility (with ten times the karma I do!).


Hacker News nobility is a dubious title haha.

Btw on post quantum crypto: the problem is that most of it has not yet had enough conventional cryptanalysis. Makes no sense to use an algorithm immune to quantum speedup if it's conventionally vulnerable.


Yeah I know, I’m no expert, but I studied applied mathematics.


What would be nice book / video / web link that will give a decent intro to a novice like me?



citation? A theoretical understanding of bits in quantum superposition would support the notion of traversing all combinatorially related possibilities however that is not the case for any realizable quantum computer. Realizable machines are based on specific quantum logic gates. otherwise you would have to design at the electron level, a new processor for every problem.


Here's the citation: https://en.wikipedia.org/wiki/Shor%27s_algorithm

You do need to update yourself. There are standard gates, complete computation theories, and algorithms for lots of interesting problems. Those aren't even new.


aes isn't broken by factorization of composites of prime. if the op had said rsa, sure, but they said aes (it was changed you can tell because there was never such a thing as 128bit rsa), but thanks for telling me about that "new" shor's algorithm devised in 1994.


Yes, comment edition sometimes leads to misunderstanding. But there was 128bit RSA, it was in use up to the middle of the 00's.


One claim is the near instant factoring of large integers...breaking a class of encryption that's widely used currently.


Quantum computers can be used to simulate quantum physics and approximate solutions to optimization problems.


Honest question: Why would the average person want/need to do that with their smartphone or computer?


Why would someone want more than 640k of RAM?


My thoughts exactly.. We don't know we need it till its to indispensable to remove. Only god knows what we could do with quantum computers. There are some low hanging fruit with encryption and quantum simulation.. but hey maybe it can be used in random world generation for games like minecraft.


The average person would probably not find a lot of use from those applications; but researchers will.


They wouldn't.


That's what Ballmer thought about iPhone.


not all that much unless you're in the field, in which case it's pretty cool.


[flagged]


Please comment civilly and substantively or not at all.

https://news.ycombinator.com/newsguidelines.html


I, too, have a complete design for the warp engine in my basement.


About the only ting i know about quantum computing is it will make current encryption techniques redundant. Is anyone thinking seriously about what that will mean for global e-commerce? There will be a point in time when these machines are not affordable on the mass market, but affordable by governments. At that point, presumably, we are all going to be pwned?


I think you mean "obsolete" and not "redundant" but yes, the common asymmetric encryption methods used for things like HTTPS are trivially cracked with a powerful enough quantum computer.

The good news is that people have been anticipating this for a while, and there's a lot of study in the field of post-quantum encryption. From what I understand, people have come up with techniques that work (meaning nobody's discovered a reason why they wouldn't work).

The bad news is that this does nothing for the data that we're currently encrypting. There's nothing to stop someone from storing today's encrypted web traffic and decoding it in the future when working quantum computers are capable.

Thanks to Edward Snowden and others, we know that the US government is already hoarding and sifting through data, so I have no doubt that there are plenty of agencies planning on doing this exact thing.


Here's a good starting point to learn more: https://en.wikipedia.org/wiki/Post-quantum_cryptography

Also, there's some related discussion in the amazing cinematographic masterpiece that is Sneakers: http://www.imdb.com/title/tt0105435/


not all encryption techniques-- off the top of my head, only RSA (algos that depend on prime number factorization)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: