If you go back quite a ways, I'm willing to bet at least some of the innovation-required Castles and Cathedrals and Pyramids of history were completed "on schedule" with it known up front that they would take many years to finish.
I suspect a lot of projects are accurately classified as 10-30 years away, given a certain level of funding and a dedicated team. Absent that, they may very well take twice as long or more. I always like to see what these long-term oriented institutions like SENS and MIRI say they would do if they had x times more money, to see that they've got a plan -- fortunately those ones do give that information.
Precisely. Take fusion for instance:
This is the source of the "joke" that fusion will always be 20 years away. It's because it's been grossly under-funded ever since that prediction was made.
What are these three experiments? Which of them have already been done in the existing plan? And what others might need to be added?
Also curious is the expected total cost. Eyeballing each plan average cost and multiplying by the duration, the total costs are 6x14=84, 4x17=68, 3x22=66, and 2.5x29=72.5 billion dollars. The accelerated and aggressive plans are more expensive than the moderate plan!
I suppose there's some additional overhead costs associated with keeping the program running for 29 years instead of 17, but I would have expected all the rush orders, overtime, and extra staff to make the faster programs much more expensive.
The LHC was approved for construction start in 1995. The wiki article doesn't capture it but you can bet the science behind the design of the LHC wasn't all done the year before. It didn't achieve its first real goals until 2015.
LIGO was dreamed up in the 1960's to detect gravitational waves. Took until 2002 to get the first one built big enough to theoretically detect them, and until 2015 to finally get measurements demonstrating the existence of waves.
Real science is slow. This "20 years means I don't know" business is indicative of perhaps not appreciating the difference between basic science and applied science/engineering. I know we're used to engineering projects pumping out new awesome tech every couple years, but basic science is hard and takes a long time.
I picked the word "technology" instead of "science" on purpose. Science isn't slow or fast; it can't be predicted at all. Technologies; different story. Someone says I'm going to build you a piece of technology and it will take 20 years, that means they don't know how to do it. There are no exceptions.
The problem is that for every person who says, "This is 20 years away", there are unrealistic people who are saying, "This is 5 years away". And there are pessimistic people who are saying, "Oh, this is just going to go on for ever. They don't really know how long it's going to take".
Will you ever be able to read popular news articles and reasonably expect the predictions to be accurate? I doubt it. Are there people in the world who have a pretty good idea how long it will actually take? For a great many things, yes.
Slightly more on topic, it's the thing in my home that most makes me feel like I'm living in the future :-)
I'm very skeptical when people do 20 year predictions from the current state of the art into something which is impossible to do today for any amount of money, such as this quantum computing based RSA breaker.
Short answer : over 114 years from 1714, Britain sought and paid rewards for accurate measures of longitude. Resulting of course in the invention of 'modern' clocks.
For sure. Government-promised highways, for example.
Oh, uh, wait a minute...
"If one counts demonstrations not based on quantum computing, some people have claimed even earlier precedents for the 3 x 5 = 15 theorem."
- Safari; Certificate is fine
- Chrome; Certificate error due to SHA-1 signatures
- Firefox; Unknown issuer
Why is the DoD root certificate in OSX but not in Firefox?
I'm all for having principles, but you have to accept that it'll break for your users.
Based on that I would consider this an "intranet" CA (albeit for a very large
intranet) and based on my previous "meta-policy" comments I would recommend
not including this in Mozilla et.al. I'll leave this bug open for a period of
public comments, and then I'll close it with "WONTFIX" unless someone can
provide compelling reasons why I should do otherwise."
Seems reasonable to me.
I also agree with the weaker argument that the cert in question is essentially for a local intranet and that the DoD can, for as long as it continues to exist, which I find politically disagreeable, install the cert locally on its own resources.
If it wants to publish material for broader consumption, it can get a cert like everyone else.
So yes, if you trust the DoD root certificate, then the DoD as well as every certificate authority in the world could in theory generate a valid certificate impersonating www.google.com or any website. With a sophisticated enough attack, they could do this just for your one visit to one particular website, in such a way that it would be difficult for anyone to realize that it's happening. However, though this is difficult to notice if you're not looking for it, it's actually really easy to notice if you are looking for it. If you use Chrome, then Chrome will report the certificates that it sees back to Google, who track what certs are issued by CAs. This is how Google noticed that Symantec issued fake certificates for Google domains in Symantec's test environment: https://googleonlinesecurity.blogspot.com/2015/10/sustaining...
Anyway, the practical risk of trusting a DoD certificate is pretty low. To decrypt your traffic, they'd have to man-in-the-middle intercept your connection to a web server, and replace the site's valid certificate with their own, which would leave an obvious and flagrant trail to anyone who is looking. This would very obviously "play their hand" and anyone with evidence of being attacked that way by a first world government would be immediate worldwide news in the security community. If they did this even once, they'd need to be extremely careful not to be caught by any of the countermeasures that detect this kind of surveillance.
That kind of attack would be a one time thing, because evidence of being attacked through the DoD cert would cause all browser vendors and OSes to yank support for it. CAs have been revoked for far less justified reasons than explicitly attacking someone.
I find it much, much, much more likely that targets of interest will simply be attacked and exploited through regular known security mechanisms - such as software vulnerabilities or built-in back-doors. These things don't leave an obvious trail and smoking gun pointing back to the perpetrator. Someone MITMing your website visits with the DoD root certificate would stir up a shitstorm. "Some anonymous IP broke into my computer with a 0-day and installed a rootkit" is not particularly newsworthy by comparison. Even the recent news of backdoors in networking product codebases is, while newsworthy, not really that surprising these days. Active evidence of DoD interception of someone's network traffic followed by evidence of CA certificate misuse would drop like a nuclear bomb in the security community, especially if for no extremely well justified reason. It would be the proverbial straw that broke the camel's back in terms of government interference with Internet security, and would lead to a digital revolt even more severe than what Snowden's disclosures caused. Government technical experts will be aware of this and will use other methods, at least in any context where it could be plausibly noticed.
To be clear, I completely believe that the government is or could passively conduct surveillance on virtually all electronic communications. I just don't think they'll go as far as actively intercepting and modifying a connection, and inserting a fake certificate, within the borders of the country or in any normal circumstance. Maybe they would do such a thing within the private networks of North Korea, but I'd be highly skeptical of them doing it within the US, and with the DoD certificate of all things. It would be too obvious and has too poor of a risk/reward payoff compared to other methods. If they were going to do this, they'd steal the private key from another CA and use it instead. Because that capability could be noticed and "burned", they'd save it for high value targets only.
So, in all practical analysis, I think it is extremely unlikely that the DoD will attack people through their root certificate, though I concede that it's plausible. I would be interested in feedback from others about this reasoning.
My question to HN: Have any of you guys attempted to come up with a plan for if/when this happens? Are there any algorithms in suites like OpenSSL that are quantum-resistant? Is SSL/TLS even compatible with a post-quantum world?
The trick with all of this is that there isn't all that much cryptanalysis work of some of the more promising PQ schemes, so trying to preemptively adopt them just in case quantum computers learn to factor numbers bigger than 15 is likely to do more harm, in the short term, than good.
An EU research project has published a couple of recommendations what you can do if you need postquantum today (none of which you want to use for TLS, because the keys or signatures are too big). NIST plans a standardization process. A draft for a stateful hash-based signature scheme (XMSS) is probably going to be an IETF document soon.
Small bits and pieces. People are working on it, but it's still a long way until you will be able to use your browser to establish a quantum-safe https connection.
("...Chuang expects to see quantum encryption methods that will inscribe sensitive data into the very states of atoms")
Very curious about current state of this research (relative to the current state of quantum decryption)--any experts in the room?
Since the intended use is key distribution, a MITM is fine as long as you can detect it reliably: you can keep sending new keys until one isn't eavesdropped upon, and then use that key.
Cryptographers interested in encryption schemes that use mathematical structures that are not amenable to any known quantum algorithm. Lattices, Ring Learning With Error
Cryptographers are also interested in how Quantum Computers will scale to large sizes. It will be important to understand what the largest quantum computers that can practically be built are.
Which is a pretty big "oh crap" that will catch a lot of people by surprise if quantum computers ever really happen.
What about other legacy systems?
Actual, physical quantum computing is still in its infancy, so sure, it's hard to conjure up the wherewithal to worry. But nonetheless we have a long way to go before we can say the world is ready for real quantum attacks.
QKD allows you to distribute a one-time pad while only sharing an authentification key. It is (on paper, if you don't count experimental flaws) theoretically-secure, meaning you can't break it or man-in-the-middle attack it even if you have infinite computational power (with 1-epsilon probability, epsilon being as close to 0 as we decide). In practice, most QKD systems can be hacked through hardware flaws.
I have found this: https://en.wikipedia.org/wiki/Post-quantum_cryptography
But am not sure what is considered the state of the art. Stehle-Steinfeld seems like the most promising option from my very uninformed position. Supersingular elliptic curves also seem to have nice properties but I have to admit I don't even fully comprehend them.
Would love for a crypto expert to chime in :)
The article has a broken link to the paper abstract, which is "Realization of a scalable Shor algorithm" available here:
And I'm yet to find a full copy of the paper anywhere, but it sure looks interesting...
Edit: Ah, here it is:
There's also "Compiling quantum algorithms for architectures with multi-qubit gates" which cites the above:
"Briefly, the new work uses Kitaev’s version of Shor’s factoring algorithm, running on an ion-trap quantum computer with five calcium ions, to prove that, with at least 90% confidence, 15 equals 3×5..... So, what’s new is that a QC has now factored 15 “scalably”: that is, with much less cheating than before." - (Scott Aaronson).
Btw, http://www.scottaaronson.com/blog/?p=2673 (via https://news.ycombinator.com/item?id=11235186) is also about this.