>According to WikiTribune’s source, experts in the delegations have clashed over recent weeks and the NSA has not provided the technical detail on the algorithms that is usual for these processes. The U.S. delegation’s refusal to provide a “convincing design rationale is a main concern for many countries,” the source said.
So it's not just "We don't trust anything the NSA puts out." It's "The NSA is refusing to explain their algorithms in lieu of saying 'Trust us,' and we don't."
But in real standards competitions, academic cryptographers bundle their designs with rationale essays and point-by-point explanations of how the designer mitigated attacks, like differential and linear trails. Standards groups didn't get that from NSA, and when academic cryptographers poked at the ciphers and asked questions about linear trails, the NSA designers got standoffish.
I think there's a subtext to all of this where the NSA is dismissive of, well, basically all academic cryptanalytic work. The converse of that, of academics and the NSA, didn't (I think?) used to be true, but might gradually be taking this shape, so that the two groups are just mutually dismissive of each other.
So, where in the past the NSA got some deference that enabled them to submit standards proposals that didn't follow process, now the opposite is true, and academic cryptographers expect deference.
It's no tragedy. NSA brought this on themselves, and really, what we're "losing" here is kind of a marginal design anyways, right?
(I write this in the hopes that someone better connected to these issues will correct me on lots of it!)
This isn't new. My paper about exploiting shared caches in Intel Hyperthreading as a side channel to steal an RSA key was rejected by the Cryptology ePrint archive "because it wasn't about cryptology", while some people in the computer security community dismissed it as "just a theoretical cryptography thing".
Why shouldn’t it have been rejected?
I can’t see how it was about cryptography either as they seem to define it based on the center of gravity of their papers: https://eprint.iacr.org/2004
Separately, if what you said about the security community downplaying your results as too theoretical was not just the occasional opinion of a maverick, then clearly that was incorrect and unfortunate in multiple ways.
Finally regardless of any of that, great work on your contributions. Nice insights and efforts, coming so early on in the lifespan of an important problem.
A quick search shows eight papers which have "side channel" in their titles, so I think it's a bit of a stretch to say that they don't consider side channel attacks to be cryptography...
Are you saying any of these papers make a significant argument about side channel attacks? Or of you saying there are eight papers that make some reference to it? If it’s the latter that’s quite a big difference and it’s easy to see the logic of rejecting your paper based on its central theme.
I didn’t notice any of the papers made a significant argument about side channel attacks. Maybe 2004 was not the most proximate year prior to take as sample data? Or maybe I’m just overlooking the eight your referring to?
Btw I wouldn’t begrudge you any wtf thinking if you had any. It would definitely suck to do good work and not get proper and timely recognition for it, especially when it could have sped a solution or helped mitigate a real life problem.
It’s just that to whatever degree this suckness happened, I can’t see how it was due to irrational or biased reasoning on the part of the Cryptology ePrint archive.
A 2005 paper presented is linked, where he demonstrated such an attack and worked with the usual people to implement fixes.
In fairness to cperciva he clearly distinguished his work from Meltdown/Spectre - "These new attacks use the same basic mechanism, but exploit an entirely new angle."
I think that since the world was surprised by how bad it really was in practise, its fair to say cperciva (as well as others) predicted the explosion, but not necessarily the timing or the blast radius.
There are I am sure many other papers in corners of the net that explain the next one to come bite us.
PS cperciva was the Security Officer for FreeBSD and tends to know more about this stuff than the average bear.
(Again HN shows its ability to have someone with truly detailed knowledge just one comment away.)
NB _ I may have some details wrong, please correct if needed.
Simplicity is useful. I've seen on multiple occasions bugs in more complex algorithms, like ChaCha20. Test vectors don't help as much--or at all--when you're creating a bespoke CSPRNG as in the OpenBSD and Linux kernels that repurposes the core round functions.
Moreover, if we're talking about backdoors, then code complexity--even just sheer number of lines of code--is the spy's friend. For more complex algorithms it would be more practical to trojan COTS and FOSS software to, e.g., substitute an operation so you'd still get the same logical output but lose side-channel resistance.
I'm not an EE, but assuming all the standard reviews happen, I'd much prefer that hardware vendors use something like Simon. Hardware acceleration is the very definition of a blackbox. Hardware developers can copy+paste+munge as well as any software programmer, but there's rarely any subsequent external review. The value of simplicity just can't be overestimated here. Because hardware products lack the extra layers of transparent, open review, we really want to minimize the potential for accidental screwups. The simpler the algorithm, the fewer degrees of freedom they have to be creative.
The smaller block size and smaller key size profiles were dubious, but that's a judgment call. The NSA probably sees so much bad crypto out there that the Simon & Speck designers could have earnestly considered them a step up. Note that the debate about these weaker profiles was never about choosing them over some stronger algorithm. Rather, the alternative argument was that if a hardware design was so low-power and so low-bandwidth that those profiles were useful, it would be better to not have any crypto at all so nobody would have a false sense of security. From an engineering perspective I think most would agree with the latter; but as a practical matter commercial vendors no doubt will sell half-baked crypto in such tiny devices, and without a known quantity we'll probably be worse off. In any event, those weaker profiles have already been ceded.
Assuming the algorithms continue to hold up to review, I think it would be a net loss to lose Simon & Speck. And, frankly, I'm more suspicious of the motivations of, e.g., Chinese and Russian security services.
As for why the NSA designers haven't been cooperating as much as the community has desired, it's anybody's guess. AFAIU, these designs were something of a 20% project for these engineers and they're probably not getting much support from management for pushing these designs. I don't even think they work in one of the departments these designs normally come from; IIRC they've claimed they tossed it over the fence to one of those secret departments and got a thumbs up. But who knows. And it shouldn't matter, especially for something so simple. All evidence suggests that the NSA no longer possesses extraordinary skill when it comes to cipher and hash design, so provenance shouldn't color anyone's judgment. Academic and private industry designers can and have worked for security services, too.
With today's speed, I'd go with a traditional Feistel cipher with moderately high security margins any time.
It's annoying that NIST approves standards with very low security margins - AES is an example of an algorithm with unnecessary low security margins, for instance. Speck and Simon are even worse in that respect.
That would mean a shorter battery life. Not all of IoT is mains-powered.
The root problem is they have two conflicting missions: one is to help USA secure itself and the other is to read all comms worldwide.
If you ever hear that somebody has a plan to have the best offense and make sure that everyone else has the second-best defense so they're the only ones who can do something bad, the first thing that should come to your mind is that they are planning to do something bad. It's not an "will they" situation. It's a "can they" situation. If they have the capability, they will do it.
That is a terrible plan. And don't tell me that they are on our side so it's okay. Secretive, crypto-state actors are on their own side. Ask yourself: if this power eventually falls under the unilateral control of the executive branch, will that always be a good thing under any conceivable future administration? Believing the justifications is extremely dangerous.
This is also, in large part, why these cipher suites are classified—not because declassifying them would make them easier to cryptanalyze; but because it would tip off foreign powers that the US knows how their crypto works!
(And this is, of course, just as true of other weapons-systems development projects as it is of state-run cryptosystem developments.)
I mean it's not like division of intelligence agencies is anything new (FBI for counter-intelligence, CIA for foreign-intelligence etc).
Of course to make it work you'd need a watchdog with actual teeth and historically the NSA has regarded oversight as well something they think they don't need or want.
Those include statements about how far cryptanalysis have weakened the ciphers, which the NSA claims was roughly what they had expected during design.
If the NSA published its own cryptanalysis, would you believe it, or would you assume they had told less than the whole story? What if they paid an academic to publish cryptanalysis (“of course he would say that, he was paid $X by the NSA!”)? The NSA appears to be in a catch-22 here.
I’d opt for not trusting them, but even if they did provide some details elsewhere I’d imagine ISO had some questions the NSA didn’t feel like answering...
There's a great dad joke that relates: if you have a boy, you only have to worry about one little prick, but if you have a girl, you have to worry about all the little pricks out there.
Too simple, as some cryptographers would say...
>the NSA's behavior was outrageously adversarial to the process. They refused to motivate design choices they made such as the choice of matrices U, V, and W in Simon's key schedule. Instead, they chose to personally attack some of the experts (including @hashbreaker, Orr Dunkelman and myself) as incompetent.
>This is yet another example as to how the NSA's surveillance program is bad for global security. If they had been more trustworthy, or at least more cooperative, different alliances would have probably been formed. But instead, they chose to try to bully their way into the standards which almost worked but eventually backfired.
I took a look at the performance tests of AES vs NSA's Simon/Speck done by the CryptoLUX group at the University of Luxembourg. They have so much data comparing different scenarios, processors, and implementation versions that it's difficult to summarize the trade-offs. But my brief look at AES vs Simon/Speck on an 8-bit Atmel AVR processor is that the difference in code size and RAM are in the hundreds of bytes (bytes, not megabytes) and AES performance might be approximately equal (if AES is implemented with large code size and RAM) or up to 10-15 times slower (if implemented with small code size and RAM).
Seriously, embedded software these days is so bloated (just like in web development), and processors and RAM are so over-provisioned, and encryption is such a minuscule part of the tasks of a system that I wonder if using a standard algorithm like AES would make a perceptible difference to anybody.
code: 2588 bytes;
RAM: 208 bytes;
encrypt: 2835 cycles
code: 972 bytes;
RAM: 200 bytes;
encrypt: 1793 cycles
code: 1426 bytes;
RAM: 132 bytes;
encrypt: 997 cycles
The differences look insignificant. Code size and RAM requirements are in bytes (not megabytes). In fact, AES might be faster than Simon because the AES encrypts a 128-byte block whereas both Simon and Speck encrypt a 64-byte block; therefore you'd have to encrypt two Simon blocks to get the equivalent of one AES encryption.
If companies want every toaster and hair dryer connected to the Internet, they better damn well be sure they are secured. Any performance hit is a small price to pay.
I'd have to go look at their data myself, because AES is slow even compared to ChaCha20, and these algorithms are significantly more lightweight than that.
Also AVRs are cheap and very easy to use due to their simplicity.
AVRs are pretty hard to use when you hit one of the numerous hardware and peripheral walls. synchronous timers are killing me this week which lead me to switch to a PIC part.
Try $0.30 or cheaper with only 1KB ROM and 32-bytes of RAM. (Not kbytes. I said bytes).
The "smallest processors" and "most trivial embedded devices" are just that: incredibly trivial and incredibly small. These are still useful, especially because they use less power than the passive-draw of most components.
When you need basically a voltage-monitor, an inaccurate clock, and a tiny bit of logic, these tiny chips are quite useful.
Besides, that microcontroller has either 512 or 1024 bytes of flash program memory. That's not enough space to run any of the lightweight encryption algorithms (well, maybe, Simon, but you wouldn't have space to do anything else!).
As others have mentioned Simon and Speck very straightforward. There really isn't much room for obscuring anything there. On the other hand, when any group that's a part of a standards org begins to feel so privileged that they can operate under their own rules and without truly cooperating with the others and share information in the way that people are asking for it, it's going to breed further mistrust given the already tense environment due to the history there.
So the people who are suspicious of NSAs motives and actions are "tin foil hats", but your position is somehow "the reality"?
> From someone who has been following this saga
What does following mean here? Who are you? What are your credentials that somehow make your opinion more authoritative than that of others?
> Most likely due to their own internal self-conflict.
So you are just guessing? Is it at least an educated guess? If so, based on what?
> There really isn't much room for obscuring anything there.
Is there a publicly verifiable analysis by cryptographers to back this claim?
The standardization of Simon and Speck has been an ongoing fight within ISO/IEC JTC1 SC27 WG2 since 2014 or so, but looks like it's finally game over for now.
What happened here seems like a combination of two things: first, a general statement that the community is skeptical of NSA-related standards after the Dual EC fiasco, just on principles, and, second, process concerns about the way NSA interacts with standards bodies --- their work is considered poorly documented and their engagement with the academic research community (for instance, to answer concerns about flaws in their designs) is poor.
Slap NSA's hand for being abusive to the privacy of everyone, including their own citizens? We need more of that.
They should be going out of business because all their customers left in droves.
But they didn't and RSA is still an esteemed security company.
What happened when Juniper firewalls were outed by Snowden.
Did we ever hear the name of the employee who backdoored their product?
Surely they use revision control and can tell who contributed what.
I have to wonder if the NSA mole still works there too.
Zero transparency from these "Security Companies".
Casting political problems against technical problems is a tough endeavor.
It would be a very blatant move because it'd be rather suspicious if the NSA chose not to use these ciphers. Still, the possibility might in small part contribute to this failure.
If US federal defensive cybersecurity (especially on cryptographic matters) is going to have credibility in its recommendations outside of the US government, especially given the long history of the NSA compromising defensive recommendations in service of its offensive mission, it needs to be both visibly and effectively distanced from the signals intelligence mission of NSA/CSS.
This is the NSA. They're no fools. And they know that no one is going to trust them, especially if they try to bully their way and not reveal details.
What if the next-best competitor for this encryption is actually something they've broken? Could be that they got clever and lucky and figured it out, could be that they planted it with someone secretly working for the NSA. Then it would be in their interest to loudly lose in such a way that the standards committee picks the secretly-broken encryption rather than the one the NSA was pushing.
Fun tinfoil hat ideas, naturally, but it would sure make a better story than the NSA trying to backdoor an encryption standard again.
The NSA has eroded much of the trust it once had. This reduces its effectiveness as an organization and puts all Americans and American companies at increased risk.
Those who committed the crimes revealed by Snowden should be brought to justice, the program dismantled, the hardware auctioned off, and the money returned to taxpayers.
One does not have to be a privacy zealot or an anarchist to believe that the NSA should act within the law.
Both yes and no.
No, because many startups are just looking for the shortest path to market and that means they absolutely will go for US cloud storage and computing if it best serves their initial business plans.
Yes, because there are many ethical businesses (albeit smaller) -- and also in light with the GDPR -- who have a clear business model that doesn't involve selling personally identifiable information. And they now would go an extra mile to ensure they don't use USA-hosted services. I know business owners who did this and I'd do it as well if I was one.
Post Snowden there were a lot of companies offering secure email hosting in Europe. Not sure if that really amounted to a big loss for the USA email hosting market though. Many people are too dependent on Gmail to ever replace it, for example.
For example, if you use "voice assistant" from some major company, what prevents it from voluntarily sharing all the records with the government for the sake of national security? What prevents its employee from secretly sharing the data under some legal obligations?
The location of the server is really important here.
Also keep this in mind the US government also wants to use these algorithms. Why would they use a broken symmetric cipher?
Those pesky Belgians!
Also, seeing the overall good treatment the US reserve for Israel I'm always surprised when someone implies that Israel wants to damage the US.
Imagine how actually fucked you have to be to believe this shit.
One of the ideas I've tinkered with along the way is hiding encrypted emails in content that's markov generated from unencrypted communications with the same peer. It doesn't have to be perfect, just good enough to not get caught in the slime squad's algorithms.
Then for each new message subtract 1 from the number of leading zeros.
Now at some point Eve has to decide for herself when it's no longer worth it to brute force your messages.
That's pretty interesting, one way or the other.
This was proximately caused by Snowden disclosing what the NSA was doing. It was ultimately caused by NSA monitoring major swaths of the population.
ETA: decent summary: https://en.wikipedia.org/wiki/Proximate_and_ultimate_causati...
I’ll still argue for ultimate cause because if it wasn’t Snowden it seems likely would have been someone else.
I’m a firm believer in don’t shoot the messenger, even if they are flawed.
No. NSA's reputation is its own. They were once trusted as experts but that stopped long ago. Clipper chips, lotus notes, echelon, RSA ... the decline in NSA's trustworthiness started decades ago and has been steady ever since. Snowden gave us shiny powerpoints and codenames, but the overall programs have been common knowledge in the community for generations.
NSA recommendations was also one of the reasons that the Rijndael cipher was chosen as AES. So at least in the late 90s they had some positive reputation. As far as I can tell, the tipping point is 2001. After that, the spy part NSA seems to have all the leverage, and to actively propose backdoored standards is not below their dignity.
Ask the old hats about "no such agency". If you were working with the US government in the 80s/90s you knew the term.
They have always been a spy agency.
They were a spy agency as long as they were the NSA, and inherited that function from prior, WWII spy agencies.
> That was the CIA's job.
CIA is a civilian spy agency focussed on HUMINT, analysis, and open source intelligence, and also the lead US spy agency (and it's head was essentially the overall head of intelligence prior to the separate DNI), while NSA has always been a defense department spy agency and the lead SIGINT spy agency.
> NSA were the security experts.
They have that function, too, but they have been a spy agency back to 1945 (before being public in 1952) and inherited duties from prior spy agencies when they were founded.
As linked elsewhere in this thread,
Snowden certainly didn't help the NSA plan get traction, but reducing the rejection to Snowden is overly simplistic.
If they were going to do this, wouldn't they submit it under a pseudonym?
Let's not pretend it's some impossible or unheard of feat pushing something through, like getting someone to publish it.
You'd have to be naive to suggest the NSA hasn't compromised any individuals in the security community.
You simply can't be anonymous and trusted at the same time.
As I understand it he applied a bunch of well known algorithms to solve a long-standing problem. He's not a cryptologist but, as far as I can tell, he never claimed so.
As it currently stands, the IA mission of the NSA is solely to secure National Security Systems, which are systems that handle classified information or are critical to military or intelligence activities.
"National Security Directive (NSD) 42 authorizes NSA to secure National Security Systems, which includes systems that handle classified information or are otherwise critical to military or intelligence activities." https://www.nsa.gov/what-we-do/information-assurance/
I agree completely that our rights shouldn't be trampled on by governing agencies. Yet, I don't know how they're going to work in the old fashioned way. Perhaps it's not required, but I can't blame them for seeking that option, even if well meaning.
(By old fashioned, I mean back in the days of easily monitoring everyones phones and calling it good enough. Things are becoming increasingly difficult to monitor for a even a well meaning government)
edit: Not sure why I'm being downvotes, so let me elaborate. We all know it's stupidly common for the government to monitor it's civilians. Those days are being altered heavily due to cryptography. In some ways it still be easier to track people, in other ways it will be harder.
Is this news to any of you? I don't get the push back lol.
The idea isn’t that you can’t surveil within the US. The idea is routine traffic snooping is off limits. A warrant and a fair bit of effort should be required. The less effort required to wiretap, the closer we get to a surveillance state. Which is why trying to nurf IOT encryption is deeply wrong. It weakens freedom within the country, and makes a country week to outside attacks. Lose-lose.
However, crypto doesn't abide by warrants. So many people read into my comment thinking that I'm supporting a surveillance state. I am not. I simply am talking about how it is, indeed, complicated.
Crypto doesn't care about warrants. The problem is complicated.
The fact that you can even talk about warrants in a crypto conversation sort of irks me to be honest. Warrants are meaningless in crypto conversations. Not just meaningless, but incompatible.
Don't argue points I'm not trying to make please. I'd delete my original comment if I could -_-
Warrants aren't relevant if we're talking about crypto, but warrants are relevant if we're talking about surveillance.
It's easier today than it has ever been for surveillance services to monitor large amounts of people. Not just because they have the access and computational ability to automatically search the content of messages, but also because more of our communications are long-distance or otherwise digitised than ever before.
Oh, I thought it was far easier to tap a phone than to break modern cryptography. Why was it not?
> It's easier today than it has ever been for surveillance services to monitor large amounts of people. Not just because they have the access and computational ability to automatically search the content of messages, but also because more of our communications are long-distance or otherwise digitised than ever before.
I would argue that this is changing though, hence the entire point of this Post. In the 90s no one encrypted anything, even https was laughable. More and more things are being encrypted.
That's what this post is effectively about, no? The government is trying to stay ahead of the ball on cryptography and hoping to keep their ability to monitor everyone.
Do you somehow disagree? I'm confused.
Here's your problem. The fact that many people do something does not make it right -- it's a logical fallacy / conflation and it seems our brains are very vulnerable to it. Is it really so hard to believe that many people are wrong? Why does quantity equal quality?
I mean, it was very common to have slaves as well. Did that make it okay for the slaves?
I didn't once say it was right. In fact, I alluded to me being against it.
However, that doesn't change the fact that if you are tasked with protecting a state, it's the effective (but morally wrong) choice to monitor them.
My comment was simply that, like it or not, if you truly had 100% vision over the population you'd be able to protect them. In the same way a dictator could be far more effective than a democracy. Again, since ya'll will love to jump on me over this, I'm not supporting this.
I'm saying it's simply complicated. If the NSA is a good actor (I already said I don't believe them to be), then they are being asked to "protect" the country while flying blind. At least, that's what some people want.
.. again, I'm not defending them. I am in full support of all things crypto, and do not want any rights on any civilian to be trampled on.
Ya'll are touchy.
> Is it really so hard to believe that many people are wrong? Why does quantity equal quality?
I don't know, perhaps you should ask someone who said that? Ie, not me, because I didn't say that. Again.
I was also way too vague. A better thing to talk around would have been warrants, and how our legal system is designed to ensure government can invade your privacy assuming reasonable requirements.. yet crypto changes that game entirely.
I definitely don't want the government invading my privacy, and I want true secure crypto in all things. I just can sympathize with how much this is going to change things.. some for better, and some for worst.
Do you also mistrust Wikipedia? Besides, it is not like every reputed journalist these days maintains basic journalistic standards or integrity. Case in point, article by Reuters from two days ago: https://news.ycombinator.com/item?id=17057700
You're just mad I didn't trust your article
Would they allow your kids to use Wikipedia's own sources — as long as they're legitimate — as their sources?
I do not trust traditional journalism as a source either (unlike Wikipedia and most schools). I try to find their sources; and when I can't, I regard it with suspicion.
> You're just mad I didn't trust your article.
This is either incredibly childish or too subtly tongue-in-cheek for me. Neither is the article "mine" in any way, nor was I "mad". I was genuinely curious.