Hacker News new | comments | show | ask | jobs | submit login
NSA encryption plan for ‘internet of things’ rejected by ISO (wikitribune.com)
405 points by Jerry2 33 days ago | hide | past | web | favorite | 195 comments



Key point:

>According to WikiTribune’s source, experts in the delegations have clashed over recent weeks and the NSA has not provided the technical detail on the algorithms that is usual for these processes. The U.S. delegation’s refusal to provide a “convincing design rationale is a main concern for many countries,” the source said.

So it's not just "We don't trust anything the NSA puts out." It's "The NSA is refusing to explain their algorithms in lieu of saying 'Trust us,' and we don't."


It's worth bearing in mind that the documentation issues here are basically process concerns more than they are substantive concerns. Both Simon and Speck are straightforward designs. Cryptographers are capable of evaluating a deliberately-simple lightweight ARX cipher!

But in real standards competitions, academic cryptographers bundle their designs with rationale essays and point-by-point explanations of how the designer mitigated attacks, like differential and linear trails. Standards groups didn't get that from NSA, and when academic cryptographers poked at the ciphers and asked questions about linear trails, the NSA designers got standoffish.

I think there's a subtext to all of this where the NSA is dismissive of, well, basically all academic cryptanalytic work. The converse of that, of academics and the NSA, didn't (I think?) used to be true, but might gradually be taking this shape, so that the two groups are just mutually dismissive of each other.

So, where in the past the NSA got some deference that enabled them to submit standards proposals that didn't follow process, now the opposite is true, and academic cryptographers expect deference.

It's no tragedy. NSA brought this on themselves, and really, what we're "losing" here is kind of a marginal design anyways, right?

(I write this in the hopes that someone better connected to these issues will correct me on lots of it!)


I think there's a subtext to all of this where the NSA is dismissive of, well, basically all academic cryptanalytic work. The converse of that, of academics and the NSA, didn't (I think?) used to be true, but might gradually be taking this shape, so that the two groups are just mutually dismissive of each other.

This isn't new. My paper about exploiting shared caches in Intel Hyperthreading as a side channel to steal an RSA key was rejected by the Cryptology ePrint archive "because it wasn't about cryptology", while some people in the computer security community dismissed it as "just a theoretical cryptography thing".


[My paper about exploiting shared caches in Intel Hyperthreading to steal an RSA key was rejected by the Cryptology ePrint archive "because it wasn't about cryptology"]

Why shouldn’t it have been rejected?

I can’t see how it was about cryptography either as they seem to define it based on the center of gravity of their papers: https://eprint.iacr.org/2004

Separately, if what you said about the security community downplaying your results as too theoretical was not just the occasional opinion of a maverick, then clearly that was incorrect and unfortunate in multiple ways.

Finally regardless of any of that, great work on your contributions. Nice insights and efforts, coming so early on in the lifespan of an important problem.


I can’t see how it was about cryptography either as they seem to define it based on the center of gravity of their papers: https://eprint.iacr.org/2004*

A quick search shows eight papers which have "side channel" in their titles, so I think it's a bit of a stretch to say that they don't consider side channel attacks to be cryptography...


I was trying to guess that 2004 was the most proximate archive year of their papers prior to when yours was rejected, hence that list: https://eprint.iacr.org/2004

Are you saying any of these papers make a significant argument about side channel attacks? Or of you saying there are eight papers that make some reference to it? If it’s the latter that’s quite a big difference and it’s easy to see the logic of rejecting your paper based on its central theme.

I didn’t notice any of the papers made a significant argument about side channel attacks. Maybe 2004 was not the most proximate year prior to take as sample data? Or maybe I’m just overlooking the eight your referring to?

Btw I wouldn’t begrudge you any wtf thinking if you had any. It would definitely suck to do good work and not get proper and timely recognition for it, especially when it could have sped a solution or helped mitigate a real life problem.

It’s just that to whatever degree this suckness happened, I can’t see how it was due to irrational or biased reasoning on the part of the Cryptology ePrint archive.


I'm saying there are eight papers on that list which have "side channel" in their titles. I assume, based on those titles, that the papers have something to do with side channels...


Are you claiming to have prior knowledge and academic precedent for Meltdown?


Basically, yes :-)

https://www.daemonology.net/hyperthreading-considered-harmfu...

http://www.daemonology.net/blog/2018-01-17-some-thoughts-on-...

A 2005 paper presented is linked, where he demonstrated such an attack and worked with the usual people to implement fixes.

In fairness to cperciva he clearly distinguished his work from Meltdown/Spectre - "These new attacks use the same basic mechanism, but exploit an entirely new angle."

I think that since the world was surprised by how bad it really was in practise, its fair to say cperciva (as well as others) predicted the explosion, but not necessarily the timing or the blast radius.

There are I am sure many other papers in corners of the net that explain the next one to come bite us.

PS cperciva was the Security Officer for FreeBSD and tends to know more about this stuff than the average bear.

(Again HN shows its ability to have someone with truly detailed knowledge just one comment away.)

NB _ I may have some details wrong, please correct if needed.


A couple of people do, if I recall correctly.


I definitely remember rumblings about "Branch prediction runs code outside normal execution? There's got to be a security hole there somewhere." That sentiment was common enough that it's certainly not hard to imagine someone sketching the shape of an actual attack with it before the detailed proof came down the line.


Speck and Simon have the benefit of simplicity. Like RC4, Speck could even be memorized. It's like 30 years of block cipher design has been condensed into the smallest possible algorithm.

Simplicity is useful. I've seen on multiple occasions bugs in more complex algorithms, like ChaCha20. Test vectors don't help as much--or at all--when you're creating a bespoke CSPRNG as in the OpenBSD and Linux kernels that repurposes the core round functions.

Moreover, if we're talking about backdoors, then code complexity--even just sheer number of lines of code--is the spy's friend. For more complex algorithms it would be more practical to trojan COTS and FOSS software to, e.g., substitute an operation so you'd still get the same logical output but lose side-channel resistance.

I'm not an EE, but assuming all the standard reviews happen, I'd much prefer that hardware vendors use something like Simon. Hardware acceleration is the very definition of a blackbox. Hardware developers can copy+paste+munge as well as any software programmer, but there's rarely any subsequent external review. The value of simplicity just can't be overestimated here. Because hardware products lack the extra layers of transparent, open review, we really want to minimize the potential for accidental screwups. The simpler the algorithm, the fewer degrees of freedom they have to be creative.

The smaller block size and smaller key size profiles were dubious, but that's a judgment call. The NSA probably sees so much bad crypto out there that the Simon & Speck designers could have earnestly considered them a step up. Note that the debate about these weaker profiles was never about choosing them over some stronger algorithm. Rather, the alternative argument was that if a hardware design was so low-power and so low-bandwidth that those profiles were useful, it would be better to not have any crypto at all so nobody would have a false sense of security. From an engineering perspective I think most would agree with the latter; but as a practical matter commercial vendors no doubt will sell half-baked crypto in such tiny devices, and without a known quantity we'll probably be worse off. In any event, those weaker profiles have already been ceded.

Assuming the algorithms continue to hold up to review, I think it would be a net loss to lose Simon & Speck. And, frankly, I'm more suspicious of the motivations of, e.g., Chinese and Russian security services.

As for why the NSA designers haven't been cooperating as much as the community has desired, it's anybody's guess. AFAIU, these designs were something of a 20% project for these engineers and they're probably not getting much support from management for pushing these designs. I don't even think they work in one of the departments these designs normally come from; IIRC they've claimed they tossed it over the fence to one of those secret departments and got a thumbs up. But who knows. And it shouldn't matter, especially for something so simple. All evidence suggests that the NSA no longer possesses extraordinary skill when it comes to cipher and hash design, so provenance shouldn't color anyone's judgment. Academic and private industry designers can and have worked for security services, too.


Low powered hardware? You can give it more power. Why should security be sacrificed at all by IoT devices just to make them cheaper? Make them secure in the first place, point.

With today's speed, I'd go with a traditional Feistel cipher with moderately high security margins any time.

It's annoying that NIST approves standards with very low security margins - AES is an example of an algorithm with unnecessary low security margins, for instance. Speck and Simon are even worse in that respect.


> Low powered hardware? You can give it more power.

That would mean a shorter battery life. Not all of IoT is mains-powered.


Simplicity also reduces defense in depth...


that they think this has even a tiny chance of happening after Snowden is really surprising.. I guess we really need to make sure people don't forget about that going forward


They brought this on themselves.

The root problem is they have two conflicting missions: one is to help USA secure itself and the other is to read all comms worldwide.


The way it's supposed to work is that they try to have the best code cracking, and then they want everyone to use the second best codes so no one else can crack code. If they aren't the best the plan can't work and you get junk like clipper chip or whatever it was called.


Anyone who falls for this kind of reasoning is incredibly naive.

If you ever hear that somebody has a plan to have the best offense and make sure that everyone else has the second-best defense so they're the only ones who can do something bad, the first thing that should come to your mind is that they are planning to do something bad. It's not an "will they" situation. It's a "can they" situation. If they have the capability, they will do it.

That is a terrible plan. And don't tell me that they are on our side so it's okay. Secretive, crypto-state actors are on their own side. Ask yourself: if this power eventually falls under the unilateral control of the executive branch, will that always be a good thing under any conceivable future administration? Believing the justifications is extremely dangerous.


This seems like a very easy problem to fix. We can split NSA into two and now the defensive people are not tainted by the people trying to break into everything. I mean the defensive people can focus on trying to secure everything, everywhere and against everyone including their former coworkers. Is this too naive?


I think the practical barrier in the way of this is that one of the main ways the NSA “invents” new Suite A ciphers is by the cryptanalysis of cryptosystems of foreign militaries. The “defending the US” job is a consequence of the “attacking others” job; the crypto experts got to be crypto experts by breaking rival crypto.

This is also, in large part, why these cipher suites are classified—not because declassifying them would make them easier to cryptanalyze; but because it would tip off foreign powers that the US knows how their crypto works!

(And this is, of course, just as true of other weapons-systems development projects as it is of state-run cryptosystem developments.)


I don't think it's naive and would likely be better than the current system where the poachers are way more heavily supported than the gamekeepers.

I mean it's not like division of intelligence agencies is anything new (FBI for counter-intelligence, CIA for foreign-intelligence etc).

Of course to make it work you'd need a watchdog with actual teeth and historically the NSA has regarded oversight as well something they think they don't need or want.


IT is a little more complex than that - they need to also figure out what codes the US government uses for classified secrets that we don't won't other countries to find. It is not clear how they resolve the inherent conflict in these goals.


For these ciphers, it seems less likely that NSA has a backdoor that no-one else could find. Notably in the case of dual-EC there was a recommended curve chosen by the NSA. That was easy to backdoor by knowing how the curve was generated.


More importantly, pretty much the whole point of a PKRNG is to make the random state recoverable. It's not as if competing RNGs have designs that enable the kind of backdoor Dual EC does. That's what was so weird about it, and why there was some doubt about what it was --- not doubt that people should use Dual EC (of course they shouldn't, and it's been amazing to see companies like RSA and Juniper actually adopt it; the cryptographic incompetence behind those decisions was shocking), but that NSA could really be using such blatantly awful tradecraft.


I don’t know anything about how the NSA interacted with ISO, but it is worth mentioning that the NSA has material explaining these ciphers:

* https://csrc.nist.gov/csrc/media/events/lightweight-cryptogr...

* https://eprint.iacr.org/2017/560.pdf

* https://www.nist.gov/sites/default/files/documents/2016/10/1...

Those include statements about how far cryptanalysis have weakened the ciphers, which the NSA claims was roughly what they had expected during design.

If the NSA published its own cryptanalysis, would you believe it, or would you assume they had told less than the whole story? What if they paid an academic to publish cryptanalysis (“of course he would say that, he was paid $X by the NSA!”)? The NSA appears to be in a catch-22 here.


I think when someone like the NSA provides you an algo you either decide you can’t trust them or need to ask some heavy hitting questions to make sure it’s not broken somewhere along the line for their benefit.

I’d opt for not trusting them, but even if they did provide some details elsewhere I’d imagine ISO had some questions the NSA didn’t feel like answering...


But we can also assume that everyone else working in security has their own other bias: Chinese and Russian services must be at work too.


The nice thing about mathematics is that truth and falsity doesn’t derive from trust or authority.


Though, the NSA did lose the keys to their van full of fun toys to the Shadow Brokers just a couple years ago. In my mind that adds 'incompetent' right there next to 'evil/criminal' on the list of reasons they are untrustworthy.


I think that speaks more to the vastly more difficult task of playing defense than offense.

There's a great dad joke that relates: if you have a boy, you only have to worry about one little prick, but if you have a girl, you have to worry about all the little pricks out there.


One could reverse genders in that joke, but many would call that version sexist ;) And both versions are, in truth.


It's like a burglar offering to change your locks. And promising he doesn't have a copy of the key. But you aren't allowed to check. Just trust him.


On the other hand, an honest burglar is probably the most reliable source for all the known exploits and can judge a good lock when they see it.


Yes, but the NSA is hardly honest.


However, Speck is out in the open, the specification is public. It's surprisingly simple to implement.

Too simple, as some cryptographers would say...


What's also interesting is how the NSA admonished and personally attacked three cryptographers (including Daniel J. Bernstein aka djb) and called them incompetent:

>the NSA's behavior was outrageously adversarial to the process. They refused to motivate design choices they made such as the choice of matrices U, V, and W in Simon's key schedule. Instead, they chose to personally attack some of the experts (including @hashbreaker, Orr Dunkelman and myself) as incompetent.

>This is yet another example as to how the NSA's surveillance program is bad for global security. If they had been more trustworthy, or at least more cooperative, different alliances would have probably been formed. But instead, they chose to try to bully their way into the standards which almost worked but eventually backfired.

Rest: https://twitter.com/TomerAshur/status/988696306674630656


I wonder how they personally attacked those experts. Was it public?


"Personal attack" is a pretty big stretch here. The NSA is generally dismissive of academic cryptographers, and was dismissive here.


Okay that's what the article says but the tweets specifically say that they attacked the credibility of some very well respected security experts


If they're talking about what I think they're talking about, they're referring to a technical argument in which pretty much everyone was dismissive. It was still more civil than a typical HN thread, which is in turn more civil than a typical Reddit thread, which in turn is... my point being: outrageous personal attack is a bit of a stretch.


I take it you're referencing a discussion that isn't public?


So, the strech is loosely defined. I concur - please reference source material.


Does anyone who's been following these IoT encryption standards think that new algorithms are truly needed? Considering that even the most trivial embedded devices these days get powerful microcontrollers and megabytes of RAM -- often running a full operating system! -- is there any perceptible gain with one of these unknown lightweight algorithms compared to using a well-known and standard algorithm like AES?

I took a look at the performance tests of AES vs NSA's Simon/Speck done by the CryptoLUX group at the University of Luxembourg[1]. They have so much data comparing different scenarios, processors, and implementation versions that it's difficult to summarize the trade-offs. But my brief look at AES vs Simon/Speck on an 8-bit Atmel AVR processor is that the difference in code size and RAM are in the hundreds of bytes (bytes, not megabytes) and AES performance might be approximately equal (if AES is implemented with large code size and RAM) or up to 10-15 times slower (if implemented with small code size and RAM).

Seriously, embedded software these days is so bloated (just like in web development), and processors and RAM are so over-provisioned, and encryption is such a minuscule part of the tasks of a system that I wonder if using a standard algorithm like AES would make a perceptible difference to anybody.

[1] https://www.cryptolux.org/index.php/FELICS_Block_Ciphers_Det...


It's too late to update my original comment, but here are the fastest 128-bit key implementations of AES, Simon, and Speck for the AVR processor according to the CryptoLUX group at the University of Luxembourg:

AES> code: 2588 bytes; RAM: 208 bytes; encrypt: 2835 cycles

Simon> code: 972 bytes; RAM: 200 bytes; encrypt: 1793 cycles

Speck> code: 1426 bytes; RAM: 132 bytes; encrypt: 997 cycles

The differences look insignificant. Code size and RAM requirements are in bytes (not megabytes). In fact, AES might be faster than Simon because the AES encrypts a 128-byte block whereas both Simon and Speck encrypt a 64-byte block; therefore you'd have to encrypt two Simon blocks to get the equivalent of one AES encryption.


It depends on the application. If your IoT thing is plugged into wall power then yeah, you can just use a more powerful chip. If your device is running off of a button cell and needs to last for at least a year then you have to be a lot more conscious of cpu cycles and memory size.


I agree it should, but realistically does any IoT device actually worry about such a thing?


Sensor nets often operate in conditions like that. You don't want to have to change the batteries on something attached to a Zebra any more than absolutely necessary.


You also likely don't need to encrypt the data from a Zebra-attached sensor.


Concur completely. My cow-worker located a little package called Tiny-AES * that weighs in at 1848 bytes of code and under 300 bytes of RAM for AES-128 CBC with 128 bit blocks targeting a Cortex M0+ (IAR 8.20.2 EWARM and high optimization).

* https://github.com/kokke/tiny-AES-c


Agreed.

If companies want every toaster and hair dryer connected to the Internet, they better damn well be sure they are secured. Any performance hit is a small price to pay.


hundreds of bytes is a lot on AVRs. 1 millisecond is a lot on AVRs.

I'd have to go look at their data myself, because AES is slow even compared to ChaCha20, and these algorithms are significantly more lightweight than that.


Why the hell would you use an AVR when there are better resourced and faster ARM cores for the same power drain.


Perhaps there are now, but these algorithms were developed several years ago.

Also AVRs are cheap and very easy to use due to their simplicity.


STM32 unit cost is lower than most AVRs. And dev is just C as well. Also NXP do Cortex M0 parts for $0.45!

AVRs are pretty hard to use when you hit one of the numerous hardware and peripheral walls. synchronous timers are killing me this week which lead me to switch to a PIC part.


ChaCha20 is a stream cipher, of course it would be faster.


> Considering that even the most trivial embedded devices these days get powerful microcontrollers and megabytes of RAM -- often running a full operating system!

Nope.

https://www.digikey.com/product-detail/en/microchip-technolo...

Try $0.30 or cheaper with only 1KB ROM and 32-bytes of RAM. (Not kbytes. I said bytes).

----------

The "smallest processors" and "most trivial embedded devices" are just that: incredibly trivial and incredibly small. These are still useful, especially because they use less power than the passive-draw of most components.

When you need basically a voltage-monitor, an inaccurate clock, and a tiny bit of logic, these tiny chips are quite useful.


Sure, but in the context of the conversation, we were talking about Internet of Things devices. How likely is it that you'll choose that 6- or 8-pin IC to make an Internet connection? I'm not saying it's impossible, but you're going to need a lot of support ICs; if you're making an Internet-enabled device you're probably not going to pick the tiniest microcontroller.

Besides, that microcontroller has either 512 or 1024 bytes of flash program memory. That's not enough space to run any of the lightweight encryption algorithms (well, maybe, Simon, but you wouldn't have space to do anything else!).


Lots of low power/radios/IoT whatever chips already come with hardware AES support. If a new standard emerges, that will get a hardware implementation, too.


More than that, many embedded processor lines include an AES hardware block in at least some of their devices already.


I see the tin foil hats are out in full swing, but the reality is, far from what people assume. From someone who has been following this saga, this is more a fight about how the NSA cooperates (or lack thereof) with ISO and other standards orgs. Most likely due to their own internal self-conflict.

As others have mentioned Simon and Speck very straightforward. There really isn't much room for obscuring anything there. On the other hand, when any group that's a part of a standards org begins to feel so privileged that they can operate under their own rules and without truly cooperating with the others and share information in the way that people are asking for it, it's going to breed further mistrust given the already tense environment due to the history there.


I don't know anything of the technical details of Simon and Speck, but distrusting the NSA hardly entails a tinfoil hat.


"Distrusting" one of the simplest "mainstream" ARX ciphers entails a little bit of tin foil. That's not really why ISO isn't moving forward with them.


That seems to be exactly why ISO isn't moving forward, or at least has been in the past.

https://www.reuters.com/article/us-cyber-standards-insight/d...


Distrust of the NSA has cost it the deference it was given by academic cryptographers to basically ignore the process norms of public crypto standards. That's not the same as saying the academic cryptography community distrusts a simple ARX design.


This is a pre-2013 discussion.


> I see the tin foil hats are out in full swing, but the reality is, far from what people assume.

So the people who are suspicious of NSAs motives and actions are "tin foil hats", but your position is somehow "the reality"?

> From someone who has been following this saga

What does following mean here? Who are you? What are your credentials that somehow make your opinion more authoritative than that of others?

> Most likely due to their own internal self-conflict.

So you are just guessing? Is it at least an educated guess? If so, based on what?

> There really isn't much room for obscuring anything there.

Is there a publicly verifiable analysis by cryptographers to back this claim?


Better article from April when this story first broke. https://www.theregister.co.uk/2018/04/25/nsa_iot_encryption/

The standardization of Simon and Speck has been an ongoing fight within ISO/IEC JTC1 SC27 WG2 since 2014 or so, but looks like it's finally game over for now.


That's not a great article. In reality, nobody thinks Speck and Simon are backdoored --- they're extremely straightforward block cipher designs with well-understood components. Unless the NSA knows something that breaks all modern block cipher designs --- in which case, why tip your hand? --- there's no place to hide a backdoor in either of these standards.

What happened here seems like a combination of two things: first, a general statement that the community is skeptical of NSA-related standards after the Dual EC fiasco, just on principles, and, second, process concerns about the way NSA interacts with standards bodies --- their work is considered poorly documented and their engagement with the academic research community (for instance, to answer concerns about flaws in their designs) is poor.


I don't see why would any Non-American company would accept any of their thoughts/designs after their sneaky backdoor prng.


They shouldn't, but at the same time, these non-American companies should at least be honest about why they're rejecting the NSA's thoughts/designs rather than hype up some vapid fear of a backdoor.

Slap NSA's hand for being abusive to the privacy of everyone, including their own citizens? We need more of that.


The back door prng wasn't all that sneaky? I would assess "don't look behind the curtain" and "nothing up my sleeves because I'm not wearing sleeves" quite differently.


So true. How about the credibility of RSA.

They should be going out of business because all their customers left in droves.

But they didn't and RSA is still an esteemed security company.

What happened when Juniper firewalls were outed by Snowden. Did we ever hear the name of the employee who backdoored their product?

Surely they use revision control and can tell who contributed what. I have to wonder if the NSA mole still works there too. Zero transparency from these "Security Companies".


What or who are the trusted entities whose thoughts or designs are acceptable?

Casting political problems against technical problems is a tough endeavor.


Because we live in a global economy where non-American companies sometimes have the US government as their customer.


Would it perhaps make sense for the NSA to have a publicly discoverable weakness here, and had no plans of using these ciphers in the US? Then the NSA could essentially get more people to use a weak cipher.

It would be a very blatant move because it'd be rather suspicious if the NSA chose not to use these ciphers. Still, the possibility might in small part contribute to this failure.


The NSA trying to propose an encryption plan is like letting wolves decide how to secure sheep. Total conflict of interest, especially after Snowden.


But isn't it the same argument that if you wish to secure your system you really need to get a whitehat hacker. NSA looks at herself in the same light as a whitehat.


The NSA are overtly, by a central part of their official mission, black hats, not white hats.

If US federal defensive cybersecurity (especially on cryptographic matters) is going to have credibility in its recommendations outside of the US government, especially given the long history of the NSA compromising defensive recommendations in service of its offensive mission, it needs to be both visibly and effectively distanced from the signals intelligence mission of NSA/CSS.


No one else but the NSA sees the NSA as whitehats.


Nation-State Adversary is my favorite backronym for NSA.


Yeah, the NSA's trust is down the drain. I wouldn't call them whitehat. Huge conflict of interest to accept anything they propose.


Hiring a whitehat is one thing. The NSA is gray at best.


Here's a weird/fun thought: what if the NSA was trying to lose?

This is the NSA. They're no fools. And they know that no one is going to trust them, especially if they try to bully their way and not reveal details.

What if the next-best competitor for this encryption is actually something they've broken? Could be that they got clever and lucky and figured it out, could be that they planted it with someone secretly working for the NSA. Then it would be in their interest to loudly lose in such a way that the standards committee picks the secretly-broken encryption rather than the one the NSA was pushing.

Fun tinfoil hat ideas, naturally, but it would sure make a better story than the NSA trying to backdoor an encryption standard again.


Why would they do that rather than just not submit anything and let the already broken next-best competitor win?


Where's the conspiracy theory then?


When we think about what actually matters when it comes to institutions (public or private) one very important thing is trust.

The NSA has eroded much of the trust it once had. This reduces its effectiveness as an organization and puts all Americans and American companies at increased risk.

Those who committed the crimes revealed by Snowden should be brought to justice, the program dismantled, the hardware auctioned off, and the money returned to taxpayers.

One does not have to be a privacy zealot or an anarchist to believe that the NSA should act within the law.


> puts all Americans and American companies at increased risk.

Both yes and no.

No, because many startups are just looking for the shortest path to market and that means they absolutely will go for US cloud storage and computing if it best serves their initial business plans.

Yes, because there are many ethical businesses (albeit smaller) -- and also in light with the GDPR -- who have a clear business model that doesn't involve selling personally identifiable information. And they now would go an extra mile to ensure they don't use USA-hosted services. I know business owners who did this and I'd do it as well if I was one.

Post Snowden there were a lot of companies offering secure email hosting in Europe. Not sure if that really amounted to a big loss for the USA email hosting market though. Many people are too dependent on Gmail to ever replace it, for example.


The NSA has lost an insane amount of credibility for acting like a three letter agency instead of a security administration. Turns out its pretty hard to be both.


Remember, the S in IoT is for security.


IoOPT — the Internet of Other People’s Thjngs.


This is the first WikiTribune story I've personally noticed on HN. The story is not as juicy as the headline makes it out to be, but it's good to see WikiTribune as a source of news nonetheless. I'm excited to see more from this site in the future.


This is the first story that's got traction, apart from the meta-stories: https://news.ycombinator.com/from?site=wikitribune.com


Do you read WikiTribune? I am always hoping to find better sources of news, but it doesn't appeal to me enough to try.


Not at the moment. I'll give it a shot though and see how things go.


Encryption scheme choice doesn't mean much if the devices send all the data to US-controlled "clouds". NSA will be able to read everything anyway.


Only if you're operating under the assumption that the NSA has broken all modern encryption algorithms


Why do they need to break encryption if they can come to any datacenter and copy data from the server? Or install a backdoor. Or get a subpoena for the data?


The question isn't has the NSA broken all modern encryption algorithms, but rather will the NSA break all modern encryption algorithms and when.


The data is only encrypted during network transit; the "clouds" decrypt it in order to perform requested data-munging and unrequested snooping.


If your encryption is strong and end-to-end, it doesn't matter where your data goes.


It does matter. Because there are many people who have access to the servers and they can cooperate with government agencies voluntarily or involuntarily.

For example, if you use "voice assistant" from some major company, what prevents it from voluntarily sharing all the records with the government for the sake of national security? What prevents its employee from secretly sharing the data under some legal obligations?

The location of the server is really important here.


The point of E2E is that it doesn't matter who owns the server. "End to End" means end to end, not 'end to middle then decrypted then to end'. If it's ever decrypted anywhere but an endpoint it's not E2E, by definition.


In the voice assistant example, the server is one of the endpoints. The point they're making is that E2E doesn't really help there if you can't be sure who is looking over shoulders on the other end. If the two ends are both under your control or scrutiny, then it doesn't matter who's hiding in the clouds... Except for metadata of course.


Typical IoT device usually talks to the server where all the received data are often stored for futured use. If they used E2E, server wouldn't be necessary - it would make more sense to connect directly inside home WiFi network for example.


This is what a loss of soft power looks like. The US has burned its reputation and is loosing influence in all areas of the world.


I find this kinda sad since both speck and simon are really nice algorithms. The analysis that has been done also shows that seem secure. Moreover, they are simple I have made several implementations, and one that runs simon on a FPGA. It honestly would be hard to sneak a back door in my opinion.

Also keep this in mind the US government also wants to use these algorithms. Why would they use a broken symmetric cipher?


Well, I doubt NSA are stupid. The question is more what their plan is to get the standards they want to be accepted. Maybe by exposing themselves and thus creating a better chance for secret allies? We'll need to wait for the next Snowden to find out I guess, if ever.


So happy to see a Wiki Tribune link here. I think the project is super cool, I hope it stays around.


Some more suspicious elliptic curves no doubt.


Don’t cryptographers still have concerns about an AES backdoor?


No.


Unrelated:

Has anyone ever not clicked on the "this site uses cookies" button on any website since they started being introduced? Does anyone even look at them before clicking them?


That button is just a CYA measure. The idea seems to prevent users from claiming they didn't know they were being monitored. It solves nothing and gives users absolutely no control over tracking behavior. I doubt web sites wait for the user's decision to "accept" the use of cookies: since they normally travel via HTTP headers, they would already be in the browser's cookie jar before the page even rendered.


[dead]


Explain how not using the NSA proposal starves the NSA of talent?


> Two delegates told WikiTribune that the opposition to adding these algorithms was led by Dr. Tomer Ashur from KU Leuven University, representing the Belgian delegation and it was supported by a large group of countries.

Those pesky Belgians!


And an Israeli!


[flagged]


Linking to a Wikipedia article about a biblical figure helps how, in this context?


He points out that one is Israeli and the other has a surname of Hebrew origins. I'm no dog, but even I can hear that whistling.


I didn't want to imply anything, but I too thought that was the point.

Also, seeing the overall good treatment the US reserve for Israel I'm always surprised when someone implies that Israel wants to damage the US.


They're using anti-Semitic dogwhistling to bring doubt.

Imagine how actually fucked you have to be to believe this shit.


It shows that Ashur is no Nimrod.


The NSA continues to work against the US national interest by illegally spying on its entire populace, so... Maybe let's not start throwing rocks over glass houses, ya know?


This is an international standard. Pretty much any non-American country told them to fuck off.


Good.


What kind of bullshit argument is this? Manipulation? To not allow the NSA to choose what encryption people use? Is it manipulation to prevent the NSA from manipulating us? Give me a fucking break.


Telnet passwords ought to be good enough, right?


No one expects anyone to pass sensitive information in plaintext!


Jedi encryption :)

One of the ideas I've tinkered with along the way is hiding encrypted emails in content that's markov generated from unencrypted communications with the same peer. It doesn't have to be perfect, just good enough to not get caught in the slime squad's algorithms.


Or you could send encrypted texts with a subject line that tells how many leading zero bits are in your GPG private key. Perhaps start with keysize - 1.

Then for each new message subtract 1 from the number of leading zeros.

Now at some point Eve has to decide for herself when it's no longer worth it to brute force your messages.

Edit: clarification


Hiding in plain sight.


Snowden caused this.

That's pretty interesting, one way or the other.


Interesting way to put the blame on someone exposing the crimes, instead of those actually doing the crimes.


The parent comment can only be read as assigning blame if you think that ISO rejecting the NSA's proposal is a bad thing. I'd say it's a good thing since we know that the NSA's proposals are sometimes backdoored.


Irrespective of the proposal being backdoored, it was clear that the NSA refused to follow standard procedure by providing the necessary supporting materials for their proposal, so it's not difficult to see why the ISO rejected the proposal.


Your rush to judgement here is unwarranted. OP said nothing about whether what Snowden did was good or bad, just that it was a factor in the ISO's decision.


Still off. The NSA’s behaviours made this decision happen. Snowden exposing those behaviours isn’t really relavent.


You're arguing over proximate vs. ultimate causes, when both are causes.

This was proximately caused by Snowden disclosing what the NSA was doing. It was ultimately caused by NSA monitoring major swaths of the population.

ETA: decent summary: https://en.wikipedia.org/wiki/Proximate_and_ultimate_causati...


I agree with other guy, that is a useful distinction and I didn’t know that’s what it was called.

I’ll still argue for ultimate cause because if it wasn’t Snowden it seems likely would have been someone else.

I’m a firm believer in don’t shoot the messenger, even if they are flawed.


Thanks for thatuseful distinction


Snowden helped bring those behaviors to light by being someone people could 'relate' to. Thus, calling attention to past, present, and future actions by certain agencies.


Unless you're trying to thank him.


True. Snowden made it happen :)


"Snowden causing this" is not a bad thing, it's a good thing. Personally, I like to be blamed for good things...


>> Snowden caused this.

No. NSA's reputation is its own. They were once trusted as experts but that stopped long ago. Clipper chips, lotus notes, echelon, RSA ... the decline in NSA's trustworthiness started decades ago and has been steady ever since. Snowden gave us shiny powerpoints and codenames, but the overall programs have been common knowledge in the community for generations.


In other words the NSA had a great reputation up until anyone knew it existed? They aren't NIST, they are a spy agency.


While the NSA never have given reasons for their recommendations, they did at least to some extent contribute to securing standards in the past. One example was DES, that NSA changed the key schedule, at the same time reducing it from 128-bit to 56 bits around 1976. They never said why, but 15 years later, in 1991, the academic crypto community discovered differential cryptanalysis, and found that the changes suggested by NSA was in fact strengthening DES, not weakening it. The attack made possible by differential cryptanalysis require 2^55 steps, which is the same as a brute force attack against a 56-bit key. That indicates that NSA knew differential cryptanalysis already in 1976, but only modified DES to being able to resist it, not telling about the attack. The reduced key length of the new key schedule indicated the actual strength of the cipher, without revealing the attack.

NSA recommendations was also one of the reasons that the Rijndael cipher was chosen as AES. So at least in the late 90s they had some positive reputation. As far as I can tell, the tipping point is 2001. After that, the spy part NSA seems to have all the leverage, and to actively propose backdoored standards is not below their dignity.


They didn't start out as a spy agency. That was the CIA's job. NSA were the security experts. They built the tools that protected government systems, and the systems of government contractors (aerospace). That they are now called a "spy agency" shows that reputation is long gone. Of course now the CIA is flying armed drones and the NYPD has a foreign intelligence branch. The old job descriptions are irrelevant. Everyone can do everything these days.

Ask the old hats about "no such agency". If you were working with the US government in the 80s/90s you knew the term.


SIGINT, that is spying on electronic communications, has always been a core part of the NSA's mission.

They have always been a spy agency.

See https://en.wikipedia.org/wiki/ECHELON


> They didn't start out as a spy agency.

They were a spy agency as long as they were the NSA, and inherited that function from prior, WWII spy agencies.

> That was the CIA's job.

CIA is a civilian spy agency focussed on HUMINT, analysis, and open source intelligence, and also the lead US spy agency (and it's head was essentially the overall head of intelligence prior to the separate DNI), while NSA has always been a defense department spy agency and the lead SIGINT spy agency.

> NSA were the security experts.

They have that function, too, but they have been a spy agency back to 1945 (before being public in 1952) and inherited duties from prior spy agencies when they were founded.


This is hardly the first time suspicion has fallen on NSA proposals.

As linked elsewhere in this thread, https://en.wikipedia.org/wiki/Dual_EC_DRBG


True, but it's one of the first with hard evidence to use against the NSA.


Trevor Perrin also asked to kick some NSA chair from the IETF board. On my phone can't find the thread.



Yes, because Snowden taught us not to trust the NSA because they're shysters and crooks. How dare he!


People that aren't the US government have been suspicious of, and reluctant to adopt, NSA backed encryption proposals at least as far back as the Clipper chip, because of the NSA’s overt signals intelligence mission.

Snowden certainly didn't help the NSA plan get traction, but reducing the rejection to Snowden is overly simplistic.


Not really. Snowden caused a lot of people to think that everything ever leaked out of the NSA and all changes in perception of the NSA were caused by Snowden. But that is a somewhat different thing.


> concerns that the U.S. agency could be promoting encryption technology it knew how to break, rather than the most secure.

If they were going to do this, wouldn't they submit it under a pseudonym?


How would that even work? Cryptography is a small world. A submission from someone nobody's heard of would have no chance.


NSA has a budget of $12 billion.

Let's not pretend it's some impossible or unheard of feat pushing something through, like getting someone to publish it.

You'd have to be naive to suggest the NSA hasn't compromised any individuals in the security community.


The suggestion was that NSA would get someone to submit under a pseudonym, not that they would bribe Orr Dunkelman.


Because no one would consider it, or because careful expert analysis isn't enough and no one would trust it? In the latter case, how does one get on the trusted list, and if NSA is on it how on earth does one get off?


Probably nobody would even consider it. It would be like standardizing a cipher proposed on reddit r/crypto.


In that case they could just pay some reputable experts to consider it. They would pay anonymously, of course.


I think you're seriously underestimating how insular this community is. The phenomenon you're describing really just doesn't happen. You build a reputation among cryptographers and build to the point where your designs are seriously considered as standards.


Yes I probably am. The reputation shortcut seems to have worked for the community so far, so anonymous crypto designers don't really have anything to complain about.


Why? Doesn't the math stand on its own? Not accepting a sound proposal from a "nobody" is just as bad as accepting the NSA's because people think they're an authority on the matter. Isn't math supposed to be better than this?


The math needs explanation, explanation needs cooperation and coordination, cooperation and coordination breed familiarity, familiarity negates anonymity.

You simply can't be anonymous and trusted at the same time.


Satoshi Nakamoto sympathizes.


In serious academic crypto circles, "Satoshi Nakamoto" is a clown. I don't think a "Satoshi" algorithm would stand much of a chance either.


Can you elaborate? In what way is he a clown? (honest question)

As I understand it he applied a bunch of well known algorithms to solve a long-standing problem. He's not a cryptologist but, as far as I can tell, he never claimed so.


Why would they? A national security organization should be concerned with keeping their citizen’s traffic from being snooped - not snooping their citizen’s traffic.


If the glass is half full of citizen traffic and half full of non-citizen traffic and the NSA drinks ALL of it and I don't think they should be drinking ANY of it does that make me an optimist or a pessimist?


It makes you a cynic, but rightly so.


Considering the nature of comms that happen on line these days, it might make you a septic.


Optimized pessimist. You are part of the majority!


Please look past the name of the Agency to understand its mission. If you think the security of US citizens' communications should be a mission of the NSA, go advocate with your representatives to make it so.

As it currently stands, the IA mission of the NSA is solely to secure National Security Systems, which are systems that handle classified information or are critical to military or intelligence activities.

"National Security Directive (NSD) 42 authorizes NSA to secure National Security Systems, which includes systems that handle classified information or are otherwise critical to military or intelligence activities." https://www.nsa.gov/what-we-do/information-assurance/


Eh, I mean, it's complicated - isn't it? If anyone inside US borders is off limits, wouldn't it be a great place for terrorists to set up shop? Completely no surveillance in US borders, win win!

I agree completely that our rights shouldn't be trampled on by governing agencies. Yet, I don't know how they're going to work in the old fashioned way. Perhaps it's not required, but I can't blame them for seeking that option, even if well meaning.

(By old fashioned, I mean back in the days of easily monitoring everyones phones and calling it good enough. Things are becoming increasingly difficult to monitor for a even a well meaning government)

edit: Not sure why I'm being downvotes, so let me elaborate. We all know it's stupidly common for the government to monitor it's civilians. Those days are being altered heavily due to cryptography. In some ways it still be easier to track people, in other ways it will be harder.

Is this news to any of you? I don't get the push back lol.


> If anyone inside US borders is off limits, wouldn't it be a great place for terrorists to set up shop? Completely no surveillance in US borders, win win!

The idea isn’t that you can’t surveil within the US. The idea is routine traffic snooping is off limits. A warrant and a fair bit of effort should be required. The less effort required to wiretap, the closer we get to a surveillance state. Which is why trying to nurf IOT encryption is deeply wrong. It weakens freedom within the country, and makes a country week to outside attacks. Lose-lose.


I completely agree.

However, crypto doesn't abide by warrants. So many people read into my comment thinking that I'm supporting a surveillance state. I am not. I simply am talking about how it is, indeed, complicated.

Crypto doesn't care about warrants. The problem is complicated.

The fact that you can even talk about warrants in a crypto conversation sort of irks me to be honest. Warrants are meaningless in crypto conversations. Not just meaningless, but incompatible.

Don't argue points I'm not trying to make please. I'd delete my original comment if I could -_-


Crypto doesn't respect warrants, but if the government can't break your crypto, they can still install a physical keylogger - provided they go get a warrant.

Warrants aren't relevant if we're talking about crypto, but warrants are relevant if we're talking about surveillance.


There were never days of "easily monitoring everyone's phones". If they wanted to monitor somebody's phone they had to go to great effort to do so.

It's easier today than it has ever been for surveillance services to monitor large amounts of people. Not just because they have the access and computational ability to automatically search the content of messages, but also because more of our communications are long-distance or otherwise digitised than ever before.


> There were never days of "easily monitoring everyone's phones". If they wanted to monitor somebody's phone they had to go to great effort to do so.

Oh, I thought it was far easier to tap a phone than to break modern cryptography. Why was it not?

> It's easier today than it has ever been for surveillance services to monitor large amounts of people. Not just because they have the access and computational ability to automatically search the content of messages, but also because more of our communications are long-distance or otherwise digitised than ever before.

I would argue that this is changing though, hence the entire point of this Post. In the 90s no one encrypted anything, even https was laughable. More and more things are being encrypted.

That's what this post is effectively about, no? The government is trying to stay ahead of the ball on cryptography and hoping to keep their ability to monitor everyone.

Do you somehow disagree? I'm confused.


> We all know it's stupidly common for the government to monitor it's civilians.

Here's your problem. The fact that many people do something does not make it right -- it's a logical fallacy / conflation and it seems our brains are very vulnerable to it. Is it really so hard to believe that many people are wrong? Why does quantity equal quality?

I mean, it was very common to have slaves as well. Did that make it okay for the slaves?


> Here's your problem. The fact that many people do something does not make it right

I didn't once say it was right. In fact, I alluded to me being against it.

However, that doesn't change the fact that if you are tasked with protecting a state, it's the effective (but morally wrong) choice to monitor them.

My comment was simply that, like it or not, if you truly had 100% vision over the population you'd be able to protect them. In the same way a dictator could be far more effective than a democracy. Again, since ya'll will love to jump on me over this, I'm not supporting this.

I'm saying it's simply complicated. If the NSA is a good actor (I already said I don't believe them to be), then they are being asked to "protect" the country while flying blind. At least, that's what some people want.

.. again, I'm not defending them. I am in full support of all things crypto, and do not want any rights on any civilian to be trampled on.

Ya'll are touchy.

edit:

> Is it really so hard to believe that many people are wrong? Why does quantity equal quality?

I don't know, perhaps you should ask someone who said that? Ie, not me, because I didn't say that. Again.


Sorry for misunderstanding. And nope, not touchy in this case. More like -- I get it where are they coming from but apparently people don't share the sentiment so find another way; we pay taxes for them to figure out stuff and make it happen, not to play eternal game of cat and mouse with their own citizens.


Yea, I agree. I guess it's just that I can sympathize with their position (again, for the sake of argument assuming they are good actors..).

I was also way too vague. A better thing to talk around would have been warrants, and how our legal system is designed to ensure government can invade your privacy assuming reasonable requirements.. yet crypto changes that game entirely.

I definitely don't want the government invading my privacy, and I want true secure crypto in all things. I just can sympathize with how much this is going to change things.. some for better, and some for worst.


Could you elaborate on what you mean by "easily monitoring everyones phones"? Physically tapping the phones is easy, but doing anything meaningful with that amount of audio is anything but.


I suppose I should have said "anyones" vs "everyones", but that's an implementation issue isn't it? The problem I was talking about was proper encryption being impossible to monitor. Meaning that a well meaning actor (which I don't believe exists, but ignoring that) finds their job more difficult.



Pseudonymous submission would also be a reason for suspicion.


My problem with this link is that anyone can edit it and it looks like anyone can be an author on the site, so I question the validity of the article.


Interesting.

Do you also mistrust Wikipedia? Besides, it is not like every reputed journalist these days maintains basic journalistic standards or integrity. Case in point, article by Reuters from two days ago: https://news.ycombinator.com/item?id=17057700


The schools won't allow my kids to use Wikipedia as a source - so yeah. (this is despite the fact that Wikipedia got more strict with allowing people to comment and that they require references)

You're just mad I didn't trust your article


> The schools won't allow my kids to use Wikipedia as a source

Would they allow your kids to use Wikipedia's own sources — as long as they're legitimate — as their sources?

I do not trust traditional journalism as a source either (unlike Wikipedia and most schools). I try to find their sources; and when I can't, I regard it with suspicion.

> You're just mad I didn't trust your article.

This is either incredibly childish or too subtly tongue-in-cheek for me. Neither is the article "mine" in any way, nor was I "mad". I was genuinely curious.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: