Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Intel subsidiary fined for unlawful export of software that enables encryption (goodwinprocter.com)
314 points by aburan28 on Nov 3, 2014 | hide | past | favorite | 120 comments


How utterly pathetic.

Encryption export restrictions are an absolute joke. I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers. They're still fining people for exporting anything over 64 bits (which is most things), even if almost every individual let alone government has access to the technological knowledge to reproduce such encryption at home.

Seems like this fine is about them exporting a custom Linux distro' with all the normal encryption libraries still in-place (e.g. 256 AES).

Why do I get the strong sense that this fine isn't really about exporting encryption and is really about Wind River failing to place backdoors into this equipment? Because frankly that makes a lot more sense than what it appears to be on the surface.

Wind River got fined for "something" to do with encryption. Now you can either take the government at their word and assume that that thing is just exporting AES 256 within a standard OS, or maybe consider it a little deeper and wonder if it was punishment for something else Wind River did or didn't do.


>Encryption export restrictions are an absolute joke.

I don't think you give BIS & NSA the credit they deserve. Experto crede. The restrictions may serve a useful purpose and if you can't see it, that doesn't mean they are the ones missing the point.

Having myself had to pass the export control for one of my apps, and so having the opportunity to pour over the these regulations in great detail, I came to conclusion that the actual purpose of the crypto export regulation is not to control export of the technology. That sort of restrictions is impossible to implement, as you noted, and it's also quite unnecessary.

The actual purpose of the law is to prevent export of expertise. Implementing a crypto system that would be secure end-to-end is pretty much impossible without being an expert, and even then it's exceptionally hard. The best chances belong to a team of experts with a track record of fruitful collaboration and rich history in a particular area. Unlike bits and pieces of code, cohesive teams of experts are relatively easy to keep track of.

The way this control regime works, is that the entirety of the regulation revolves around two questions: 1) Are you supplying a turn-key end-to-end system to the client? 2) Are you supplying services to setup a complete, tailored system for the client? If the answer is "yes" to either question, that raises the red flag for the BIS/NSA, and they want to see what's going on there. Otherwise it's just a minor bureaucratic hoop to jump through. You are certainly welcome to export bunch of crypto code, and they even have a special open-source exception to the paperwork requirements.

So, you may disagree with the goals of this regulation, but it's certainly not a joke you make it out to be.


> 1) Are you supplying a turn-key end-to-end system to the client?

> So, you may disagree with the goals of this regulation, but it's certainly not a joke you make it out to be.

It remains a joke, when you consider how much really good open source software implementations are in the wild already. Think about the crypto in Debian and OpenBSD, including GPG, LUKS, openvpn, OTR, Open/LibreSSL with their support in Apache, nginx, etc. Even for Android, the best end-to-end encryption software (security-wise, not interface-wise) is open source. Let's face it, we have won this crypto "war" long ago. (This is not to say, we don't have other problems to deal with, regarding the immense power the NSA, etc have.)

And even all sufficiently secure software was closed source, as long as it is reasonably widespread, it will be easy to pirate the software instead of importing it from the US.

> 2) Are you supplying services to setup a complete, tailored system for the client?

What kind of systems do you have in mind? Afaik, the case referred to in the article does not fall into this category.


These are all components, not systems. A system would be something akin to a full-bank installation with all the computers, cabling, routers, crypto protocols, key generation/escrow/rotation/destruction, fall-back procedures, firewalls, hotglued USB ports, etc.

You can use those components to build a system like this, but you have to be an expert in it. This is why component export is restricted - all they want is to look into your design to see if you are an expert capable enough to design and implement a secure system. What happens if they pick interest in you is beyond my experience. I know, it speaks poorly of my crypto skills, huh?.. :) I imagine they would start looking into identity of the client(s), and see if they are connected to embargoed entities. Or something...


Fair point. I didn't recognize you had such a high level view of "system". But this raises the question whether one can really call these laws "cryptography export restrictions". Because, sure, cryptography is involved, yet the restrictions only apply to (/ are meaningful in regard of) the procedures involved in secure IT systems in general. I'd argue that this expertise is somewhat independent of cryptography, as you can swap the cryptography implementations with any other and the procedures would stay the same.

Even better, leave the cryptography out of the package entirely and just include an "apt-get install" line or a list of open source projects you have to install. Instructions for "cabling, routers, crypto protocols, key generation/escrow/rotation/destruction, fall-back procedures, firewalls, hotglued USB ports, etc." aren't cryptography in themselves, are they?


Fist, it adds more moving parts, making the system more likely to contain a hole, which might be just enough for NSA. Second, consider that typical buyer is a beauracracy, and they are are either buying a crypto system, or the are not buying it. The upgrade might be trivial for you, but a beauracrat has no way of knowing that, so he has to play by the rules.

I think they key to scale of the word system is however big it needs to be to establish actual security. You can have perfectly good crypto, but if you're relying on certificate authorities you may be safe from street hackers, but as far as NSA is concerned its wide open.

It's just reading tea leaves, of course. I imagine that NSA is not enjoying reading thousands of applications from iOS app devs like me, so if they keep doing it they must be getting something for their effort.


> The actual purpose of the law is to prevent export of expertise.

You're both right. It can serve dual purposes.

To prevent the export of expertise AND give the government a convenient stick to smack anyone not doing what they want.


BTW, other countries can and do start to apply the same rules against US companies.

China started to have a lot more rules and regulations regarding to the importation of products with any kind of crypto.

They are becoming bigger IT/Mobile product market than US.

I can see in 2-3 years - as soon as the Chinese home grow Mobile SOC are mature, they will start to require Intel/QComm/Apple to disclose everything they do in the Chip/Software level relate to crypto before any of the next gen product are allowed to be sold in Chinese market.

The choice for AAPL/QComm/Intel will be do what the Chinese government wanted or loss access to 30-50% of customers on principle - like Google did a few years ago.


> 1) Are you supplying a turn-key end-to-end system to the client?

Apple's iPhones and their iMessage system are end-to-end systems (or so they claim). How can they do this?


It's not illegal to export the system, you just have to get BIS/NSA permission to do that.

I see a few possible explanations on why they allowed it - 1) NSA was satisfied with their intercept capability (likely around key generation / exchange being based on cellular network/SMS) 2) they decided that the technology was a commodity and it's best to have Apple dominate it (where they can still attack the data center or use legal means) rather than cede it to a foreign power.

I'm just guessing though.


Whilst the ability to implement secure systems is certainly not commonplace, the United States doesn't have a monopoly on encryption and security expertise.


If the USG wanted to coerce Wind River into backdooring vxWorks, they have better ways to do it than a newsworthy enforcement action regarding cryptography exports.

Export controls for crypto are stupid and counterproductive, though; I agree with you there.


> I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers.

Yes, you can. And from that, you can go implement those cyphers. And the NSA is probably OK with that, because it's really hard to implement crypto correctly. If I understand correctly, more crypto systems are broken because of implementation flaws than because of protocol flaws. So if you have to implement it yourself, odds are that you will make a mistake that makes it insecure, especially if you aren't an expert on crypto.


> And from that, you can go implement those cyphers [...] If I understand correctly, more crypto systems are broken because of implementation flaws than because of protocol flaws.

That is compare with high professional flaw-less implementations with high professional back doors built in like the RSA products.

>So if you have to implement it yourself, odds are that you will make a mistake that makes it insecure, especially if you aren't an expert on crypto.

It of course depends on who you're, whether somebody would go to the lengths of custom cracking your custom implementation vs. buying an "off-the-shelf" crack/backdoor if you use off-the-shelf software.


I know this is a common belief, but I'm not completely sure it's true. I mean crypto is an algorithm, and there are reference vectors for most common types, so you can be pretty sure you got the basics right.

Now sure, you need to worry about other things, like the entropy of andomm number generators and so on, but this stuff isn't exactly rocket science.

So does experience help? Sure. And maybe you need to be I the upper quartile to get it right, but programming is programming and yes it's going to take some time and effort, but clearly it's do'able...

Now I'm not recommending you do your own - I'm just not sure I buy the idea that it's impossible for mere mortals to pull it off...

Then again, maybe I'm just an NSA employee wanting to make you think it's safe...


You should take a look at the [encryption] tag on stack overflow. There is a lot of really depressing stuff there.

Implementing low level cryptographic algorithms is hard - I'd say near impossible to get right - if you apply high standards like side-channel resistance. There are a lot of use cases though, where side-channel resistance is not necessary to get secure "enough" against the slightly above average attacker. But the next level to get wrong is using the low-level crypto correctly. E.g. by assuming encrypting data is enough to protect the content, until you find out about CBC padding oracles and how your service involuntarily decrypts the sensitive date for the attacker (ASP.net: [1]). Or you realize, that your ad-hoc message authentication code does not protect your API at all, because MD5, SHA1 and the SHA2 family are all subject to length extension attacks (Flickr: [2]).

The subtleties are endless: In RSA, you can calculate the `e`th root over the real numbers, if `e` and the message is small, RSA has it's own padding problems [3], Stream ciphers provide no authenticity whatsoever as you can just xor the ciphertext with any value to manipulate the plaintext, password hashing is not at all about what most people believe it to be, comparing passwords should be done using a PAKE [4] (otherwise you can bruteforce passwords, like in WPA)... And then there is stuff that even the experts get wrong, think TLS Triple Handshake [5].

You can learn about this kind of stuff by reading the right books (I like to recommend "Cryptography Engineering" [6]) and keeping an eye open for vulnerabilities and how they work, so it is not obscure or hidden knowledge. But building cryptographic systems involves a lot more work and expert knowledge than the average programmer in the upper quartile imagines. And, crucially, you won't know when you get it wrong.

[1] http://weblogs.asp.net/scottgu/important-asp-net-security-vu...

[2] http://vnhacker.blogspot.de/2009/09/flickrs-api-signature-fo...

[3] http://archiv.infsec.ethz.ch/education/fs08/secsem/Bleichenb...

[4] https://en.wikipedia.org/wiki/Password-authenticated_key_agr...

[5] http://blog.cryptographyengineering.com/2014/04/attack-of-we...

[6] https://www.schneier.com/book-ce.html


Sure, crypto is hard, but I don't think for the last 20 years or so we have been living in a world where the five countries (and one specially-administered territory) mentioned in the article as having received the software (China, Hong Kong, Russia, Israel, South Africa, and South Korea) would have significantly more trouble than a private company in the U.S. producing secure cryptographic software based on known algorithms and protocols. The only consequence of export restrictions on crypto beyond the things that live in the classified/military world, is reducing the competitiveness of American software firms and decreasing economic output by having every piece of security software developed twice, once in the U.S. and once in say Switzerland or India or whichever country doesn't have insane export restrictions on crypto.


And those references are either well-studied by the NSA to learn the implementation failures, or they've actively worked on the reference to insert a weakness.


Side channel attacks are more common.


Especially timing attacks. Designing security-related software that is immune to those really is challenging.


Dude. $750K for Intel. They spend more per day on psychological counseling for their chip geek employees who think they're living in the Matrix. Ain't gonna force them to backdoor nothin'... ain't gonna be punishment either for not doing a backdoor job when they were told nicely to do a nice backdoor job.

Sure, my tinfoil is as good as anybody's... REYNOLDS WRAP HEAVY DUTY... best there is... but this seems to be a clear case of some bureaucrat applying the rules he or she has been given to the case he or she has been given... Nothing else...


I laughed too hard. I'll get down voted?


Not only that, but anyone in any embargoed country can trivially redirect through a VPS or Tor and bypass any controls.

It's sort of like the war on drugs if pot could be sent electronically. Hilarious security theater...


> Encryption export restrictions are an absolute joke. I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers.

You are right. Encryption exports are a joke. Not because of Wikipedia, however, but rather because of open source software.

It's trivial to find instructions on how to build OpenSSL from source, especially for a foreign government with all the power and money they have. This makes the law in question nonsensical in theory and ineffective in practice.


As a public policy matter, and if I had to guess, these controls have more to do with retarding the flow of any high-end technology to the USG blacklist. The policy goal isn't to prevent organizations on the blacklist from being able to deploy RSA encryption, but rather to prevent them from sourcing technology of any sort from US companies.

Blacklisted organizations can obviously still source RSA, along with whatever platform they want to run it on, but it's presumably more expensive for them to do so.


I'm not even sure the object is to make it significantly more expensive, at least not in the financial sense. If I were a securocrat, I'd be monitoring entities forbidden from purchasing this stuff from the US in order to compile intelligence on them and their vendors - do they prefer to use TOR, or just hit a supplier in .xyz domain, or does a particular individual reliably purchase a plane ticket after a legal rejection?


This could also be one of those things where everyone involved recognizes that the policy is incoherent, but any time someone makes a serious move towards reforming it, they're informed by DoD or DHS that aspects of the policy have convenient knock-on effects that they don't want to eliminate.

If the policy isn't actively harming industry (and beyond optics it may not really be doing that much direct harm), it may seem like poor risk/reward to change it.

This isn't a normative argument.


> is really about Wind River failing to place backdoors into this equipment?

I also felt some bad vibes from this story. There are probably hundreds of US based companies exporting embedded crypto, but how many are exporting to China and Russia a RTOS for the aerospace and defense industries?


Careful: vxWorks applications to military/industrial work might be what got BIS's attention, but that's not what vxWorks is "for"; it's just a very popular RTOS.


Sorry English is not my first language, I didn't mean that those were exclusively the only industries.


The amount in question doesn't make sense for that. Nor does the voluntary disclosure. 750k isn't a scary and punishing fine when you have Intel backing you.


This could be just meant as a message to other firms. Intel might even be getting the money back through routes like tax breaks and just be playing along.


Then the message is "Obey the law, or you will be slapped with trivial and meaningless fines".


The contracts the fine was over were valued at less than 3 million dollars, so it was pretty sizable in those terms.


This idiocy also give you the feeling that the Federal government is on such a roll forward that every agency that ever lost a battle on security wants to come back and re-win it.


Wind River is the vendor behind vxWorks, which is one of the most widely-used embedded operating systems in the world. vxWorks isn't especially common in consumer products (although you can find SOHO-type routers running it) but it's very important in industry, aerospace, manufacturing, RF and telecom, and things like that.

The "sharp departure" Goodwin refers to here is the detente between industry and the USG for the past ~12 years. In the late '90s, crypto export controls were vigorously enforced. When you went to download Netscape, you had to select the US-specific installer to get cryptography that wasn't trivially brute-forced. If you shipped products that included SSL/TLS, or even just MD5, you had forms to fill out.

It seems like there'd need to be a backstory on this Wind River enforcement action. Surely there's quite a bit of crypto code in various vxWorks distributions, but it pales in comparison to the cryptography shipped by Mozilla and the Chromium projects. Maybe the issue is "enabling technology" versus consumer products?

It's also unclear to me whether this is a return to form or the BIS being especially aggressive about deliberate exports to specific countries.


The BIS press release seems to narrow the scope a little:

http://www.bis.doc.gov/index.php/component/content/article/1...

It talks about the software in question being on some list and says some of the export recipients in China are on the BIS Entity List.

Edit: "on some list" is a dumb way of saying that they decided some of the software was subject to "Export Administration Regulations".


Incorrect... You sometimes had to download binaries from a source outside the US. Usually not, since the user was simply expected to click the link for the non-US version.

In cases where the server actually looked at your source IP you could either proxy or just download the binary from a source in Europe. The dev teams were and still are seldom located in the US alone, but distributed all over the world. The authors of encryption packages were often not US citizens, either.

The rest of the world looked at the US policy on encryption with wide-eyed disbelief due to the horrifying ignorance that just had to lie behind it.


The US is far from the only western country that had laws restricting the use of cryptography, and export rules governing crypto were common throughout Europe as well.


> The rest of the world looked at the US policy on encryption with wide-eyed disbelief due to the horrifying ignorance that just had to lie behind it.

http://en.wikipedia.org/wiki/Authoritarianism


Back in Australia in 1994 or so, I clicked on a download link for PGP without really reading the warnings, reading them during the download, and then wondering when the US SWAT team was going to swing through my window.


It could be that the crypto part is just incidental, the BIS press release doesn't really make much of a fuss about it. They focus on the duration and repeated non-compliant actions.

It sounds almost like they're saying 'if you screw up once or twice up but self-report quickly, we'll send you a sternly worded note but if you tell us you've been violating the regulation for years we're obliged to give you a firm but not too painful slap on the wrist'


If this type of thing is interesting to you, here's some lovely historical perspective about government restrictions on crypto: http://www.foia.cia.gov/sites/default/files/DOC_0006231614.p...

In particular, related to this $750k Wind River violation:

Bidzos realized that there was a simple way to skirt the export limitation laws. Both the AECA and the ITAR are US laws, and, as, such, they pertain only to US companies. Bidzos went to Japan and established a Japanese subsidary, Nihon RSA. Nihon RSA qualified as a distinct Japanese company, subject only to Japanese law, and Bidzos was able to do whatever he wanted with his strong encryption technologies. Knowing that his encryption technology was a desired commodity, he shopped it around the Japanese marketplace. He promptly struck a deal to produce encryption chips for Nippon Telephone and Telegraph Corporation.

If the US government continues to fine Wind River, then couldn't they just set up a foreign corporation?


That is likely going to be a consequence of these actions. Digital security is a huge industry and is going continue being a huge industry for a long time. US companies are decreasingly in a position to market themselves as genuinely secure companies.

Companies that want to have credibility in the industry are going to need to bring themselves out of the jurisdiction of restrictive laws, or lose business to companies that have set themselves up in more friendly jurisdictions.


This is no longer possible with all the international trade agreements.

Lets take for example UK company selling their stuff manufactured in Asia on the Australian market - secret US blacklists still apply:

http://www.eevblog.com/2014/02/24/element-14-holding-orders-...


US Company Fined for Exporting Math.

We ain't got enough Math in the US to be sending it abroad!


Isn't that like saying someone fined for exporting missiles was fined for "exporting physics", or someone fined for exporting chemical weapons was fined for "exporting chemistry"? If not, how do you distinguish them?

Note: I'm not suggesting that the cryptography export restrictions are useful or make any sense it all. They don't, as far as I can tell. I'm just trying to get some discussion going as to whether reductionist arguments like "exporting math" make sense.


Missiles and chemical weapons are physical objects which one can reasonably presume to maintain at least some degree of control over.

Encryption algorithms are math, theory, information. They can be reproduced, copied, and transmitted at-will, for no cost.

If the difference is still non-obvious, google around for instructions to make ricin. Easy to find, and it's not illegal to share that information. Why should it be illegal to share any other data? And how could one reasonably expect to enforce it?


It's definitely illegal to commercially export information in those areas as well, at least past some level of detail. For example it's illegal to sell missile or reactor blueprints without approval. It's not just selling the physical objects (missiles or reactors) that's illegal, but also selling the information on how to construct them.


There's a difference between an algorithm and a piece of working software, as most programmers are painfully aware.


Neither missiles nor chemical weapons are ideas. They are not information; they are tangible.

A better analogy would be someone exporting a piece of paper with the reactions involved in a chemical weapon on it, or the formulas that describe where a missile will land given certain parameters.


We have enough math, but the bureaucracy keeps us from teaching it effectively.


You've already lost the 's'. Any more would be a disaster.


I like the word "puzzles" instead of "math". Makes it seem even more ridiculous.


"US Company Fined for Exporting Math."

This.


Time for DJB and the EFF to bring their old case back?

https://en.wikipedia.org/wiki/Bernstein_v._United_States

"On October 15, 2003, almost nine years after Bernstein first brought the case, the judge dismissed it and asked Bernstein to come back when the government made a "concrete threat"."

This sure sounds like a concrete threat.


It needs to be a concrete threat to Daniel J. Bernstein.


Yeah, you're likely correct. I didn't catch this at first due to my immediate feeling of threat simply for having published a few tools which use high-level cryptography libraries.


Wind River seems to have been dinged for completing commercial transactions with organizations that happened to be on the BIS export blacklist. You're not going to get fined for posting code to Github.


But Wind River is only guilty of exporting the same already public information which I may be and it's quite possible that anyone on that "BIS export blacklist" could feel free to obtain my code.


No, I do not think that accurately captures what the USG is fining Wind River for.


Sorry, my time machine appears to be on the blink. I appear to be in the 1990s.


Seems like they are trying to reignite the crypto wars, what with the FBI talking about mandatory backdoors.

It's just utterly confusing. You would think their reaction to Snowden would be appeasement, not further repression and surveillance. Can't make it too obvious this ain't a democracy.


I can actually see their reasoning: "you thought we were being hard on you? So the gloves are off now? Ok, so we'll show you what happens when we really play hard".


I'm not sure if it was intended, but using that idiom here looks to me as a clever reference to <blink>.


It was an Intel subsidiary. The article is not very informative. It appears as if Wind River didn't file for an export permit and the BIS fined the because of it; moreover, they voluntarily disclosed it [0].

[0]:http://www.theregister.co.uk/2014/10/17/intel_subsidiary_cry...


Yes because one should have to file for a permit to export math (or anything else)


The law says you do. If you want to change that law, fine. But if you choose not to comply, don't complain when you get fined for it.


Why shouldn't someone complain about a law, even if they already knew about it? The false dichotomy between "obey and complain" vs. "get caught and shut up" isn't necessary.


Because sometimes it's annoying and simply stupid.

Case in point: posting stupid reductionist arguments about "exporting math" are obnoxious and completely frivolous, especially when it literally ignores everything substantial about the law.

After all, if I give north korea blueprints for nuclear reactors, that's just paper right?! It's just math on paper! What, that's not the same? Sure it is - quit trying to stop me from exporting paper!

You do see how the grandparents point is obnoxious and stupid, right? Or do you not because it simply aligns with your worldview?


Because it's obnoxious?


This is exactly what I fear happens when we allow bad laws to stay in a "well it's there but we don't enforce it" state.

Is the way around this to develop crypto systems outside the US?


Laws that aren't actively enforced by the state are solely there exist so that they can be used to persecute undesirables. It creates the illusion of a free society while keeping an implied state of control against the public. The only just act to do is repeal or nullify the law.


If I remember correctly, someone published a book which contained source code in monospace for PGP so that it would fall under freedom of speech and was allowed to leave the US. But that's from memory, I'm not sure it really did circumvent that law or that it was PGP.

Edit: Ah, AlyssaRowan actually mentions[1] this further down the comment thread:

> PGP 2.6.2 was exported as a book to call the bluff of the whole thing

[1] https://news.ycombinator.com/item?id=8552169


Yes, it was PGP, and yes, it got around the law. The issue was indeed that exporting it as software made it subject to ITAR, but published as a book, it was considered speech.

The tactic was never tested in court with PGP, but it was upheld in at least one other case I'm aware of, Bernstein v. United States. Thank you yet again, DJB.


That doesn't prove anything. You can also export a physics textbook without the issues you would run into exporting a nuclear weapon.


Except that you don't really need to buy expensive and probably-hard-to-get-by stuff to turn the book with code into something useful (or munition, in the US government's eyes) whereas with nuclear weapons you somehow need to get a radioactive warhead or something.


No, you need something even harder - expert knowledge.


The damage to the US economy and tech industry caused by this sort of thing seems likely to outweigh any supposed security benefit. Celine's First Law: "National Security is the chief cause of national insecurity." [1] Related stories on HN recently involving Twitter et al [2] are enough to make one wonder how long before some of them relocate overseas.

[1] http://en.wikipedia.org/wiki/Celine%27s_laws#Celine.27s_Firs...

[2] https://news.ycombinator.com/item?id=8422581


So if you're a U.S. software developer, perhaps opensource, working on something that uses encryption, how do you stay out of trouble?

Are you ok if you use standard libraries, rather than implementing your own crypto algorithms?


Publicly available encryption source code is exempt from the ECCN[1] according to the License Exception[2], section 740.13(e), provided you notify the BIS. But do read the documents, IANAL. Interestingly, though I'm EU based, but ECCN list is literally the same over here. Have been wading through these documents countless times.

[1] http://www.bis.doc.gov/index.php/forms-documents/doc_downloa...

[2] http://www.bis.doc.gov/index.php/forms-documents/doc_downloa...


And I think it is fairly easy to "self-classify" many kinds of encryption software, though you may have to file an annual self-classification report.


American tech companies still in america are becoming the unluckiest bunch. Their competitors pay a third of tax, are free from having to answer to the NSA, and generally don't have to put up with cold war nonsense from the likes of the BIS. If it weren't for american talent, many many more companies would make the move.


I agree with your comment. There are also a lot more reasons than just talent for why the US tech industry does so well.

It's the largest singular market in the world. A very technologically advanced economy in general. It's a very diverse economy, with the second largest manufacturing sector, the largest agriculture sector, the largest biotech + pharma sectors, as well as the largest space industry and service sectors - which provides for a very diverse tech demand and stimulant. Businesses in the US have very high productivity rates, so they seek out productive gains as a matter of fact, which incubates a higher demand for technology. The US tends to adopt technology rapidly, the culture is mostly accepting of new tech, which plays into that huge market. Businesses can change state locations fairly easily, shifting to jurisdictions that are more friendly in various ways (either for taxes, regulation, weather or talent depending). In the US the tech industry is regulated at a mostly low level, leaving the market free to innovate and change rapidly. There are a lot of other good reasons.


Could this all be a positive thing if it promotes development of open source libraries and tools ?

Not only that, but I wander how feasible is to create binary APIs for most operating systems and CPU architectures, where you could download, verify (that they were build from a commit id of some know open source implementation), and plug in crypto modules as binaries.

So you buy your software from any vendor, any country. If they implement the use of this API, then get your verified crypto modules and plug them in. Or you build your own in house from source.

I can see how that mechanism would of course be a massive attack target by all kinds of actors, but in theory is that possible? Maybe make it decentralized as much as possible.


This might be an attempt to use existing means to pressure mobile device manufacturers from enabling full device encryption by default. I know the executive branch hasn't been too happy with the latest announcements from Google and Apple.


I would speculate they did something closer to providing source code to Chinese defense contractors.

(Which probably doesn't really matter in the scheme things, but it's the sort of thing that will really piss off the U.S. Government)


If US tech companies cannot legally offer encryption to international clients then we international clients shouldn't be using US tech companies.


The country classifications are interesting https://www.bis.doc.gov/index.php/forms-documents/doc_downlo...

A:1 countries are our friends. E:1 countries are "terrorists". The rest are in between. There is also the list of "Supplement 3" exception countries (friends): see last page of http://www.bis.doc.gov/index.php/forms-documents/doc_downloa...

This charts shows what's allowed if you have crypto in your product: https://www.bis.doc.gov/index.php/forms-documents/doc_view/9...

I assume vxWorks is 5E002 (tools to create non-standard crypto systems), which is banned for D:1 countries. Well there is an exception "‐ 5E002: no D:1 countries (unless HQ'd in Supp. 3)".


does this mean if I have developed a product based on encryption it is better to incorporate outside of USA? and perhaps register a satelite in US only to import said products?


It is better to not be in the US as a crypto developer, yes (especially since there are some concerns that CALEA, National Security Letters and other provisions have been used to strongarm people into backdooring crypto products as described in the classified "SIGINT Enabling Project" budget, although so far I have not seen any backdoors that were not put in willingly).

That has pretty much always been true. The crypto export laws are weaker than they used to be back in the PGP days, when PGP 2.6.2 was exported as a book to call the bluff of the whole thing, but they're still there and of some concern (especially if you intend your software to be used by citizens inside oppressive regimes which might be on The Export List).

I don't see why you'd need a 'satellite' to 'import'. There are no import restrictions I'm aware of in the US?

For a brief, non-normative summary of crypto law, please see: http://www.cryptolaw.org/


Correction: "against Wind River Systems, an Intel subsidiary" not Intel.


What is the justification of this law? Seems arbitrary to me. Another question. Is not providing ample security the same as explicitly granting a backdoor?


The justification is we don't help our enemies, even if it doesn't make a difference in the grand scheme.

Edit: looks like one of the countries is our ally... so that's weird. For the rest of them makes sense from a geopolitical standpoint.


It can get weird. I remember when the export list was significantly expanded for high-bit crypto (2000? 2001?), and France was still on the no-go list - not because the US prohibited export as such, but because France prohibited import (except through authorized channels). Israel may be in the same camp.


To be more precise, France allowed import of 128-bit crypto before US allowed export of it and months after US allowed 56-bit crypto to be exported. I also found out that it took until 2004 for France to formally allow import of 256-bit crypto.


What better way to prove that your chips aren't NSA backdoors than to pay a hefty fine to the people you're supposedly in bed with?


Prove?


Do we need to go back to Zimmerman's old trick and publish the code in books? What happens if we put the code in non-DRM'd ebooks?


Few years ago, I was in a team implementing IPSec/IKE protocols for a US based company in Pakistan. Though implemented in Pakistan, that product could not be sold there :)


So vxWorks on the Curiosity Rover is - presumably - exporting strong encryption to Martian terrorists, right?


Would this include the export of Navajo?


What do you think the reservations are for?


All this will eventually amount to American companies losing credibility around the world.


Can anyone provide details on what is forbidden to distribute and to where?


http://www.bis.doc.gov/index.php/regulations/export-administ...

I guess you have to read it all before you can go back and try to understand part of it (that is, I got a headache looking at a few paragraphs, and don't think I learned anything).


Is open sourcing the sensitive bits a possible defense against this?


I feel like the title should be updated, a tad misleading IMO.


Is it illegal to opensource encryption implementations?


No, however, the export of it is still punishable. The GNU gcrypt tools, for example, are hosted exclusively on European servers for this reason.

http://www.gnu.org/software/libgcrypt/


What do words like "import," "export," and "country" mean in an Internet context?


They mean whatever the regulator wants them to mean, when you're handcuffed to his table.

Prudence dictates that you not expose yourself or your business to unnecessary risks. So by that standard, the "country" is wherever the enforcer has the power to hurt you, an "import" is bringing something within reach of the enforcer, and an "export" is moving something beyond his reach.

Just stay out of the "country", and you never have to worry about being punished for "exports". If you look at the Byzantine import/export regulations, you don't have to be physically located outside of the geographical boundaries of the U.S. to be considered a foreign entity for the purpose of importing and exporting technical information.

Just feed in this new input to your lawyer/accountant tax avoidance machine, and everything should be all sorted out by next fiscal year.


Incorrect, the USA are known for messing with money, goods, people (CIA abductions) or even the airplanes of foreign heads of state (Snowden), even if there is no applicable jurisdiction.

Behave like a bully everywhere, then some day resistance will rise to you...


I put "country" in scare quotes for precisely this reason.

If what you do impedes the state, even if it is outwardly legal, it will impede you right back, with whatever means that seem most convenient. If you want to develop strong encryption and distribute it worldwide, your operational security had better be impeccable.

There's good reason why Bitcoin developer Satoshi is a secret pseudonym, and why the real person or people behind it should be reluctant to link their public identities with it. There would certainly be either a character assassination in the media, or an "unfortunate accident" on the streets.

There have always been men in this world more willing to serve power than to uphold principles. It hardly matters what flag patches they wear on their uniforms.


Can I create a decentralized autonomous corporation with an incorporeal geographic location yet?


I think it is called the International Olympic Committee.



So what better software or hardware encryption?


1 goal: Destroy Mid-size and Small USA business 2 goal: Destroy Software, IoT, embedded software 3 strategy: 2.3.1 Stomp the grass to scare the snake 4 scope: ALL business uses encryption

5 past: destroy short term crypto by brute force 6 present: destroy mid term crypto by Quantum Computing 7 future: destroy long term crypto security aka NTRU

8 open source: sel4 proved secure and safe 9 open source: Post-Quantum Cryptography NTRU 10 open source: secure against D-Wwaavve Goooogle cracking.

11 crime UNnatural: Nature, internet and Cybersecurity 12 crime UNnatural: Big banksters too big to fail. 13 crime UNnatural: Scapegoat the small business software 14 tactic: firing a drone is expensive. 15 tactic: firing malware at Michael Hastings car is easy 16 pretext: Wind River Fined for Encryption Software

17 bibliography 18 --books many titles destruction of middle class USA 19 en.wikipedia.org/wiki/Thirty-Six_Stratagems 20 --[PDF]On the Cryptographic Long Term Security 21 www.scienpress.com/Upload/JAMB/Vol%203_1_1.pdf 22 http://sel4.systems/ 23 http://arxiv.org/abs/1405.2034 24 http://arxiv.org/abs/1410.8317 25 --Insights from the Nature for Cybersecurity 26 --[PDF]Provably Secure LWE Encryption with Smallish Uniform ... 27 eprint.iacr.org/.../164.... 28 https://pqcrypto2014.uwaterloo.ca/.../post-quantu... 29 --[PDF]Making NTRU as Secure as Worst-Case Problems over ... 30 www.iacr.org/.../6632... 31 --Michael Hasting's Car Allegedly hacked? 32 http://www.huffingtonpost.com/2013/06/24/michael-hastings-ca...


$750,000? That'll show them!




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: