DO BOTH. This is the real world. People have to compromise to send a unified message. Don't refuse to help one group who shares your goals because they have a different idea of how to achieve it. If you are in a position where you can boycott and voice your opinion to their faces, do it. Maybe you're right and they don't give a shit about what you say. Who cares? Let the other people there know, and let them know that there are more of you out there.
They had a huge brain drain as soon as the market picked up at all post-dotcom period, too.
"II. The 2008 Amendments to the FISA
While the underlying actions were pending in district court,
and partially in response to these suits, Congress enacted the FISA Amendments Act of 2008, Pub. L. No. 110-261, 122
Stat. 2435, codified at 50 U.S.C. § 1885a. Among the amendments is § 802, an immunity provision and related procedures that are triggered if the United States Attorney General certifies to one or more of five conditions. In such case, no civil action may be maintained “against any person for providing assistance to an element of the intelligence community.” § 802(a)."
This to me says such an action would not even get off the ground let alone them having to answer the "what did you think the $10m was buying" question.
1. any assistance by that person was provided pursuant to an order of the court established under section 103(a) directing such assistance;
2. any assistance by that person was provided pursuant to a certification in writing under section 2511(2)(a)(ii)(B) or 2709(b) of title 18, United States Code;
3. any assistance by that person was provided pursuant to a directive under section 102(a)(4), 105B(e), as added by section 2 of the Protect America Act of 2007 (Public Law 110–55), or 702(h) directing such assistance;
4. in the case of a covered civil action, the assistance alleged to have been provided by the electronic communication service provider was—
A) in connection with an intelligence activity involving communications that was—
i) authorized by the President during the period beginning on September 11, 2001, and ending on January 17, 2007; and
ii) designed to detect or prevent a terrorist attack, or activities in preparation for a terrorist attack, against the United States; and
B) the subject of a written request or directive, or a series of written requests or directives, from the Attorney General or the head of an element of the intelligence community (or the deputy of such person) to the electronic communication service provider indicating that the activity was—
i) authorized by the President; and
ii) determined to be lawful; or
5. the person did not provide the alleged assistance.
To argue they were actively working with the NSA on an intelligence operation, while possibly granting them immunity there, would likely cost them even more customers.
I just don't see an effective boycott of this scale happening -- especially when most of their customers just care about the product cost and benefit. Also, it can probably be argued that trying to secure your systems against a targeted intrusion from the NSA using technical means is pointless and a waste of money (throwing money at the EFF might be more effective).
Having said that, is there an good alternative to SecureID? The only thing that seems to come close is CRYPTOCard, but it looks like they have closer ties with the NSA than RSA does. A yubikey also looks nice, but I don't like how it needs to be plugged in as a keyboard -- a device that is kept physically separated from the login machine would be ideal. OTP apps on a multi-purpose device (mobile phone) also isn't something I consider to be secure.
RSA is no longer regarded as invincible, infalliable.
I met some cocksure RSA shirts (sales engineers?) waiting for their espresso. I cheerfully congratulated them on their employer's security fobs getting pwned . I'm sure they were visiting our company for damage control. They tried to play it off "Gee, whatever are you talking about?"; I'm sure everyone got the memo.
I just looked them in the eye and said "Don't play stupid. I read the news." Their demeanor completely changed. Real contrition.
Like everyone, I have some skin in this game. As an election integrity enthusiast, I'm exhausted by the techno utopian stooges promoting crypto as a cure all.
The more the public understands the emperor has no clothes, the safer we'll be.
Twitter, Tumblr, Facebook, MIT, Stanford, Sony, Arbor Networks, 37 Signals, Twilio, (and many more) all use Duo Security as an alternative to RSA SecureID. https://www.duosecurity.com/success-stories
Duo 2FA is easily the most secure, easiest to use, and most developer friendly multi-factor solution.
[3a] Almost all of Duo is open source. https://github.com/duosecurity
[3b] Duo's c development libraries and SSH/PAM packages are available in the official repos for major distributions like Debian/Ubuntu, REHL/CentOS/Fedora, SUSE/SLES, etc. http://packages.debian.org/search?keywords=duo+security&sear...
[3c] Duo's REST APIs kick ass: https://www.duosecurity.com/api
Minor nitpick, but if we're shaming companies in the security community, I think it's worth calling out some of the security celebrities whose stances contribute to the privacy destroying activities of the NSA. He and people like the gruqg are enablers of the government's destruction of our privacy.
I agree with you though. Duo is one of the best alternatives out there. Jon's collaborations w/ Miller were years ago. Perhaps I'm being a bit to grudgy.
Boycotting all EMC subsidiaries > boycotting just RSA > doing nothing
Nothing wrong with keeping it small scale and only boycotting RSA, if the alternative is doing nothing at all.
False dichotomy of a slippery slope all-or-nothing boycott with the all option there is another false dichotomy of "limited alternatives." Sounds like rationalization BS for doing nothing to me.
If the options were so "limited," why isn't there more crowdfunding of open source hardware &| software security modules?
A device with a broken usage model (USB or writable SmartCard interface) isn't acceptable since it defeats the entire purpose of using it in the first place.
You're assuming that the NSA is unwilling to do things that are illegal. The (legitimate) fear is that your competitors may have ties to the NSA, who can then target you illegally, which can greatly benefit your competitor and damage your business.
Good security has now been made vastly more expensive because the government has to be assumed to have built nearly every high-budget attack vector they could. The cost of defending your business from this is very high, and we're all going to end up paying for it.
> Good security has now been made vastly more expensive
Most (all?) companies won't be able to outspend the NSA when it comes to security. So, my point is that trying to compete with them by spending more is pointless. We won't pay for it, because we can't afford it.
Lobbying seems to be the only course of action left which might be effective.
You don't need to outspend the NSA to have good security. You just need to spend more than you used to. One example, for instance, is Google now has to encrypt all traffic on its own internal lines.
The choice is to make an effort instead of making excuses, because doing nothing has the appearance and effect of condoning the behavior. Silence gives consent.
A wise (wo|)man will make more opportunities than (s|)he finds. - Francis Bacon
However, it's the same company that makes CRYPTOCard (do a search for "fortezza" to read up on the history).
Edit: The parent to this post originally said "Aladdin Etoken" instead of "open source soft tokens".
Open source software tokens aren't really acceptable since you need to run them on a multi-purpose device which has several paths to exploit. It's not a "false dichotomy" if there aren't any reasonable options. I'm not willing to settle for a device that doesn't solve the only problem it exists for.
The approach leverages the smartphone for what it is, a connected device with rich user experience interaction. https://www.logintc.com/docs/platform/multi-factor-flow.html
Full disclosure: I'm a co-founder at Cyphercor the builders of LoginTC.
Should we boycott Redis and RabbitMQ?
And no, I don't think a boycott of EMC is practical.
It's practical for me.
However, it is not a matter of debate that the RSA backdoor of BSAFE was and is not open merely to the NSA. It is an objective fact that anyone can take advantage of a backdoor like this. As such, even if you think that the NSA is right, even if you think that cooperating with the NSA is correct, this is not the way to do it.
It might make business sense to do business with a security company that cooperates with the NSA. It does not make business sense to do business with a security company which is proven to produce vulnerable software.
Whether or not it's an ethical problem is subjective. The fact that it's a business problem is objective.
This comment misses the mark:
> Also, it can probably be argued that trying to secure your systems against a targeted intrusion from the NSA using technical means is pointless and a waste of money
The BSAFE backdoor does not simply make companies vulnerable to targeted intrusion from the NSA. It makes every technology which uses Dual EC_DRBG vulnerable to any hacker who knows how to use the vulnerability. This is a pseudorandom number generator, which means that it affects almost every primitive cryptographic operation.
A company which would introduce such a vulnerability for the NSA may or may not be an ethical company, but it certainly is not a company qualified to provide security.
EDIT: It looks like I messed up my understanding of the way in which Dual_EC_DRBG was broken. See the responses to my post for details.
This is not accurate. You need to know the private key for the generator, and this is not publicly known.
And, what about further down the road? 10 years, maybe 20, when this new type of key is predictably breakable with large enough resources? A 1024 bit RSA key is breakable for about $10 million today, according to a study that was linked to in a previous discussion about the state of quantum competing a couple days ago.
There are too many ways this one key could end up compromising potentially millions of locks.
This is the same NSA that has no idea what or how many documents Snowden exfiltrated as a contractor sysadmin?
Would you bet your company's confidential data (and possibly future existence) on the assertion that Snowden didn't have access to that private key? Or that other less politically motivated NSA contractors didn't have access to that private key, and which they could have sold for profit instead of publicly whisteblowing for ethical reasons?
While I've never seen it spelled out this way, I've always been under the assumption that the reason the NSA had so many outside contractors doing particularly dirty work was perhaps because they knew it was illegal and unconstitutional, and wanted it to happen outside the agency itself. But, I may be misinterpreting. It may have simply been a cost-cutting measure in which they failed to account for the lower level of loyalty to the state and higher level of loyalty to the constitution and individual rights than they were accustomed to from "company men".
It has nothing to do with "doing the dirty work" as the contractors are still working as agents of the government and are therefore held to the same limitations.
Rather there's a ton of seemingly-good reasons.
HR because you can't hire enough of the types of geniuses you need from the free market on government pay (it's hard, though not impossible, to justify paying a civil servant gobs and gobs of extra cash).
Additionally though there's the political reason: NSA can seem "smaller" by shifting headcount from government employees to contractors. It's not true, of course, but it doesn't have to be true for most of the people who would care.
Plus, it's just hard to scale government org. structure up and down as needed. Where you need to be flexible and the mission is not "inherently governmental" then turning to contractors is a popular way to adapt to changing situations.
But in no case are contractors held to more lax rules. In fact it's the opposite: they legally must comply with all restrictions on government action since they are (contractual) agents of the government, but they also have to comply with government ethics rules pertaining to the fact that they are not civil servants.
These rules are often annoying in their own right (e.g. a contractor is technically required to include the fact that they are a contractor and their company in any email signatures, must announce on the phone that they are a contractor and don't speak for the government, on and on).
You're _probably_ right.
A year ago I would have said you were "probably right" if you told me the NSA wasn't recording metadata for almost every phone call, email, and website visit.
I think we both agree that any company that is willing to compromise its users to any entity, for money or otherwise, is not a company that should be entrusted with security. I will never deploy an RSA product, and will encourage my customers to choose other options (we support 2FA in our products, as of a couple of months ago, so we have the ability to determine what potentially millions of users choose, though realistically only a few hundred of our users have enabled 2FA, thus far; we don't support RSA).
So, yeah, it's also possible that the NSA's super secret input data they used for this RNG will be revealed or will be compromised by some powerful attacker (China, for instance, who would have very high incentive to compromise a large percentage of major corporations in the US in one fell swoop).
I must have missed this part of the news cycle. Can you explain how anyone other than the NSA can take advantage of Dual EC_DRBG? Did P and Q or whatever it was get leaked as well?
Dual EC_DRBG is a random number generator. If you use any random number generator long enough without re-seeding, it will eventually cycle (start producing the same sequence of bits again) and a discernable pattern will probably emerge long before that. If you can identify that pattern, then you can predict what will be generated.
To understand this: a really crappy random number generator simply takes the seed and increments it to produce the next random number, wrapping around to 0 when it hits 32 bits. So if I give you the numbers 56, 57, 58, produced by this generator, you can predict the number 59 will be generated next (and the number 55 was generated before). If these numbers were, say, the generated public exponent in an RSA public key, and the private key was generated just before, you could very easily figure out the factors used to generate the private key. Obviously nobody is using such an obvious "random" number generator.
In practice, most DRNG cycles are very large, and it takes many generation cycles for a pattern to emerge. This provides security because even if you get a few randomly generated bits, you won't be able to predict which bits were generated next. Even if you know the cycle, it would take many, many bits to be able to know where you are on the cycle and pick what would happen next.
Bruce Schneier explains much more authoritatively than I can how Dual_EC_DRBG can be broken--that after collecting only 32 bytes of generated bits, you can predict what the generator will generate next.
HOWEVER, what I didn't know before re-reading Schneier's article, is that in order to predict future numbers from the previous 32 bytes, you need to know the proper constants (unknown) which correspond to the constants set forth in the NIST's Dual_EC_DRBG. So only the person generating those constants (in tandem with the unknown constant "key") can break the standard. So when I said that anyone can break the standard, I was wrong. Color me embarrassed.
The political debate over "working inside the system" is certainly important to have. But the organization that makes those hardware tokens used all over the place could vanish, and it would be a minor systems integration inconvenience.
The reputation hit to a fundamental algorithm is going to be confusing programmers for a long time. I don't even know how to start measuring the cost of that.
Of course that probably won't happen since programmers who don't know what they're doing implementing crypto seems to be as popular as ever.
And then, the degree of "knowing what you're doing" is important too, because I'm pretty sure I have a better background in algebra than some professional cryptographers, but human blind spots can get pretty subtle.
The difference between a PRNG and an asymmetric cipher is easy to understand. The cognitive load of associating RSA the company with RSA the algorithm (and ECDRBG the PRNG with ECC the PKI for that matter) is difficult to overcome even when you're aware of the potential bias.
This is how wrong so many people are. Verizon's CEO has flat out said "they are our largest customer" (i.e - go fuck yourself).
"Sadly, I haven't spoken at RSA in many years. Had I been accepted to talk this year, I'd certainly be canceling it."
Instead, I propose that it be unlawful for companies which have been thoroughly hacked to bid on government cybersecurity contracts, at least for some period of time. After the SecurID hack, RSA should have been blacklisted for, say, a year. BSAFE should not be anywhere near a government or defence network.
PS: The analogy to Vichy France isn't great. It was not a matter of French technocrats collaborating just to save their jobs; it was real counter-revolutionaries fighting to bring down the Third Republic from within.
"Complicit" would be Verizon or AT&T, who to this day still sell phone call metadata to the NSA.
Until they deliver open source client software that uses end-user-controlled strong encryption, they are not making their users secure. But that means putting their users out of the reach of law enforcement, too, and they are scared to do that.
They need to face the choice: Enable real privacy, or lose your customers.
Google does an excellent job keeping everyone's data out of the hands of black hats, but that's not quite the same thing as making it impossible for governments to snoop.
Instead of bitching about the NSA, how about a GMail client I can audit, a key exchange I can trust, and routine encryption of all emails.
How about PFS and strong payload crypto for hangouts?
I suspect they are hoping for a scenario where customers will just trust them about government snooping. That would enable their security people to cooperate in situations they deem worthy of cooperation. But I think that trust is gone forever. After all, the courts have ruled metadata collection is legal, so why wouldn't Google cooperate with a legal operation?
Aside from their secure ID products, do people use many RSA products?
Should they bear any of the burden or only the subsidiary?
What about those companies that use RSA products and services?
These are just questions.
411 - http://rsaconference.com