* The IETF has a dedicated crypto review board, the CFRG, which approves or pokes holes in the cryptography used by other IETF standards.
* The chair of the IETF CFRG is an NSA employee (Kevin Igoe, one of the authors of the SHA1 hash standard).
I just learned these things a couple weeks ago. I am not generally a believer in the theory that NSA actively subverts Internet standards†. But even I think that it's crazy for an NSA employee to chair the CFRG.
In case you're wondering: Trevor Perrin is widely respected professional cryptographer. Most cryptographers work for university math departments. Perrin worked for years as a staffer for Paul Kocher, the godfather of side channel attacks, at Cryptography Research. He's the designer of the new forward secrecy ratchet for OTR (Axolotl) and the TACK TLS extension, and a behind-the-scenes contributor to other IETF crypto standards. Perrin wrote the pure-Python "tlslite" TLS implementation. If you were to draw a "family tree" of crypto know-how in the software security profession, a surprisingly huge chunk of it would be rooted in Perrin (and Nate Lawson and Kocher); for instance, virtually every modern TLS break came from ideas that Perrin popularized. 64 current Matasano Crypto Challenges, probably 50 of them I can trace to Perrin and Lawson. Trevor Perrin is someone you should pay attention to.
† (my best guess is that the standards NSA was actively subverting were about international telephony; subverting the IETF is a little like subverting the Linux kernel --- doable, but bad tradecraft)
This is a great point. The mailing list and public nature of the standards process makes it very difficult to subvert, without very high risks of getting caught and breaking trust in the community. These agencies need to keep hiring good cryptographers and ideally keep bodies working on standards.
Shows the importance of OSS in security and having people like Trevor Perrin keeping watch.
But at the same time - if the NSA was going to subvert encryption standards - I doubt they would subvert the process with someone who is known to work at the NSA. Intelligence agencies would operate covertly. Most likely by converting someone trusted in the community into an agent, or grooming their own agent straight out of high school/university and getting them to a point of influence in the community (over a long period of time) and only then having them damage crypto standards. < this is standard tradecraft.
Kernel contributors aren't background-checked so all you need to do is to pay someone to do some legitimate kernel hacking in a sensitive area. Then in one of their commits slip in a backdoor.
"But noooo the many eyes will see the backdoor!" you say.
This is clearly false. If it were true Linux would have no security bugs at all. Since old security bugs continue to be found, it follows that it is possible to have a security bug that goes unnoticed for many years. See also the underhanded C contest.
Hell maybe it has already happened. Who is to say the latest Linux security bug wasn't deliberately introduced by the NSA?
I don't think this is paranoia - it would be fairly easy for the NSA to do, very useful and almost completely deniable. I would do it if I were them. They certainly wouldn't not do it for moral reasons because they've shown they don't really have any.
People vastly overestimate the worth of background checks, and underestimate the effect of current hardship.
Furthermore, before the Snowden revelations, they weren't under suspicion for trying to do that.
PAKEs ("password authenticated key exchanges") are a kind of public-key algorithm.
When two people use public key crypto to exchange a key in public, they need a way to "break the tie" in case of a MITM attacker. The conventional way to do this is with a certificate; that's how TLS works.
A PAKE uses instead a password; it does that by baking the password into the public/ephemeral key generation process. Alice & Bob can't agree on a MITM'd key if the MITM doesn't know the password; they just won't come up with the same numbers.
Dragonfly is a PAKE designed by Dan Harkins a bunch of years ago. Harkins got Dragonfly added to LEAP, the extensible authentication framework for 802.11 networks. He would very much like it added to TLS as well.
The trouble with that is, nobody knows why we would want to add Dragonfly to TLS. TLS already has a PAKE option, SRP. SRP is better than Dragonfly; it's designed to make it hard for an attacker who seizes a server to collect a database of passwords. SRP is also well-studied and well-understood; if it isn't particularly beloved of academic cryptographers, well, Dragonfly doesn't score any better. But more importantly, nobody uses TLS-SRP. No browser implements it. The demand for PAKE authentication in TLS is not strong.
It gets worse for Dragonfly. The protocol has problems. I'm going to get myself into a little bit of trouble summarizing, but here goes: Dragonfly can be made to work with elliptic curve key exchange. The process for mixing a password into an ECC key exchange involves a trial-and-error process for finding a valid curve point; a loop runs conducting these trials. A passive attacker could, in an earlier version of the Dragonfly protocol, discern how many iterations through the loop had happened to find a valid point given a password. Dragonfly is randomized: a nonce is mixed into every exchange. As a result, the same password might take a different number of loop iterations on every login attempt.
Net result: given a corpus of 232 passwords, 32-and-change passive observations of the key exchange might uniquely identify the password used!
Modifications to the protocol were proposed, some in the CFRG process, but they in turn resulted in a protocol with a side-channel weakness: the process of computing parameters for the key exchange leaks timing information.
Dragonfly has one sponsor that we know of: Kevin Igoe on the CFRG.
Trevor Perrin broke this down in amazing detail, mining posts out of several IETF mailing lists. Probably nobody is ever going to use Dragonfly in the real world, but watching people pick apart a new crypto protocol in public is amazing and hugely educational. I highly recommend reading this post:
I'd argue that there's no demand for TLS-SRP because most of the people who could make use of it don't even know it exists. (Conventional thinking is that the only way to do authentication in TLS is with X.509. TLS-PSK has the same problem.)
But applications can't reasonably support TLS things that the TLS libraries they use don't support. Two of the three elephants in the TLS room, SChannel and NSS, don't support TLS-SRP. The other elephant, OpenSSL, has supported it for less than two years.
There's interest in (and patches for) adding TLS-SRP support to NSS, but they don't seem to want to implement anything that Firefox won't use:
Could someone shed a bit more light on who Dan Harkins is? He seems to be quite arrogant and does not address Trevor Perrin's criticism very constructively. See the other two threads started by Trevor:
http://www.ietf.org/mail-archive/web/cfrg/current/msg03545.h... (Question regarding CFRG process)
http://www.ietf.org/mail-archive/web/cfrg/current/msg03537.h... (Review of Dragonfly PAKE)
Believe it or not, I've seen many people thinking that KE alone is enough to build a secure channel. This misconception is probably as widespread as encryption provides message integrity. One needs to know he's talking to the right person, otherwise he's vulnerable to some man in the middle.
There are many ways to solve this authentication problem. A large number of TLS ciphersuites uses PKI, but there are also ciphersuites using pre-shared key or password such as TLS-PSK or TLS-SRP. PAKE protocols use a password to authenticate each other, e.g., if only Bob could know the password he's the only one could finish the protocol with Alice.
Recently I've worked on some authenticated key exchange problems not using a password, which could be fun to think about:
1) How to build a secure inter-app channel on iOS.
2) How to bootstrap keyboard-less devices to password-protected WIFI network.
And it is sad, even if it would be an interesting UI challenge to make it usable. However, I've seen TLS-SRP used in a machine to machine context (specifically home automation), where e.g. it would be too difficult to provision certificates or because a x509 library is just too big.
SRP exists under a cloud of patent uncertainty. A lot of people would like to use SRP (perhaps not with TLS, but for SSH— for example) but deployment has been deterred by patent concerns.
Though IIRC only the limited form of SRP which doesn't provide bidirectional authentication (e.g. where you can't be MITMed even by a sever that knows your password or has stolen the server's 'hash' database) was covered under the royalty free grant. Implementation of the full protocol required a license from Stanford, https://ietf.org/ipr/31/
I've never seen it used by humans, but there are contexts were the secure channel to establish the secret exists, e.g. most banks still have large brick and mortars infrastructures, so do tax authorities...
SRP used to authenticate sites and users would make phishing as we know it obsolete, but it would have to be implemented very carefully in the browser so there is no way whatsoever for a website to mimic its password entry UI...
But given the abysmal security record of the PKI model (were a few bad apple really ruin it for everyone else), the world could use a lot more SRP deployment...
I am uncomfortable with the NSA / GCHQ being that closely tied to the standards process.
I'm much happier when they're noodling away with research in the background and providing support to universities.
An example: GCHQ invented PK before Diffie and Hellman. They invented RSA before RSA did. They kept both of these secret for many years. GCHQ's RSA was not revealed until 24 years later. (About 20 years after RSA had been in use).
So, secret government spy agencies keep secrets. I think this is as alarming as secret government spy agencies spy. While they might not actively subvert crypto standards would they allow weaknesses to be implemented without comment?
No, I don't think NSA tried to subvert IPSEC, and especially not in the way Gilmore describes.
Edit: I may have misunderstood who was being accused of incompetence. My apologies.
That is not what he said.
Your statement was and is unclear. Your follow-up to which I'm responding doesn't change that.
... and what kind of tradecraft, pray tell, is subverting a random number generator and planting it inside the BSAFE library after paying off RSA DSI with a $10 million dollar contract?
If the NSA is willing to do something like this, what is would it consider too unethical/immoral/bad tradecraft not to do?
+1. Fine for him to provide individual contributions, not fine at all for him to have a position of responsibility.
Unfortunately the fact it might be "bad tradecraft" doesn't seem to be much of an impediment for U.S. intelligence agencies. They seem to have a knack for getting caught with their pants down over stunts that one would think that cooler heads would have prevailed against as being not only reckless, but to have had the potential for, at the very best, marginal tactical benefit.,
> virtually every modern TLS break came from ideas that Perrin popularized
oh i didn't know this. what are the ideas?
If the NSA wishes to change that rule in the future, it can publicly ask Congress to enact a law making it a federal felony for a government employee or contractor to try to subvert, compromise, or weaken public encryption standards. (That would still allow the NSA to subvert, compromise, or weaken proprietary Chinese or Russian military encryption standards, if it is capable of doing so.)
Until the NSA requests such a federal law -- and it's duly enacted -- it seems folly to encourage the participation of its employees in the IETF process, let alone granting them a position as chair of an encryption working group. Put another way, the NSA's signals intelligence mission has eclipsed its information assurance mission.
Even President Obama's NSA review group that came out with a report this week recommended that the agency "should not" weaken commercial encryption software. Why not a "must not?" p36: http://www.whitehouse.gov/sites/default/files/docs/2013-12-1...
This makes me think: What is the basis of trusting any organization or person not to have their own agenda, possibly contrary to the group’s ostensible agenda?
The basis is this: We have a tacit assumption that all participants have realized that better standards (and strong crypto, more secure systems) will lead to the betterment of all. This is the default assumption.
However, now that the U.S. government, and the NSA and its collaborators in particular, have been shown to explicitly not have this goal – in fact, their goal has been to strive for less secure systems and more difficult standards – what should be done? The logical thing to do is to exclude any person or organization revealed to have an agenda explicitly contrary to the group.
The same argument could be made (and has been made many times in the past) for Microsoft to be excluded from any and all standardization committees like ISO, IEEE, IETF, etc. for the same reason – their repeated practice of Embrace, Extend & Extinguish among other things shows them to have an agenda contrary to the group, and their participation would therefore be a detriment, not an asset.
> their goal has been to strive for less secure systems and more difficult standards
I don't think, specifically, that they're looking for weaker standards. Weaker standards would allow for competing governments to have just as much access as the NSA does. I think they'd prefer stronger standards, but that they _still have the key to_.
In short, I don't think they want cheaper locks, they want better locks, and master keys.
Edit: Everything else you said is spot-on.
But would it? How would you distinguish? Why wouldn't they just use whatever is known to be off-limits?
It appears that any article with NSA in the title gets an automatic penalty of .4. I looked for other words causing automatic penalties, such as awesome, bitcoin, and bubble but they do not seem to get penalized.
I observed that many websites appear to automatically get a penalty of .25 to .8: arstechnica.com, businessinsider.com, easypost.com, github.com, imgur.com, medium.com, quora.com, qz.com, reddit.com, rt.com, stackexchange.com, theguardian.com, theregister.com, theverge.com, torrentfreak.com, youtube.com. I'm sure the actual list is longer.
I rarely upvote, but I did this time.
A request to replace him with Bruce Schneier.
As IETF chair he may be even bring us the elusive "secure by default Internet" with his CurveCP protocol or a future evolution of it. The secure by default Internet has actually been declared one of the main goals of IETF at their recent meeting:
For what it's worth, Bernstein spends most of his time in Europe these days.
Of that list, looking at it now that I've written it, I think Kenny Paterson might be my favorite choice; he's done very important recent work on (breaking) TLS and is an eloquent and convincing writer and translator of the cryptographic literature.
Matt Green, I've been impressed with his writing.
Maybe agl? Probably more because he's got good perspective from the implementation side.
Well, no one expects the IETF to get anything done, anyway....
But I will state my position clearly: I do think the resignation is a good thing. I don't agree with the word "removal".
The biggest problem to me is not about NSA involvement, it is how WE treat people who work at NSA and other government intelligence agency. If the fear of a single man is what makes the issue hot, I beg to differ. You can disagree with him and not pass the standard. If the whole committee thinks there is something fishy, I see no reason why the proposal would get through the internal draft. It is that distrust.
My school and many schools out there would send out internship notice; if you are a public school one of those would be government internship and among them is NSA and FBI.
How do we treat these kids in the future? How should we treat our future or current co-workers who had worked as contractor or done internship at NSA, FBI and CIA?
Do we trust them?
The fact that "NSA [employees] (edit, response to http://www.ietf.org/mail-archive/web/cfrg/current/msg03556.h...) should not be in any position in the cryto committee" is too far. He should resign in fact, to avoid interest conflict; people don't trust NSA right now. But how are we treating these employees? Have we asked him privately? Should this email be in the public in the first place? Have they ever had a private conservation about this? I think like it is more of an attack and a warning to all NSA-title employees that they should never reveal their affiliations, even on resume.
Since everyone does things differently, some will never join NSA and some will for either money or technical development or patriotism, how do we as people treat these employees?
I am upset that when people look down at them and think they are rat. This is a stronger ethic issue that few notice. The whole "removal" sounds like "one ought not be an NSA employee." Being someone new to security and admire open standard and fear of backdoor, I think it is nicer and professional if that has been raised to Kevin Igoe first privately.
From the way the mail is phrased: it never happened.
Why? There's absolutely zero reason to allow people taking part in the NSA's antics into groups deciding public policy.
We need to make working for the NSA a big red 'untrustworthy ' flag, regardless of alleged level of involvement.
The moral issue is about someone's affiliation with NSA makes him or her a spy, unworthy in the crypto community.
If people have different opinion when they were young (or in the case of a student who just want a god damn internship), it sounds like we will punish them and discriminate them in the future, given their past involvement with NSA.
If a year or two of NSA interns have to learn the hard way that they'll never find employment anywhere else afterwords, too bad. You can bet your ass that, by the third year, nobody will be accepting those positions without fully intending on being there forever, and committing to the degradation of privacy.
I don't think I can trust someone worked at NSA for many years either. I will give you that. But how should companies deal with these applicants?
> If a year or two of NSA interns have to learn the hard way that they'll never find employment anywhere else afterwords, too bad.
And that's exactly why my question is important to raise: how do we choose not to "discriminate" them.
Downvoting on a moral issue? Really?
If we started ostracizing only long-term hires, the NSA would just have more motivation to 'turn' their short-term hires.
I feel that without a sort of "zero tolerance" policy towards NSA employment, there will always be a way for them to get around imposed restrictions.
FYI, you shouldn't have been downvoted, your argument has merit even if I disagree with it. Upvoted.
NSA/GCHQ employees need to learn working for these agencies is a one-way street. No one on the outside will ever trust them or employ them again.
And I am interested in #1 how companies, particularly, big companies handle these issues, and do you as co-worker evaluate your new co-worker.
... but the "war" of privacy is yet to be won. So chasing individuals for collaboration may be too early.
If they want to be trusted, they need to stand apart from the people who are lying and deceiving us about their tactics and their data gathering.
Will you risk your company's integrity and reputation over hiring someone who had worked for NSA? An undergraduate? A father who had three kids to feed and just took the job to make a living?
Not trusting often turns out to be punishing. And in many cases we simply just discriminate them.
This is a serious issue, a real moral issue.
I cannot save others from the impact of the choices they make. and I would not want to.
You appear to be arguing that we should not judge people based on their actions - I am unsure where that leaves us?
So a driver who drunk drive and lost their legs should have a tag "I was a killer killing fifteen people five years ago and don't build an accessible platform for me"? No of course we don't do that.
Should we try to help those people with disabilities? Yes.
Should we go out of our way to help those who decided that they could drive under the influence, and kill 15 people because of their negligence? I think you will find that most people wouldn't even waste the spittle on that person's face.
I should add all of these to my CV just to avoid ppl who would irrationally discriminate (although I'd then be lying on my CV).
For the moment, until or unless there's a sea change of public accountability and trust with respect to the NSA (which would likely take years if not generations), it's safest and most sensible to treat these employees as potentially suspect in any area where their employer has been shown to be an adversary.
It is not that one 'ought not'. It is that the NSA is spying on us all. It is also lying about what it is doing, and being deceitful about its tactics.
People need to make a choice about where they stand. If they want to stand with the NSA, that is fine.
Unfortunately from a practical POV, it means I cannot trust them.
On the bright side, I cannot think of any reason an ex employee of the NSA would be honest about their previous employment.
One ought not be an NSA employee.
One can say such thing to Google, Facebook, Adobe or any company out there if they dislike that company. Any nsa contractor who depends on government contract to feed their families - is it wrong? If I can't find a job at private industry but NSA hired me for 3 months is it wrong that I did it for the money for my family? How would you know that was my story? You won't and you will just penalize me for being a contractor once.
And people have different degree of tolerance and standard for patriotism. i don't endorse what they do, but that's exact false attitude we have toward NSA affiliations. We put government's fault onto the workers and it is wrong.
In what sense is it unprofessional?
"Any nsa contractor who depends on government contract to feed their families - is it wrong? "
in a moral sense? who knows. but from a practical sense it genuinely does mean I cannot trust you. hey, I understand what you are saying - and I sympathise, but at the end of the day there are clear, solid reasons why I cannot trust an ex NSA employee.
"And people have different degree of tolerance and standard for patriotism. i don't endorse what they do, but that's exact false attitude we have toward NSA affiliations."
working for the NSA is not patriotic. it is the opposite of patriotic. it is associating with sneaky people who deceive and lie.
So now every NSA employee is marked as "untrustworthy".
If I may allowed to be radical in my own response, isn't this what the red scared is all about. You are a friend to X who is a known communist and now I cannot trust you. You were a secretary for this communist spy, I cannot trust you even though you didn't know. You just deal his daily accounting.
There are simply people who work for NSA for reason like employment.
> working for the NSA is not patriotic.
Yes, I will agree that jeopardizing democracy and liberty is not patriotic. But I am saying from the guy who started out at age 20 and thought it was everything he could do for the country, and now he realized he was wrong. Now do you trust him?
The ultimate issue is again, we are equating NSA is untrustworthy == employees are untrustworthy people.
That secretary is facilitating the erosion of our personal privacy and liberty by logistically supporting the people who are doing the actual dirty work, and now, she knows it. At this point, her choice is "Continue supporting people doing bad things and carry the stain that comes with it" or "Stop associating with people who are doing bad things".
Persons who engage or support the kind of behavior that the NSA is engaged in are untrustworthy and should be treated as such. You don't have to be the actual guy tapping data lines to be complicit. Shoving it off of "the employees" and onto "the NSA" is a copout.
No, that is not. You are making the same mistake that "because I work for someone evil it means I must be evil too." You are assuming every single person who have or is currently working for NSA must be evil. Just because someone accepts that having a job is important than public good at some point does not make them less untrustworthy. You can have someone who donates to good causes all the time, does a lot of community work and yet when he quits his job because he is sick of NSA or because he just want a better job and now he can't because he was with NSA.
This assumption is wrong; you are equating government to its workers. If so, should anit-war people look at our American soldiers evil too? Because people are following orders?
If I work for a Mafia boss, you know, take care of his house so that he can be comfortable and safe at home, am I supporting to the cause? If a German citizen was supplying raw materials to Nazi is that citizen a Nazi and evil when he was just making a living for his family while he absolutely hate Hitler and Nazi?
In an ethical/moral, if not a legal, sense: yes. Absolutely.
>If a German citizen was supplying raw materials to Nazi is that citizen a Nazi and evil when he was just making a living for his family while he absolutely hate Hitler and Nazi?
Does it mean they were Nazis? No, Nazis were members of the NAZI political party. Does it make them a Nazi supporter, though? Yes.
Does it make the person evil? That's a moral judgement I don't think we can have a sensible argument about, but unarguably it does make the person a supporter of evil.
This is the key takeaway from the Milgram experiment, and the key point of Hannah Arendt's concept of "the banality of evil". We think of evil acts as things perpetrated by conniving, malicious villains, but the reality is that evil acts like the holocaust were not perpetrated by cackling villains, but only happened because enough regular, otherwise perfectly nice people surrendered any personal responsibility for the larger implications of their work and contented themselves to "just follow orders".
No, I'm not at all. I'm assuming that some of the people working at the NSA are evil (that is, that there isn't some "The NSA" entity that is evil while the people employed there are not), and that the rest of the well-intentioned people working there now know about it, and have a choice to make about whether they are going to work with and enable these evil people to do their jobs or not.
A housekeeper with mob ties would be looked up with distrust. A person who supplied material aid to the Nazis would be rightfully branded a "Nazi supporter". Doing something to make a living isn't an excuse.
Let's put this into a more Valley-friendly context: You can't really work for a porn company in some non-porny capacity (sysops, let's say) and not expect that stigma to follow you around. You know exactly who you're working for, what they represent, and how other people would perceive that line of employment. Would you expect to be exempt from people making any kind of judgement about the kind of people you choose to work for since you're just making a living?
That's a super softball variant on what we're talking about here. Porn is something that doesn't bother some people and deeply bothers others, but "I contributed to your government spying on you and systematically eradicating your privacy (but just a little bit)" isn't going to play well anywhere, as well it shouldn't.
By the way, about your Mafia/Nazi examples: a zealous prosecutor could easily charge the housekeeper, and the police and FBI would certainly lean on them a bit whilst investigating the boss. That's totally commonplace. Meantime, in the case of the literal Nazis, yes, there were in fact consequences for such people (leaving aside that you've constructed that scenario very poorly).
That's my issue I want to discuss. If we are striving for a better society, discrimination should be minimized. If we keep discriminate them, penalize them, the only job they will ever have is NSA job.
I think now this point is clear why it is important to look at the effect and the moral issue of how we treat them.
And to many, it is NSA is evil, every NSA employee is evil. and that's a big issue.
Bullshit. Discrimination is the foundation of a sane society. Unjust discrimination with no bearing on one's ability to do a job (such as on the basis of one's genital configurations or preference of who you like to smooch) should be minimized, but you absolutely must be discriminating about you who choose to work with. After all, if we really wanted to minimize discrimination, we'd just hire people for any job without any consideration towards their capabilities, experience, or work history.
Being cautious about hiring potential employees because their work history demonstrates that they may be a threat to the security and reputation of your company is a damned good reason to discriminate.
I agree that it's not fully accurate to say "The NSA is untrustworthy, Bob worked for the NSA, therefore Bob is untrustworthy" - but because of how institutions work, if you want the institution to pay attention, then yes, there must be impact on its members. There is no way around this.
And to those who left NSA either because they hate it or because they realize it was wrong.
So if the solution is to be snowden, declare publicly that he does not endorse NSA, it is the equivalent to saying "you must face a public trial, you must tell the world you do not and will not endorse nsa anymore upon exit." Everyone ought to be a hero.
I do understand the institutional constraint. If all NSA employees rage quit today, will we just forgive them? But that is not going to happen, for various reasons.
When Bob just wants to introduce himself and people say "You worked for the NSA? Go away, spy" -- that is a sign that effective social punishment of the NSA - the institution - is taking place. I see you worrying about the scenario where no amount of explanation Bob can give is enough to get him in the door. That's fine. But I'm arguing that Bob should have to do some explaining, and I'm arguing that if you do want the NSA - the institution - to suffer effective consequences, then you must accept that individual members will also face some gnarly consequence, because that's how institutions in a social context work. You cannot meaningfully punish the institution without also punishing its members. It does raise moral issues, yes - but if you have already decided, "we must punish the institution," then you fatally compromise that goal by adding a lily-livered "but let's not make things too hard for the members."
> should bear the burden of explaining.
Yes. Indeed, but how many will accept that? How many of us here will bother to look at it and say "he sounds genuine." We should evaluate what things one can do to help people to get out of NSA and still receive some degree of respect/human treatment.
If by punishing is by not giving them a chance because, as some have declared, "I will not trust them at all anymore", these poor souls will have no jobs but NSA jobs.
And let me restate my position: I don't endorse NSA and I believe they are morally responsible for the work they do (everyone should, regardless of whether NSA or Adobe leaking password). But let's focus on what one can do to make the transition easier. What makes them part of us again.
When we say Mr. Foo Bar, NSA cryptographer, publishing a paper for his next talk, sure, the chair will just throw that away and say "fuck it, no one would want to hear bs from NSA." As you can see, even though some people say "I have no problem with individual contribution", some probably left the part "but I will remain skeptical and uncomfortable with them being around, and making changes to Linux kernel." And to some they just never want to see those poor souls on IRC anymore.
However, if you work in a place, there is the Buddhist question of "right livelihood." I'm sorry that I don't know any better way to put it. Given that the NSA has been doing some things that seem objectionable, shouldn't everyone who contributes to that effort begin examining their conscience? And yes, a secretary does contribute to that effort.
Now it might be that the latest report will make some well-needed changes. And heaven knows, countries need intelligence to combat things like terrorism. But still. If you were part of the NSA, two things would today be true:
a) You knew what went on, and agreed with it. Well, maybe your ethics are not what we would all like.
b) You didn't know what went on, and aren't impressed now you do. I suggest it's now time for you to consider whether this is an enterprise you should contribute time and effort towards.
And if that comes at odds to feeding your family, well, maybe you can't just resign. But you certainly can look for more ethical work, or take part in internal conversations about how the organisation can become more ethical.
And we continue to be scared because of their background, previous involvement.
> a) You knew what went on, and agreed with it. Well, maybe your ethics are not what we would all like.
Not everyone is a saint and some are utilitarian and some are not. People also change they way they perceive things.
I suggest reading and considering Hannah Arendt's famous work, "Eichmann in Jerusalem: A Report on the Banality of Evil" for a different perspective on this.
Our typical idea of a perpetrator of evil is this malicious, moustache-twirling, cackling super-villain. The reality of how very evil acts are perpetrated is typically much different, though. It has more in common with what we see with the Milgram experiment: lots of small cogs doing highly compartmentalized, abstract work that simply adds up to something ghastly. Paper-pushers who simply "do their jobs" without bothering to examine or take responsibility for the ethical implications of their work output, preferring instead to shift that ethical responsibility to the superiors/authorities who handed down their orders.
But the issue people keep on missing, or not in sync with me, is practically, how do we minimize discrimination, how do we handle future employments? All I am hearing is "they are still morally responsible." If for the sake of discussion, assume yes they are morally responsible, my actual questions are still unresolved. And we just keep saying "yeah it sucks to be them, but what can we do?" That's the exact attitude we always have and won't eager to fix on;y because we dislike what the NSA has done.
why on earth should they get a free pass for that? have you seen the scope of the damage they have done?
Anyone who chooses to work (directly or indirectly) for the Government should realize that the public good (not to be confused with "good for the government") must come above "feeding the family".
It is very easy to fall into atrocities and "following orders" when you do things in the name of "feeding the family". For example, spend a few hours playing "Papers, Please" (or, even better, watching other people play it) and you'd be surprised how easily people start looking at the characters crossing the border as adversaries.
 http://papersplea.se/ (It's on sale in the Humble store for $4.99 USD.)
Radical, certainly. Unprofessional? Professions usually have ethical codes of conduct. If you violate those ethics, you're not a professional. We don't have a unified professional ethical code, but ethics is absolutely an important part of professionalism.
Will draining the NSA of its ethical population improve it? Perhaps -- by hastening its demise -- but that is not the obvious result. Needs further analysis and argument.
A professor of mine who was active in the peace movement in the eighties was talking to me about employment at defense contractors like Boeing or Lockheed. He said "If ethical people always choose to not work for defense contractors, then only non-ethical people will be working for defense contractors. Do you want defense contractors to be composed solely of non-ethical people?"
Any employment-offering entity (EOE) $foo can be substituted in the pace of Boeing or Lockheed in that argument: "the mob", "the Stasi", "revenge porn sites", etc.
So it really does boil down to intrinsics, after all. If we're going to commit ourselves to ending perennial abuses perpetrated by EOEs that seem entrenched and averse to wisdom and self-reflection (like large military contractors, or our so-called intelligence agencies)... then perhaps it should no longer be socially acceptable to work for them.
An organization is made up of people. If you don't like what the organization is doing, you start by holding the people accountable. I see nothing wrong with shunning people who, in their professional capacity, are a part of machine that uses said capability to undermine my rights and privacy.
The only stuff that doesn't upset me is genuine foreign intelligence. The NSA can listen in on the Israelis as much as they want, as far as I'm concerned; the Israelis sure as shit listen to us.
The rationalization from some posters in the thread of why he shouldn't be removed is scary.