Hacker News new | past | comments | ask | show | jobs | submit login
Yahoo installed a backdoor for the NSA behind the back of the security team (diracdeltas.github.io)
1009 points by xenophonf on Dec 15, 2016 | hide | past | favorite | 280 comments

I know this is from October, but it warrants re-reading now.

Today, Yahoo announced a hack of 1B accounts. They say they don't know who it is, but we can conclude it's not the US government because Yahoo is willing and legally able to publicly disclose it.

Previously, Yahoo willingly assisted an attacker in compromising 1B accounts. In this case, they did not disclose the attack publicly, or even to their own chief information security officer, because in that instance the attacker was the US government itself.

US intelligence activities are actively harmful to American commercial interests because they destroy trust, particularly from customers elsewhere in the world.

[1] http://arstechnica.com/tech-policy/2016/10/report-fbi-andor-...

Given its government's track record, I would think that data centers in the US should be walled off in much the same way as data centers in China. It is frankly surprising that American companies are blind to their own government's track record for indiscriminately spying on its citizens and people around the world.

You mean American companies like Apple? Whether you trust them or not they've certainly brought it up. Though they might not label the spying indiscriminate; maybe still 'criminate.

Or are Chinese companies doing some interesting walling-off?

Google encrypting traffic flowing between data centers seems to reflect significant awareness of the issue too.

When I worked at Square we required mutual TLS for almost all service-to-service communication within a datacenter, so sniffing traffic within the datacenter wouldn't be fruitful either (assuming no weakness in the cipher or TLS stack). Is that not common elsewhere?


I've had to explain to clients on numerous occasions why I enabled TLS between things in the same data center. Security requires layers and just because the data is on a network you own doesn't mean nobody can get in.

this is a really good point, and there are plenty of articles out there suggesting it's fine to terminate SSL on your app proxy since if you can't trust your own datacenter your security is pants anyway ... These semi-informed claims about security often delivered in set-up instructions are annoyingly frequent too. The attitude seems to be it's fine since everyone does it.

I get it why somebody would argue this before but things like letsencrypt make it easy to manage the certificates for your internal hosts. I guess it gets more complex if you have your apps in containers that need their certificates managed etc. Even that shouldn't be this big a deal to make sure renewal is automated with some scripting or even managed properly with a backend CA like r509 ... Also the argument for performance no longer counts when the sales argument could be "we guarantee you e2e encrypted services etc" ...

Also Data Engineering jobs become a lot more complicated when suddenly certain data-points are no longer available for visualization on "BI-dashboards" consumed by wann-be tech-savvy CxO's, so that too may be a factor.

(The opposite of indiscriminate is discriminate)


I don't get it...

The root word at play is CRIM

See also crime, criminal

Replace the "te" with "l"

I'm still lost?

It's a (bad IMHO) attempt at sardonic humor by punning on the word "criminal".

'criminate => criminal

The "te with l" tip was for the original "'criminate" and not very clear.

Duh I'm sorry I thought that that was an "i". I kept thinking is that some Latin form of pluralizing the word. Oops. Where's my coffee...

The US Government has far more legal leeway to spy in other countries than in its own.

But they have all the facilities to do it in their own.

I'd say if you believe there are any real checks or balances to the NSA's power you're naive.

> we can conclude it's not the US government because Yahoo is willing and legally able to publicly disclose it.

This implies a potentially fatal assumption that IC agencies perform clandestine activities with prior notice to mission targets.

That's a very good point actually. Though in this case they say the tip-off came from a law enforcement agency (I assume they mean an American one).

It's possible however that agency doesn't realise it's a clandestine operation by another US agency. Or they do, but now the operation is complete (this was three years ago) and they want a way of informing yahoo about the breach without admitting who was behind it in the first place.


PRISM (Yahoo joined 2nd in 2008 after Microsoft in 2007) would basically defeat the purpose of doing this without permission...

Did Yahoo Mail even use HTTPS? In that case a FISA warrant would just be an extra level of assurance that they got everything from that person's inbox (plus inboxes of 3 hops of everyone they ever emailed). Otherwise they were just an XKeyscore query, probably filtered by US geodata, away from getting whatever email they wanted (plus unlimited hops).

I feel like everyone has forgotten about PRISM. All the big companies denied they participated in PRISM. Is it real, and if so, why have Americans been so complacent about its existence?

This is not a mystery. PRISM == use of secret court orders to collect information from companies regarding certain targets. Can you clarify what you think it is and how it is relevant here?

What do you think PRISM is?

Americans have the attention span of a newt, that's why.

Your understanding of PRISM is entirely wrong. It processes emails these companies send to the FBI for specific accounts that the government has a court order for, not all the emails that these companies handle.

That's why I said FISA warrant... which includes 3 hops.

Otherwise they would collect any email ever sent unencrypted via submarine fiber wiretaps. While feeds into XKeyscore.

FISA warrants do not include 3 hops. There is nothing in Snowden's leaks that suggests the NSA uses submarine wiretaps to collect your emails either. You seem to have ignored the leaked documents entirely and made up a scary all-seeing Boogeyman.

Wrong. I don't even need to cite Snowden slides, congressional testimony will be sufficient:

> Turns out the data collection is not so limited. In testimony yesterday before the House Judiciary Committee, National Security Agency Deputy Director Chris Inglis said that the NSA’s probing of data in search of terrorist activity extended “two to three hops” away from suspected terrorists. Previously, NSA leaders had said surveillance was limited to only two “hops” from a suspect.

> Inglis said that the NSA looks at two to three hops from a suspect. To determine how many hops you are from Osama, for example, the NSA’s data analysis engine software constantly plows through information and builds a model of all the relationships between every phone number on record and every IP address. Other software robots query the graph to discover which “nodes”—phone numbers, IP addresses and email accounts—fall within three degrees of separation from an established suspect.

> If you have a direct relationship with a suspected terrorist or target (you’ve called them, you’ve emailed them, you’ve visited their website) that’s a “one hop” relationship; there’s a solid line connecting you to that person in the NSA’s relationship graph. If you talk with, e-mail, or visit the Facebook page or website of someone who’s got a one-hop relationship, you’re two hops away. Add one more person in between in the graph, and you’re three hops away.

> Under the NSA’s FISA requests, Google, Microsoft, and other Internet services companies can be compelled to hand over relevant data from their servers on any account that falls within the three-hop range and is flagged as belonging to a person of interest. If you’ve won this lottery, the NSA will get access to your e-mails on Gmail or Outlook.com as well as your chats and Web-stored contacts, your documents, your synced data from computers and mobile devices, your backups, and anything else that can be handed over—at least, so the documents Snowden leaked imply.

> Your raw Internet traffic will get more attention as well. Your IP address will be watched more carefully by deep packet inspection hardware at the NSA’s 'Net taps, and what you do online will get extra scrutiny.


I'm not sure if you're just a casual spectator, willfully spreading disinformation, or inclined to ignorance but the boogeyman dismissive posturing boat has long sailed. PRISM's only purpose is to fill in the gaps of passive collection by directly sourcing data, otherwise it comes in primarily from submarine wiretaps and the multitude of other various passive collection sources. Or associated five eyes programs.

And yes FISA warrants include 3 hops and if you know anything about the Internet you know that is a hell of a lot of data for a single warrant. And public data shows the FISA court only deny around 0.1% of warrant requests. Even rubber stamps have to pretend they are doing their job.

I see what you've misread now. To explain: they look two to three hops away using the metadata (phone records and until Obama cancelled the program before Snowden leaked anything, the email envelope data — from and to). They did not actually request the data of everybody three hops away, and nowhere in your quote is that claim made.

PRISM doesn't "fill in gaps." It is their main source of actionable intelligence according to the leaked slides, and it only contains the data of the person being watched, requested via court order, approved by the company, and then sent to the FBI.

Read the transcripts of the Yahoo court case where they challenged the FISA court order. Then tell me they weren't asking for 3 hops...

Further proof:

> 50 U.S.C. § 1861 (b)(2)(C). These call detail orders cannot last longer than 180 days. Additionally, in an application for call records “two hops” from target—call records from people in contact with the identified target—the government must base its request on “session-identifying information or a telephone calling card number identified by the specific selection term” used in its first request. In December of 2015, the FISC ruled that USA Freedom does not require the government to show that these “two hops” call records are relevant to an ongoing investigation.

The only development Ive found has been that they promised to use 2 hops instead of 3. Which is good. But I'm not convinced the 2, previously 3, hops is limited to simply metadata.

Additionally this whole discussion of FISA warrants and limitations are strictly for Americans. They can collect full content and metadata of every foreign traffic they passively intercept.

The issue with metadata that was debated is primarily because they had unlimited warrantless access to American metadata since it's basically public data in their view. No one cared about non-americans. The FISA orders are for granting analysts access to full content on Americans (most likely they already have most of this data, they just aren't allowed to query it without a warrant.

They don't need warrants to collect metadata.

Point me to anything in the Yahoo court case that says they were asking for three hops.

Your "further proof" also shows that they don't ask for three hops. It says that in order to request call records (phone call metadata, not the communications themselves: https://en.wikipedia.org/wiki/Call_detail_record), the investigator must show that the user is two hops by communication from a target. They determine this from the full-take phone metadata collection program that ended last year (https://www.washingtonpost.com/world/national-security/nsas-...) despite being ruled legal by the courts (http://www.reuters.com/article/us-usa-court-surveillance-idU...). According to Snowden's leaked documents, analysts have neither the authority nor the tools to look at anybody's call records in that full-take data but are only able to query it in specific ways (e.g., list the anonymized numbers that are 3 hops away from a particular number). The government can then apply for a court order to request the call records for a particular number according the rules you quoted.

> They can collect full content and metadata of every foreign traffic they passively intercept.

They can, but according to Snowden's leaks, they don't outside of a handful of hostile countries. The poster's friend is unlikely to live in one of those countries. This is not unique to the US -- Pretty much every country's laws allow the government to collect any data on foreigners.

> The issue with metadata that was debated is primarily because they had unlimited warrantless access to American metadata since it's basically public data in their view.

Also false, as I explained above. They have legal access to collect it, as I showed above, but the law allows them to query it in only a few restricted ways.

> The FISA orders are for granting analysts access to full content on Americans (most likely they already have most of this data, they just aren't allowed to query it without a warrant.

Completely wrong. FISA Section 702 orders can only be for non-Americans living outside of America. Data for a non-American living in the US cannot be requested, and data for an American living outside the US also cannot be requested. You're thinking of NSLs, which also must specify the particular user whose data is requested.

If you can't say more your words are useless. The only thing you need to comment on this website is an email address.

Unverifiable comments like this are harmful.

> US intelligence activities are actively harmful to American commercial interests because they destroy trust, particularly from customers elsewhere in the world.

I think, they actively harm US corporations because they fundamentally destroy trust of US citizens too.

Yea but they're still going to buy the products.

Foreigners, especially foreign corporations, are the portion of the market whose buying decision is most sensitive towards these issues.

On one hand, I actually do see domestic harms. People are a bit less inclined to use Yahoo, and a bit more inclined to use Apple, because of their different levels of pushback.

I agree, though, that this is a much smaller issue than the loss of foreign sales. Surveillance won't kill US iPhone usage, but it could probably destroy Cisco's foreign markets.

> US intelligence activities are actively harmful to American commercial interests because they destroy trust, particularly from customers elsewhere in the world.

We already stand as the most powerful country on earth. It's a great testament to ineptitude in government that this is the current reality.

Yes. And dangerous in the long term because the US wont be the most powerfull country forever. High trees catch a lot of wind. People are more likely to hate the US. When the tides change and the power inbalance goes away the hate and mistrust will still be there. But maybe society is wiser this time around.

>> because the US wont be the most powerfull country forever

I honestly find this hard to believe because:

- When there is conflict in the world, countries always come to the US first for military intervention

- When there is a serious disaster of some kind, countries always expect us to send billions in aid (both militarily and financially) to help them

- When a country is trying to obtain nuclear weapons or weapons of mass destruction, they come to us to stop them

Essentially it comes down to everybody comes to the US first for everything. When that starts to change, maybe I would entertain the fact that we won't be the most powerful country. When so many countries and millions of people rely on us for so many things, our position as being a global leader won't change in the near future.

And quite honestly, I know there's large chunks of our population that would welcome some other countries stepping up and taking the lead instead of the US. It would certainly save us thousands of military personnel that have been lost over our involvement in questionable conflicts in the Middle East.

I'm not disagreeing with your assessment of the current situation, but forever is a long time.

> I know there's large chunks of our population that would welcome some other countries stepping up >t would certainly save us thousands of military personnel that have been lost

The US spends more on its military than the next 30+ countries combined of which only one is not a NATO ally.

We in Europe seem quite content with the situation. We get the fruits of the american sacrifice, but not the economic and cultural burden. We spend all that money on social issues in our little paradise. Then we either point our fingers at the US and naively assume our freedom & security comes from us being such darn nice people. Not realizing it is the United States who are walking the perimeter of our little paradise of moral superiority.

But outside of Europe, there are lots of countries who definitely want more power and standing in the world (China, Russia, India). And the economic imbalance from our history is wearing off.

So financially, the military budgets are going to look more similar. The West will only keep its dominance in terms of culture and values, if we (Europe) and the other NATO allies start contributing our fair share. As a side-bonus: this would give us more influence in the more questionable aspects of the US military strategy. But like the spoiled child of rich parents, i doubt we will figure that out in time. The US might end up in a situation not that different from Germany after WWI or WWII.

I'm not arguing it's fair. But i don't have to explain to an American that everyone always roots for the underdog. And the US hasn't been the underdog for a long long time.

You make some really excellent points, and I agree with all of them.

I will be curious to see what does happens once the US does have to share the military and economic influence you reference. Will we be happy to hand it off? It seems lately, our population has grown weary about waging constant proxy wars and getting involved in long standing regional and tribal disputes that go back thousands of years with no real end in sight.

I feel like there is the beginning of a vacuum and there are a lot of countries already queuing up to get a shot at that power and influence.

Aren't you assuming the implied change of power mentioned in the parent comment will come at the hands of a rational actor/state? That seems fraught with hindsight, no?

> we can conclude it's not the US government because Yahoo is willing and legally able to publicly disclose it.

That is the way to bet, but it is also possible (though not very likely) that Yahoo disclosed this publicly before the relevant USG agency could get an NSL out.

That doesn't make much sense though. If they could get an NSL for this with a gag order after the fact, then even a rubber stamp judge would no doubt be curious why they didn't get one before and notify Yahoo.

> They say they don't know who it is, but we can conclude it's not the US government because Yahoo is willing and legally able to publicly disclose it.

I would assume, like most announcements, the reason that it's being announced is because the data is available out there and been seen in the wild.

I seriously think that to get a CS or EE degree (or similar) B.Sci degree, you should be required to take at least one full term length ethics course. Same idea as the ethics courses taught to junior law students.

The internet is already fucked up enough with governments and rogue corporations messing with its AS-adjacency topology in non-free ways at OSI layers 1-3 , before you even get into stuff like writing backdoors at layer 4+ to pass all email to the NSA.

When you get a legally-binding order from the government of the United States of America, and exhaust your legal appeals, you either comply or go to prison. An ethics course won't do you any good.


- Refuse to take action. They want engineering done, they can bloody well do it themselves. Don't type a single keystroke in the direction of helping them.

- Announce what is going on anonymously. Plenty of avenues for this.

- Announce what is going on, publicly. See if they do indeed want to take you to court.

- Quit.

- Take down the service. Much easier if the service is only a part of your company. Helps a lot if you don't retain the information they're looking for.

- Lie.

- Destroy records (this is by far the riskiest action here, above a simple public announcement).

- Delay. And delay more.

- Keep information outside the jurisdiction, possibly controlled by a third party who will not comply with orders.

- Misunderstand ("Is that a one, or an ell?")

Most of these will get you into trouble, a few won't. Most of these are really difficult roads.

I've given this some thought before. If I found a backdoor in a product, I would remove it with a tracking bug and a checkin, and send internal email about a really bad bug that I'd just fixed; the more internal people that know, the better. And if a VP showed up and berated me, I'd just tell them to fuck off, and quit if it came to that.

In a large company, a useful thing to do if something fishy is going on is to go see the company's general counsel.[1] If they didn't know about it, they should be told. Their job is to keep the company out of legal trouble. In many cases they have a legal obligation to do something about it. An attorney will rarely tell you to do something illegal; they can be disbarred for that. If they tell you it's OK, then they've given you legal advice as an employee, which gives you some protection when things come unglued. You're also unlikely to be fired for talking to the company's general counsel.

[1] http://www.slu.edu/Documents/law/Law%20Journal/Archives/Dugg...

I'd suggest following up any conversation of substance with general counsel (or even the likes of HR for that matter) with an email detailing the conversation you just had.

Don't do it in an obtuse manner; let them know in advance that you will be sending it following the meeting, maybe even get them to suggest the wording of it if you aren't convinced of the security of internal mail.

It's easy for either side to forget the exact points raised during a conversation like that, so good for everyone to have a written record.

You might as well just quit if you think HR is there to help you at all. HR is there to protect the company from you.

Corporate counsel != HR.

But wait! The emails are being intercepted by the NSA and CIA...

In the case of NSLs, or UK orders from the security services, it may be illegal to tell anyone who isn't mentioned in the order. Including the company's general counsel.

In the event that you are asked to do something illegal, it may be illegal or inadmissible to mention that you were ordered to do so by the government (Matrix-Churchill trial passim)

Would be interested to hear a legal opinion on this. In the US, you do have a constitutional right to counsel (Assistance of Counsel clause of the Sixth Amendment), generally of your choice.

From my reading of the case law, the crux would be that right to counsel can only be invoked "at or after the time that judicial proceedings have been initiated against him, whether by formal charge, preliminary hearing, indictment, information, or arraignment."

Does an NSL constitute adversarial proceedings? Also, how does right to counsel work with other legal gag orders (e.g. if I'm cooperating to provide information on a criminal case but have been ordered not to inform the targets)?

IMHO you have a right to counsel because as a normal citizen, you can't be expected to actually understand if the letter is valid, legal, etc.

That's actually exactly why we have counsellors.

If you're the CEO or some other exec in receipt of such an order, you're not going to be able to single-handedly implement a backdoor without anyone noticing, even if you possessed the skill to do so in the first place.

The nature of such an order requires you to be able to tell the people who need to do it what it is that you need them to do, even if you can't tell them why.

Additionally, anybody with the power to veto such a change also must be provided with a good reason why they can't veto this one. Your legal counsel needs to understand why the change must happen, so he can respond appropriately to questions from pissed off developers and ensure that your company is complying with the letter of the demand (and no more).

I wonder how they think this is supposed to work.

- CEO gets a Letter. Does the CEO start learning Python/C++/PHP and Cisco configuration? Or does he tell a worker bee "Shhh! And read this Letter" ?

- Worker bee starts making changes to production code and systems. Suddenly he starts needing automated code reviews, and reconfiguration alerts go out when he frobs the firewalls. These changes are indistinguishable from an infiltrator with the worker-bee's credentials and ideally things are set up so that changes are generally shared around, a normal review process, to catch out-of-control worker bees.

- The build lab scripts are modified (by who?) to insert bad code. Oh, but the build checkers catch this ("Hey, we found a compiler bug!" / "Umm, no you didn't..."). Everybody starts handing around links to "Reflections on Trusting Trust".

- Things get even more exciting when the internal monitoring systems discover (say) equipment attached to the network that ain't supposed to be there. "Wot's all this then," says the network engineer, and he yanks the cables to the SkankSec-1000 that someone hot-wired into a rack. "Oh yeah, blue fiber is for NSA, green is for CIA, yellow is for GCHG, and black is for Russians, what else?" He leaves it unplugged. Let's ignore the security camera footage in the datacenter, since this is a thought experiment.

In an environment with self-monitoring for health and intrusion detection, applying changes for user surveillance requires quite a lot of internal cooperation and communication. No wonder the Yahoo stuff looked like a Bad Guy who got in.

We can probably extend the internal defenses to alerting on odd access patterns to sensitive database rows, too . . .

The easiest way is probably to create a bullshit project with a few people. We are only creating a new dashboard for X, this is cost reduction project, etc.

I don't know how Yahoo is organized but if teams works in silos, without any visibility on other teams, it is probably not that hard to introduced changes that are undetected.

In a place where everything is monitored, down to the MAC address of machines and their network traffic, ideally it would be difficult to sneak in a monitor.

Access to critical data should be similarly protected.

These are relatively tame intrusion detection systems that you would have to make changes to in order to remain undetected. That should be really hard to hide.

This is apparently pretty much what happened at Yahoo! when the security team found out the rootkit.

My understanding is that US requests explicitly include a clause that exempts your counsel from the NDA requirement, in that you're permitted to show it to and discuss it with them.

(They, of course, are bound to not share it further.)

That almost has to be the case. Otherwise, I get this "legally binding" order that I don't understand but looks scary, but I can't discuss it with my counsel? How am I supposed to tell the difference between that and a con job?

I'm in a right to work state, so they'll just find something else to fire me for.

Edit: But seriously, I'd take the job loss in a heart beat if my company was doing this crap.

In other words, act the same way that most govt legislators would act when asked about any non public relations information...."but sir, I was just following your example..."

>Options: - Refuse to take action.

I realize this stuff is easier said than done, but there's a lot of tough talk about the ethics of many things on HackerNews. How many people here work for companies like Yahoo or Facebook or many smaller shops that are legitimately harming people with these sorts of things?

I've worked on products with tens of millions of customers, in a position to know if something nefarious was going on, and also in a position to be able to do something about it.

This is why I've given it some thought. This is something that you probably need to spend a little time thinking about if you're in a similar situation, because things are not going to get better.

It's a personal decision. I can't tell you it's worthwhile to quit your job or risk legal action. But if you're responsible for the privacy and rights of millions of people, you should consider what your actions will be.

I'm all for taking the high road in these situations. But I would highly disrecommend option 2. NSL and the like are targeted narrowly and they will find you... so you might as well get some public kudos for speaking out. But speaking of, regarding option 3 - they definitely will take you to court and definitely win and throw you in jail. Not to mention in both cases, they will likely cease & desist/gag/NSL whatever media you speak out in right quick, so unless you have legit media connections there's no guarantee your announcement will go viral and spread beyond their ability to douse it.

I think it's reasonable to expect a reasonable level of (legal) pushback from service providers against government warrants or mandates. It's unreasonable to expect of them to risk their entire business.

But they're doing that anyway, in bits and bobs, when they erode the trust of their customers.

I think maybe this is something that can only be solved by elected officials being pressured by the public. Expecting companies to go up against the government is not reasonable.

Or...leave! Mine are Kerala and São Paulo.

Take it up with the people who thought it was a good idea. All the above is sneaky. Just talk with people up the chain if someone is following orders from someone else. If that means you end up in the white house, so be it.

Be found in contempt and go to jail... you first?

No - Nelson Mandela first. Martin Luther King first. There's a precedent of people going to jail for standing up to civil rights violations.

Mandela went to jail for acts of violence not for standing up for human rights.

All that would accomplish is that you and everyone you sent this letter to would now be subject to another NSL with a gag order, and your fix would be undone.

I don't see how a gag order can scale to hundreds of people without the subject matter becoming common knowledge.

Lots of secrets scale to hundreds of people just fine, and are enforced weaker than with a prison sentence. As far as any of those people are concerned, they are now actively monitored in the name of national security (same reasoning as why the NSL was issued).

If you want to quit to refuse to help, fine, but I take serious issue with what you're suggesting beyond that.

These are the legal actions of the United States government, executing the authority given to them to the people's elected representatives, and overseen by a judiciary duly selected according the constitutional procedures. Their legal and democratic authority is unassailable.

Further, you don't know what motivated these actions. Is it intelligence about a specific threat? Has the government identified how a specific adversary operates? You could be impeding the investigation into a deadly plot. You have no way of knowing this.

This isn't a game. We have real enemies who would gladly kill all of us. We have processes as a society for deciding how we defend ourselves. If you don't like our current policies, then exercise your vote, exercise your right to speak and publish and assemble. But don't usurp the sovereignty of the people of this country.

> You could be impeding the investigation into a deadly plot.

We have warrants for that. Subpoena Yahoo for specific records. I'm not willing to trade the freedom of an entire country for a single investigation. This isn't an episode of 24 and there is no Jack Bauer.

The fourth amendment was sold as follows:

> The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

I'm not a constitutional lawyer. I'm sure the actual amendment can be interpreted to mean whatever George W and Co want it to mean. But that's not how it was sold, and this government is not legitimate as far as I'm concerned. Not as long as it runs dragnet surveillance over the American people. Snowden is a patriot and the U.S gov't is betraying the public.

George W? He's been out of the picture for coming up on a decade. There's another guy in there now who could have fixed all this, if he wanted to.

"How do you know it's not really about a truly bad guy?"

Ans: Tens of thousands of NSLs and other secret actions make it really improbable that a small set of refusals to cooperate will result in great harm. Or that the game the government is playing is really about "bad guys" most of the time.

"No really, this is about an actual bad guy this time!"

The government has lied repeatedly about this stuff. It has covered up the fact that it performs unlawful surveillance on its "sovereign people". If this is about a bad guy this time, rather than all the other times when it has been lies or cover-ups, then that's just too damned bad, isn't it? See also: The Boy Who Cried Wolf.

I have a fundamental disgust with arguments that attempt to appeal to blind loyalty. That "sovereignty of the people" you wrote about has been invoked too many times by scoundrels to be respected.

Sure, there are indeed bad guys who are trying to kill us. But that doesn't justify a wholesale surveillance state, or tens of thousands of NSLs, or secret courts. That's just stuff the powers-that-be want, and have used the bad guys as an excuse to get.

I'm not so sure of that. For example, it's not clear that Congress has the authority to create NSLs, a sort of not-a-warrant. It's not clear that they can issue gag orders like this. Even if they could do these things with warrants, they're not---and that matters.

The judiciary is not meaningfully overseeing the parts of the state where law enforcement and national security intelligence mix. The FISC is a great example: it pisses off the agents and attorneys who have to deal with it, because it imposes an unholy amount of drag: they're trying to do their jobs, catch bad guys, etc.---and a FISC submission takes weeks of full-time work. But the FISC only exercises drag, not control. It can't steer---we see this because it never refuses a request. I suspect that's all one-sided proceedings can do, given the way we train our lawyers and judges. So it fails the needs of the citizens, and it doesn't protect liberty.

The FISC, and NSLs, will someday be looked at with the same light as Adams' Alien & Sedition Acts: as fundamental betrayals of the American promise by an executive too scared of what might go wrong to trust the country to go right.

Look, there's a lay assault on their legal and democratic authority: NSLs are not obviously a legitimate exercise of Federal authority. And for all that, if I were at the FBI and chasing a terrorist cell on American soil, I'd use one in an instant.

> It can't steer---we see this because it never refuses a request.

There are issues with FISC(like lack of transparency), but this specific point is unclear to me. Here are some possible scenarios where a low refusal rate wouldn't be a problem:

* The judge could informally tell the agent that the request needs to be amended before he'll grant it.

* FISC could publish clear guidelines on what is and is not allowed, so agents are never surprised and only submit legal requests.

* Agents only make requests in extraordinary circumstances, so every request comes with a lot of legwork and investigation done making it clear why it is necessary.

The US court system is an adversarial one. You make arguments in court. There are two sides.

FISC is a court in name only. There is no oversight (I don't count the political appointees in the Senate as oversight), there are no checks and balances, there are no appeals. Calling it a court beggars the term.

> You could be impeding the investigation into a deadly plot.

problem is, that's the excuse that's always going to be given by the intelligence community for even the most obviously unconstitutional intrusive privacy invading stuff, like the NSA's "opticnerve" program.

Not an american, but the idea is that passive aggressiveness / mild-sabotage from citizens is just another way of expression, just like voting. And it works for the purpose and/or towards democracy!

This is imho the untold truth about why most communist / socialist regimes failed: people at all levels of society started doing their work gradually worse and worse, pissed off and stressed out by the lack of freedom and constant interference from state organs, until everything collapsed, and people now live in better economies and freer because they've destroyed their previous systems. Yeah, most of the people who performed such "low level sabotage" will not even recognize to themselves that they did it. And yeah, bad governments were also helped to fail by external interference and external sabotage, but imho a "socialist regime where the people would truly believe in socialism/communism" could have worked out and had good economic performance, problem is that when "good economic performance" can only be had at too high of a cost of "individual personal freedom", people will start, even subcounsciously motivated, without admitting even to themselves what they are doing, and with insect-like non-communicative coordination, to slowly and methodically weaken the system they live in from within, until it crumbles, even at the cost of their own lives sometimes.

This was America's secret weapon that helped "win the cold war" - people's inner "instinct for freedom or death" - thank god we all have it and a race of subhumans lacking it has not yet been engineered! It might also be the reason why so many Americans "voted for Trump", though this was probably engineered by some really smart "puppet masters" - the guy is clearly a "man of the establishment" despite it's clown persona...

Of course, there is no "monument to the lazy and drunk soviet worker that helped take down communism". And there will be no monument for the Google employee that writes a subtle bug in the "government reporting module XYZ" :) Of course, if the gov doesn't abuse its power and asks for too much, that poor programmer might not be so "overwhelmed" by it's duties so as to make stupid "mistakes" with dire consequences...

How does the old Soviet saying go - "We pretend to work - they pretend to pay us"

That's an extremely rose-tinted narrative.

Nothing rose tinted about it... yeah, the system got from state A to a less bad state B. And millions of people suffered and died in the process.

Probably one of the worse and most mentally damaging ways to enact social change. Now you have democracy, but also a society with 50% of the seniors infected with a toxic mentality that will stay with them until they die - once you do something over and over again it becomes a habit and a habit of subtle systemic and self sabotage is horrible. And some of it managed to pass it on to younger generations, hence extra laziness and crime. Almost like the fallout of a "mental nuclear war" that keeps lingering and "irradiating people's minds" long after the old problems disappeared.

I'm just remarking that in its slow and lives-wrecking way, this... works!

Well, Stanislav Petrov won a Nobel Prize for refusing to follow orders ...

No he didn't. He won the Dresden prize, whatever that is.

But don't usurp the sovereignty of the people of this country.

It's difficult to reconcile this idea of democratic sovereignty with the utter secrecy in which the operations under discussion are performed.

Their legal and democratic authority is absolutely assailable, within the laws of the United States and within it's democratic processes. That's why the US system of governance contains within it checks and balances between the different institutions.

There absolutely should be mechanisms to challenge or appeal against any action of government, and when there isn't as seemed to be the case with NSLs that's a very good indication that the system is being grossly abused by government agencies.

> You could be impeding the investigation into a deadly plot. That's the "think of the children" of security agencies. You think they are so clueless that they need to read all mail of Yahoo or only follow some accounts?.

When your argument can also be used unchanged by the Nazis in the late 30s, it's probably not the best way to go.

>When your argument can also be used unchanged by the Nazis in the late 30s, it's probably not the best way to go.

You either missed the entire 'democratic legitimacy' part of my point, or don't actually know anything about the Nazis in the late 30s.

I don't see a whole lot of "democratic legitimacy" in FISA courts or black budgets in the tens to hundreds of billions of dollars.

Recall democratic legitimacy isn't necessarily moral authority. "The tyranny of the majority", etc.

People have the option to quit. The company is legally bound to obey, but individual employees can chose for themselves whether they want to be a party to it.

Yahoo may still be court ordered to implement it, just hopefully not with their best and brightest developers. I also have a theory that a lot of the recent terrible news coming out of Yahoo is due to staff complying with the letter of the legal order but not the entire thorough spirit of it, intentionally doing a quick, shoddy bug ridden job.

Unfortunately it's not that simple. More often than not, someone high up will find someone low whose continued stay in the country is highly dependent on them keeping this job.

While I've never had to make such a choice, I can tell you that my personal attitude towards what I would and wouldn't do changed the day this restriction was removed.

The other point to be made is that you probably want your best developers implementing back doors (if you are compelled to do so), so at least the back door is as secure as possible (and not trivially bipassed)

All the more reason that you should take pause when your best developers refuse to do so.

it only works if resignations are numerous enough to matter to the company.

It only "works" in the sense of preventing it if resignations are that numerous. It "works" in the sense of "I, the individual employee, can still live with myself", even if only one quits.

This is why we (as a community/industry) need to have the equivalent of a prison/death threat.

The government wins by making it personal, by threatening prison and death for your obedience. It's not an abstract threat. It is directed to a specific person not a company or security team.

The people who comply can quit those companies but they don't. I'm not referring to the ones down the chain (e.g the security team who was unaware) but the ones who obeyed and pushed something they believed was wrong.

Whether it is public shaming or having a permanent stain on their reputation next time they go job hunting, we need a way to retaliate.

Right now, they can fuck us and get away with it and what's their excuse? "the government told me to do it. Just doing my job. I had no choice."

There are no societal consequences to cooperating with the government even when we believe it's wrong.

Create a climate of "snitches get stitches" and maybe they will think twice before selling out the lives of a billion people. The same way these government officials hide behind patriotism when they mean blind obedience to my authority.

> Create a climate of "snitches get stitches" and maybe they will think twice before selling out the lives of a billion people.

Committing crimes against people who defect is obviously not going to work. They'll just arrest you for it.

But suppose we create a certification. To get certified all you have to do is promise not to work on a specific list of things: Mass surveillance, backdoors, etc. To lose certification forever all you have to do is work on one of those things.

Then we can have companies offer to put it in their contracts with their international customers that they'll only hire employees and contractors with that certification. Any companies that aren't actually working on mass surveillance etc. are happy to do this because it gives their customers more confidence that they aren't, and if it becomes popular then any companies that don't will become suspicious and lose business.

Then all employees at all companies will have to think twice before working on such things, because it would mean giving up any possibility to work at any of the companies who won't hire such people, which with success will constitute most of the industry.

> To lose certification forever all you have to do is work on one of those things and get caught.

There, fixed that for you. That small detail is why your attractive idea wouldn't work in real life. How many years it took for the car industry scandal about tampering with emission tests to be revealed? And it only got to ruin the the reputations of the few engineers involved in the forgery.

> There, fixed that for you. That small detail is why your attractive idea wouldn't work in real life. How many years it took for the car industry scandal about tampering with emission tests to be revealed? And it only got to ruin the the reputations of the few engineers involved in the forgery.

The same goes for theft and graft and murder. You don't get punished if you don't get caught. But you could get caught, and you don't know ahead of time whether you will or not.

> But suppose we create a certification. To get certified all you have to do is promise not to work on a specific list of things: Mass surveillance, backdoors, etc. To lose certification forever all you have to do is work on one of those things.

I fear you may be making a potentially dangerous assumption about how engineering works in a compartmentalized environment. Engineers do not always know the purpose of the systems on which they work.

Once upon a time in Texas I spent several years working on a system to run a binary sample through a series of plugins that produced analysis of the binary sample. I was told that this was to help detect malware - and it could certainly do that.

Did I know that for certain? No. Were there other possible uses, such as to determine how detectable a given piece of experimental malware was? Yes. Did I have any way, shape, form, manner, or means of finding out what 100% of uses were? No.

A lot of software has more than one possible use.

You don't prohibit making or having knives, you prohibit stabbing people.

The problem with the certification model proposed is that it seeks to prohibit stabbings, but works on people who make knives.

CanaryCertified (TM).

Wow. This is probably the worst comment I've ever seen on HN. You're suggesting that people who follow the law, which you disagree with, get blacklisted, following the model of criminals killing other criminals who follow the law.

If you want to change the government, there are far better ways than retaliating against citizens unwilling to risk life and limb for your ideology. I think you need to learn to direct your anger. Not everyone is in so privileged of a position to be taking your specific moral high ground over their own well being.

1/ For the most part, this is not the "law". We're talking about instructions that have not been legally challenged, that are issued by a government agency, ie civil servants that are not responsible to the people.

2/ In any case "the law" is only the result of a very imperfect process; it's not the word of God: it can be changed (it often is) and it doesn't have any special moral value. Your moral code is yours, it's dictated by your conscience and should not be taken whole from texts written by other people.

In the 20th century, many crimes have been committed by people following the law, and not just in Nazi Germany. French Jews were arrested and delivered to the German authorities by members of the French police who were simply following orders. The ones giving the orders have indeed been tried after the war, but not the ones following them; in fact they never had to suffer any consequence for their actions. But the truth is, had there been no cops willing to participate, there would have been no deportations.

"I'm just doing my job" or "I was just following orders" is NOT a valid excuse. It's a cop out (pun intended).

So now, quitting your job (because you're asked to do something unethical) is the equivalent of losing your life and limbs?

Let me guess, you believe all government actions are lawful. All lawful actions are righteous and justified. The only permissible way to change the system is through the system. Did I get the gist of your morality?

In which case, keep voting and I'm sure you will see the changes you want reflected in legislation just so they can be ignored in by secret courts and intelligence services.

In the real world, people respond to threats (e.g prison). That's what the government uses. Why are we expected to behave differently?

> So now, quitting your job (because you're asked to do something unethical) is the equivalent of losing your life and limbs?

It's an expression. How about "Losing your livelyhood." It's a real thing for some people.

> Let me guess, you believe all government actions are lawful. All lawful actions are righteous and justified. The only permissible way to change the system is through the system. Did I get the gist of your morality?

You're way off on my morality. It's more centered on not causing harm to others, which is what you're suggesting people do, to others that don't fit your sense of morality.

> In which case, keep voting and I'm sure you will see the changes you want reflected in legislation just so they can be ignored in by secret courts and intelligence services. In the real world, people respond to threats (e.g prison). That's what the government uses. Why are we expected to behave differently?

Do you really think voting or threats are the only two methods to enact change? Neither work particularly well. What does work well is changing the opinion of the majority, and until you do that, you must question whether your minority opinion is actually the correct one, and provide reasons for others to change their opinion of you believe it to be so. Taking the low road and suggesting punishing people who aren't the cause will get you nothing but negative comments on hackernews.

> So now, quitting your job (because you're asked to do something unethical) is the equivalent of losing your life and limbs?

In a capitalist system, where your alternatives are "work or stave", yes.

Only if you can't get another job. That's not likely to be a problem for most people faced with this particular situation.

You don't seem to understand Nash equilibrium, game theory, and the Prisoner's Dilemma.

While I do think the concept of community ostracision is impractical (because there is never only one tech community to start with) - "I just followed the law" is never a moral free pass. Given that what law means actually is daily debated in courtrooms it's not only morally hollow, it's uncomputable to an individual in the general sense.

"I just did what the authority figure told me to" is one the oldest excuses in the book.

To be clear - I am not suggesting that everything on the books should be followed, I am suggesting that punishing people who are following these specific laws in question, which are following NSA letters and gag orders, does not fit what I think is the commonly accepted way we treat each other.

Yes. I agree with this sentiment fully.

The main connotation which invoked my response was equalling the resentment of following the law unilaterally to the mindset of a criminal organization.

It is legal, and perhaps moral, to refuse to work with unethical people who follow immoral/illegitimate orders. While the mention of "snitches get stitches" set things down the wrong path, the core idea is not unjust.

Not everyone is in so privileged of a position to be taking your specific moral high ground over their own well being.

"Privilege" doesn't enter into it. Everybody has the option to do the right thing, if they're willing to accept the consequences. Not everybody will of course, but there's nothing truly preventing one from doing so.

The difference is that for some people the "consequence" for doing the right thing is that they'll have to get a new, but similar, job while for other people it's watching their family being tortured to death. Saying that these two people are facing the same or even similar options is disingenuous to say the least.

Saying that these two people are facing the same or even similar options is disingenuous to say the least.

I didn't see anybody say that, so I'm not sure what your point is.

At the end of the day, sometimes doing "the right thing" has horrific consequences. Some people can and will accept that, some can't or won't. But the choice is always available.

Consider that the Founding Fathers of the US had "Give me Liberty or Give me Death" as a rallying cry, not "Give me Liberty, or erm, well no more than minor inconvenience". Sometimes you have to be willing to accept death, imprisonment, etc. for a cause you believe in. Look at Edward Snowden, who did what he though to be "the right thing" even though he knew the most likely consequence involved death, imprisonment or torture (or some combination of all of the above).


Where did I say anything remotely like that?

You're taking an ideological stance on privacy, one I generally agree with, and forgetting the motivator behind the overreaching laws, which is quite simply to ATTEMPT to do something to make this country safer. Does it work? I don't think so, but a lot of people do. It's pretty hard to quantify since the data isn't open, and by its nature can't be.

It's a pretty severe jump from disagreeing with whether or not this system works, and what liberties can challenge the system, to pulling your godwin card out.

One can never know the motivations of others, full stop.

You can also "do it poorly", in such a mediocre way that what's going on leaks out. There's no way for them to prove that you did a bad job on purpose.

While on topic, if they threaten you with prison if you don't do their bidding (with no due process, etc), doesn't that make them the terrorists (eg: they're using terror/fear to get people to do what they want).

Not true. Lavabit founder still roams free. But that's how they get you, through fear. You self-censor and agree to comply yourself, before you even dare to speak out against them. That's how they win without even lifting a finger.

The ethics course will make you feel better about going to prison.

Save it for the MBAs instead, or even as an onboarding requirement along with other training for new management hires. It's not generally the engineers making these decisions.

Sometimes it's an active issue and at the end of the day someone must implement something terrible (knowingly or not -- direct a junior engineer to do some complex task with the expectation they'll leave behind security vulnerabilities, just as good as getting someone to intentionally leave an issue). Ethical engineers can and probably should quit -- who knows how much a required, dull ethics course would influence that though?

Other times at the end of the day it's just lack of engineers doing something -- typically due to management not signing off/budgeting. Ethical management won't even necessarily help here, the incentives don't change. Some sort of stronger corporate liability for negligence is needed, probably, but the problem is not generally the engineers -- engineers, with an ethics course or not, are typically the only people who care about these sorts of things in the first place! What's the largest dip in stock price due to a password leak? How about shady government collusion? Have any groups of shareholders demanded more care to avoid such issues at any company?

I'll wrap up with a joke: "It should be noted that no ethically-trained software engineer would ever consent to write a "DestroyBaghdad" procedure. Basic professional ethics would instead require him to write a "DestroyCity" procedure, to which "Baghdad" could be given as a parameter." --Nathaniel Borenstein

Fun fact: I got an MBA and was one of the first cohorts of a new curriculum, which required an ethics course. Hilariously and sadly, some 20 students were put on probation for CHEATING in ETHICS class.

I think that was the moment I realized I made a $100,000 mistake.

SCU (my alma mater) does require MBA students to pass an ethics course.



Then again, it's not particularly surprising that a Jesuit institution would be strong on ethics. More institutions should take their lead, though.

I disagree strongly. If you from an engineering perspective are the only one who truly recognizes the implications of a management decision, you need to speak up about it.

Engineers do speak up, frequently. For a famous example, see the Challenger disaster. Again, engineers aren't generally the problem. Still disagree?

The NSA don't seem to lack the technical talent to wantonly shit all over the Constitution.

Plenty of HN posts laud people for working for the government, the entity which engages in war crimes, torture, and mass surveillance.

They are hurting for it, though.

There are still lots of people willing to join the TAO and the like, but the NSA has been pretty open about struggling to recruit top talent. Not all of that is ethical stuff, they lose people for reasons from salary to drug and felony screens, but some of it is.

Bear in mind that the NSA only needs good talent to compromise systems, not elite talent. They have some elite talent (Stuxnet anyone?), but their domestic work is largely hacking theater. After all, you don't have to covery your tracks like a private hacker if you can just ship out an NSL to bury the matter. Hell, some of their projects involved a lawyer, a bunch of analysts, and no internal talent - they can just ask for what they want.



What's the TAO?

Tailored Access Operations. It's the "offense" branch of the NSA, responsible for gaining access to external computer systems: often technologically, sometimes legislatively when they're domestic. They did QUANTUM and FOXACID among other access tools.

It's a major part of the NSA, and generally considered to be where the bulk of the "serious hackers" work. The Equation Group is (probably) tied to TAO - they're the access group that was recently affected by the Shadow Brokers leak.

This appears to be what OP was referring to (a cyber espionage sub-unit of the NSA):


Bartweiss provided a better, short summary.

Thanks! In hindsight, I should have searched for 'tao nsa' rather than just 'tao'.

We know first hand what James Clapper thinks of the Constitution, today from the Intercept: https://theintercept.com/2016/12/15/james-clapper-has-a-clas...

To be fair, engineers aren't usually the ones who will have their careers judged by a delayed project that cost the organization large sums of money. There is a much stronger motivation for upper management to pretend everything is still fine and dandy.

I don't know how common this is in general, but an ethics course was a required part of the engineering core curricula at my school (Case Western Reserve).

Ultimately, though, an ethics course isn't going to make people stand up for their ethics. A professional organization along the lines of the American Medical Association (at least in terms of political strength) that stands up for its members and censures those who behave unethically would do far more for strengthening engineering ethics collectively.

When there are few professional consequences for unethical behavior at the behest of your employer, taking an ethical stand is ineffective and quixotic. You will be fired and another engineer will likely complete the job.

> When there are few professional consequences for unethical behavior at the behest of your employer, taking an ethical stand is ineffective and quixotic. You will be fired and another engineer will likely complete the job.

I disagree. I've seen this in action where people who were CCIE-level senior network engineers at one of the five largest ISPs in Turkey quit and got new jobs elsewhere (outside of the country) rather than be an active participant in the autocratic government's messing about with internet censorship and null routing of IP blocks. These were not people who could just be replaced by advertising an open position.

The more junior staff members left behind had trouble with the government's diktat, with less clue and technical capability, and ultimately ended up implementing what was required in a less effective and shittier way than an unethical CCIE could have.

Having to leave home country, friends, and community is a steep price to pay for ethics. And the dirty work still got done, albeit ineptly. I think you proved my point.

As morgante said below "If you can professionalize a certain set of ethics, you can make it impossible for your employer to find anyone to complete the job."

If you can professionalize a certain set of ethics, you can make it impossible for your employer to find anyone to complete the job.

For example, capital punishment is increasingly hard to carry out because anesthesiologists refuse to participate and drug companies won't supply the drug.

I'm not sure the capital punishment part is a good comparison... It's because north american governments are squeamish about solutions that look brutal but efficient. I'm sure that if a state wanted to implement capital punishment via firing squad and advertised for death penalty proponent rifle marksmen they would have no lack of candidates.

Yes but how would that look? The death penalty relies on the "humane" veneer, or it would be abolished very quickly. If there were the slightest hint that the people killing other people in the name of the state are actually enjoying it, there would be a huge outcry.

Similarly for surveillance, if you're only able to implement crude solutions people will be disgusted. "So you're saying Yahoo was compromised because no good engineers wanted to work there due to government interference?" We don't know whether it's true, but the assumption is already damning.

> The death penalty relies on the "humane" veneer, or it would be abolished very quickly. If there were the slightest hint that the people killing other people in the name of the state are actually enjoying it, there would be a huge outcry.

It seems unlikely you know anyone you know supports the death penalty. The death penalty has almost universally been abolished despite public support for it. I say almost only because I am too lazy to check whether or not it was in fact universal. Norway was among the first countries to abolish the death penalty and they brought it back especially for Vidkun Quisling.

I'm not going to go looking through the Pew World Survey for attitudes to the death penalty but at a guess Sweden probably has the highest proportion of the population opposed. If you got less than 1,000 people volunteering for the firing squad for the kinds of crimes that get the death penalty in the USA I'd be very surprised.

Professional organizations have a lot of other, unintended negative consequences that make me want to stay the heck away from them. Also, nobody guarantees that the professional organizations themselves would be ethical. The AMA lobbies for things that I consider unethical, as a way to keep their leverage. The American Psychological Association was fine and happy with torture. I've seen an architects association let a member amend their copy of signed documents (which I got to court for, and won, because the editing ended up overlapping a little over my signature, and my own copy didn't have the last clause)

Collective action like this demands doing a lot of work to make it difficult to get a job without being a member, and then works on limiting the influx of members: For instance, you'll find that the attempts of a professional developer association in Spain advertises how it'll increase the value of your college degree, and protect you from having to compete with intruders, like those with physics degrees, or that learned programming from an accelerator, or on their own. And professional organizations have to do this kind of thing, because otherwise they lack the power to get anything done.

So no matter how much I personally dislike mass surveillance, I'd not be caught dead supporting the creation of a professional organization. Instead of using force, how about growing the utility of our work so much, everyone knows they can change jobs the next day to a place that doesn't do mass surveillance? If enough great jobs exist, the awful can't retain talent. That's why developers in the US have far better working conditions like in my native Spain, where finding another job is not something you can do in a week.

I agree, I'm not sure that an ethics course will make people behave ethically.

As part of my former life, I had to take an Ethics course for my accounting license. It focused on topics like conflicts of interest. Even though many professional (licensed) accountants have taken the ethics course and passed an ethics exam, it doesn't mean that much. Ultimately, behaving ethically will come down the individual's morals.

> When there are few professional consequences for unethical behavior at the behest of your employer, taking an ethical stand is ineffective and quixotic. You will be fired and another engineer will likely complete the job.

Well quitting a job as a concentration camp gas chamber constructor may merely slow down the operation a bit, but that could mean more lives saved until VE day arrives.

And yes I did just make that metaphor.

That assumes your job has a very low bus factor. Presuming a company of the scale required for a state to need its assistance, your absence (and that of however many people there are who agree with you) likely won't slow down the operation at all.

...which is a good argument to give up and continue building, how exactly?

Undergraduate ethics courses usually just go over the big historically-important ethical analysis systems (utilitarianism, Kantian deontology, veil of ignorance, social contract, etc.) Unlikely to touch on trickier issues such as why you ought to act ethically at all, or even what it means to act ethically - why, for instance, are you so certain writing backdoors for the NSA is unethical? This is certainly something which can be argued.

Even worse though, is that these sorts of courses are almost always derided by the majority of the student base as a waste of time, and "common sense stuff".

I actually really enjoyed the one I did, although it had just gone through a major rewrite to improve it a lot. And yet, I don't think I ever heard any of my peers mention the course with anything but contempt.

That's because a course on ethics doesn't provide students with greater ethical reasoning or values. Such courses simply provide information and test whether students were listening to the lectures or not.

My alma mater had just that. It was a course named "Ethics in Technology".

The description for the course:


TCH301 is designed to introduce students to essential concepts necessary to evaluate the ethical implications and potential impacts of the use of new technology within human society and culture. Students will explore modern ethical dilemmas in technology, looking at multiple aspects of how the introduction of technology redefines law and values.


From: http://majors.uat.edu/Tech-Studies/Pages/Core_Curriculum.asp...

It was and still is a required course to graduate with a degree from UAT.

I think tech people are actually far more likely to be against this stuff than the general population. Also, a significant portion of people in tech don't have many formal qualifications compared to (as a bit of an extreme example) something like medicine or accounting. Also, as an accounting major myself, there was a big push to impress upon students ethics after the series of financial frauds in the 2000s. It involves a lot of telling us that Sarbanes-Oxley exists often, quizzing us on some detail of the act rarely, but never actually giving us ethically questionable situations and having us debate what should be done or covering cases of unethical behavior. Having taken an actually course in ethics, I'm not impressed by it's ability to get people to behave ethically, it was very theoretical and, actually, not very prescriptivist which is actually probably what you want. And the struggle you're up against, too, is that the NSA is a government agency giving you legally binding orders because, they argue, they need it for law-enforcement purposes. It's not even a black-and-white issue where you can expect everyone, even most, to agree on if the article described ethical behavior or not.

For what it's worth, Ethics was a required course for an undergrad business degree in my college in the 90s. I'm not sure if it still is.

Ethics is easily thrown out the window once some executives falsely proclaim that they are required by law to do what's best for shareholders. Since shareholders have a minimal voice outside large holders, it's easy for those executives to claim, "the shareholders want profits, period." Of course, they don't mention that executives want bonuses too.

An interesting thing about ethics when you study History. When asked how they could indiscriminately kill Jews during the Holocaust, Nazis usually claimed the first few times were difficult, but the acts became a matter of course; normal as anything else in their day. In that same war, the Soviets committed brutal atrocities of the German cities they conquered, as did the Japanese in China. Every country has committed heinous acts in their history. Once it starts, inhumanity can spread like wildfire.

We as developed world citizens believe we are civilized, and for the most part we are; but we are only a stones throw away from being capable of casually committing what we would think of as atrocities as the norm.

How would an ethics course solve this? There are people who think that this is the ethical thing to do. That it is OK for the government to see what the "criminals" are doing. The "I don't have anything to hide crowd."

I suspect the people arguing for this want to push their brand of ethics and ideology. But you're right, there isn't a single set of ethics.

Everybody at my school took one of those. It was all platitudes all the time. No substance at all.

Every once in a while I hear someone call for more people to have to sit in Platitude Class, and I can't figure out why.

And if there is any class of people known for their impeccable ethics it is lawyers.

I'm not sure that class is having the intended effect.

>I seriously think that to get a CS or EE degree (or similar) B.Sci degree, you should be required to take at least one full term length ethics coursev

Why limit to CS or EE? Why not ... everybody?

>> take at least one full term length ethics course

I did take one for my Masters in Computer Science. I think it was required. Unfortunately, for some reason I didn't understand most of it. Don't know why. My suspicion is that they didn't know what they were talking about or it was just a class full of BS.

The people who do this aren't wilful engineers.

Source: I'm a former member of Yahoo's Paranoid team.

I think that this is largely already the case. I studied at a post-polytechnic university in the UK (think community college), and we had a legal and ethics course, and IIRC this very subject was touched on.

> I seriously think that to get a CS or EE degree (or similar) B.Sci degree, you should be required to take at least one full term length ethics course.

That would be a nice idea, but short living.

Say for an example, we had "free software." Now many say "open source." As long as we are having greed for money or power, nothing is going to change.

For its EECS undergrads, Berkeley requires CS 195: "Social Implications of Computing".


You think a course will teach someone ethics who does not have any? Or they do have ethics, but theirs are simply different than yours. Plenty of people believe assisting with government surveillance is the ethical thing to do to 'keep people safe.'

I had to take a full term length ethics course to get my Software Engineering degree -- at what post-secondary institutions does this not hold true?

How do we make all the unregulated ways of learning how to code (e.g. coding bootcamps) include ethics in their curriculum?

University of Central Florida required this when I went there, but I think the issue is applying the learnings from the course vs getting the pay check from your employer. I think it's one of those things that is easier to say than do unfortunately.

Given that actual professors of ethics, professional ethicists, are no more moral than other professors this would be nothing but a waste of time.


The Moral Behavior of Ethicists and the Power of Reason, Joshua Rust, Eric Schwitzgebel

Professional ethicists behave no morally better, on average, than do other professors. At least that’s what we have found in a series of empirical studies that we will summarize below.

During my time in college I took multiple ethics and related classes and in my opinion they aren't worth the paper I printed class assignments on.

None of these answer practical issues like what happens when the law is bad or when one's livelihood is on the line. None of these give practical advice on how to fix a society where ethical behavior is not inherently incentivized.

In Canada all engineering students are required to take an ethics course.

Granted since it is taught to all engineering disciplines the material is quite broad, mostly focusing on topics like bribery and negligence, but it does also cover whistle blowing.

At the ABET accredited engineering school I attended, we were all required to take both an engineering ethics class, and more specifically a computer ethics + law class (for those of us in the CECS track). Computer Engineer here.

I am a developer and I had to take a course similar to what you're talking about. So it's out there, but making it more common would definitely be a good thing.

Here in Ireland anyway for a university to be accredited at all ethics has to be part of the engineering curriculum.

First Semester of our first year we got a dedicated ethics class.

>messing with its AS-adjacency topology in non-free ways at OSI layers 1-3

Can you please explain how this works? I would like to understand.

example: all ISPs in Iran are required by law to be downstream of the single government run ASN, which controls all paths for data in and out of the country: https://www.google.com/search?q=DCI+iran+ASN&ie=utf-8&oe=utf...

example: the turkish government orders ISPs to null route IP blocks of things they disagree with (occasionally all of youtube).

It wasn't required but I did take a course on ethics and law. I'm pretty sure many programmes include such courses.

They do in Spain, it covered data protection laws & professional ethics and was in the third year of the four year degree

taking an ethics class is not a guarantee of ethical behavior. in fact, I have my doubts that it even improves the odds. and even if it did improve the odds, there's always _someone_ that won't care, or can be bought. it only takes a handful of unethical actors to make problems like this one happen on a regular basis.

can't speak for all CS students but we certainly have an ethics course. Unfortunately a lot of people see it as an insult to their intelligence yet need reminding of many of their professional obligations and that their 'cool hackathon idea' might be illegal

What engineering program doesn't require ethics as a standard freshmen engineering introduction?

At my school (Northern Arizona University), CS students are required to do exactly this.

San Francisco State University's CS program requires an ethics course.

With the way the execs of Yahoo handled the 2015 back door, there is a likelihood that the 1 billion + 500 million compromised accounts were due to an exec decision that no one knew about. That individual or group of individuals may not be at Yahoo, and kept everything quiet. Or, the individuals may not even know that they did it!

Let's say you were on security on and found it on the network. Would you somehow be bound by a gag order about it, since you would never have seen said gag order?

I think it's irrelevant in practice. (In theory, that's an interesting question) You're not making the decision to make this public. If you're on the security team, you're going to notify the boss, and at that scale of the system compromise this goes all the way to the top. At that point someone who knows about the gag order is in the chain.

The only scenario where I think the question matters is if you do something really stupid that would get you fired if this was an actual exploited external access.

Let's say an unaffiliated third party (white-hat hacker) found the exploit and reported it to you under a Bug Bounty program. Let's also say that that third-party was someone who followed "responsible disclosure" rules, and said that they'd publicize the vulnerability if you didn't do so yourself within a short time-frame. You investigate (by asking your team, your boss, looking at the bug tracker, etc.) and figure out it's an NSA backdoor. Now what do you do? Are you allowed talk to the white-hat? Are you allowed to not talk to the white-hat, knowing that this would result in the white-hat reporting the vulnerability and thus compromising the investigation?

Whether or not the company is doing everything they can to resist the order, I think that NSL's are always accompanied by a clear communication channel between a company's counsel and the agency.

So, after someone under the gag realizes the situation, they get the company's lawyers in contact with the agency to see what to do. The agency would then gag the white hat.

IMO, that's a huge part of why NSL's are scary. You are in an absolute strangle-hold and are at the mercy of the agency for your every move.

If I remember correctly, people even had to argue for the ability to talk to a lawyer about receiving an NSL. So, the feds are really not messing around here and will do absolutely everything to ruin you if you don't cooperate fully. Any perceived resistance is crushed.

> So, after someone under the gag realizes the situation, they get the company's lawyers in contact with the agency to see what to do. The agency would then gag the white hat.

He doenn't live in the US. Once he realizes this is going on, he'll disclose.

You talk to your boss. Your boss talks to the NSA. The NSA will find a way to silence the white-hat. Problem solved.

Philosophical dilemmas are fun to talk about, but only as long as you take the premises as granted. People who carry swords tend not to waste time trying to disentangle knots that they can simply cut in half. Most "technical" solutions to "human" problems suffer this vulnerability.

> You talk to your boss. Your boss talks to the NSA. The NSA will find a way to silence the white-hat. Problem solved.

Seems you're assuming the white hat hacker is from USA. I'm not so sure the NSA is going to be able to silence a white hat hacker from say Russia, or anywhere out of USA for that matter.

Silencing somebody doesn't need to involve sending him a legally binding gag order. Nor does it necessarily require killing him.

There are lots of carrots (e.g. job offer, lucrative contract) and whips (e.g. a threat to ruin his business or professional reputation) that a government agency can use to persuade someone, even a foreigner, to keep something a secret for a certain length of time.

This obviously won't work on someone who is under Putin's protection, for example, but then we're talking about cyberwar, not a lone white-hat.

Not necessarily, but you are for sure bound by the confidentiality agreement you signed on employment

This is an old story that was discussed extensively when it was new (in October):


First time I'm hearing about it. October is very recent so not old at all.

Good point.

I assumed it was a direct development in the other day's story re: the largest hacking to date for YHOO.

It was linked in the comments thread of that story, probbly why it was re-submitted.

Personally, I missed this story when it was first posted, but regardless reposting now definitely adds an interesting additional perspective to the recent announcement, especially as we all wait for more information to be released about how the hack was executed and what the broader implications are. Combined the two stories are more insightful than seperate IMO.

They are unrelated. The hack is from 2013. The malware scanner modification happened in 2015. http://mobile.nytimes.com/2016/10/06/technology/yahoo-email-...

> [Update (12/14/16): Reuters has specified that the rootkit was implemented as a Linux kernel module. Wow.]

Hm.. One more proof to avoid using non-free binary blobs in Linux kernel. Be safe. Use Debian GNU/Linux without non-free repo or any better[0] one.

[0] https://www.gnu.org/distros/free-distros.html

I don't think this has anything to do with non-free blobs. They knew exactly what it did and they installed it anyway. It wouldn't have mattered if they had the source or the rootkit was open.

What really upsets me about this is the idea that the security team was bypassed, effectively compromising security for Yahoo and every one of their customers. The idea that a company executive would knowingly bypass their own CSO, and take it upon themselves to understand the risks they are introducing, is mind-bogglingly stupid and egregious.

Marissa Meyer, if she approved this, should be deeply ashamed of herself.

>Marissa Meyer should be deeply ashamed of herself.

Fixed that for you.

But Yahoo definitely does not know how it got hacked, losing 1 billion accounts.

Seriously, why the title change? Sometimes the title tells you nothing - wouldn't it be reasonable HN policy to allow the title if it's an accurate summary of the first paragraph?

"Yesterday morning, Reuters dropped a news story revealing that Yahoo installed a backdoor on their own infrastructure in 2015 in compliance with a secret order from either the FBI or the NSA"

Edit: Looks like it's changed back now - great. For a brief period the title was set to "Surveillance, whistleblowing, and security engineering".

From a threat modeling POV, this is an interesting type of insider threat. A high-privileged faction of the company is hijacked (via extortion) by a malicious third party with legal leverage.

If they are doing this at Yahoo, what proof do we have they are not doing this at Google...right at this moment?

One would have to be really naive to believe that Google, Apple and the other big ones aren't backdoored by the three letter agencies by now.

You'd be surprised that there are still (technically competent) HN users who seem to be that naive. Or to promote gv't propaganda.

Our only solution is to go 100% open-source and 100% end-to-end encryption.

And the government has the power to ban open source and encrypted communications. It's not really all that difficult, if you eliminate any concern for the constitution. If all else fails, rubber hoses and bullets work well.

That won't work. People will just do it anyway. Just look at how well alcohol prohibition worked.

That's part of the beauty of open source - once the cat is out of the bag, there is no getting it back in.

Many decentralized projects really give me hope for a secure and private internet. Though there are still big problems with those systems (e.g. sybil attacks and what not) and general adoption rate.

But you do already see block-chain technology used by huge industries like for diamonds and projects like ZeroNet are surprisingly user friendly and visually pleasing (though sadly most zites use a centralized identity authority.)

What is a "backdoor" exactly? Another euphemism used to disguise something much darker I imagine.

Usually a "backdoor" is something you would be aware of in your home. You know you have a door round the back. This is more like your landlord giving the keys to a stranger who comes and stares at you every night when you're sleeping and rifles through your draws and cabinets.

I'm wondering what will happen? More users will start using decentralized and open source programs? They will run their own mail server? Or They will hate NSA, but still push personal data to few big corporations?

Nothing will happen. "Users" don't even know what the majority of these things are, and this concern certainly doesn't even fly on their radar.

I've spent 5 years now on this very topic and my conclusion is that the people just don't give a shit.

I feel it same. I've created new decentralized internet platform for 3years. But when I release it next month I will present different features than decentralization, because people who hate spying, but fear to change are really not the market.

Whelp. Yahoo is officially over today. It was quite a fall.

Do you think?

Seems to me people are getting used to accounts being hacked, it's not such a big deal any more, in fact it may be even an expectation.

And as for government NSA hacking, well that's just old news and a given isn't it?

I'm not sure that people are getting 'used' to it. I was talking to a non-techie over the weekend, and although they were aware about Snowden's NSA revelations, they were quite perturbed to think someone could be reading their email.

I don't think people have stopped caring, they just feel helpless. This means that normal people may be willing to adopt new protocols (end-to-end encryption), something they wouldn't do if they were accepting of NSA spying.

Nothing in Snowden's leaks suggests that the government has access to your friend's email, let alone is reading it. Stop exaggerating to your non-techie friends.

You must have been folowing a different leak than me.

I read the documents released by Snowden. Which leaks are you referring to? Can you point me to any document that suggests the NSA has his friend's emails?

I'll cite a single program not even leaked by Snowden which would allow any unencrypted email sent to by intercepted https://en.m.wikipedia.org/wiki/Room_641A?wprov=sfla1

Snowden leaks showed that they get billions of hits each month from the various submarine cables as well as direct access from telco backbone fiber stations in the US, Europe, Middle East, and elsewhere.

> As this map shows that almost 3 billion data elements from inside the United States were captured by the NSA over a 30-day period ending in March 2013, Snowden stated that this tool was collecting more information on Americans located within the United States than on Russians in Russia


In addition, the MUSCULAR program involved tapping the data links between data centers of Google and Yahoo.


So I'd say there is an 80-90% chance the NSA has a good chunk of his friends email. Closer to 95% if he was located outside of the US.

The only thing stopping them from getting the full content of each Americans (plus 3 hops) passive data collection (besides 100% of metadata they get legally) is a FISA warrant. They have no restriction for foreigners.

Maybe you need to reread some of those slides because you clearly missed the big picture.

Your single source does not actually collect his friend's data. According to Snowdon's leaks, it was used to find a court-ordered monitored target's traffic leaving or entering the country. It does not actually siphon all data, including emails, to the NSA.

MUSCULAR provides similar filtering capability within Google's and Yahoo's networks, though not anymore because they encrypt all traffic. Again, only metadata. And again, the email envelope collection had already been shut down prior to the leaks according to the leaked documents. According to Snowden's leaks, the NSA is not allowed to keep communications from a US citizen or anybody even living inside the US without a court order, so no, his friend's emails don't reside with the US government.

Next, people on HN are going to deny the holocaust. 3... 2... 1...

It sounds like you've been reading too much fake news. Can you point me to any document that suggests the NSA has his friend's emails?

If only we could convince the millions of yahoo groups users to move their groups elsewhere.

Does anybody still doubt that all major closed-source software companies in the US (but probably elsewhere too) will put backdoors in their software products and therefor on your hardware?

Ever since I started using email I though law enforcement has continuous, uninterrupted access to my communication.

News like this are no news to me :)

I browse assuming that I always have someone over my shoulder.

Face it, the US government is not operating like it has the best interests of the people as its core motivation.

`"That's not the way the world really works anymore." He continued "We're an empire now, and when we act, we create our own reality. And while you're studying that reality—judiciously, as you will—we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors … and you, all of you, will be left to just study what we do."`


I think we need to find ways to hurt big companies in the pocket book who do this. Stage (IT) employee walkouts, boycot the products they sell, etc.

Somehow, _we_ need to get the upper hand over the surveillance monster the US is becoming.

It's a nice notion but are we going to see an exodus from Google? Facebook? Apple?

I left Google and Facebook but me and the other 3000 people don't actually matter. You'll never get a non-negligible number of people to forsake their comfort zone for anything so trivial as rights, privacy, etc.

Taking the headline literally "Yahoo installed a backdoor for the NSA behind the back of the security team" is as much an indictment of then-Yahoo's security team as a reminder that's it's a possibility for other companies to consider.

The OP does specify that Yahoo's Security Team caught the backdoor within a few weeks, so they can't be wholly incompetent.

Yahoo stock hasn't moved very much recently over these issues. I guess the market still thinks that the Verizon acquisition will still go through. For context, the price was ~$43 after the acquisition and closed at ~$41 today.

I guess the market doesn't care about the security and confidentiality of users who aren't paying customers.

Kinda makes you think of "marks" in the carny sense (http://grammar.yourdictionary.com/slang/carny-slang.html)

Most of the value from yhoo is from its holdings in alibaba. So this actually is fairly irrelevant for the stock.

There was a big jump in the stock when the acquisition was announced, so it would follow that the acquistion premium is priced in.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact