Hacker News new | comments | show | ask | jobs | submit login
Wikileaks reveals CIA’s ‘Brutal Kangaroo’ toolkit for hacking air-gapped network (wikileaks.org)
140 points by Sami_Lehtinen 12 months ago | hide | past | web | favorite | 118 comments

With every leak of government produced malware I hope the issue get pushed a bit further onto the political agenda so a international treaty can be reached on software implemented weapons. There need to be some defined limits for what agencies can do and what happens when the weapons sooner or later are discovered and turned into problems like WannaCry.

Its clear to most politicians that its a problem if criminals use guns acquired from the military or police, and that its partial the fault of those agencies when it happens. We are not there yet with malware.

Your diagnosis is at odds with the basis of open security.

The primary thing that needs to happen is an accepting of responsibility by those who administer critical systems. Mathematically, what we call a developed "exploit" is really just an existence proof that something is already insecure (hence "PoC or GTFO"). The blame needs to be properly assigned to the developers/integrators of these systems for negligence (currently gross negligence - eg relying on Turing-complete languages) instead of scapegoating those who discover the emperor has no clothes, even despite their underlying motivations and lack of full disclosure.

Nasty exploits have always existed - the worrisome development is the scale of execution brought on by entities with such resources, and the trend of governments becoming overtly adversarial against the people, both their own citizens and foreign individuals.

Even if it were practical (see: China's attitude to imaginary property), having these entities form treaties (ie collude) with one another is not going to resolve this. If anything, treaties will form gentleman's agreements between allies, while enshrining the attacking individuals "of interest" as standard procedure.

I would not object to the idea that companies should be held liable for critical faults which get exploited. It was the suggestion that Schneier gave several years ago in order for security funding to be financially sound. However we very far from this reality, where its questionable if companies can be held liable for security faults in extreme cases like voting machines, airplane systems, and medical devices. I recall reading that Schneier more or less given up on it.

We need to get to a point where people in power positions are held responsible for the damages caused by government developed malware that is used in maleware like wannacry. That is hard to do without the issue rising on the political agenda.

> its questionable if companies can be held liable for security faults in extreme cases like voting machines, airplane systems, and medical devices

> We need to get to a point where people in power positions are held responsible

You're contradicting yourself. If it's possible to hold people in power responsible for what a single agency under them does (not even requiring their knowledge!), then it's certainly possible to hold them responsible for approving public use of insecure voting machines! (especially at a more local political level)

And if it's not possible to hold people in power responsible (which is a reasonable assumption), then the philosophy of distributed defense becomes even more important!

> the damages caused by government developed malware

Attempts to silence messengers never end well. And I don't think that changes, even and especially when the messenger is a well funded government. But normalizing that philosophy certainly sets the stage to silence demonize smaller messengers!

The problem with holding manufacturers liable is that I'm not sure it's the economically optimal solution. Fundamentally, let's break down exploits into two categories:

1) Shoddy programming creates exploit that is obvious to manufacturer (let's say Chinese Android TV stick makers)

2) Exploit exists despite manufacturer's efforts otherwise (let's say Microsoft)

Putting exploit risk on software companies solves #1. But, especially in critical industries, balloons the cost of their systems due to #2, because they must now cover a risk they can't even model (unknown unknown). Can you imagine what accountants and actuaries would do with "You may have a nation state targeting the firmware controller in an HDD to compromise your system"?

I think a more optimal situation would be the government mandating liability for known-problematic code standard lapses, but then providing a liability shield provision if the manufacturer can deliver a fix to a critical vulnerability in X days.

Ideally, we'd want any legislation to do two things that solve both of the above problems. Increase adherence to generally accepted secure coding standards (helps #1) and increase ability to deliver a timely fix to customers (aka codebase maintenance and agility; helps #2 and especially important in mass-use IoT devices).

> The problem with holding manufacturers liable is that I'm not sure it's the economically optimal solution

Either you hold someone liable or the effect will just be hiding the risk.

How about just requiring that the use of critical software systems be ensured against malware/failure in general? Seems like we want that anyway, and if we can't find anyone to insure a piece of software, it probably shouldn't be used in a critical system in the first place.

Importantly, it's the users of software in critical systems alone that need to be insured. Neither software vendors nor regular users need insurance. The insurance company, alone, should handle the job of shielding a critical system from the mistakes of software vendors. We need to allow software vendors to be able to make mistakes, or nothing would get made, ever (source: I'm a programmer).

And the insurer would be wise to spend some of the premium on bug bounties for the software they're ensuring (to minimize the cost of failure). In the end, all white hats would end up being employed by an insurance company, helping assess software security.

Did you read the rest of my comment?

The difference between your approach and mine is that you propose to solve it by making rules (regulations), as opposed to adding a separate party that can absorb risk (insurance), thus shielding a creative industry -- software development -- from adhering to a list of rules, which surely will only grow in size.

Ah. Insurance doesn't act as a separate party to absorb risk in the way you're talking about.

They act as a party to amortize known risk, in exchange for a monetary premium set based on that known risk.

Without the government stepping in and limiting catastrophic liability to some degree (ideally in exchange for signaling the market to produce a social good), the premiums changed would be so large as to just suck money out of tech. There's no creativity shield if you're paying an onerous amount of your profits in exchange.

Which is why I said any solution has to be two part: (1) require risk liability on a better-defined subset of risk & (2) provide a liability shield on the remaining less-defined risk iff a company demonstrates an ability to handle it (aka prompt patching). This creates a modelable insurance risk market, therefore reasonable premiums, and still does something about nation-state level attacks.

I don't think "economically optimal" is a goal to strive for, especially under the current inflationary busywork treadmill.

In fact, the current regime seems pretty close to economically optimal - 1. cheap/quick to develop because required diligence is not done 2. incremental cost to keep the zombie going via "patches", lobbying, and legal threats 3. planned obsolescence when the defects can no longer be ignored.

The downsides only show themselves as mortal risk accumulating outside the system, hence the spectre of "nation-state" attackers. But the economic puppeteers will easily move to another country if this one collapses!

"But, especially in critical industries, balloons the cost of their systems due to #2, because they must now cover a risk they can't even model (unknown unknown)."

This is where "best current practices" and "standards of care" egg get the picture.

But to revisit the analogy about stolen police/military arms, are we to blame casualties/victims of these stolen arms for not being able to defend themselves?

Yes, those tasked with network and infrastructure security should be held to rigorous and constantly increasing standards. But there has to be some accountability from the state purveyors of this malware when it gets intercepted and weaponized. Publication of these tools is one way to help in lieu of that, but it becomes a toss-up as to whether the world will harden against published malware before bad actors, state-sponsored or otherwise, weaponized it for their own gain.

There is a world of difference between physical and informational security, so it is an improper analogy.

First, it is possible to perfectly defend (and conversely, a vulnerability means one is entirely vulnerable. Don't confuse mitigations that affect the probability of being vulnerable with actually being vulnerable. There is no middle ground as to whether something is secure or not, it is a binary proposition!)

Second, attacks are practically free and untraceable. And all the status-quo anonymity-destroying attempts cannot actually change this, as making "network identity" trusted would mean changing every system's attack surface into every other system - eg break a single person's credit card and use Starbucks wifi.

Even if a lot of the vulnerable machines still run insecure software there's just no reason why they all need to be networked as they are now. E.g. medical machines at the NHS running Windows recently got ransomware'd.

There's no intrinsic reason for why these machines need to be connected to even an internal network other than lazyness prioritizing convenience over security.

I don't know anything about your background or area of expertise, but you sound completely disconnected from reality.

> Your diagnosis is at odds with the basis of open security.

> The primary thing that needs to happen is an accepting of responsibility by those who administer critical systems. Mathematically, what we call a developed "exploit" is really just an existence proof that something is already insecure (hence "PoC or GTFO"). The blame needs to be properly assigned to the developers/integrators of these systems for negligence (currently gross negligence - eg relying on Turing-complete languages) instead of scapegoating those who discover the emperor has no clothes, even despite their underlying motivations and lack of full disclosure.

To paraphrase:

"I assert that I am intelligent by using the word 'diagnosis' and demonstrating that I also know of another word for 'exploit'. Therefore, if I sound arrogant it's because I'm actually just intelligent. Moving on:

When malicious hackers take advantage of an exploit to hurt people, it's not fair to blame the CIA, NSA, or any other agency who knew the exploit existed but chose not to disclose it to the people who wrote the software. They wanted the option of using the exploit themselves--which of course is perfectly fine--so you see, it wouldn't make sense for them to disclose it.

It would be silly to blame these agencies for keeping these secrets from the public and from the people who wrote the software, and equally silly to blame them for being unable to keep these secrets from falling into the hands of malicious hackers. Do not scapegoat these people, for they have done nothing wrong.

No, it's the fault of software developers who write buggy code! We need to properly blame the people who try to write secure code but make mistakes. After all, malicious hackers have to maliciously hack innocent people, since that's what they do. The CIA has to keep secrets because that's what they do. The CIA also has to keep the secrets in the pocket of their coat which they lost at the bar. After all, the CIA is just a bunch of people, and people make mistakes.

People make mistakes, but software developers are not allowed to. Any good developer knows how to write code that has no mistakes in it. One of the easiest ways to write mistake-free code is to program in a language that isn't turing-complete. I've been writing code my entire life and have never introduced a single security flaw into a system, because the only two programming languages I use are english and occasionally arithmetic. Any software developer who makes a mistake or uses a turing-complete language is guilty of gross negligence and should be punished and blamed severely. I see no need to provide any sort of rationale for the things I have stated."

...Imagine a certain model of commercial airliner that's been in widespread use for well over a decade. The planes have their quirks and some parts wear out and have to be replaced, but they are frequently inspected and repaired. One day, the wings completely fall off of every single plane. Anyone unlucky enough to be on one of these planes while they were in the sky dies. The people are shocked and the government pledges to find out what happened.

Some of the best aeronautical engineers in the world had worked for years to design these planes, and the plans had been scrutinized and approved by many people. The manufacturing plants were known for their high standards. Nevertheless, it was discovered that a flaw in the design had indeed been the cause for the wings falling off. The enormous bolts used to attach the wings to the fuselage were incredibly sturdy, but if you blasted them with a specific ultrasonic frequency, they would resonate in a wildly unexpected fashion and quickly explode.

A terrorist group claims responsibility for the attacks, and upon closer inspection of the planes it is discovered that the seat-back TV screens near the wings of every plane had been replaced with ones that contained devices capable of emitting the exact frequency needed to cause the bolts to explode. They were designed with a clock-based system that had been set two years in advance to trigger simultaneously on every single plane on that day. The terrorists had spent years patiently buying flight tickets and performing the replacements en route. Since the devices looked like ordinary tablets, they had no problem getting through security even though it took ages to get everything in place.

The inspection also uncovers a second set of devices, very similar in nature to the screens. These are far more elaborate-- the entire seat base contains a powerful ultrasonic emitter and an antenna tuned to the same communication frequencies used by the plane itself. It's designed in such a way that a special signal from air traffic control could cause the wings to fall off a specifically-chosen plane.

Due to the advanced nature of the second device, it's clear that it had to have been installed by people with far more resources and access to the planes and an intricate understanding of the plane's communication systems. Before the speculation goes any further, the director of the CIA comes forward and admits that they are responsible for the second devices. Having known about the faulty bolts even as the planes were passing final approval for use in commercial flight well over a decade ago, the agency had sent teams to install the systems under the pretense of doing security sweeps. The grim purpose of installing these systems was to give the agency a last-resort method of stopping a hijacked plane from flying into a building or crowded area.

Finally, the director admits that there had been a data breach three years ago, and though they couldn't be sure, it appeared that documents relating to the purpose and design of these devices were among the stolen data.

And now you come along to share your expert opinion.

You say that we shouldn't blame the CIA for knowing about the faulty bolts and installing systems to take advantage of them, instead of reporting the flaw to the company that designed the plane so that the problem could be fixed. After all, they put those systems in place to reduce casualties in a catastrophe.

You agree that terrorists are bad, but hey-- that's what they do, right? They aren't the real cause, just an inevitable outcome. No, there's another party who's really to blame...

Blame the people who designed and built the plane! Simple as that! If they hadn't built a plane with bad bolts, the CIA wouldn't have been forced to take advantage of the mistake and design their secret remote kill system. Those documents wouldn't have existed when the data breach happened, so the terrorists wouldn't have been able to devise their own plan to take advantage of the bad bolts. No bad bolts means no horrific catastrophe, it's as plain as day.

Since you are a world-renowned expert in everything, you are interviewed and asked about how to prevent things like this from happening in the future. Should we put some laws in place to prevent the CIA from keeping such dangerous secrets to themselves? Do they have the right to make their own internal risk analysis of whether it's in the public's interest for them to be able to build a secret system to remotely drop the wings off a plane, even though it means that people have been flying around for a decade in planes where the wings can fall off? Is it worth talking about the bitter irony that the CIA kept the bolt flaw a secret from the public and the plane company, but couldn't keep it secret from the terrorists? Are there legislative steps that might be taken?

You reply, "Nope! Of course not, how silly and stupid of you to say that. Laws don't do anything. The plane company built a bad plane, and they are to blame. Specifically the stupid engineer who picked that dumb bolt. We'll prevent this in the future by building planes where it's impossible for the wings to fall off. Anyone who knows the first thing about plane building knows that it's actually very simple to build planes where the wings don't fall off. In fact, I've been doing it for years, and anyone who doesn't use my method is grossly negligent. You see, I build my planes without any wings! Now just take a moment to look at this excellent proof I've written. You can see that it's impossible for the wings to fall off of a plane that doesn't have any."

By the way, I'm curious: Where do you point your majestic finger of blame in what happened with OpenSSL and Heartbleed?

(Ignoring the continual ad-hominems based on particular word choices)

Your comment seems primarily motivated by anger/frustration at the NSA/CIA/etc - an anger which I greatly share. Politically, I think the entirety of the NSA deserves the firing squad as the bunch of traitors that they are, but alas until the public comes out of the spell of their disinfo games then no action will happen on that front.

Speaking of disinfo games, which do you see as the more likely outcome from this current scare story of the week - these citizen-hostile government agencies are reformed and actually become responsive to the people, OR they court this fear about how bad exploit-finders are to acquire more power, especially the power to go after competing hackers?

That's the crux of the matter - when one chooses the wrong philosophical analysis, one can only go down a path where any "solution" compounds the problem. Responsible disclosure is not the law or even the full extent of ethics - it's a gentleman's agreement as to what is prudent and polite. Regardless of how bugs are fixed, who finds them, or their motivations, the fundamental open-society truth is that responsibility actually rests on buggy software itself, as opposed to the people who point out the bugs. Never mix that up, unless you'd like to get back to the dark ages where even good-faith full disclosure results in draconian legal thuggery!

In the context of your plane example, the company who designed the plane and marketed it for passenger use didn't even bother using a CAE program. When previously informed that the tail easily falls off, they added duct tape and a redundant tail. I've said nothing absolving the CIA/foreign fighters - all bad actors are to blame for their parts. But where that blame is focused matters, and blaming the whole situation on one bad actor (the CIA) will guarantee that the company keeps right on selling the known-defective planes.

I hope this helps the politicians realize encryption back doors can never be made safe. If even the NSA can't keep their top secret tools safe, generic 'law enforcement' surely won't be able to either.

What is your proposed test? Detecting a nuclear explosion or traces of chemical-weapons manufacturing is easy compared to figure out who made some malware component.

I don't disagree with you on any particular point. However, who do you suppose we trust to decide what is a software implemented weapon and what is not? Of course there are very clear black and white examples, but the in between is where we should be concerned. Consider the USA stance on export of encryption as a historical example of where it can go terribly wrong.

Shouldn't the Wikileaks page be linked instead? The original article doesn't add anything and the wl page has more technical details.

Ok, we changed to that from https://www.theinquirer.net/inquirer/news/3012499/-wikileaks..., which points to it.

There seems to be no evidence that public money is being spent on national cyber-defense, i.e. counter measures - at least there have been no links suggesting the NSA, CIA or GCHQ are working on patches.

This leaves private industry, e.g. Microsoft et al, solely responsible for defending citizens.

As there is no disclosure by the spies, Microsoft cannot patch without knowledge of the vulnerabilities - at least until there is a leak or potential leak that promts a reluctant disclosure.

This leaves one wondering is tax money being spent solely to make us less safe ?

Could Wannacries NHS (UK) devastation have been averted if academic security researchers had responsibly disclosed the potential for this vulnerability ? Surely the UK Health Minister should have been more properly briefed on the dangers to patients that his decision not to patch Windows could have ?

If these agencies come crying for more tax dollars we should seriously enquire if they have any plans to make us safer rather than continuing to backdoor their own citizens.

Surely laws being passed in the UK and USA to curtail security research the danger to the public can only increase.

Without whistleblowers and journalists such as Wikileaks we would far more at risk from cyber-attacks !

I say Wikileaks as we know journalists are being targeted by Nation State actors to hack the identities of their sources. and Wikileaks is a fine exemplar of decent Op-Sec.

Why is it that only the US, but not china, russia, U.K, germany, etc., out of all the strong countries, have these leaks? U.S needs to take a good look at itself and find the reason that their intelligence organizations are like a leaky bucket. It seems as if americans don't care and don't even see it as a bad thing when traitors sabotage entire organizations years of work.

In any other sane country, the outrage against such leakers would be far greater than the outrage against the government for doing its job. If a person was as a big traitor as snowden in my country, you'd see his whole family and friends deny him, he would think twice before doing it not only because of the chance of getting caught and going to jail, but from the actual embarassment and shame from all the people close to him.

Yet look at snowden, acting like he's some hero. Look at all the other leakers. They are barely ashamed. Your society hates the government so much you're turning traitors into heroes. Your security screening process is probably failing hard with so many leaks. How can the US allies trust it when everything leaks one way or the other. Leaking should be considered such a horrible treason the only the most psychopathic person would do it, and the screening should screen those out. But when society views leakers as heroes, and even the president freely leaks other countries intelligence to russia, followed by even more leaks than what he actually said, when it's justified as long as 1 in the 1000 documents they leaked shows some government wrongdoing (and statistically this will ALWAYS happen), they aren't traitors they are heroes. Then you don't need be a huge psychopath to betray your country. You just need some nudge in the 'right' direction.

If any of you think the right approach is to have no secrets (a.k.a. no vulnerabilities. Today it's vulnerabilities, tommorow it's CIA agents identities), you're naive. If you think your enemies will be as righteous as you, you are naive. If you think that if you disclose all vulnerabilities, nobody will have them, you are naive because russia and china will find their own different vulnerabilities.

This "all leaks are bad" mentality is fascist bullshit. If someone commits a crime and you know about, you have a moral obligation to report it.

Some leaks are benifical and others are detrimental. Some leakers are heros, some are traitors, and others are somewhere in between. It's not black and white.

Although I don't care for your rhetoric and I wish you could tone it down a bit, I think you raise interesting and important points throughout this thread which I hope people can discuss further.

I am someone who is clearly in the camp that you criticize (in fact, more than the average of HN commenters) and I think there's a lot to talk about, including

American anti-government and anti-military cultural and political traditions (which are extremely important for American society and history, but also very diverse, and not easy for Americans to find consensus about)

How are those currents different, differently expressed, or absent in other parts of the world?

Civilians and civil society defending themselves against state power on a national level (Americans vs. the U.S. government)

Civilians and civil society defending themselves against state power on an international level (people vs. governments in general)

All of these issues are interesting and can be much more complicated than the simple accounts that we might tend to give of them at first. I don't feel like I have time to do justice to these topics at the moment, but hopefully this thread or a future one will give an opportunity to go further into these things.

2 things you're missing:

1) The majority of what got people riled up about Snowden's revelations was that the US government was spying ON THEIR OWN CITIZENS.

In places like China and Russia nobody batted an eye because the expectation is obviously the government already does that.

Call it naive, but the majority of the US population believed (and still believes) that their government is above that. That regardless of whatever questionable evil the US military conducts abroad in the name of "democracy", domestically people have freedom and liberty. Sure, the post 9/11 world eroded it somewhat, but GENERALLY there is still the concept of privacy.

Snowden shattered this fantasy. Everyone is being watched. All those conspiracy whackjobs, all the hollywood "worst nightmares" turned out to be true.

That the government was doing this, and that it reacted to the leaks with extreme prejudice and desire to treat Snowden like a traitor rather than a patriot is what led to people considering him a hero.

I could go deeper into the other arguments like "I have nothing to hide so why should I care that the government is monitoring what I'm saying", but I'd like to know more about your stance first. Do you think that the domestic surveillance is justified for the greater good of international security? Or do you simply believe that it's obvious and expected of the government to take on this role, and fighting it is ridiculous.

I agree that Snowden's disclosures went too far. But to be honest, that was more on the journalists that released them than him. Ultimately I think it was important that Snowden disclose the existence of many of these programs. In so doing the journalists he entrusted with his materials unfortunately ended up disclosing more details of specific techniques and methods than I would have preferred. However, I think the basic idea that he informed the US public of the types of bulk collection that were going on is important, and ought to be praised.

The USG has traditionally not kept secrets from the populous (or not for very long.) The idea that they are not only keeping secrets but that these secrets are evidence of corruption and violation of our trust means that it is, in fact, a good thing they where leaked.

This is exactly what I'm talking about. Look at you. You think your government is bad? Look at russia. Look at china. You have the privilege of being in one of the least corrupt, most democratic country out there, yet you hate the country and its government more than the people in worse countries. A russian civilian has 10 times more reasons than you to hate the government. It's a fact. But he's also much more loyal to his country than you. Corruption or not, he won't hurt his country as much as your leaks are doing right now. The only reasonable conclusion is that it has nothing at all to do with the quality of the government, and everything to do with the loyalty of its people.

>You have the privilege of being in one of the least corrupt, most democratic country out there

Why do think that is? Governments that have the blind trust and loyalty of their populations are more vulnerable to corruption.

Criticism our government practices is expressing love for our government.

I am not sure how you reach the conclusion that being critical of USG actions is somehow praising or supporting worse international actors such as Russia.

And the bad behavior of other countries does not excuse the bad behavior our of own. I am a US citizen so I will criticize the USG because that is what affects my life most. It is also my right and the only country that I have any sort of voice in the matter (however small).

There's always room to improve. Leaks exposing corruption and to a lesser extent critical software vulnerabilities are a good thing. I don't think anyone would argue that exposing the identity of, for example, undercover agents is a good thing. As another commenter said, you seem to lack color vision and only see black and white.

Also, patriotism as a justification for not exposing a corrupt government is a load of shit.

Did any leakers care to leak only the information relevant to the corruption? No. I do have color vision. I see the most blackish grey. And so should you. Yet you all cling to that little white spot in the middle of all the blackness, and using it to justify everything. You're sending a very clear message to all would-be traitors: as long you throw us a little bone of government wrongdoing, you're wellcome to burn the whole house down. You think this makes any sense at all?

You know that there can be more than one "patriot" in any given scenario, right?

The guy who actually exfiltrates a DB dump on a USB stick is one guy, and he might be a sociopath. But he (usually) doesn't just post it to the Internet; instead, he gives it to a journalist.

Now, is the journalist also a sociopath? Probably not. Probably he cares about lives that would be lost if actual "actively sensitive" classified information was leaked. So he doesn't publish that information. He just looks for the stuff that works as "news": basically, things that hurt the government (as a bureaucratic entity) without hurting the state (as a body representing the people.) He takes the body-destroying toxin he was sent, and purifies it down until it's a chemotherapy treatment. And then he hands it to his editor, who also cares what happens to the country, and they talk about it with the publisher, who in turn advocates for the positions held by the boards of the companies behind the ads that run in the paper, who might also be patriots...

In short, to the degree that "leaks" are mostly something that happen through the media, not through lone vigilantes, there is a sieve of probabilistic patriotism reducing the "splash damage" of any leak. The media is not a "fifth column."

I believe the point is, no corruption has been shown in the Vault 7 leaks. They are getting cheered on by some folks simply due to the fact that they are leaks, whereas all that has been shown so far is methods used by CIA to do their job.

the United States is a country founded on the principal that the mechanism of government is fundamentally evil and needs to be constantly scrutinized, distrusted, and criticized. That's why we have a free and open press, elections all the way from country to federal government, and branches of government that can overrule each other. Loyalty is to compatriots, not to the mechanisms of government. Those aspects of our country, to the extent that the citizens of the United States (or any other country) successfully maintain them, are worthy of respect.

It's incorrect to equate distrust of government for hatred of a country. The reason this country is loveable is because it was forged by distrust of power.

I actually find your equation of love of government with love of country to be mind-blowing strange. What on earth?

An average Russian civilian gets his news from government-controlled media, the average Fox News viewer is loyal to this government but is supremely misinformed, and were, and still are, being manipulated. Is that what you want, dumb subservience?

And the myth of the US being better. Pah. China's elite are corrupt as fuck, but I don't think the US can be proud by saying "well we're very fucking corrupt, but China's worse at that!". Military-industrial complex, Halliburton, Citizens United, Wisconsin, Donald Trump and his GOP Congress still plowing through a bill that will strip health protection from millions... But oh no, look at the Chinese.

To quote Brandeis, "Decency, security, and liberty alike demand that government officials shall be subjected to the same rules of conduct that are commands to the citizen. In a government of laws, existence of the government will be imperiled if it fails to observe the law scrupulously. Our government is the potent, the omnipresent teacher. For good or for ill, it teaches the whole people by its example. Crime is contagious. If the government becomes a lawbreaker, it breeds contempt for law; it invites every man to become a law unto himself; it invites anarchy. To declare that in the administration of the criminal law the end justifies the means -- to declare that the government may commit crimes in order to secure the conviction of a private criminal -- would bring terrible retribution."

Hello? You are aware that leaking, a.k.a. treason, is actually illegal :O? Shocking I know. Guessing by the example of all the politicians in the US, who leak everything that politically fits their goal, you were led to believe there's any correlation between what you leak and the legality of it. Hint: there isn't.

You are the one inviting everyone to be a law unto himself. If any would-be-leaker feels the government is doing any wrong doing, he's free to take the law unto himself and release that information to the public along with whatever else fits into his usb key. There are proper channels to handle this, which don't involve betraying your country and breaking one of the most fundamental laws in the country, and the fact the government was acting illegaly doesn't justify breaking the law.

> There are proper channels to handle this, which don't involve betraying your country


As Doge would say, "Much faith, wow.". I see it more of a mafia organization. You try to speak out within the org? You'll sleep with the fishes, metaphorically speaking.

> If any would-be-leaker feels the government is doing any wrong doing, he's free to [...] release that information

So Snowden is justified in doing what he did?

I never did address the leak of the hacking tools, IMO that is less in the public's interest and the leaker's motive is probably to damage the agency, but to re-use the Mafia comparison, someone who can see clearly what that organization does (rather than just gulp propaganda from Fox News or Russia Today) probably doesn't feel that bad if his actions damage the org/that is what motivates him.

Leaking is not inherently treasonous in the United States. I understand that you don't live here and probably don't understand how we do things; I suggest that you research something we call "whistleblower protection laws."

You must live in a terrible, sad world where only brutal strength has value. I'm sorry.

OP's comment smacks of propaganda.

I'm not from the united states. I'm saying this from the perspective of an outsider, and most of the people here think the same. Americans have serious loyalty issues, coming from self hatred. I'm watching from the outside, seeing things like snowden, and judging by how I would've reacted to a similar story in our country compared to how the U.S people reacted. A similar leak in my country would be condemned across all the political arc regardless of whether 1 document out of 1000 showed something bad. This is not the way to change things.

The news coverage of the Snowden leaks being taken at face value probably played a part in this. Non-technical journalists made assumptions which were either unsubstantiated or even sometimes refuted by the leaked material accompanying the coverage.

Examples which I still hear repeated: The claim that NSA considers TOR users to be extremists (Based purely off XKS DSL code which was used to filter connections), claims that there is a loophole allowing NSA to collect American information via GCHQ (In reality US Person Information protections apply regardless of who collected the data), and claims that companies are willing "partners" of the NSA via PRISM (Whereas PRISM is actually the term used to describe use of the FBI and a FISA warrant+gag to collect non-US Person data from US companies using an assumed-to-be automated process, the companies themselves not actually having a choice in the matter).

NSA seems to have decided to not push back on the untrue claims or assumptions which were conveyed. There were responses along the lines of "the organization does everything within the bounds of US law for national security purposes" which is quite meaningless and only served to make people more upset. With the material already out there, NSA PR probably could have done much better if they responded in an informed and blunt manner, instead of allowing the "blanks" to be filled in by assumptions.

I understand and fully agree with your main point, but it isn't too hard to see why this attitude is prevalent regarding the US IC these days, as most folks have not read the source material and are relying on what the press coverage either stated or implied.

There are certainly a number of Americans who feel the same way. Most of us, though, like to believe we value liberty above loyalty. Both are valuable, but where the two conflict, the former takes precedence (even if we sometimes end up in a messy debate over the interpretation of "liberty").

If that attitude has yielded "one of the least corrupt, most democratic country out there", as you called it elsewhere, I don't see that as a bad thing.

> A similar leak in my country would be condemned across all the political arc regardless of whether 1 document out of 1000 showed something bad.

That sounds like the result of a polity entirely disconnected from the citizenry. The US is the way it is mostly because the politicians' reactions are an extension of the identity-politics at play in the popular culture itself. If the people of the US left think secrecy-culture is bad, then the politicians of the US left will also condemn secrecy-culture to "score points" with their electorate, no matter how obviously necessary such a paradigm might be for their day-to-day lives. In this way, the US is a lot more of a "democracy" than it seems to be at times—both for better and for worse.

(The "worse" is that it's very hard to enact measurably-good-from-every-angle technocratic policies in the US if they have bad optics from a populist perspective. The politicians are just summaries of the people they represent, rather than technocratic experts derived from them. "The lunatics are running the asylum" and such.)

It shouldn't be surprising that we're not blindly loyal to our government; our country was founded when our ancestors kicked out a tyrannical government.

For many of us, our loyalty is to our country as a whole, and the ideals upon which it was founded, not to any regime that attempts to abuse those ideals without our knowledge.

> tommorow it's CIA agents identities

I spotted two names in the Vault 7 leaks which WL failed to redact. Metadata from Shadow Brokers leaks gave Equation Group member identities, and additionally the Shadow Brokers twitter account directly called out a former Equation Group operator who is a part of the Information Security community (currently private sector / pen-testing).

So this is already starting to happen. Dangerous path, especially considering the indifference you mention.

I'm mildly surprised you can't process why many in the United States (including myself) praise Snowden.

The United States is a country founded on distrust of power. Nevertheless, it is a state, and its function is to protect its citizens through monopoly of force. That tension is central to our political consciousness.

Sometimes people argue that "secrecy" is required to fulfill the duties of the state. Secrecy is required, sometimes, but secrecy requires trust, and trusting the powerful is not something we do. If we 'must', then we demand checks to prevent its abuse.

In the mythos of the United States, the government (county, state, federal, doesn't matter) is always viewed as a potential threat. America-worshiping morons notwithstanding, there is a deep moral undercurrent in this country that views the state as intrinsically corrupt, simply because it has power.

Many felt that the Snowden leaks showed that whatever mechanisms were supposed to be in place to check overreach by a potential enemy were not functioning. People who view Snowden as a hero believe that he was fulfilling his duty as a citizen to his fellow citizens.

If you wonder why someone might be paranoid about their own government in the 21st century, I think its sufficient to point out that governments killed several orders of magnitude more of their own citizens in the 20th century than they did foreign citizens in wars. I personally believe that a turn to an unchecked security state in the United States is a bigger threat to my children and children's children than a war with China (or wherever), and a vastly larger threat than "terrorism".

I hope that helps clarify why a reasonable citizen of the United States might be at least ambivalent towards leaks.

Also I hope its clear that this is a discussion about what is fundamentally a fiction: the people (myself in particular) have a bullshit narrative about governments and corruptions and heroes, you have some bullshit narrative about spies and the CIA. Both are very thinly connected to reality, because the reality we're describing is vast and overwhelmingly complicated. I'm embarrassed to say that I don't have any well-researched constructive suggestions about what I would do to fix the mechanisms of verification that should be in place to check secrecy in our security apparatus. But this isn't a conversation about mechanisms and policies, it's one about narrative, and one compelling narrative in the United States is that we owe our duty to our compatriots not the government, and government is always the first and most likely place to look for something evil.

I agree with your assumption that most US leakers being treated like heroes/celebs will put the US in a disadvantage against geopolitically rival countries, where leakers are considered to be traitors.

But I'm not American and neither are you. Why do we care?


read US constitution so you actually know who is the traitor here

My understanding of air-gapped network/computer is that it gets no access to Internet/External devices (like a thumb drive in this case). So is accurate to call it "hacking air-gapped network"?

In quite every system you need to exchange data, and the usage of a medium to move data between the trusted host and the untrusted host is _usually reasonable_ and _common practice_ (traditionally you move data from black to red with less restrictions, while the reverse procedure is more rigorous).

Making a comparison, I’d say air gapping is to networking what galvanic isolation is to circuits: you don’t have direct contact, but there’s information exchange (be it bytes or em fields).

EDIT: I would call a (strictly) isolated computer a tempest-compliant and physically isolated host.

Say you store your private PGP-key on a air-gapped computer. When receiving a sensitive document you put in on a USB-drive, enter it to the air-gapped machine and then physically destroy the USB-drive. The air-gapped machine then directly presents the decryption on a monitor.

There definitely exist scenarios where the air-gapped machine to not have to both communicate out and in, but where only in is required.

As others said, you usually need to move data. So, the typical solution for defense sector was to use high-assurance guards or data diodes. The former are like firewalls with an implementation designed to actually stop state-sponsored hackers rather than fill bank accounts keeping out the least skilled hackers. They do the latter, too, at the prices they charge. The data diodes are devices that allow data to only flow one way. There's at least two, use cases for that depending on which way the data flows.



Indeed, A and B are not "gapped" if a thumb drive connects to B, and then later the same thumb drive connects to A.

That's the same as being able to send a datagram from B to A over a network.

"Air gapped" is an idiotic term to begin with.

Radio communication such as Wi-Fi is literally air-gapped.

The plates of a capacitor can use air as a dielectric, making them literally air-gapped, yet the cap will pass AC signal, and two adjacent air-gapped inductors can pass signal, as well as power with great efficiency.

It's a perfectly reasonable term. It's not used for its literal meaning here, but that's fine. We use "network" to describe things that are not literally a net-like arrangement of threads, wires, etc.

It is not idiotic.

It is anachronistic.

Doesn't this require two-way dataflow between the air-gapped computer(s) and the "primary host"?

Edit: in other words, you would have to 1) plug a USB drive into the "primary" host and then plug the same drive into an air-gapped computer, and 2) take a USB drive that was plugged into the air-gapped computer and plug it back into an internet-connected computer. Plus, all computers in the dataflow above must be running Windows, right?

>Plus, all computers in the dataflow above must be running Windows, right?

Linux surely has plenty of 0 days to exploit.

Sure. But "plenty" is not equivalent to "so cheap and so powerful that a well-funded state actor would automatically design every targeted exploit package to be cross-platform." At least this particular case suggests that is probably not the case.

Keep in mind from the article that the user must browse the files in the GUI for the exploit to work. I doubt Windows and the set of the most commonly used Linux GUI file browsers all have "plenty" of 0 days to exploit for this same purpose. Or, if they do, it's going to cost substantially more money to find them, test them, and package them up.

On an unrelated note, I agree-- the Linux kernel probably has plenty of 0 days to exploit.

This doesn't describe an air gap jump in the sense I understand the term (compromising or extracting data from a system that isn't connected to anything external, perhaps by ultrasound via the speaker/mic). This is regular movie-style "plug the magic thumb drive in" trick, no?

Tomorrow unwary operatives or turncoats will turn up at work, submit to a surprise frisk-down to find whether they're carrying any USB drives, these will be ceased, and analysed, and found to contain this malware, and they'll be tossed into the deepest dungeons with only a kangaroo court to look forward to.

Why would you expect an agent to be carrying one of these things? The typical vector ous some innocent local with poor security hygiene.

Anybody care to explain why I was downvoted for explaining what the immediate consequences to operatives will be? Or à la Assange would we all rather ignore the risk people are being placed at?

So .. is Mac safe, relatively speaking?


We detached this subthread from https://news.ycombinator.com/item?id=14624908 and marked it off-topic.

America really does have an obsession with Russia. It wasn't just communism.

Probably because they have one of the largest nuclear arsenals on the planet? Who wouldn’t be concerned? The only reason Russia doesn’t own most of Europe right now is because of that American “obsession.” You think that Russia would have stopped at the Crimea if it weren’t for the heavy US presence in Poland?

Yes, you can't compare Crimea with western Europe. Crimea was mostly Russian to begin with.

Poland, by that argument, as well as swaths of Germany, were arguably once "mostly Russia to begin with". Almost every civilisation can look back into its history and find convenient boundaries.

The argument is that Crimea is mostly Russian _today_, not just 50, 100, or more years ago.

>According to the (2001 census), the ethnic makeup of Crimea's population consists of the following self-reported groups: Russians:1.18 million (58.3%), Ukrainians: 492,200 (24.3%), Crimean Tatars: 243,400 (12.0%) [...]

>According to the 2001 census, 77% of Crimean inhabitants named Russian as their native language, 11.4% – Crimean Tatar, and 10.1% – Ukrainian.

>Ukrainian was until 2014 the single official state language countrywide, but in Crimea government business was carried out mainly in Russian.

Source: https://en.wikipedia.org/wiki/Demographics_of_Crimea

So that makes it ok to take over parts of a country? If so, the Europe should take over parts of the US, Taibet should give up on independence, parts of Jordan should become Palestinian, etc.

Russia on purpose flooded the area with Russians when they last had it, much like China is doing now with Tibet.

You can’t argue this justifies war from another country. If the people vote for (or otherwise self-select) independence and to join another nation that’s very different than a war where the soldiers hide their identity in order to take a region forcefully.

I never said that. I just provided facts to back up evook's claim that "you can't compare Crimea with western Europe. Crimea was mostly Russian to begin with". Even if you look at the 1897 census, it was 33 % Russian.

How was Kosovo different from Crimea?

Good for the CIA having skills and tools like this. These are needed.

But is hacking really a "weapon"? I think that hacking is a technical capability or tool, but I wouldn't call it a weapon.

The flip side of this is that we have to realize that as long as vulnerabilities are put in to protocols/products on behalf of Govts, The Govts can exploit them sure, but other people who ahve the ability to read or view code will eventually find out/figure out these weaknesses/exploits also.

Like weakening a protocol that all systems, even your own use. Then not even the govt itself is safe or can defend adequately against them.

I really believe in Security Karma.

Stuxnet was surely a weapon, right?

Stuxnet as an entity, was just the method to deliver different payloads right? I'm not exactly convinced if id ever class that as a weapon over a clever hack. I mean, is the ability to read and find holes in code, a weapon?

Was it a weapon or a very clever hacking job that prevented and held back scientists in Iran from creating an actual weapon? Either way, it was peace through superior code and vulnerability finding power

It begs the question though, is the text editor mightier than the sword?

And what does the world gain by Wikileaks leaking this for public usage? It will only become more ubiquitous. Microsoft already has patched the vulnerability.

When will Russia, China, Israel, Iran, France, North Korea or other countries wares be leaked? Something tells me they wouldn’t even if they got it...

Usonians are so brainwashed by "american exceptionalism" that they can't even see how out of control their military/industrial complex has become.

I'm not sure why you're worried about our military/industrial complex and informatics capabilities. I say this as an American: We're increasingly not a global power.

1. Our drone program is garbage, cheap chinese factories are giving $200-1000 solutions that are better than our $60000 solutions (edit: cost of explosives not included, but for lower yields it's way cheaper). Same for our logistics, it's dated and the civilian market has better for 1/100th the cost.

2. Our opsec is broken. Whoever's behind these attacks on the US's intelligence community is winning. For example, we know for a fact that Russia has been using a very similar tool outside embassies, but the fact that we know that is because actors has so much to gain by dumping it into the public sphere, politically. The US increasingly cannot "keep a secret".

3. Our industrial capabilities are falling behind. America's efforts of pursuing ultimate cost efficiency have ultimately outsourced the majority of the US's modern manufacturing capacity. This means at a national level we're at a demonstrably disadvantage to nations with much more of a connection to their private sector, with the classic example being China.

Worrying about America's military-industrial complex is reasonable if you're near a naval deployment because America still has an impressive supply of very powerful explosives. But beyond that... we're sort of getting our asses kicked and our government appears to be rapidly destabilizing. You can suggest this is from outside influence, and I think there's strong reason to believe that there is an ongoing attack. But it can't explain the entire phenomenon.

So I'm not sure what you're afraid of from us. Nothing in the WL page should be surprising. We've seen similar tools disclosed earlier this year. These tools are more sophisticated versions of attacks that have been ongoing for, gosh... I had to write a less awesome version of this tool as a demonstration I was ready to move up in rank in a security challenge forum, it was so easy. It's harder now, for sure, but...

If you don't think Europe has a military industrial complex then you aren't paying attention - check out the list of countries here (https://en.m.wikipedia.org/wiki/Companies_by_arms_sales).

During the Arab Spring there were multiple reports of European surveillence technology being fundamental to the police state in the Middle East/Africa (http://www.bbc.co.uk/news/world-middle-east-40276568). Small arms made by H&K, Beretta, and FN etc are prevalent across the region. Advanced weapons like anti-aircraft systems also flood in from across the globe.

The UK, French, and Germans were also key supporters of Saddam Hussain during the Iran-Iraq War in the 80's and actively supported the WMD program there by supplying technology etc.

Every large power is involved in military manufacturing and seeks markets to sell too. Americans are no less ignorant to this than citizens in Europe, Russia, or Asia.

Europe having an out of control military industrial complex doesn't negate the fact that the US has an out of control military industrial complex as well.

I don’t know what an “in control” military-industrial complex looks like, but I don’t think the current situation is unique to the US by any means.

I don't either. My point is that both can have an out of control system at the same time

You have only subtly insulted the commenter without providing a rebuttal regarding why you believe he is wrong.

> Usonians

What this?

I've seen it occasionally used to refer to Americans, because technically speaking, any citizen of any country of North or South America is an "American", and yet citizens of the USA don't recognize this.

To many, the implicit assumption that USA strictly equals America and vice versa is just another artifact of the arrogance built into US culture.

Use of the term "American" to describe US citizens is quite common outside the US.

To me, the idea that this is somehow a manifestation of our cultural arrogance and not just a mundane example of the malleability of human language is just another artifact of how certain people really, really want to find more reasons to hate us.

Presumably it's a name for people from the US but I've never seen it before and it's not great.

It's the word or similar to the word for the USA in a few languages, like Esperanto ("Usono": the USA, "Usonano": a USA citizen, "Ameriko" already being used for the continent and "Amerikano" for people from that continent). It was a term introduced in the 1800s without a lot of success, then after its use in Esperanto it was mildly popularised by American architectural legend Frank Lloyd Wright, who embraced the term and described his work as "Usonian architecture" and wrote at length about "Usonian character."

Most people outside the security community don't believe that the government has the offensive capabilities it does. This is necessary proof.

No it's an unnecessary blabber, unlike the NSA leaks, these show that the CIA actually does what is supposed too.

Targeted capabilities, not mass surveillance.

You mean being able to do one cancel the ability to do the other ? That makes no sense.

No but we haven't seen mass surveillance in any of the leaks.

All we've seen is specifically targeted capabilities with expiration dates.

Penetrating air-gapped networks is exactly the capability one would expect from the CIA and its exactly the capability it should have under its mandate.

You have a short memory if you don't remember PRISM.

PRISM was an NSA operation, so it's not clear how that is evidence against his claim that the CIA is more responsible with their cyber offense/surveillance than the NSA. If anything it seems to lend even more credence to his point.

That would assume that NSA and CIA do not share informations. So that actions from the first don't benefit the other. It would be foolish. No need to create 2 mass system if your brother has one and you can use it as much as you want.

Not fixing the underlying security issues makes us all more vulnerable.

Are you arguing "other countries are not experiencing leaks -> leaks are harmful"? Sorry, I don't follow...

It’s counter to the stated mission when it becomes entirely 1 sided and not beneficial. Wikileaks was meant to put an end to corruption, etc by revealing leaks across the world that benefit its local people. The biggest leak to do so lately, the Panama papers, wasn’t even by Wikileaks. What value are they providing?

By publishing this they increase the vendors incentive to fix the security bugs.

It had already been fixed.

What is the gain? We civilians no longer have to be victims to our out-of-control governments, who are no longer working to protect us, but rather their corporate interests.

I have no idea how leaking highly classified cyberwar toolkit helps that cause.

Full disclosure ensures that vulnerabilities are known, taken seriously and eventually patched.

It also levels the playing field.

It was already patched. This just makes the world worse.

It educates the masses as to the nature of the lies of their society.

This may not be good for "Americans", but it is good for "Humans".

Do you think I really want to do business with an American company if I know they are liable to secret, hidden manipulation by their own government, while facilitating a facade of 'freedom'?

erm they could just have these verified by trusted 3rd parties and just mention they have them and not release them to Russian hackers. That's be the responsible thing to do

I am very interested in knowing if there is an equally efficient alternative.

The alternative is a social system designed not to keep secrets, but reveal them.

I don't think that is realistically practical.

Currently you have several secrets you use to confirm your identity.

The idea of the individual would need to be eliminated to remove all secrets.

It's funny you say that. I've had a longstanding conclusion that my true individuality only arises from the secrets I keep.

Even if these are benign and even sometimes silly secrets, even if no one would find them interesting, they are the only thing I have that sets me apart from others.

This is another reason I feel anxious over creeping mass surveillance and the loss of privacy. It's a direct threat to my individuality.

The more I think about it, the more I come to one conclusion.

An organisation only knows who an individual is by the secrets they share with each other. It doesn't matter what the secret is, it comprises the identity.

It's why I get angry when I hear talk of backdooring/banning encryption.

It's why the phrase "I have nothing to hide" is a ridiculous fallacy.

Even the credit card in my wallet has multiple secrets to identify itself, and its ties to my account.

Without secrets, you become a non-person, because you are incapable of proving who you are.

>The idea of the individual would need to be eliminated to remove all secrets.

Well, this is good when it happens, on occasion, anyway. It's not a static value; sometimes such states as you propose are valuable/valued - other times, not so. This is not absolute, since you mention practicality.

More specifically, since we are discussing governance, we must absolutely remove the individual from the occasion if we are to maintain a stable social construction, while at the same time strengthening the individuals position within either a fluid .. or rigid .. social structure.

I personally believe modern government is holding us back. Lets just get this out of the way now.

In the digital age, we don't need all the hierarchy; we simply need better apps. And I most certainly do not want my apps to have personality, if they are designed to ease the means by which I exchange, equitably, with everyone else using my app^W^W^Wwho is a member of my society...

>The idea of the individual would need to be eliminated to remove all secrets.

I believe this is a call-to-authority fallacy. You have not thought the original statement through; but rather acted as an individual unit of agency reacting to the pressure of the masses.

Of course we will still have individuals; human bodies are made that way, and hopefully will stay that way for a long time yet.

What we won't have is rock-star/evil-genius politicians, nor will we have much reason to keep secrets to each other, over who has rice and who has salt and who is on the way to Mars, and so on ...

I have thought this through. I'm happy to have new thoughts come my way, but you seem to miss some of the requirements of interaction.

Any organisational unit needs to be able to identify the state of any individual it interacts with, to ensure continued interaction, such as resource allocation. Because resources are finite, such allocations need to be fraud resistant in some way. That is achieved by proving identity - an exchange of "secrets".

Some examples from day-to-day life are license numbers, registration IDs, home address and so on. How private such a secret neefs to be is proportional to the ownership of the secret.

My address is usually fine to share, as a stranger may find it difficult to possess my house, though they can intercept things on-property, but its more difficult.

My bank details are not as safe to share, as they tend to be the sum totality of how a bank identifies me. Thus fraudulently removing a primary resource is easy.

> Well, this is good when it happens, on occasion, anyway.

I rarely see secrets being removed at all. In fact, the only time I can see an individual no longer having any secrets, would be when they are made a non-citizen, and can no longer interact within society, or not without an enormous amount of effort.

Which is exactly the goal in communist systems. Individuality is discouraged.

Even there, you need some concept of isolating individuals, to assign work, allocate food and housing, etc.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact