The thing I'm stuck on though, isn't it still possible to do semantic analysis on the various permutations. Basically reading permutations for cogent statements? So do some sort of a-posteriori analysis
Infeasible for a human to do, but assuming one could construct a significantly advanced parser (non-trivial of course), wouldn't it be possible to brute force still? What am I missing?
"The swallow flies at midnight"
May (with a one time pad) be encrypted into
(which, incidentally, is indistinguishable from random noise)
If you just bruteforced that by xor'ing every character with every other possible character you could derive every possible message of that length, such as:
"garfield hate lasagna someday"
"men are cats why even bother?"
"pocket knives go to space yay"
etc ad infinitum
No measure of semantic analysis will help you here!
If you use the same one time pad to encode two or more different messages, then all the sorts of attack proposed here become plausible again.
The security provided by a one time pad relies entirely on the fact that it is only ever used once.
Computerphile recently showed how this was done.
Imagine a one time pad made for encoding numbers that used a "MOD 10" operation on each digit.
Then imagine the key is:
1234567890123 -> 7150027069897
1111111111111 -> 7037671370885
If this were English letters rather than numbers, and you know 'e' is very common, you still can't get anywhere because each 'e' is encoded with a unique character from the key.
With public key crypto it's a lot more likely that something might be broken. But then again if you somehow solve the problem of swapping secret keys/OTPs with everyone you want to talk to, you don't need public key crypto.
>Pakistan secretly captures Bin Laden by bribing tribesmen. The US finds out by bribing Pakistani officials. Further bribes with foreign aid money get other Pakistani officials to issue a stand down order. The SEALS swoop in unopposed but somehow still lose a helicopter. They kill a captive Bin Laden as part of a deal to avoid exposing Saudi support for Al Qaeda. The media gets fed a cover story about the compound being a command center. Some doctor guy becomes a scapegoat and vaccination programs are derailed in all of Pakistan. The CIA fabricates documents from the compound and flirts with claiming credit for "enhanced interrogation" technique in the matter.
If you don't like that story then sure by all means stick with Story A: The CIA does brilliant investigative work. The commander-in-chief makes a gutsy call. The SEALs storm in and kill the bad guy in a firefight. He is buried at sea with full rituals. The 2012 presidential campaign starts a few days afterwards.
If you support Story A, then this would certainly make sense:
>It's like they simply forgot about the biggest reason it took so long to find Osama bin Laden: he was so security-concerned that he used couriers and relays (which, counter to the broad narrative about the terrorist shift to security, actually cannot be decrypted)
Claims comes from a previous HN article about the Killing of Bin Laden: https://news.ycombinator.com/item?id=9520984
All the back doors accomplish in the end is harming the general public.
So imagine if Apple, upon condition of selling iPhones within China, must provide the Chinese government with a backdoor through Apple's encryption. Do you know how many federal employees use Apple technology? Including the President himself, who totes an iPad everywhere?
I don't think that federal intelligence and law enforcement officials calling for backdoors have fully thought through the consequences.
I think assuming that level of incompetence is a big claim.
It's simply much more likely that the actual plans/goals and the talking points and press releases about the plans/goals are mostly unrelated, as usual. You can infer that those calling for backdoors have decided that calling for backdoors is the best thing to say, inferring anything more requires more information.
Not really. Have you listened to the average member of Congress lately? Incompetence runs deep throughout all levels of government.
Instead, look at it this way: everyone has their area of responsibility. The head of the CIA is charged with providing the best possible situational awareness for the U.S. government. He's going to make proposals and requests that will help him do that.
He's not charged with balancing all possible consequences from his requests, and he's not going to do so.
But ... that seems like quite a level of incompetence for someone trusted with the position of Head of the CIA?
By which I mean, not charging him with that responsibility is a mistake (in gov structure), but him actually not doing so is his own incompetence, is it not?
To a man with a hammer, everything looks like a nail. To a man who only has words, everything looks like a soapbox.
How is calling for a really bad idea because it's the best thing to say different from the level of incompetence you say is too big to assume?
edit: "not going to happen" could be read as "not going to be effective." I wouldn't actually be surprised if the US ended up passing some law restricting crypto to an approved list of backdoored schemes (surprised: no, dismayed: yes), forcing people into hiding their crypto in deniable ways. What some people don't seem to grasp is that no matter how much you outlaw certain math operations, whether or not the end users comply with those laws is ultimately up to them, and the terrorists simply won't comply.
The NSA is part of the Department of Defense charged with foreign signals intelligence for national security purposes, its not a law enforcement agency. (And many of the tools it uses would be flagrantly unconstitutional if used for domestic law enforcement.)
Generally, blurring of the military and law enforcement roles is not a healthy thing; it may be good for order, but rarely for liberty.
I get that some of the controversy lately is about how well they stick to their mandate, but that doesn't change what the mandate is.
It sounds impossible, of course, but that's just because we're not sure where these guys put their goals; some may be fighting for lack of another purpose in their life, while others may be motivated by personal loss or a thorough belief in the violent interpretation of Islam.
So what we really need to do is find a common thread among all the terrorists, and pull on it.
and it's pretty easy to see that this avenue of thought --- that we will put our efforts toward finding a way to ban encryption --- is totally ridiculous.
Read pretty much anything Amnesty International publishes, or even the right history books and you'll have all the evidence you ever need as to why privacy is important even when you think it isn't.
Once bad things start happening it's too late to change your mind. They already have what they need.
The government would just have the private keys so they could decrypt traffic easily.
Someone posted a photo of master keys and Internet was, as usual, the Internet.
...Unless you're intercepting and sniffing everything that's unencrypted. Then we're back to square one.
There is no truth being asserted in the premise causing the conclusion to be true. It's fairly clear that it was a statement from an original source, that source being trustworthy is where we would find (or not find) the information which would show the conclusion to be true, not the premise of the 'fantasy.'
"$noun did $verb which resulted in $result and $result is $something interest which begs the question $question."
And terrorists would be smart - they'd employ non-backdoored encryption, hide that in "legit" communications, and encrypt that with the proper back-doored government encryption. They'd appear to be a totally normal snapchat user, or whatever.
If we actually meant to give up all encryption, card-kiddies would destroy civilization in three weeks.
Yes, measured data and unmarked compressed data have this same property, as do actual random data. But is does not look like 9 nines of false positive rate are a concern to those people.
If you include stenography, yes, it's certainly not easily recognizable. I don't think good stenography can be recognized at all, but that's not my area and I've got people contradict me at this (without further info), thus I'm not sure.
I think you're straw-manning their position here.
The opponents of encryption aren't claiming terrorists will start using unencrypted communication, they're advocating for legally-mandated "back doors" to be built into supposedly-secure communication systems.
Terrorisms won't use legally-mandated software for they illegal operations.
It may indeed be that this kind of signals intelligence isn't actually helpful, but I don't think this is a very good argument either way. If there was a massive disruption of planned terrorist attacks, it is not useless just because there wasn't a complete elimination of terrorist attacks.
The issue the security forces _did_ have was in identifying where the calls were originating. The actual content was not encrypted though.
It's unfortunate that there's no way [without a lot of money] to launch a campaign that highlights the fact that most sensitive online transactions rely on encryption - banking, purchasing, etc - to protect you from The Bad People who want to steal your critical financial information.
That would also highlight that it's not just about keeping secrets from the government - in fact, that's the smallest fraction of it.
In the end, it's more education, isn't it? Most things are. But in some cases, like this, it's a race between general education and political (and spy agency) advantage-taking.
That's Snowden's greatest legacy - a few more people became educated, and the public frustration and pouting spy agencies are doing regarding encrypted communications are a bit more transparent than once they were.
Google pins their public key fingerprints right into Chrome, and this feature is open:
Edit: Ah, if you were talking about the CA's private keys, then the parent is correct.
I don't know about that. I'd argue that most people don't think of themselves as bad, but it doesn't make them not-bad. And humanity definitely has base instincts - if we didn't, we wouldn't need governments and police.
I disagree. I've never found a coherent absolute ethical framework. Not even "Don't kill people."
When anyone is arguing that something is bad, that person has to appeal an authority or belief. Often the reasoning leads to utilitarianism: "If we want society to continue, we should ban murder." But that's still an if/then statement. Beneath the if/then is an appeal that society is good or desired.
> And humanity definitely has base instincts - if we didn't, we wouldn't need governments and police.
This viewpoint becomes popular with Hobbes in the 1600s. I disagree with the term "base instincts," which negatively connotes those things. I'll change it to "randomness"; i.e. in a society of nondeterministic people, some will try to kill the others. Government and police try to reduce that kind of randomness. But from that same randomness we get music, science, justice. I'm speaking loosely of course.
I think of my American government as social contract, not as protective parent.
I think "randomness" is even worse, because it makes it sound like these actions could be expected from anyone at anytime. Statistically, a small portion of the population is responsible for most of the violence, through repeat offenses. Also, in many situations there are clear warning signs (e.g. mentally ill with clear homicidal ideation, member of a gang). Are you really arguing that a dice roll is a good fundamental model for human behavior?
Nondeterminism (assuming humans are truly nondeterministic) doesn't really matter here, except for the fact that we don't have a way of precisely predicting people's behavior (and may never have one). If it turns out that humans are just complicated, deterministic machines that we can not feasibly predict, the reasons for which we have developed societal structures do not disappear.
> But from that same randomness we get music, science, justice.
I think it's pretty easy to distinguish between the first two and violence. I agree that the third is a bit trickier. If what you mean is that the exercise of both these and the dispositions-formerly-known-as-base-instincts are a result of allowing for a significant measure of personal freedom, I'll agree with you. But writing human behavior off as having no more structure than a random number generator ignores a lot of predictive and explanative power that we do actually have.
They can. A neurotypical person's brain could spontaneously generate a violent psychotic episode due to a stroke or adrenergic tumor or somesuch. Just like any area of the sky could spontaneously throw a lightning strike at you at any time. It's low probability, certainly, but murder is a low-probability event to begin with.
The important point is that a model with "non-deterministic" people in it has more predictive power, epidemiologically, than a model where it's impossible to become a murderer without "warning signs." It's not at all "writing human behavior off"; the fact that the model includes randomness can actually help you prevent murders more effectively, by leading you toward strategies to cope with unpredictable murders—e.g. building education toward methods of "de-escalation" for psychotic episodes, crimes of passion, etc. into your society—rather than simply trying to reinforce policing and social work.
I think if we eliminated all murders except these, we would be in excellent shape. My point is these are not the ones worth focusing on, because we don't have good tools to deal with "random, history free, psychotic break."
> The important point is that a model with "non-deterministic" people in it has more predictive power, epidemiologically, than a model where it's impossible to become a murderer without "warning signs."
Well, if your probabilistic model has no "warning signs", then how does it provide any information at all? If you don't have a method of using information to differentiate the probabilities when given a person/group of people/location etc. then you have no predictive power at all, except for the average murder per capita.
> building education toward methods of "de-escalation" for psychotic episodes, crimes of passion, etc.
De-escalation of psychotic episodes is an impossibly hard thing to teach without protracted work with a mental health professional. In addition, I seriously doubt that there would be any effectiveness when taught to people who have not experienced psychosis. Teaching this to everybody would be inhibited not only by cost, but by the fact that there is not likely enough people in a society that would be good enough therapists to do this on a large scale.
Sure, they did not intend to crash, but choosing to risk others lives is considered a reasonable thing to do. Assuming your not overly blatant about it.
I would agree that most people don't think of themselves as bad, but since there's no objective 'bad', wether or not they're 'actually bad' is up to the observer. When a person is labeled as 'evil' it's not a description of them as a person but a description of what the describer thinks of them. Hitler wasn't evil because he killed millions of people, he was evil because the general consensus is that doing the stuff he did makes you an evil person.
If one loses sight of this and starts think of evil as objective reality, they're taking their biases and opinions as objective reality, and down that path lies ruin.
But back to Hitler, if you're trying to make the case that his evilness is subjective, I mean no offense but that's a really dumb argument. Hitler was in no small part evil because his actions were, on one hand irrational, fueling and amplifying his people's potential for hatred and destruction and on the other hand detrimental to the survival of our species and of Earth itself. And again, genocide is not natural. You see, in nature animals kill to eat, but that's to satisfy a basic necessity and not out of some wicked sense of justice and animals can definitely not kill on an industrial scale like we do. Whatever definition for "evil" you find, genocide on an industrial scale is pure evil by definition.
And if that doesn't sound objective enough, consider that culture is a part of who we are. We aren't DNA-coded to eat certain foods, or to live in a certain place, or in a certain way. Compared with rats, we can rely on the wisdom of our elders in order to survive. And we've survived this way for a long time. As an example, our rich culture, which includes preconceptions and taboos, is what prevents us to eat each other, or to have sex with our siblings, or to bring human sacrifices to our gods. Actually some preconceptions are more subtle and newer than others - for example the notion that children are fragile beings that need to be loved and protected, instead of someone's property, is pretty new, being popularized by Christianity.
So you know, if popular conception is that doing this or that is evil or toxic or taboo, there's a high probability that such judgments are correct, helping us to survive and thrive. Even with all the false positives (which tends to be the lack of tolerance towards people that are different from us), dismissing our heritage would not be wise. Plus usually the guidelines are simple, like being a murdering maniac counts as evil, though somebody should tell those jihadists.
This is a really pedantic, subtle and (I think) important distinction. If one sees good and evil as objective truth, they're reliant on the source of that truth for their moral judgement. This explains the jihadists you mention. It's not that they enjoy being murdering maniacs (although I'm sure some do), it's that their source of moral truth tells them that the infidels are evil and must be destroyed (or whatever), so they see what they're doing as a good thing.
They are bad, we are bad. The only not bad involved in all of this is the little children who we both have hurt, and even then those children will grow up and become what the other side considers bad.
However when it comes to encryption, bad guys use it so it needs to be banned!
It's like they have it completely backwards.
However, even if the bad guys didn't have encryption, we would still want good guys to have encryption to prevent the bad guys from intercepting private and sensitive communications.
Ignoring the issues such as fending off bad guys (so what if they only have a knife, I'd rather bring a gun to a knife fight), hunting wild life, shooting for fun, owning collections, any many other reasons. Outright dismissing all of those is like just dismissing the need to protect private electronic communications.
Terrorism is pretty far down the list of things that frighten me.
The Europeans are fond of reminding Americans that we think 200 years is a long time. So, how long ago was the Stasi disbanded, again?
Of course, you could play the "Slippery Slope Fallacy" card against my argument... but we're not dealing with logical entities, are we? We're dealing with often-irrational human beings and governments composed of them.
The reality is, of course, encryption was used in these attacks. These terrorists aren't stupid. That doesn't mean it needs to be banned the say way a randoms stabbing don't mean knives need to be banned. I think the people saying "No way, never no encryption has every been used for nefarious purposes" are just as bad as the other side. Having a sophisticated view of this stuff makes a lot more sense.
Encryption will be like the gun debate. It will flare up from time to time. Its an easy horse to beat because its too technical for the layperson to really have a good understanding of and clearly there are political elements that would love Clinton-era laws limiting key size and such. The reality is that this ship has long sailed away. The genie is long out of the bottle to control encryption like its 1993.
Unfortunately, I'm already seeing similar hyperbolic arguments about cryptography - "let's outlaw math", etc.
As you implied, I'd like to see a rational discussion. Yes, cryptography can and will be used for nefarious purpose -- but it's used in much greater capacity for legitimate purposes. That's what needs to be emphasized - it's the only logical, supportable argument to make.
Pretty much all the British digital policies are silly. I think the simple thing they seem to not understand is that the Internet is smarter and moves fast than policy.
Quote from UK government advice to small business people, see
I suspect that GCHQ people want people using encryption but encryption that they have a feasible way of breaking if needed. Yes, I know, no way of stopping back-doors being used by other agencies/bad actors.
I don't think that's a good idea either, but there are people who would probably see that as a reasonable middle ground.
The only paragraph discussing encryption is buried in the middle of the article.
The attackers are believed to have communicated using encryption technology, according to European officials
The lede did not mention encryption at all: "The attackers in Friday’s terrorist assault in Paris communicated at some point beforehand with known members of the Islamic State in Syria, officials on both sides of the Atlantic say."
After Paris Attacks, C.I.A. Director Rekindles Debate Over Surveillance
- Article launches with sensational, buzzfeed quality linkbait headline.
- Article gets lots of clicks
- The title is quietly changed to a more restrained version which often matches the print edition title.
At first I thought there must just be different editors overseeing the web and print editions, but now I think it's an intentional form of viral marketing or activist editing.
In some cases, the sensational headlines really diminish the journalistic quality/seriousness of the article, or create the impression that a minor point in the article was the main thrust of the article.
Similarly, there has occasionally been a headline that appears intentionally boring so that the paper can publish a story but effectively hide it from view.
So in a nutshell it's classic misdirection, enabled by the digital medium, since tracks can be fairly easily covered making it harder for those who care/notice to call attention to it. For a while I thought about creating a screenshot archive of changed titles but eventually just stopped reading the paper.
More specifically, the NYTimes public editor wrote about headlines earlier this year: http://www.nytimes.com/2015/04/19/public-editor/hey-google-c...
He made it easier to understand unicode!
To me, it's transcendental.
The implication is that it can be illegal in some cases to say "The government knows the factors of X" or even "The factors of X are Y and Z."
Targeting main stream applications only hurts main stream users. NYT should write an article about that.
"If you outlaw encryption, only outlaws will have encryption"
I'm not on the side of reading messages but you missed the real argument being made.
Typical HN. How about someone answering the argument that will really be made by governments instead of the pointless one the NSA, etc have already answered. You can downvote me but you don't get a downvote in governments around the world when they outlaw real encryption.
I was ready to up-vote your argument because I agree with the first two paragraphs but the snark turned me off from supporting you. Leave it out next time and I you'll make a more convincing argument.
What does seem being argued for is to mandate adding a 'backdoor' for the government.
The counter argument to that is that adding a 'backdoor' makes the encryption pretty much worthless.
The weighing of risks of this and the importance for privacy versus the anti-terrorism fighting benefits is the only debate (not) happening.
Of course it doesn't, and that's not what I meant. I didn't mean wholesale outlaw, as in no one would be allowed to use it. Governments can mandate consumer communication technology, for example, like they are trying to do with Apple now.
Suspension of disbelief on security of escrow mechanism is a necessary pre-condition.
I'm pretty sure you can downvote them by voting for representatives that do not support such laws. At least if you're in some kind of democracy.
Voting is not a solution to a problem like this and is generally very ineffective in the US.
Also remember that crypto would still be legal in the rest of the world (China is not going to backdoor their crypto for the NSA's benefit) so you would still have heaps of encrypted traffic moving across the internet. I doubt that such a measure would do anything but make mass surveillance of western populations easier.
But then the NYT doubled down on the terrorists-and-encryption angle in a separate story here: http://www.nytimes.com/2015/11/17/world/europe/encrypted-mes...
Which says in the second sentence: "Obama administration officials say the Islamic State has used a range of encryption technologies over the past year and a half, many of which defy cracking by the National Security Agency..."
HN discussion of that second story is here: https://news.ycombinator.com/item?id=10579201
"The attackers are believed to have communicated using encryption technology, according to European officials who had been briefed on the investigation but were not authorized to speak publicly. It was not clear whether the encryption was part of widely used communications tools, like WhatsApp, which the authorities have a hard time monitoring, or something more elaborate. Intelligence officials have been pressing for more leeway to counter the growing use of encryption."
NYT didn't seem to pass it as anything else than it is. What's your logic here? Should all anonymous sources be banned? Or only the sources with opinions that you don't like?
Should the NYT report that a UFO passed over because some official told them so? No.
When you receive an anonymous tip, its your job as a journalist to try and confirm it, not report it directly.
Additionally, when a paper reports something with no evidence and screws up, they usually publish a retraction, not change the article and direct you to a new copy.
Here is the NY Times Standard and Ethics Statement, which describes the newspaper's "distaste" for anonymous sources:
In fact, that's probably the reason it was pulled: a complaint was made to the NYT Public Editor about the piece's sources.
I'm kinda ashamed I haven't thought of that.
And that just shows how little a mandatory backdoor policy would help: being able to read something doesn't mean you can understand it. It's always possible for someone to come up with a communication scheme that you didn't anticipate or can't interpret, and you can't legislate away that capability.
So a backdoor policy doesn't trade privacy for safety -- instead it trades privacy for a chance at safety, and pretty much just safety from careless, unprepared attackers.
Without encryption, internet doesn't work. Without internet, banking and e-commerce doesn't work. Really, not a single user-identifying service can function without encryption.
Also, encryption is based on math and logic. You can't really prohibit people from using it, or having it in any sensible way. Meaning that if we live in a fantasy land where this became outlawed, you'd still have any two nodes using encryption should they wish to. I doubt terrorist have the necessary incentive to follow international laws.
In fact, a mathematically unbreakable encryption is simply generating random data (aka. a one-time pad https://en.wikipedia.org/wiki/One-time_pad), and having a copy of this data at each end of the communication. Want to transfer sensitive data across the border? Just xor your data with the random data, and take the random data with you across the border. If the microSD card didn't leave your butt-crack at any point, you can merrily download the xor-ed data through NSA's mainframe, and they can't do jack about it, even with quantum computing, or any imaginable alien technology.
Sure, someone tech savvy can get around that, but not everyone. Just look at how many criminal cases end up finding emails explicitly talking about plans and crimes. So there are people nowadays not even using encryption when committing crimes. If you make encryption-by-default impossible, the amount of unencrypted messages will increase, and you'll have gained those.
Now whether that's good or bad is a separate discussion; I tend to think it's bad, but that's a more nuanced argument than "you can't ban it, so don't do anything about it".
"If you make encryption-by-default impossible, the amount of unencrypted messages will increase, and you'll have gained those."
Requires you to monitor every message for every form of data, and assuming the false assumption that banning encryption increases the simplicity of finding messages, you would probably only find more things not related to what you are looking for. Only assuming you have control over all data channels, which even the US doesn't have.
On the other hand: This message is then flagged: bomb, truck, explode. While you could better be looking at other stuff like origins and communication contacts. These give a clearer view than the random words you will be picking up, even when they are encrypted.
Don't we assume the NSA is logging any US traffic anyway? Even if not, they could require a direct line into all messages sent via some apps.
Or someone who has a strong incentive to keep their communication hidden - like a terrorist or a drug enforcer. This leads to a situation where law abiding citizens are punished when their right to privacy is denied, while criminals aren't hindered. "When you outlaw X, X will only be used by outlaws"
Criminals aren't perfect.
 - http://nindalf.com/OverconfidentArowana
See also http://www.bloombergview.com/articles/2015-10-20/shh-credit-..., http://www.bloombergview.com/articles/2015-08-18/bny-mellon-... and http://www.bloombergview.com/articles/2015-10-20/accidental-...
Clearly neither being really smart nor having a lot at stake will prevent someone from sending sensitive things over email.
As for terrorists, a short search turns up https://books.google.com/books?id=VqY4Wr3T5K4C&pg=PA410&lpg=....
I think if you claim that major criminals' encryption usage would be unaffected by making it difficult to use, you need to support that at least as well as I've just argued for the opposite.
Does this mean we should outlaw the use of screwdrivers ?
> Former CIA Deputy Director Michael Morell said he suspects the Paris attacks will weigh heavily on the encryption fight ongoing.
> “I think what we’re going to learn is that these guys are communicating via these encrypted apps, the commercial encryption, which is very difficult, if not impossible, for governments to break, and the producers of which don’t produce the keys necessary for law enforcement to read the encrypted messages,” Morell said on CBS’ “Face the Nation” Sunday.
> “We need to have a public debate about this,” he continued. “We have in a sense had a public debate — that debate was defined by Edward Snowden, and the concern about privacy. I think we’re now going to have another debate about that — it’s going to be defined by what happened in Paris.”
So, "They" want to misuse the attacks on Paris as an excuse to attract mind and screen-time for an alternative counter-debate about privacy and encryption. But Snowden got to "define the debate" because of the new information (as well as evidence for old information) he brought to the public, it's an entirely different thing if you're a politician grabbing screen-time and "define the debate" to a shocked audience over a terror-attack--which is exactly what the terrorists' want. Any freedoms or security taken away from us as a response to a terror-attack, is a win for the terrorists.