The story title mentions Australia but this is relevant to all the 5eye nations, as they're obviously pre-briefing the media on what the agenda will be and this is the first time that we're getting detail on what they'll be proposing (the UK proposals were vague)
What they seem to be talking around is implementing an app-level CALEA-like capability.
What I think how they think it would work: companies would be made to build lawful targeted intercept capability into their apps, in the same way telephony and other equipment is today. The app developer receives a warrant for an identifier and they're required to split off that traffic and change the keys, or encrypt it twice (the sender/recipient key and an intercept key - one per warrant (this happens with some net and tele warrants now)).
We all know the downsides of this approach, but it isn't technically impossible. What would be impossible is enforcing it, as it is more a regulatory hurdle. It is more possible today because of vertically integrated walled gardens being used for most app distribution - and backed by two of the largest companies in the world who may be susceptible to a compromise (especially as there is the large tax issues hanging over both their heads).
On a scale of how bad things can get - I think warranted targeted surveillance is better than device backdoors which is better than metadata retention which is better than the mass surveillance we have today (leading to cable splitting and DPI, or situations like Lavabit)
I don't see how, even if you're ok with warranted targeted surveillance, how a compromise is made here that doesn't lead to a wack-a-mole game where legitimate users are inconvenienced while the 'bad guys' are pushed onto alternate Android distributions and unofficial apps.
I also don't see how a CALEA-like capability is kept secure and safe - especially with apps (we saw the NSA use CALEA intercept to surveil political targets). Clapper et al always vaguely answer "key escrow" to this question without spelling out how that would work.
With subsequents backdowns in the scope of what these governments are wanting to do (and this latest proposal is again is a minor backdown) we might be reaching the finite conclusive point where comms do go dark and the new reality is that despite all of the tech we have law enforcement mostly relies on human intelligence and they'll have to scale back up for that. 3,500 terror suspects in the UK, 4,000 employees at MI5 - and notably in the recent attacks there were HUMINT warnings.
Stored-program general purpose computers are a fundamentally a threat to any entrenched power that relies on being able to control any potential risk with physical, legal, economic, or social force. The only real way to control software that is no longer scarce is to find a way to hobble the "universal computing machine" so it is no longer universal.
Cory Doctorow's warning[1] about the War On General Purpose Computing received a lot of attention, but I suspect his far more important followup[2] about the looming Civil War over General Purpose Computing was had a much smaller audience. Dan Geer suggested[3] that this "Cold Civil War" has been ongoing for a long time already. With this new push by FVEY nations against crypto, it looks like the war is starting to heat up.
> where comms do go dark
That's just not true. Metadata is everywhere and will likely only get even more informative into the future. As Susan Landau explained[4] in her testimony to Congress, the only people "going dark" are the people trying to "preserve 20th century investigative techniques [while] our enemies are using 21st century technologies against us." Complaining about "going dark" is just misdirection away from a total failure to update investigative techniques to not just keep up with changing technology, but to take advantage of the new opportunities created by our growing sea of {,meta}data.
I rushed through the article but it didn't really say anything of value, other than justifications from the state for treating everyone as criminals (oh how convenient that would be).
You don't really need much for targeted surveillance, right ? One only needs to tap into the distributor and push a specific trojan update.
Even without that Telegram and Signal already have vulnerabilities by tying key-pairs to phone numbers via OTPs. GSM is broken, ergo so are these. If these agencies wanted to do targeted surveillance there is very little in their way IMO.
The argument presented in the article is a specious one in that they use the premise of targeted surveillance for instituting the structures for mass on-demand targeting.
This is a very slippery slope, and as usual the morons that form our 'forth pillar' have let us down badly.
It would mean outlawing any software implementing asymmetric encryption; it would essentially mean making a specific application of maths illegal. Because this would also imply outlawing TLS as we know it, a lot of resistance can be expected from citizens and (parts of) governments and corporations alike.
That doesn't rule out such silliness from happening, but it will be a tough fight, so for now they'll probably stick to going after the silos (e.g., WhatsApp) and simply get them to substitute true end-to-end encryption for some backdoored solution — it's easier and more effective now.
I think that you're spot on. My only surprise would be that it hasn't happened already.
Many of the major telecom-ish apps that were not subject to interception added the capability later via regulation or circumstance. Nextel Direct Connect, Skype and the mysterious purchase by eBay, and FaceTime after the patent suit come to mind.
Thanks for your analysis, seems like a realistic reading of the situation.
I wonder if the threat of the net going dark is really anywhere as bad as the intelligence agencies pretend, considering that there is more information available than ever before outside of those encrypted messaging apps.
> "I personally want to live in a world where reasonable people and companies would say, 'You know what? Under the rule of law, and with the right oversight and a warrant, communications can be listened to when it's needed to protect us.'"
Yes well, I don't. But hey – why not facilitate foreign actors spying on our companies so that we may or may not catch any terrorists?
This is a meme that is coming from the top. Expect to see this phrase a lot more in articles and from talking heads on the topic. They aren't even very subtle about it.
1. The rule of law has not been compromised (snowdon has shown it has already)
2. warrants are issued by a proper judiciary (not the likes of FISA)
3. Oversight that protected citizens privacy rights (let's all laugh at this one since it requires Snowdon to show us that oversight just doesn't exist)
- warrants should have explicit expiration dates...no more indefinite intercepts.
- cooperating companies should be able to publicly say anything as soon as the warrant expires. The gag orders are what have allowed the scope of monitoring to be hidden from public scrutiny and that scrutiny is what's needed to keep surveillance to reasonable levels.
> Attorney-General George Brandis said the government will not pursue the controversial "backdoor" access option by forcing firms to plant flaws in their encryption software that would allow it to be cracked by police or security agencies
Forcing firms not to implement end-to-end encryption is forcing firms to implement flaws in their encryption software.
> Given the difficulty of cracking end-to-end encrypted messages during transmission, one option would be to improve warrant-based access to communications at the sender or receiver ends, Senator Brandis said.
> "At one point or more of that process, access to the encrypted communication is essential for intelligence and law enforcement," he said.
> "If there are encryption keys then those encryption keys have to be put at the disposal of the authorities."
The last part of the quote muddies the water a bit. Maybe they are interested in cooperation from companies with control of endpoint software (Apple, Google, Microsoft) to extract the keys?
Building your crypto system with an ability to spit out the keys on demand to any podunk FBI Director or podunk US President, not to mention any podunk sheriff, means you have built broken crypto.
> The rapid proliferation of encrypted messaging by terrorist networks has prompted...
Giving governments the power to perform mass interception and decryption of communication doesn't seem like a sensible way to fight terrorists, even if they say it's only to be used on suspects. Terrorist attacks aren't increasing because the "bad guys" suddenly got their hands on a copy of OpenSSL.
In the case of the most recent attacks, these people were let into the country voluntarily.
It says specifically that the government will _not_ pursue the backdoor options; seems that they just want to have clearer international protocols around warrants for information. Seems sensible if you ask me.
How do you think the government will access end-to-end encrypted data without making use of a backdoor? (Or alerting the user.)
Brandis said warrants should be "sufficiently strong to require companies, if need be, to assist in response to a warrant to assist law enforcement or intelligence to decrypt a communication". A company which makes end-to-end encryption will not be able to assist law enforcement in this way unless they make a backdoor.
Conclusion: Brandis either doesn't know what a backdoor is, or he does know but realises that "backdoor" has negative connotations so he is pretending that that's not what it is. Both possibilities are pretty reprehensible in my opinion.
Or, they're just going to redefine the term "backdoor" so it doesn't include automated access to endpoints, which is what this attempt smells like it would require. They might not want to call such a thing a backdoor for marketing/political purposes, but it still would be one. It would not be backdooring the encryption, but backdooring the software driving the user-facing plaintext.
> In mid-2013, less than 3 per cent of counter-terrorism investigations intercepted communications that were encrypted. Today that figure was more than 40 per cent, Senator Brandis said.
I want to hear more on this, because so far as reporting has gone on terrorist attacks since 2013... The use of encrypted messaging systems seems conspicuously absent.
In fact, it is notorious that they didn't use any such system, just regular SMS, speaking sometimes in Arabic or using very weak coded language. In general, these guys are morons.
However, ISIS overseas is different. They or an allied group have offensive cyber capability and an appreciation of opsec. They are known to have taken advantage of weak opposition opsec for disinformation and tactical advantage (hacking opposition command cellular devices via phishing and social engineering to get tactical planning information). I don't know if they use good encrypted comms, but it seems likely.
Would these skills migrate back to be use by local wanna be terrorists? I doubt it.
So the bigger problem is not deliberate use but accidental. If they were all using imessage by default it is going to be much harder. No easy meta data, no mass scanning of SMS. You're left with physical surveillance, phone calls, rough cell location data and HUMINT. If you can't get their Facebook messenger calls or messages you are stumped.
Of course, this is as intended -- no effective mass surveillance. But how do we enable supervised targeted electronic surveillance without it getting out of control?
If fb/google gave into to a CALEA type enforcement regime there are no limits on how much government surveillance would occur, at a level the Stasi would drool over.
None of any of this ever makes any sense. There will always be communication styles that are inaccessible to authorities. And if we ever get "spooky action at a distance" style communication that does not rely on an interposing medium (regardless of speed), then all this becomes even more moot.
Actually, the most depressing part of this, is that they do understand. Turnbull is a noted user of Signal, Wickr, and other secure messaging services, and has pushed for other cabinet and parliament members to use them also.
They know what they're doing, and sooner or later they'll go for a "more equal than others" approach to encryption.
What do you mean? The article basically boils down to Brandis wanting international intelligence agency protocols for warrants to get access to info like this.
Well they can if fb include a copy of the session key, encrypted with the public key of the escrow authority, appended to the ciphertext. The crypto is done by the fb app, so it is within their ability.
Big companies are easier to coerce than e.g. the pgp developers. There is no way for you to wrap your own encryption layer around the one used by WhatsApp/etc. You can post pgp messages on those systems but that is something very few will do.
The developers of Signal and similar privacy-oriented apps will probably rather shut down than compromise the security of the app. As long as at least one secure app remains, the policy is pointless. And even then there's other ways to communicate securely. There's no viable way to enforce this.
OWS probably wouldn't comply, but have you noticed how hard it is to install something Apple/Google don't want on their platform? Side loading is feasible but improbable on Android, totally impractical on iOS. In place of well engineered solutions from principled developers there would be apps from collaborators. I'm thinking WhatsApp/Facebook will fold pretty soon. 99% of users won't care, and we'll be back to where we were with PGP: most messages are in the clear, the only encrypted ones are huge red flags for further tracking.
Obviously, either encryption works flawlessly for both legal and criminal purposes, or it works for neither.
What the proposal seems to concentrate is endpoints, where plaintext inevitably exists, and legal protocols for accessing it.
OTOH any sane implementation would only generate plaintext for display purposes, and would clear the RAM as soon as display (or input) is done, so finding the plaintext anywhere may be honestly impossible. At least, without tampering with the software on either end.
For the phones, that can be pulled regardless of the app's security.
If the legal framework is laid out, government can tell Google or Apple (or phone vendor) to push a system-level update. It is trivial for both to push code that can run without any restrictions, have full access to screen, audio, camera and network.
Not sure why they bother, though. Wasn't it said many times almost every baseband module is already a black box with possibility of undetectable access to the main CPU/memory?
For those who are unfamiliar with the Attorney General George Brandis, this is one of the people instrumental in implementing two year mandatory data retention.
This is a famous interview he gave which is shows how little he understands about the concept of metadata, and is mandatory viewing for all who are not familiar with him:
His utter inability to understand the issues that he is legislating is distrubing.
"What people are viewing on the internet is not going to get caught ... What people are viewing on the internet while they surf is not going to get caught. What will get caught is the web address".
The legislation ended up retaining the IP address that you visit, but not the host or URL. I suspect this is the distinction he was trying to make, but nevertheless, it is still disturbing.
Our governments appear to be pursuing mutually contradictory aims. On one hand there are increasingly frequent and powerful cyber attacks which can only be resisted through superior cyber-security and encryption. Then on the other hand we get this rubbish.
Is it even possible to solve both these problems at once in a way which preserves the freedom of the net and doesn't involve some crippling PRC style regulation?
I wonder if Facebook/WhatsApp have already been testing this type of access under this current "feature" that's supposed to make it more "convenient" for users who switch phones:
Most in the crypto community seem to have sided with WhatsApp at the time, but I wonder if they were taken for fools, too, by buying WhatsApp's argument.
If I were to implement a backdoor, then implementing it as a "feature" that "makes sense" is definitely the way I'd go, especially if my app were to get a lot of attention. That way I won't have to hide it (much) or worry about it getting discovered because I could just "explain away" the critiques.
One thing that seems to be left out of most discussions around this, is "proof of sender" would likely be compromised.
For example with PGP/GPG, if some "magical" approach was added so messages could be intercepted and then decrypted and read by intelligence/law-enforcement/(etc), it seems feasible those same people may be able to spoof the sender's signature.
eg create falsely signed, encrypted messages that verify as being from the real sender. Extremely good for blackmail/framing/similar. :(
It would depend upon the capabilities of the "magical" implementation approach of course, but it fits the scenario. PGP/GPG is regarded as pretty strong, but SSL/TLS certs already aren't so seem like they'd be much more prone to this.
The 9/11 attackers discussed their plans through email. Good thing they didn't use encryption, or it would have been a tragedy.
Wikipedia says the United States Capitol/The White House was called "The Faculty of Law". The Pentagon was dubbed "The Faculty of Fine Arts". Atta codenamed the World Trade Center "The Faculty of Town Planning". I remember reading they had also use terms such as "birthday cake" and "candles".
I don't know if ASIO (and the US agencies pushing this agenda) are lazy or if they have some different agenda. Clearly this isn't a make-or-break issue in policing.
"I think we've got to take a common position [among the five eyes] on the extent of the legally imposed obligations on the device-makers and the social media companies to co-operate," Senator Brandis said.
He's got to realise that any such agreement will inevitably end up being the lowest common denominator of what each of the nations think they can reasonably get away with legislating, which in this case probably means that US (with the strongest device-maker and social-network lobby) will drive what is possible.
This is a brilliant comment except for the fact these weapons you presumably own will never, ever be able to overthrow the US government; a few hackers employed by Russia already did that anyway.
Secondly, if what you say is true I assume, given the refusal to follow large parts of the rest of the constitution, you are planning to overthrow them as we speak?
Finally, I’m concerned that your dream of overthrowing a bad/illegal government, while utterly hopeless, is causing tens of thousands of deaths and about 100000 people being shot per yer. But hey, that imaginary overthrowing the government thing is worth it...
yeah, we're in agreement. there will be no overthrow, instead the government (and all governments) will simply take away more and more of your rights, until they're all gone. both the ones you like, and the ones you don't like. doesn't matter; they're all going out the window, slowly but surely.
next on the list: encryption and data privacy. and there are plenty of useful idiots out there who think encryption somehow harms children, they will come out of the woodwork at exactly the right time, and scream the loudest and make the least sense, you can be sure of that.
While the parent made a partially correct comparison, this whole thread is turning into a giant contest on highlighting where it doesn't work and then trying logic fallacies of all sorts. What's wrong with you, people?
Surely, not every argument. I don't think arguments like "legally owned guns are frequently stolen and used by criminals" works for strong cryptography. However, things like "more ____ control laws would reduce deaths", "high-____ should be banned because they too often ____", "____ are rarely used in self-defense" or "a majority of adults, including ____, support common sense ____ control" can be tried. I just took those from the first search result for "gun control", picking few that I was able to adapt without rephrasing too much.
Heck, I can see even how "more ____ control leads to fewer suicides" can be pulled. In glorious Russia we already have this train of thought running at full-speed, just with "Internet censorship" instead of "strong cryptography".
So while parent comment is obviously biased, it has some valid point. Not sure if its validity means anything useful, though.
____
Note, I explicitly don't evaluate any claims validity here, and must remind that even if "A does X" is true, and even though "A is similar to B in some respects", that doesn't mean that "B does X" is true.
This violates the HN guideline against name-calling in arguments: https://news.ycombinator.com/newsguidelines.html. Please don't do that, regardless of how wrong someone else is. Besides frequently being uncivil, it leads to massively poorer comment threads, as demonstrated below.
If you encrypt something, isn't it because you're trying to hide child molestation and sexual exploitation footage?
"Think of the children" is one of the most powerful cards there is, and it will be played in full force by governments, only less elegantly than you just did.
I think the best way to counter this argument to people who don't know what they're talking about is to tell them that encryption is what keeps hackers from stealing their bank details, etc.
Insulting other nations as uncivilized is one of the worst things you can do in an internet flamewar, where by 'worse' I mean 'producing more of the same, only more so'. We ban accounts that comment like this, so please don't do it again.
More generally, please don't post unsubstantive comments. The combination of uncivil+informationless is particularly toxic.
Upvoters: if you upvote this kind of thing, eventually your upvotes will not count.
You and your fellow citizens are incapable of organizing against your government in the event that it becomes dangerous to its constituents. Congratulations!
I am glad to live in a country whose founders had the foresight that allowing a government to leverage weaponry / violence against its citizens without giving its citizens the same leverage was just a bad, bad idea.
That is the real reason for the 2nd amendment, not so I can defend myself against home intruders. That is a secondary argument.
Except that this time the difference between a regular army and a militia would be far beyond just the color of their coat.
Having weapons at home can barely counter law enforcement, let alone a professional military force.
This time there's more technology supporting each area of warfare: logistics, reconnaissance, equipment, vehicles...
A militia, however organized, has everything to lose against an regular army in the 21st century. In fact, chances are you would die before you are in range to fire your weapon.
Your only chance to win would be insurgency or guerrilla warfare.
You are absolutely correct and it's a horrible state of affairs.
There's still no need to starve the beast and remove one of the few weapons we have against the slaughtering of our rights and freedoms.
Just being able to purchase guns of reasonable firepower without scrutiny is extremely important to the freedom of our posterity and their ability to organize.
I know this almost makes me sound like some sort of radical terrorist, but these ideals are at the core of our Constitution and to not value these ideals is to be devoid of true patriotism.
He did. He said the children in Newtown's kindergarden and Columbine School were not killed with encryption. It is a pretty big difference. For a non-American this is a pretty good argument, IMO.
What nonsense. It's obvious what the parent comment meant. Law abiding gun owners by definition do not kill innocent children, just like law abiding people do not use encryption nefariously. Encryption and guns are both tools, with nothing intrinsically good or evil about them.
Yet law abiding gun owners were punished and had their rights stripped throughout most of the world on the pretense that it would somehow help stop crime, despite the quite glaringly obvious fact that criminals by definition do not care about following laws. Thus, only law abiding people are hindered. Criminals will still use encryption and guns, while law abiding citizens will be rendered unable to defend themselves, either electronically or physically.
I find it amazing that I need to spell this out for you.
Further to this, the reality is politicians who want to undermine information security don't give a damn about protecting their citizens/subjects. That's just the excuse to make the poison pill more palatable. What this is about, and what gun control is about, is right there in the name: control. It's about having a monopoly on security and force, so the populace will be less independent and easier to manage. At the very least, if you make enough things illegal you can jail whomever you like, because at some point it will become impossible to avoid breaking the law. The US is already there.
We've already seen government agencies in the US targeting specific political groups. I don't follow politics in other countries much but it would hardly surprise me if that happens elsewhere. It certainly has throughout history. How far will that go this time? No matter what side of the aisle you're on, it should frighten you, because the tool you use on your enemies when you're in power--and any tool you create to beat them up more effectively--is then available for your enemy to use on you when you inevitably lose power.
I find it shocking how many people who clamor for electronic freedom and an uninterfered-with internet are perfectly happy to cede their rights and independence in other areas to the government without so much as a grumble. Learn from history, FFS.
>I find it shocking how many people who clamor for electronic freedom and an uninterfered-with internet are perfectly happy to cede their rights and independence in other areas to the government without so much as a grumble...
This is absolutely the most shocking thing I have come to notice recently, which just shows that these people have reasons spoon fed to them from somewhere, (knowingly or unknowingly) and have not really thought it through..
What I feel is that the while modern society feels not susceptible to the evils,oppressions and exploitations that were prevalent 100 years ago, while most people were poorly educated, this seems to be just a fallacy. And educated people can be manipulated just as easily by feeding them some kind of "reason" and selective "statistics" from certain "reputable" sources or authority.
Uh huh. Have you studied the genocides of the 20th century? How does a disarmed and helpless populace look in those circumstances?
And since you think the government should be trusted to mete out all force, I'm certain you'll have no trouble handing the government all your passwords and keys, right? Maybe mail them a copy of your car keys and house keys too, since they're so trustworthy?
You trust them to protect your physical safety, to protect your life, if you're willing to hand over your right to self protection. Surely you then trust them with your data and property?
> And since you think the government should be trusted to mete out all force, I'm certain you'll have no trouble handing the government all your passwords and keys, right? Maybe mail them a copy of your car keys and house keys too, since they're so trustworthy?
> You trust them to protect your physical safety, to protect your life, if you're willing to hand over your right to self protection. Surely you then trust them with your data and property?
I don't think that's the same thing at all. I trust my parents with my life, but I wouldn't want them to read my online conversations. They don't have a key to my house, nor the passwords to my computer and various accounts, and I'd like to keep it that way. I see no contradiction here.
> Have you studied the genocides of the 20th century?
Have you?
Before Hitler came to power, the Weimar government was destabilized by - among other things - a latent civil war between armed paramilitary extreme-left and extreme-right groups. Hitler himself relied on such groups (from the right) to stage his first attemt to come to power via a coup. Even after he got elected, the groups were a cornerstone of his power.
The only purpose of a gun - a handgun in particular - is to wound and kill people, something civilians have no business doing. That's not the case with encryption.
I've used a handgun to signal for help, to find lost people in the woods in Alaska, to hunt, to scare away bears, and for fun. And on a few occasions, I've been very glad to have a handgun with me stateside, because I came quite close to needing to use it to protect myself. Fortunately I was able to negotiate those situations without using force.
Are you a pacifist, incidentally? Can you really think of no valid reason why a citizen might need to wound or kill someone?
Guns used to be tools, but they aren't in much of the US. That doesn't mean they can't be, but they aren't used as tools, so they've lost a lot of their value.
Would you please not do flamewars on HN, regardless of how wrong or irritating some other comments are? We're really trying to avoid the downward spiral here.
The analogy is flawed, and the final sentence is condescending.
All things in your list have an inherent physical danger. Neither encryption nor cryptography are themselves dangerous, they're a means of encoding some other thing which may or may not be dangerous into something whose danger cannot be determined.
Imperfect analogy, but crypto is more like a safe than anything you list. An ensuing argument is, we need to regulate (crypto) safes not because they are themselves dangerous, but because their contents might be.
The means of encryption-decryption, code, is considered free speech (Bernstein vs U.S. 9th Circuit, and Junger v Daley, 6th Circuit). Thus far crypto is itself protect, but the usage of crypto is an open question.
With a real safe, police with probable cause can get a warrant, and forcibly gain access to the safe and see the contents. This isn't possible with crypto if the key owner refuses to cooperate, and why there's the ensuing problem of sanctioning the person for contempt of court when ordering access and they don't cooperate.
The dispute has nothing to do with analogies though, it has to do with a power transfer. Crypto permits some transfer of power from the sovereign to the individual. And that has altered the social contract. It's done. And now after the fact we're trying to sort out the consequences of that power transfer, and whether or not the sovereign gets to reign it back in, and how. And that's unanswered.
> If we could restrict the use of encryption by the bad guys without compromising it's use by the good guys
oh, don't you worry, the politicians will do it anyway, with much public support, and then you will sound like the crazy, paranoid one arguing for military grade assault encryption in the hands of the dangerous, villainous public.
edit: i see you deleted the part i quoted. makes sense. cognitive dissonance is a hell of a thing. feels like razor blades in the mind!
This breaks the HN guidelines. Accounts that are uncivil and flame others eventually get banned, so would you please (re)-read the following, and abide by them? That means posting civilly and substantively, or not at all.
Another thing we're trying to avoid is the generic ideological tangent: that's when a topic with something specific in it (Australian government current encryption plans) gets diluted into topics like "pacifism" and "genocide", about which no HN thread has anything new to say, but plenty of people will get agitated and sucked into battle. This is the reason why the guidelines say: "Please avoid introducing classic flamewar topics unless you have something genuinely new to say about them."
I doubt encryption does anything all by itself. And apparently, encryption-using terrorists--who would be violating the law if such laws were in place--do kill children. Manchester, you know, it just happened. Nice. Berlin. Stockholm. Pretty soon, it'll just be a "name the city" game where the only question is how bad the most recent terrorist attack they suffered was compared to the rest.
And just as gun control laws do precisely dick to stop criminals from committing their crimes--because laws aren't magic, and they don't bend reality to make guns disappear as soon as the law is signed--encryption laws will be just as efficacious at stopping terrorists from using encryption. Like some wannabe martyr gives a shit about a fine he'll never pay and time he'll probably never serve in some resort prison?
We've banned this account for repeatedly violating the HN guidelines. Please don't create accounts to do this with.
If you want to comment here, you need to be a good community member. That means posting civilly and substantively, or not at all—and certainly avoiding ideologically inflammatory, nationalistically nasty comments.
The problem is the fallacy consisting in the idea that that owning a gun puts you in equal terms with the government. That used to be the case maybe circa 1776, but no longer the case in 2017, 241 years later.
The government operates a regular, professionally trained armed forces with modern, military grade equipment. Your weapons can barely counter law enforcement, let alone an army.
Good luck solving a dispute with the government by firing a semi-automatic gun at an Apache Helicopter firing you back with 6000 rounds per minute.
>"freedom-loving internet users will soon know how gun owners feel. every single bogus argument for gun control or "assault weapons" bans can, and will be, used against internet speech freedom and information privacy, now that the western political establishment has taken it upon themselves to strip you of your free speech rights after seeing how trivial it was stripping you of your guns, and how much enthusiasm their populaces had for it."
Wow what a total straw man. Firstly this is an article about Australia not the U.S. But I will get into why Australia is particularly relevant in your poor choice of straw man in a second.
Firstly, "Gun control" is not about wholesale elimination of gun ownerships but rather rather having some policy that permits its' ownership while still accounting for public safety. Your sentiment is typical FUD propaganda employed by lobbying groups like the NRA in the US - "they're coming to take your guns!" By the way the NRA is among the most powerful lobbying groups in the US[1], there is no such encryption/privacy lobby.
The central talking points in "Gun Control" are enforcing some common sense provisions like "back ground checks" to prevent mentally people unstable from owning guns. Or restricting civilian ownership of military-style assault rifles that don't have a compelling civilian use case in either hunting or self defense.
Australia has implemented both of these common sense provisions mentioned above - forbidding semi-automatic assault rifles and mandating back ground checks and waiting periods for gun purchases. And guess what? Australia doesn't the mass shootings that are so common they start to overlap each other's news cycles. There is no epidemic city like Chicago where 762 people in a single year were killed by guns[2] And plenty of Australians own guns. Australia implemented these after the Port Arthur Massacre in 1996. And not only does Australia have no shortage of people that own guns but gun ownership is on the rise [3]. You know what isn't on the rise though? Mass shootings of innocent civilians these went from 11 per year to zero after changes in legislation proposed in the wake of the Port Arthur Massacre[3].
I am a strong proponent of E2E encryption and the right for people to be able to communicate privately, however I thing Brandis is saying generally positive things. If Australia thinks someone is a criminal, and there is an agreed process to obtain a warrant (hopefully from a judge), I think that's fine. The NSA mass-surveiling Americans is entirely different, as are other similar tactics to spy on presumably innocent people. Warrants are good, especially with people actively making calls.
Sad to hear. Do you think he's actually going to hand-approve all the warrants? Hopefully it's still better than what the USA had with all the warrantless phone data aggregation.
What they seem to be talking around is implementing an app-level CALEA-like capability.
What I think how they think it would work: companies would be made to build lawful targeted intercept capability into their apps, in the same way telephony and other equipment is today. The app developer receives a warrant for an identifier and they're required to split off that traffic and change the keys, or encrypt it twice (the sender/recipient key and an intercept key - one per warrant (this happens with some net and tele warrants now)).
We all know the downsides of this approach, but it isn't technically impossible. What would be impossible is enforcing it, as it is more a regulatory hurdle. It is more possible today because of vertically integrated walled gardens being used for most app distribution - and backed by two of the largest companies in the world who may be susceptible to a compromise (especially as there is the large tax issues hanging over both their heads).
On a scale of how bad things can get - I think warranted targeted surveillance is better than device backdoors which is better than metadata retention which is better than the mass surveillance we have today (leading to cable splitting and DPI, or situations like Lavabit)
I don't see how, even if you're ok with warranted targeted surveillance, how a compromise is made here that doesn't lead to a wack-a-mole game where legitimate users are inconvenienced while the 'bad guys' are pushed onto alternate Android distributions and unofficial apps.
I also don't see how a CALEA-like capability is kept secure and safe - especially with apps (we saw the NSA use CALEA intercept to surveil political targets). Clapper et al always vaguely answer "key escrow" to this question without spelling out how that would work.
With subsequents backdowns in the scope of what these governments are wanting to do (and this latest proposal is again is a minor backdown) we might be reaching the finite conclusive point where comms do go dark and the new reality is that despite all of the tech we have law enforcement mostly relies on human intelligence and they'll have to scale back up for that. 3,500 terror suspects in the UK, 4,000 employees at MI5 - and notably in the recent attacks there were HUMINT warnings.