That's quite the quote, especially given his history of employment.
The weirdest thing about this whole cell-phone saga to me is that the perps are dead, did not appear to be part of some organized group and that very little could be done to them that hasn't been done already based on evidence found on the phone.
Then there is the bit that a lot of the information that is on the phone is also already in the log files of the carriers. It's as if that phone somehow magically is going to yield an entirely new class of information that may not even exist in the first place.
To me it has been evident from day one that this is not about this phone or the data that's on it but just about the legal precedent, getting it in black-and-white from the former head of counter terrorism is quite an indictment of his successors.
A reasonable reason for inquiry here is to actually try to make sure if they were part of some organized group. "Appear not to be" is not quite enough.
This is a valid reason to investigate even if they are dead. How did they get radicalized, etc.
Not that it is a good enough reason to enforce breaking of encryption in the way proposed, but in a murder inquiry, privacy of the perps has to give way.
For a country that kills based on metadata, it seems quite far-fetched that they couldn't map out potentially interesting connections using just metadata.
Also, it would be pretty dumb to put any revealing data on a work phone with iCloud backups enabled (which is just not accessible due to the FBI's mistakes).
At any rate, this discussion is quite besides the point. Permitting this phone to be unlocked (or the hundred or so other phones mentioned) will open the floodgates for questionable regimes and a hunt for Apple's private key (more automation to handle requests will reduce security).
If the feds get their way the entire idea of encryption is weakened, and not just for the lawful citizens. I suspect the reason for having this battle so publicly is propaganda based, just as it was with Zimmermann in the 90's.
The legality of the NSL (which does not require oversight- the FBI can self-sign them) hinges on the idea that the persons on whom the information is being gathered have/had "no reasonable expectation" of privacy of the information.
That said, if these back doors were to be instituted, could the argument then be made that we now have "no reasonable expectation" of privacy of anything on our cell phones? Could this set a precedent of unholy proportions?
I think its reasonable for the FBI to want to access it, but lets not have any illusions, the change that their is any evidence on the phone is basically 0%.
It has to be, because there is no perfect. Everything is a numbers game between ever-more unlikely hypothetical evidence that might yield a valuable unknown, and the cost of collecting it.
By such completist thinking, we should examine every image and video in the world because they might show up in the background of a birthday party video or vacation selfie, giving us another clue. Similarly, we should interview everyone in cities they've been known to be in...
But in reality, unlike in a game or a movie, there is no 100% complete. The phone isn't one of a finite list of clues to examine whereby you will know everything you need to know. By fixating on the phone, or any few issues, you miss the larger point.
There's very little likelihood of learning anything of value here, and are far richer leads elsewhere.
This is just our security industry grandstanding for more money.
That doesn't mean that I agree that Apple should be forced to break the encryption. It means the police really have a duty to investigate.
The FBI have tried, and failed. How much more should they dredge before we stop harping on about this one piece of potential evidence?
In this case they have the phone physically in hand so they feel blocked and powerless, but from a cost-benefit point of view it might as well be at the bottom of the ocean.
Doesn't their duty compel them to investigate the most-likely to payout leads, not play political games over high-profile issues?
And when you consider the chance of there being worthwhile, actionable, data months later after their failures, finding the pelican is the safe part of the bet.
Which is to say they absolutely have vulnerabilities that work against certain versions of iOS, but probably not others. And the NSA likely has more than a few zero days, simply because it's a far cheaper way for them to do their job than any alternatives.
Just like with any piece of software, there is no "secret vulnerability that works all the time forever": it's discover, exploit, patch, repeat.
(Which, incidentally, is also the argument against key escrow schemes. Whereas it would be discover, exploit all devices that implement the key escrow code, wait until the government develops a new patch, wait until all device manufacturers incorporate the patch... repeat.)
Also: parallel construction (i.e.: the NSA doesn't want to reveal it has already done this)
It feels to me (an outsider) that it's the government that is out of control and is not accountable to the people.
A free country's first commitment is to personal liberty, and only secondly to democracy.
There's more than one issue concurrently, and many people are actually dumb and wrong.
Let me quote this piece of an anarchist pamphlet: "Governments are instituted among Men, deriving their just powers from the consent of the governed" - this means, amongst other things, that the governed do have the option to say "this is not okay anymore, stop."
What you seem to be saying is "you have a right to choose, once, and then just shut up and deal with it, the govt can do anything it wishes in the meantime." You're describing a dictatorship - "the State is always Right, by definition".
There's a good point. And who knows what limits they'd go to, to force Apple's hand? (Kidnapping, etc.)
Apple is really only safe if they actually can't break their own encryption. Anything less makes them a target.
The government is empowered to take action on behalf of the public. If it does something that people like, then great--keep doing it. If it does something that the public doesn't like, then there are multiple ways for the public to change how the government works.
What's happening right now is actually how it is supposed to work. Innovation results in new technology, the government tries what it thinks is best, and then there's a huge national public conversation about it.
This is the 2nd time we've had this particular conversation; the first was in the 1990s. And we'll keep on having it, basically forever. That's how government by the people works.
From among two choices, already bought by special interests, and already so habituated to our intelligence and law enforcement communities that they have zero scepticism about these kinds of travesties.
The founders and many other people have worked hard to creat a system that protects the people from government action. For this reason they implented and evolved a legal system.
We do not. Simple as that.
"What you can do about it:
-- You can contact the Obama White House online to comment on strong encryption.
-- You can contact your state Senators and Representatives via the contact information supplied by ContactingTheCongress.org.
-- You can specifically contact Senators Richard Burr (R-NC) and Dianne Feinstein (D-CA) to express concerns about their bill intended to force companies to weaken or work around encryption under court orders.
Express yourself with the honesty and clarity that the government's charm offensive is lacking."
* Tracking Congressional bills, Congressional representatives and voting history:
* General resource on local, state and federal elections:
* Data on lobbying and political contributions:
* Voting history, policy positions and public statement transcripts of politicians and political organizations:
I'm looking at you, Sweden.
Not that I disagree with this, but how is this supposed to help entrepreneurs? If anything, it will just increase the tax burden of running a business.
If you use tax money on education you also have a positive effect for both startups and poor people.
You also even out the divide between new and and established companies, established companies having non innovative advantages, such as brand names, maybe their own "infrastructure". So you even get rid of companies that try to keep status quo and have innovation stall.
The sad thing is that currently some governments/parties/politicians create inhuman, even counter productive productive competition between humans while at the same time have companies be cartels and monopolies and more importantly create counter productive anti-competition laws such as intellectual property, to a degree really awful patent systems, brand protection stuff, etc.
It's kind of destructive in regards of innovation and progress.
I find it somewhat irritating that this needs to be stated, but helping the economically disadvantaged doesn't mean just raising taxes and giving everyone a welfare check. The narrative has been so well framed.
It could (and should IMO) mean creating a more just economic system while providing a real safety net. Ultimately, this would involve people being educated and able to obtain jobs that pay a decent wage.
There's some tax money involved in that, but I don't think we necessarily need to _raise_ taxes. Perhaps we just need to reallocate a bit from the war chest.
>It could (and should IMO) mean creating a more just economic system
What exactly does this mean? Don't just give me platitudes about better education.
It's so disappointing to me to hear a quote like that from the President.
Always remember that the FBI was against the civil rights movement and MLK. And consider that one of the leading presidential candidates is getting into the habit of inciting violence: http://talkingpointsmemo.com/edblog/the-rage-and-the-derp--2
I assume many of them are voting for Trump.
Hitler and his Fascist ally, Mussolini, were both far-right contol freaks. Josef Stalin and his predecessors were far left control freaks.
People in power like control, contrary to what they might say. Power corrupts, absolute power corrupts absolutely. People want to be elected because of their egos. Full stop. There is no other reason. No one does it because "they love America". They want to latch onto the gravy train and collect a check all the while dictating terms and or furthering an agenda. It's patently obvious to all.
This is not a coincidence that Hitler's political party (NSDAP) was called National Socialists. Hitler himself was the fan of Karl Marx.
It is worth remembering that effectively ALL totalitarian systems of 20th century were leftist.
Criminals have always tried to hide their activities and keep secrets. Sure with strong encryption maybe some information is hard to search, but that was always the case. (Did they ever find Jimmy Hoffa's body?)
I mostly agree with the idea that we have a lot more to lose by inserting backdoors than we would ever gain.
This is simply Godwin's Law for the modern era.
It seems likely that by far the most prolific creators of child pornography today are... children. What today's parent and grandparent generations used to do in relative privacy behind the metaphorical bike shed, surely today's teenagers are doing with their phones and laptops, perhaps unaware of how insecure and non-private their communications might be.
There are all sorts of difficult ethical questions about how that kind of behaviour should in itself be treated in law, but let's ignore those for now. In any case, if kids are carrying around that kind of material or sending it to each other, surely the best way to protect them is to make sure it stays as private as it should be? That means making everything as secure and safely encrypted as possible. Arguing for weaknesses in encryption or creating backdoor vulnerabilities has exactly the opposite effect.
No, of course we shouldn't. There really are bad people in the world, and some of them really do do very evil things, and there really are good reasons to have police and security services, and they really do need reasonable powers to enable them to do their jobs and deal with the bad people.
But what we should do is keep these things in perspective and ideally set public policy based on evidence and rational arguments. Creating security loopholes that might help the authorities to find and stop a small number of genuinely evil people but might also create millions of new vulnerabilities to be targeted by other evil people (or even the same ones) isn't necessarily a good trade-off.
What's the evidence that we need [abusive bill of the day] to fix that?
Why do we think the potential harm averted is greater than the harm we know comes from overreaching laws and abusive governments?
Human trafficking at all, sex or otherwise, is horrible, common, and relatively easily stopped without new laws - at least for the victims of the moment - so go spend time and effort helping rather than beating the panic drum as you are now.
If certain tales of the dark net are to be believed, it has enabled an unprecedented level of communication between abusers of children. Seems like an obvious "con" of widely-available secure communication.
If you believe the "pros" outweigh it, say so, but don't pretend it isn't relevant.
In addition, the "child pornography" that people want to use as justification is REALLY RARE. The biggest kiddie porn bust in the history of the US was Operation Delego which had 600 members worldwide. Most of whom were busted with good, old fashioned police work.
Now, let's talk about the common "child pornography"--Suzie has naked selfies on her phone. Do you really think that Johnny won't go to a dodgy Chinese site to get the iPhone crack to let him see those?
Also that and there are so many more successful ways to catch molestation that we aren't taking.
It's not that molestation and exploitation isn't bad, but when you see someone ranting about it you can be pretty sure they're just doing it to push a political goal. It's an obvious call for censorship and more agency funding.
If the list of relevant things could only be five-thousand items long, this wouldn't even be on it.
And then we get to the cons, many of which are potentially Orwellian.
So child porn can be perfectly illegal without having an explicit law saying "Child Porn is illegal". Its creation violates so many fundamentals and its distribution violates a subset of those.
But the reason for the illegality is as important as the legality in the first place. Some of the most horrible nations commit acts we deem moral in consequence but often do them for still vile reasons (ie, you don't execute the mafia boss for rampant murder and robbery, but because hes threatening your own power).
In other words, introducing "guilty until proven otherwise" introduces witch-trials pretty much by definition: if she floats, she's a witch (so far, so good; yay it works); if she drowns (or dies in notprisonnosirnotatall after years of not confessing nonexistent secrets), she was innocent. Of course, there is absolutely no way this might be abused, and certainly not for personal vengeance.
Welcome to Salem, MA.
That is the next frontier. A backdoor into your bank account. There is already precedent inside and outside the US.
Yes, there are bad people doing bad things. But we are enabling much worse people to do much worse things trying to stop the bad people (also, if you make it so the low handing fruit of people who only possess digital data can no longer be used to score political points and to give the appearance of helping, it could force law enforcement to dedicate even more resources to the producers, thus being a win for both freedom and for the innocents needing protection).
If after 8 years you still think Obama is the Nobel peace prize wielding progressive philanthropist you've been fooled into thinking he is, joke's on you tbh.
I point out said comment is perfectly in line with the policies implemented by Obama and his administration during his two terms and OP's incredulity is unwarranted.
Not sure how that's OT or inappropriate but furry muff, HN works in mysterious ways.
Such as the fact that he voted for retroactive telecom immunity as a senator, supported increases in NSA surveillance as a President, and certainly signed off on the extraordinary measures taken to try and apprehend Snowden.
His stance on surveillance is clear. He has strongly supported it since 2008. (Before he was the Democratic nominee he spoke out against it.)
You're going to find, that once you think it through, that you'll end up limiting the rights and freedoms of individuals... there's no getting around that.
Life sucks that way. People were never their own little countries. I just feel sorry for the middle-class suckers that thought they had freedom. Poor guys - they never figured out they were the tools of the wealthy and powerful telling them they had freedom, in order to make them work harder against the lower-class.
Suppose they meet in person to conspire. Would you suggest having every citizen carry un-blockeable microphones just in case?
This is not a "my freedom to swing my fist ends where your nose begins" kinda issue, this is "are we willing to build a police state if it reduced the risk of certain crimes by X%? Even at the cost of creating entry points into all of our communications that can be exploited by unauthorized and misauthorized actors alike..." issue. The ability to retrieve past conversations about conspiracies is a new capability here, not the ability to keep past private conversations secret (that was never as hard in pre-internet/pre-electronic times).
The interviewer in the article says:
"GREENE: But can you just explain why you would compare, you know, a company helping the government design a way to unlock an iPhone to something extreme as torture and ankle bracelets? I mean, that sounds like a very extreme jump."
But actually, an ankle bracelet that reports your location and audio might actually be less invasive of your private conversations in today's world that reporting the contents of your phone.
Not at all. I have the choice to carry / use a phone.
We have no idea what the political and technical landscapes will look like in 5, 15, or 25 years. It is not inconceivable that your governmental identification "card" will morph from the plastic of today into an embedded device in the phones of tomorrow.
People TALK about killing far more than ever seriosuly think about or have the personality to actually do. Bad humour, frustration at work or with partner, or even how to make a bomb or commit a terrorist act (oh al quaida are stupid because, it would make more sense to do x, y then z) can all be topics of conversation amongst law abiding people after a few beers.
So are we happy to move the bar to innocent until silly enough to have discussed robbing a bank? Innocent until found to have mentioned the current taboo issue? Plenty of people have found out the hard way that the TSA, Twitter and airports aren't a great mix for jokes.
I say again, there's good reasons the system sought the evidence first, not retrospective police fishing expeditions.
"It is better that ten guilty persons escape than that one innocent suffer". Sir William Blackstone 1765.
I see nothing in the new technologies to fault or change those priciples. The only thing that has changed is scale.
Everything the Obama administration has proposed includes all of the above.
For some reason, the libertarians like to ignore the fact that this would all be done under a court order.
Why should we believe them?
What's wrong with conspiring to commit crimes? Also, I would imagine that the US First Ammendment explicitly (if I understand Brandenburg v. Ohio correctly) protects such speech, as long as it doesn't result in "imminent lawless action". So it's OK to conspire to commit crimes, as long as you don't actually commit them.
Edit: Reading Wikipedia , it looks that mere conspiracy is not enough for a conviction, but an "act in furtherance to committing the crime" is necessary.
Yes, that's exactly what you do. I would much rather we let a few (more) crimes get through here and there. Weakening privacy for everyone isn't worth it.
That's just how society is.
Your choice on whether you want to live with the rest of society or not.
Just making conspiracy illegal doesn't mean you don't value privacy; it depends on what means are permissible to obtain evidence for the conspiracy.
The reality is that security is also suffering when you weaken encryption. Do you want people to be able to track your children? Do you want criminals to know when you are home? Do you want stackers reading messages of their victims? Do you want rapist to know where you and your friends go camping?
These are just things outside the problem that their is a multibillion doller industry depending on encryption in the internet.
With that kind of reasoning, you'll favor research for medicine against rare 100% lethal diseases instead of 20%-lethal cancer...
No, because in free societies you shouldn't have absolute security.
Life sucks that way. People aren't their own little countries.
Is this where you rip on Libertarians again and talk about "your army".
From that, there is also meta studies/books that has asked what the studies has shown. To take an example, the book freakonomics pointed out that out of all the crime reducing strategies, abortion laws has shown to be one of the best method. Unwanted children tend to have higher risk of being born with low social and economic status, which is then a strong predictor for crime. None of the other strategies that were looked on had any major effect on crime rate.
Then we have surveillance. Government studies has shown CCTV is effective to reduce vehicle crimes, but no evidence is found that it prevent violent crime. One study concluded that better illumination could be a cheap way of cutting illegal activity to the same degree as CCTVs.
Surveillance in stores and bank has shown a very unexpected result. It has proven useless in preventing robberies, and shoplifters are generally not bothered by them. However, studies has shown that employees accounts for 43% loss of revenue from shoplifting and that is reduced by surveilling. Banks was also one of the early adapters who noticed this.
Then we have studies such as the last month from Harvard, which said that the available information that police investigators has when investigate crime is increasing at a very fast rate. "The trajectory of technological development points to a future abundant in unencrypted data," the study said. Comparing today with the days before encryption, and today is much easier time for a police investigator.
From the article:
CLARKE: No, the point I'm trying to make is there are
limits. And what this is is a case where the federal
government, using a 1789 law, is trying to compel speech.
And courts have ruled in the past, appropriately, that
the government cannot compel speech. What the FBI and the
Justice Department are trying to do is to make code
writers at Apple - to make them write code that they do
not want to write that will make their systems less
Of course the FBI used a terrorist attack to try and get what it's always wanted, and it will abuse the unlock power in the future if it gets it now, but judges could easily cite this case as a defense for the government to compel other action from the people.
Clarke makes it sound like there is court precedent against this compulsion, but that would be overturned if the FBI wins.
Indeed, encryption and privacy are very important, but our very liberty is more important.
"The US spends more than $500 million per victim on anti-terrorism efforts. However, cancer research spending is only $10,000 per victim."
That's basic risk analysis
320,000 people dead due to traffic fatalities over 10 years.
320,000 people dead due to traffic fatalities over 10 years causing little political fallout.
Bluntly, to shift the risk argument we'd either need to take politicans out of the decision making and give it to some independent fact-based body, or get the media and people in general far more riled up about car crashes.
What was revealed a few years ago was the fact that big tech companies betrayed people's trust. So quite naturally they should attempt to regain that trust. Because if majority of people stop trusting tech companies and start using end-to-end encryption, use of encryption stops working as a signifier that indicates a higher likelihood that the user's doing something wrong. Thus it's crucial to keep ordinary people away from using encryption. In order to achieve this, it's important to make people trust big tech companies again.
In my opinion, this is what the writer of the plot of the dispute between the FBI and Apple thinks.
"You don't need encryption"
It's not the bill of needs. I was born with these rights. This is the danger of eroding the constitution, the arguments can be used against whatever issue you want. If we want it changed, do it the right way and pass an amendment. But please, protect the integrity of the most important document we have.
Time and time again, their argumentation are not particularly persuasive.
I don't doubt the existence of terrorists, but it seems that they are more boogeymen rather than an actual threats.
And when it came down to it, the power of terrorists is to inspire fear, rather than kill people. They can change us because we felt the need to change.
For some reason, especially in the U.S., "leaders" have to appear strong. When there's no war going on, they have to start one. Terrorism is easy. There's really nobody to fight, but you get to fight them anyway. Politicians LOVE this.
Law enforcement loves it too. They get to trot it out as an excuse to lengthen their leashes.
China, Russia, Saudia Arabia, all forced Blackberry to turn over their encryption keys long ago.
US politicians should set an example and say we are NOT going to be like China and Russia and other repressive regimes and that when people's lives are literally on their phones, they have a reasonable right to privacy and protection from search and seizure, you know like in our constitution but ignored everyday.
The next logical step is to outlaw phones and devices that are incapable of breaking into. Then they'll make it so you need a license to employ cryptography.
It sounds crazy, but where we are right now would have sounded just as crazy 15 years ago.
Welcome to the 90s: https://en.wikipedia.org/wiki/Export_of_cryptography_from_th... 
> It sounds crazy, but where we are right now would have sounded just as crazy 15 years ago.
Hardly. 15 years ago, the US were just barely past their encryption export ban, and we had yet to deal with the (still ongoing) fallout from it.
 although back then the US only tried to backdoor or ban strong encryption for the international market, not for the domestic one, for simplicity reasons the domestic versions of exported products often used "export-grade" (shit) encryption
The world was so different then that the analogy wears thin. It was mostly client-server, the web was just taking off, and vast cloud server farms weren't even on the horizon. As you noted, the laws back then weren't for creating crypto -- it was for exporting it. At least in the states, we saw a healthy market for all sorts of new crypto tech: DES, AES, and RSA started in the 90s. (RSA became public in the 90s).
Note that I'm talking from the viewpoint of the average developer making applications. The business side, the international side, and the exporting mess? Yes, it's very similar. My comment was about changes Joe Dev is seeing now. The 90s was "write it, but only sell it locally", the 2020s are likely to be "don't write it unless you have permission", which is a completely different can of worms.
Agreed that the development community as a whole is still recovering from the 90s. The damage we're doing right now will take as long or longer to recover from, if we ever do.
..but since we're referencing the 90's: If the feds succeed in gaining the IOS source and signing keys I would say it's more like Phiple Troenix 2.0.
I'm not from the US, so there may be a difference if you're a solely a domestic US developer, but from outside the US the distinction is pretty much entirely academic.
You already need one to export certain goods / to certain countries.
Ok so replace "ankle bracelets" with "GPS/cell triangulated device" and it's a ridiculous example because what, things that are already real aren't really "examples"?
Isn't it true that encryption legislation or policy is sort of irrelevant next to the very clear math that says encryption will always be ahead of decryption? Even in a (hopefully avoidable) dystopia where encryption is illegal, would that really stop technology companies from continuing to do what they've always done?
John Oliver has a great segment where he notes that the majority of cheap, available encryption applications aren't even US-based, and so it becomes nigh-impossible for our (or any) government to stop any pedestrian from encrypting.
And because of that, outlawing encryption is really outlawing math, which is ridiculous. Math is a universal API everyone has access to simply by existing. You can't outlaw math.
So, if Doctorow said that, he couldn't be further from the truth. The universe seems to do everything it can to make security difficult via physics itself. Throw in economics and biology (evolving malicious attackers) to top the argument off.
A priori there's only 1 correct plaintext, while there are limitless chipertexts of any given plain text (assuming arbitrary IV lengths and key). You can't change that and this is basically what makes encryption so much stronger than decryption.
Add it all up to say that, outside a few products, your security mechanisms from CPU go crypto arent secure. Physics and intrinsic complexity work together to ensure this. Systems fighting all of it have less features, are heavy, more manual steps, less battery life, and cost several times more. Economics takes over there where physics leaves off.
"A priori there's only 1 correct plaintext, while there are limitless chipertexts of any given plain text (assuming arbitrary IV lengths and key)."
A priori there's electrical signals going through analog and digital circuitry that implements a form of it with malicious hardware, software, or networks connected to it. There's tons of ways to intercept or leak those secrets. These are not in the formal model of crypto. Once included, the picture changes considerably and leans my way.
The fact that our computers are too unreliable to be trusted with encryption does not mean that the universe does not favour encryption.
Unless you constantly keep inventing malicous hardware or hidden 'observers' in the paper and pencil scenario there's no way you can say that decryption is easier than decryption.
re paper encryption
That was defeated regularly in the Cold War in a number of ways. Easy or not, the mathematical proof didn't translate directly into the real world due to human issues and physical ones like intercept or observation. FBI's crypo unit has been defeating custom pencil and paper ciphers of criminals for a long time, too. So, we can say the best, provable encryption makes the job more difficult if no observation of the act of encryption, KEYMAT, or decryption take place. That's a lot more limited than mathematicians pronouncements imply. ;)
"universe does not favor encryption"
Oh, I think it doesn't. For one, encryption only happened one time in known universe that we know of. When it did, it screwed up more often than it worked. Then, even the best forms are defeated by stuff above thanks to other properties of the universe. Universe seems to favor plain text to me. Its own codes are plain to observe, too. Obfuscated at worst.
That was a nice dismissal but computers are the whole point, right? We talk encryption that we're going to use on a computer most likely. Then someone says some stuff like how we can trust the math. Then I have to point out we run electrical impulses representing machine instructions, not math. Then the conversation drifts to pencil and paper or arcane stuff.
At least you admitted we can't trust the math on a computer because it doesn't represent what it does. Often not on pencil and paper either or in speech if under surveillance. So, we can't trust the math at all. It's always math + all kinds of circumstances and methods. Even then, we can only trust it with probability C as in odds of Compromise.
One of the early sutras put it this way:
> "Discrimination is consciousness. Nondiscrimination is wisdom. Clinging to consciousness will bring disgrace but clinging to wisdom will bring purity. Disgrace leads to birth and death but purity leads to Nirvana."
Encryption gives the means by which we can enable privacy between ourselves, or what we think of as self. If we enable complete privacy from all others, we drop into a self-world. If we disable privacy, and join all the others disabling privacy, we drop into an isolated type of Nirvana, with the implication everything becomes quite boring. I have compared this in the past to the observed push and pull of public and private cloud business models.
One solution may come via virtual realities where we can arrive at consensus in a fair and measured way without centralized control. It is my belief that immutable data structures backed by encryption, such as a blockchain, are the path out of this mess.
Here's Alan Watts talking about this: https://www.youtube.com/watch?v=lBOcFwUzIIQ
How does everyone carrying phones not already make this the case?
Yes, whether to have these devices is technically a choice, but when the social cost of choosing not to have one is so high, the choice is made for you.
So, yes it's a "choice": I can have my privacy or I can have my job.
You do need a mobile phone these days because (in The Netherlands) the government requires SMS authentication for some services, but I don't feel the need to own and carry a smart phone with me all the time.
Real crypto needs to be more compartmented than that. A bank is not secure because of the massive door - it's safe because it would take a thief weeks to empty every safety deposit box.
It's also made even safer when the key is (more or less) thrown away for periods of time and nobody can get it. Even with manual over-ride. Literally somebody could be dying inside the safe and nobody could save them.
In properly implemented crypto nobody should hear you scream.
Weakening end point security is certantly not as bad as going after tls (for example) but its still a vital piece of our trust chain.
And the smartphone will grow in importance as an authentification factor and that makes it even more vital.
Just thinking it _should_ be much harder to compel individuals to do something like this than it is to compel a corporation.