The second major side effect will be essentially making writing software into a completely corporate thing. Either the official algorithm is a secret (Remember the Clipper Chip and/or Skipjack?), or writing implementations becomes work that only a very trusted few are allowed to do (see inevitability of snooping above). Currently it's very difficult for medium to small companies to do the paperwork for security clearances. It's not going to be any easier to get programmers certified or bonded or whatever to work on official algorithm implementations. This means slower introduction of any innovations, and any buggy implementations will be in use for a long time before getting fixed.
"Banning end-to-end encryption" implies banning/backdooring all serious encryption, reviving the long lines of failed attempts but in the context where serious encryption has wide, wide, wide adoption.
Earlier plan have quickly shown to be entirely impractical/counter-productive/corrosive. If you somehow enforced that, you can count on someone else besides the NSA getting the key. You would have to agree among all the world's states who gets the key or just create a "great US firewall" plus discarding all the infrastructure large companies have spent billions creating. You'd create a situation where "only outlaws have encryption", etc.
What other plans can we come up with? Maybe put software engineers in camps along with the refugees?
So much bad prose on the internet, it seems feasible to make this hard to distinguish from human-written text. Plus you could have multiple codings for values, allowing non-deterministic encoding but deterministic decoding.
How do you prove that the giant integer which represents this text is a random result of my thoughts rather than encoded information?
In the end, it's impossible to police the use of strong encryption. Things like Waste and many others saw this coming back in the "Clipper chip" and PGP trouble days. What would end up happening is people just start appealing to OTP. Well-designed OTP is very resistant to attack.
This all reminds me of Neal Stephenson's seminal book, Cryptonomicon, one of the best IT fiction books ever written. If you have an interest in encryption, coding, data havens, currency, etc., this is the book for you. It's a veritable tome.
* Editing to say that despite Cryptonomicon being written in 1999, I still think it holds its own even now. Kind of a timeless classic as far as IT goes.
Not a single one of these was ever a stable democracy before becoming a dictatorship. Unless maybe you're referring to Vichy France, but that's more of an occupation.
Although, allegedly, Amazon is making Snow Crash into a miniseries, and again allegedly, Tom Howard is directing a Seveneves movie. Weird, that, because I didn't think Seveneves was "finished." All that just to set the stage for Elves, Dwarves, and Man, but no followup stories of Elves, Dwarves, and Man? Weird.
The US government can totally easily reverse that trend.
And it matters a lot.
People are allowed to freely associate in corporations and pursue their economic goals. Canada is smaller than California, population wise and GDP, and most of your income comes from natural resources like copper, coal, oil, and gold.
Non-Americans can't stop visiting American websites. Hmm.
There's a lot packed into that statement. Without switching into an off-topic discussion, I'd suggest that it's a lot like GPL vs BSD licenses - there's more than one way to define "freedom", and one can focus on results vs options, on groups vs individuals, etc.
Only to people with an agenda. This garbage way of redefining words to suit your predetermined policy goals is ridiculous. My comment is severely DVd but nobody supplied an example of a more free country.
All tech companies are HQd in the US for a reason. Stable private property protection and individual freedom is most prominent in the US.
The collectivists lost the argument that had been ongoing since the 1920s when the USSR fell, and some folks want to have this argument again (even when they already lost).
How many times is Europe going to fine big tech for exercising their natural rights? We need at least one country with minimal rules in the world.
If people want to communicate using encryption, they will do so. Maybe not with today's algorithms and implementations, but they will continue on. OTPs, new algorithms, hiding stuff in plain sight, etc. I think steganography is too obvious and too easily discovered.
EDIT: I wholeheartedly disagree with restricting E2E and believe in freedom of speech as strongly as one can believe in any explicitly enumerated right, I just don't agree the text of the First Amendment covers this, and it's hard for me to imagine any of SCOTUS coming up with an argument supporting it.
But Citizens United doesn't rely on a broad interpretation of freedom of speech at all. It relies on a broad interpretation of freedom of association. Supporting various political views is "speech" by any possible understanding -- that's not broad at all. The main "innovation" of Citizens United is that when people form associations (like corporations) they retain certain rights, such as freedom of speech.
A better analogy would be if we already all agreed that individuals have the right to send encrypted messages, and we were discussing whether corporations retained the same rights. Then you'd have a point, bringing up Citizens United. But when discussing whether people have some underlying right at all, it doesn't seem relevant.
Of course, people will point to X law or precedent and say "see that justifies it'. They don't see how that law should be invalid by the very logic that the state created for itself.
Would the Citizens United case apply hear? I'm thinking about corporate personhood arguments.
The term chilling effect has been in use in the United States since as early as 1950. The United States Supreme Court first refers to the "chilling effect" in the context of the United States Constitution in Wieman v. Updegraff in 1952.
It would be like federally outlawing clothes to make sure no one in America has a gun. Also, and aside from the whole premise being unconstitutional, shallow, & generally kind of silly, enforcement would be just about impossible.
Different judges are looking at it in different ways. I say assume they can search it or hold you in contempt for the password if they intercept or find it. Then, you shift the goal post to them not being able to do that.
I think it's a bit presumptuous to go that far. You may believe that it is, but it's hardly an open and shut case, and I'm pretty sure the SC may have something to add to the discussion should it get that far.
Not sure I understand; of course it's fine to share your opinion. My opinion is that yours is far too simplistic and completely ignores the nuances that always appear in constitutional cases such as these.
>Also, the Supreme Court may very well weigh in and may very well be wrong on the law.
Sure... but I'd put more stock in their opinion vs your own.
I urge you to search for "sincerely held religious beliefs" as there are an abundance of educational documents and case law related to this subject and this format will probably not do it justice.
I am from the UK but my understanding is that you cant incriminate yourself which would likely be an issue here?
Trump certainly can't block Americans from selling cryptographic algorithims (or anything which implements those algorithms) if the consumer is also american.
But preventing the communication of those algorithms to foreign nationals is something that was done for several decades.
I think its a terrible thing for him to try to do, but he would have some legal grounds to stand on, provided the restrictions are related to foreign nationals only.
Unfortunately, you can't put the cat back in the bag. It's no good putting export restrictions on encryption when the tightest encryption we have at present is already available all round the world.
Talking about judges as being "assigned" is both inaccurate and betrays an ignorance of the American system.
They ruled states have the freedom to design their own voting districts. State sovereignty has always been a thing in the US.
I would encourage you to read the historical documents of the country where many of these constitutional arguments were had: The Federalist Papers
I 100% agree freedom of speech protects encryption, plain as day; I'm justifiably skeptical that a court would see it that way.
It does increase the chances that your communications will get out to third parties besides the government, but not for the reasons given in that paragraph.
The way you would actually implement this is not by putting a "loophole" in or "watering down" the encryption between the parties. You'd do it by adding the government as another party. E.g., two party end-to-end encrypted messaging channels become three party end-to-end encrypted chats. N party end-to-end encrypted chat channels becomes N+1 party end-to-end encrypted chat channels.
To a third party, such as a stalker or abusive ex-spouse, who is trying to eavesdrop on your messages these N+1 party systems are as secure as the previous N party version.
The increased risk comes from the risk that the government won't be able to keep its copies of your messages secure after it securely receives them. Presumably they will go into a database somewhere, from with they will be made available to law enforcement and intelligence services. That means that there will be a bureaucracy around operating and accessing the database, and given the number of different jurisdictions involved and the likely frequency of access requests my guess is that this would need to be a large bureaucracy. A large bureaucracy dealing with a large amount of sensitive data is just asking for trouble.
Or how about simply not wanting the government, a third party, to be able to scrutinize your private conversations?
Adding the government as a passive 3rd party to an e2e encrypted messaging service is pretty hard to do technically
... and it wouldn't necessarily give the government access to the messages. To read the traffic the government would need to recognize that a targeted party is communicating and have the right infrastructure in the right parts of the internet to intercept the traffic.
1. What happens if the government starts recording after the session has started. Do you build the crypto so that the government can join at any moment or only during session establishment.
2. All this intercept equipment would be expensive and unreliable. ASes may use encrypted tunnels between POPs or change which ASes it routes the messages around.
Using China as an example the way this would proibably work is that the company that offers the messaging service would have all messages pass through their servers and then they would read messages in plaintext and save them in a database (think gmail). The government would be given API access to the database of all messages. Of course these datasets would then also be shared for marketing purposes, employees would read or steal the messages and hackers/foreign governments would also get access.
Does "anti-circumvention clauses" ring a bell? Or illegal numbers?
> Also, how do you oversight that the government is not abusing the system to track political activists for example?
There's something called "separation of powers" for that. That means the executive power (the government or the cops) normally have to ask permission to the judiciary power (the judges) for that kind of thing.
the implementation is bungled because it involves 3 parties all not making mistakes instead of 2. Maybe 3+n vendors all being perfect.
Example, The keys the government has are compromised because they or a vendor they used had poor opsec.
The ebook is free (with suggested donation) on his website.
> For the purposes of this essay, I’ll call it ‘ambient privacy’—the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered. What we do at home, work, church, school, or in our leisure time does not belong in a permanent record. Not every conversation needs to be a deposition.
The article is presented more in terms of JS/app trackers, but I believe the analogy still applies.
I also like the bathroom door analogy. Everyone knows what you are doing, but we still insist that there is a door.
There's nothing stopping the authorities from doing this now, and in fact, it's even easier now with the level of miniaturization we have now.
Obviously a business can not survive by this logic. Rather, implement systems that permit end users send whatever text they wish, potentially including obfuscated text.
An example of business that have done this for decades would be all the amateur radio manufacturers. It is illegal to make a HAM radio that operates outside of the HAM bands. But... clip one diode, or hold down two buttons and power on the radio, presto, all frequencies unlocked, radio in "debug mode". Certainly similar logic could be implemented by clever people here.
"One has not only a legal but a moral responsibility to obey just laws. Conversely, one has a moral responsibility to disobey unjust laws." --Martin Luther King, Jr.
And everybody needs one of those, right? Yet another perfect excuse to spend a lot of money on CNC machines.
I've been tempted to get on a proxy, download the file, print out a few copies of the source in book form, and slip it in our local libraries.
"End-to-end encryption" isn't really some fancy new technology. It's just the combination of communication plus encryption. If you and I share a password, I encrypt some data, email you that data, and you decrypt it, that's end-to-end encryption.
So what would be banned? Would you not be allowed to email encrypted files to someone? That seems implausible. Would it be legal to email encrypted text files to your friends, but illegal to build software that automatically did both encryption plus email?
That seems pretty weird, making it legal to do a thing but making it illegal to build software that made that thing easier.
They get blamed for abridging worker productivity in trade for purported safety.
What if it stopped the next 9/11? What if it didn’t do anything at all?
How are you so certain of anything? I’m skeptical of everything. Read, verify, read counter arguments, formulate an opinion.
Maybe the intel agencies should publicize their accomplishments more if they want a better reputation. 10+ years after some terrorist plot is foiled should be enough time for it to be safe to let the public know.
It will be an executive order that requires companies with a communications product and over 500k users to implement lawful intercept protocols for communications between any two users.
Most likely companies will repurpose their GDPR tools to provide standardized exports to authorities with legal intercept requests. The regulation will be written in such a way that it precludes end to end encryption as an option, rather than forbidding it specifically. Failure to comply with lawful intercept requests will result in a fine with high interest.
I mean, not even just e2e, just the normal ubiquitous encryption we all use is a "gift" from relaxed export restrictions along side non-enforcement. Some app stores still ask you if you obtained the export license or qualify for an exemption.
The US government could just start enforcing that, fund the agency to enforce it.
It doesn't matter about what it enables to citizens, no organization operating in the US would offer it because it would break their global business.
It doesn't actually work very well, since terrorists can hide, gangs can organize, drug cartels can flourish, and society is becoming more disorderly.
Weak or broken encryption privacy in order to continue a failed approach is not the answer. More physical security, more local law enforcement, and more local human intelligence to prevent terrorism and crime is a better alternative. It is better to be physically protected from bad actors than to have a technologically sophisticated process for apprehending perpetrators.
One basic implementation is that after some closed-door conversations, your specific instance of WhatsApp receives an update which either compromises your keys or performs some background E2E of your plaintext conversations (as stored on your device) with whichever government/law enforcement agency made the request.
Rather than adding the government as a third party to all communications at once, this is a nice easy first step for them to take.
What, if anything, is the government hoping to do to stop people from using GPG/PGP? I mean i guess you could force companies in the US to not ship phones encrypted by default, but they could just sneak a menu asking if you want it encrypted before the first power on.
How does the cat go back in the bag after 2015? Snowden has basically put the word "encryption" in the mouth and ears of every American. We know the government spies on us. We know it uses this information to hold theater trials of people it just doesnt like.
How do you crack down on math?
Free software can violate the law because it can be anonymous. Corporations have addresses, and employees, and owners. They're legal entities and can only break the law when the government allows it.
Our phones are built and solve by corporations, running software installed by corporations, and we download apps from corporate-controlled app stores. Do you think Google or Apple can just say "no, we're going to violate this law" and keep operating that way freely?
You can't stop a hacker from encrypting, because we can root our devices and install our own software, but encryption is only used by more than a couple percent of the population because it's easy. If you can't encrypt your communications without rooting your device, 99% of the population won't encrypt.
> How do you crack down on math?
Funny story: back in the 90s, whenever you downloaded a web browser, you had to promise that you lived in the US if you wanted it to support the state-of-the-art encryption, because encryption--math--was legally considered a weapon so you couldn't "export" it.
You crack down on math by arresting people using the math. It's pretty straightforward. Remember the illegal prime number that was DeCSS?
Same way you crack down on anything else. Round up people and throw them in prison if they break the law.
Sure you cannot stop people from using encryption but if it's a criminal offense to incorporate it into a consumer application then you could easily go after the company, it's executives, developers, or even users. The first three of those is already enough to have a chilling effect on the rest of the industry.
Law is all about interpreting and enforcing ambiguous rules.
1. The NSA probably has the ability, one way or another, to break E2E encryption. This is a symbolic move attempting to soothe the masses into thinking their communication is still secure.
2. This is vast government overreach trying to stifle personal privacy. Such a bill is idiotic not only because it runs counter to the US Constitution, but also because enforcing such a bill would be virtually impossible.
If Apple were somehow forced to drop e2e encryption I would honestly expect Tim Cook to step down and run for president. He has an incredibly hardcore political side.
As far as law enforcement goes, it reminds me of a favorite Futurama bit - Some cops can read minds... Some cops can see the past... And some cops get help from angels... But there's still one cop with no special abilities whatsoever. To solve this crime he'll have to FIND OUT.
* Honest Government Ad | Anti Encryption Law - YouTube || https://www.youtube.com/watch?v=eW-OMR-iWOE
It would be difficult to enforce such a law without implementing network filters.
End-to-end encryption is where the service provider has no ability to decrypt any of the data that passes through their servers. Only the endpoints can do that, which (usually) can't be tapped without the endpoints' knowledge.
Or maybe they'll only enforce the rules on "chat applications" or things the general public uses?
I guess they're talking about this right now.
Strong encryption was illegal for years. It didn't stop people having strong encryption. If they make it illegal for American companies to make software with unbreakable encryption then that will leave the door open for companies in other countries to fill the gap. The best algorithms are all public so it's not like it would be that difficult.
Problem is I do not see any reasonable way to get around this as a large chonk of the voter base is old, un-educated, tech-illiterate people... and politicians tend to cater to whatever they want (which is mostly law-and-order crap such as this).
This is becoming an increasing problem as legislators get older and technology becomes more important to society.
What I do think we might see is some kind of corporate incentive (secretive or not so secretive) to effectively push normal (non-computery) people into using non-E2E services. Google is already doing a fantastic job of this.
Maybe it's time we change the language and stop letting government officials label individual rights which are working as intended as "roadblocks" into government investigations.
> experts generally agree that Congress is unlikely to pass a bill requiring warrant-compatible encryption
Boo hoo. That's the system working as intended. Do your job correctly and stop making an enemy out of the People and you'll find requiring a warrant isn't a "roadblock".
Lawbreaking is one of the means that people have to fight against an unjust system.
I don't think most people have the innate sense of wanting it to be possible for everyone to commit crimes; I think it's much more selfish in that they just want themselves and their associates to get away with committing crimes.
Who are those people, exactly?
I run in mostly right-wing circles (online especially), and the vast majority of people on the right seem to be opposed to the "War on Drugs" on the grounds of individual liberty. I can't imagine many on the left are in support of it - am I wrong?
I don't think it's nearly as popular as it might seem.
I think the OP is calling some these people out as hypocrites who demand "touch on crime" policies that disproportionally affect poorer communities, while they themselves break laws.
That aside, I find the hypocrisy elsewhere, namely the people who want small government but who want everyone to pay billions for mass incarceration (and raise taxes for more police) as a result of these policies.
You might run in right-wing libertarian circles, but I'd speculate that if your experience was based on the religious right you may perceive higher levels of support.
I do agree with you though, it's lost a lot of momentum over the years and now opinion seems to be that it's clung onto by law enforcement and the lobbyists of the companies who supply them because it allows for their continued militarisation and supports a significant portion of their budgets.
Sometimes things are illegal, and society changes and decides they shouldn't be illegal (interracial marriage, homosexuality, alcohol, marijuana, etc)... if people couldn't 'dabble' in the illegal thing at all, there would be no way for people to learn enough to decide that the laws are wrong and need to be changed.
What am I supposed to do if I ever come across a cache of information revealing the literal Holocaust 2.0 happening in China?
If the Holocaust had been discovered by a US citizen who broke intercepted German communication, would we have allowed Germany to persecute the civilian for hacking their infrastructure?
The ability to break the law, and get away with it is not just vital for the functioning of society, but for social progress to keep advancing.
Gay rights would never have been won if it were possible for law enforcement to jail all gay people for existing (sodomy laws, etc.), for example.
"Show me the man and I’ll show you the crime" - Lavrentiy Beria
So the right to privacy (and therefore the ability to break the law) is one of the most important checks against government.
It's said out loud all the time (or maybe my filter bubble is less authoritarian). The problem is that politicians seem to be using the cesspools of Twitter, Reddit, etc. as their primary data feed.
I'm not convinced a very dark future is preventable
The lie detector in the regular process is when the federal employee investigates whether the answers on your questionnaire match up with public records and in-person interviews.
The polygraph is security theater intended to intimidate the subject and possibly reveal previously undisclosed issues by provoking a stress response. That's why they keep using it.
Most of the time, comments about “lie detectors” are a reference to polygraph tests, which only apply to an extremely minor percentage of the overall cleared workforce; I just wanted to point that out, that it’s not quite as bleak as implied by the parent.
The lie detector in a polygraph test is always the human running it, and they're about as fallible and unreliable as anybody else, with respect to determining honesty. They could just chuck the machines in the trash and call it a "veracity interrogation", but selling the machines and training the people to use them is a better money sink, and gives more ass-cover when someone invariably deceives the investigators. "Trained to beat the machine" sounds better on paper than "really good liar".
Security theater needs its props.
As far as I know, only those working in secure compartmentalized facilities and with high-value assets ever get polygraphs.
If you hold a key and wait by a coded terminal in a nuclear missile silo, you get one. If you reduce and analyze anti-ballistic missile test telemetry, you don't. If you write systems code for submarines, you might get one. If you write route-planning software for in-flight refueling tankers, you don't. My guess is that it ultimately depends on how much Country X would probably pay you to borrow or copy your access. If it's above $Y, they do a little more to scare you into being a good little guardian of the nation, and hope you're not another Snowden.
They just have way too much need for cleared personnel to spend enough to actually make certain, for everybody. Doing it correctly always costs more, in time and money. Why do it right when you can make it look like you did it right, and get paid the same?
I really don't think there is much hope. He just thinks everything works out in the end when in reality it is people fighting tooth and nail and giving up their lives to fight for this shit.
Who's read The Truth Machine? https://en.wikipedia.org/wiki/The_Truth_Machine
See https://www.eff.org/deeplinks/2016/04/burr-feinstein-proposa... for the previous bill.
The concern is that law enforcement apparently can't understand that "Warrant-compatible encryption" is an oxymoron.
If encryption is a munition (if privacy is a weapon) then the 2nd Amendment is in play, eh?
(edit: WWII, Turing, Bombe, Enigma.)
It sure does, just like my front door requires a key :)
Finding the answer to law enforcement being able to make these warrants useful while also maintaining rights is complex. Seeing a terrible solution as terrible is simple.
The current state of affairs is a clear violation of one of the US’s founding principles: “innocent until proven guilty” (and also the 5th amendment).
The ability, power, and right to investigate crimes is certainly reserved to both the states and the Fed. Government in the Constitution.
The fact that modern life typically entails recording and documenting almost every single thing you do during the day, and yet law enforcement complains about “going dark”? Utter nonsense.
People have never been more widely tracked, or more widely accountable. The absolute last thing we have is a privacy epidemic.
The government has the right to investigate crimes. They do not have a specific right for defendants to keep records of what they did wrong or to be provided that evidence, outside the context of business regulations which require specific paperwork be maintained.
And by the way, the Constitution specifically enumerates what powers the Federal government has. All Else is reserved to the States, or to the People.
Which is why we weren’t supposed to need a Bill of Rights in the first place. The Constitution doesn’t enumerate Speech or Assembly as something the government can control, therefore it is not. However the Framers didn’t all agree we shouldn’t go and list a bunch of things just in case, except that it might make people think if it’s not listed then maybe it’s not a Right?
And the entire concept is backed by State violence. People are only compelled to testify under threat of violence in some form. This form used to be open torture, but we've found ways of outsourcing the torture to other criminals in order to wash the State's hands clean of blood.
Sometimes we need to catch a bomber. If you ask me, that problem is only getting worse and "sometimes" may turn into "often". However, we can't risk a journalist or political activist being caught in the jaws of the system while attempting to expose the State's secret crimes. And less severely, the State doesn't have the right to know what your shopping list was last week.
The entire crux of this argument revolves around the fact that at one time it was easier to do this stuff thanks to wiretapping laws and banned encryption, but now it's harder. However, the wild west of the 70-90's isn't the base line for sensible policies when it comes to digital intrusion of your life by the State without reasonable cause.
I say this below, though, when people refuse to serve access to a lawful warrant, there are now multiple cases where people will be held in contempt of court essentially indefinitely, likely for life. The court is certain that the information necessary to convict them exists, but is being vexatiously withdrawn by the suspect.
I don't really think that's a good end. I think holding people in contempt that long is a system failure. What do we do about that? Just release people when they hide evidence that they can still have full access to once they're released?
I think it's been easier to do this stuff since the founding of the government, but our government does have certain powers, lawfully enacted by the wills of all of the states for the betterment of society. If some of those powers can't reasonably be enforced, and other, less lawful, more coercive powers arise in their absence, I think everyone loses.
So we must accept that a certain amount of State violence is necessary to keep the peace. Think of a schoolyard Bully going unchecked; it takes someone standing up to them and enforcing the threat of future violence in order to end the actual occurrence of violence.
But there is a clear legal framework here:
I don't have to produce any materials to the State without a warrant acquired by due process (and not parallel construction). This is my protection from unwarranted search and seizure as outlined by the US Bill of Rights.
However, I don't have to tell you where it is. I can't prevent you from taking it, but I have the right to not have to testify. Because the enforcement for my testimony comes from State violence, and we learned from the Dark Ages what happens when we allow the government the power to extract information with force.
It would be quite ironic if, in our quest for not "going dark", we enter the Digital Dark Ages.
This seems like a pretty clear distinction to me that could be argued in court. But, as I said... not a lawyer.
A great solution in many of these cases would be to work alongside companies like Apple to implement more products like Face ID. With Face ID, they can take a few pictures of a person, 3D model their face, open the phone. It's an easy thing with the right resources. It's a technological equivalent of interrogation. The same can be done in other contexts, they just need better hackers, better investigators, and more resources on their side.
Do you see what I'm getting at? You came up with warrant-breakable encryption, but didn't call it that. This is why it's not an easy question-- the solutions are nuanced, and some are a lot uglier than others.
Note that Face ID is intended to be resistant to this kind of attack.
People are acting as if unbreakable encryption and secure computing is a given. It isn’t. It is extraordinarily difficult and in nearly all use cases not foolproof.
The warrant excuse is fairly ludicrous anyways. We now have vast information pools of where people have been and what they were doing. We have DNA evidence. Law enforcement has never been easier in the history of human civilization.
Encryption is reversible. The information encrypted can be retrieved, changed, shared, etc. It proves itself potentially continually useful to the person who would hide evidence.
Just as in burning with a fire, that suspect can be charged with willful destruction of evidence or obstruction of justice, contingent on mens rea.
The judge may be unlikely to believe that the data is irretrievably encrypted, just as they may be unlikely to believe all records were burned.
Should it be illegal to burn a piece of paper?
The FBI and our police forces have been complaining about the 'going dark' problem for years. But in reality, it is far easier today to spy on or collect data than it has ever been before. If the argument was that our data collection was substantially lessening, maybe we could have a discussion about that.
But data collection is substantially expanding, and the FBI is essentially saying, "yeah, but it's not expanding fast enough." The FBI is phrasing this as a complicated question because they're saying it's a choice between the status quo and "the FBI can't listen in on anything." But in practice, if you look at the direction surveillance capabilities are heading, the real choice is between the status quo and "the FBI can listen in on everything."
And that's a really easy question to answer. Of course we shouldn't give up all of our freedoms just so the police can always access everything they want, all the time. Of course people should be able to exercise personal freedom even if it occasionally means a police investigation is hampered.
The FBI wants to phrase this as a choice between encryption and anarchy, but given the direction government surveillance is headed, encryption is the middle ground position. What the FBI wants is the ability to access anything they can get a warrant for, no matter what, for any reason.
And I guarantee once they got that, the next "complicated question" would be whether or not warrants were hindering their investigations too much.
I agree with you, this is a problem. What I advocate for to anyone who'll listen is to encourage more use of warrants, and sharply curtail the use of subpoenas. Subpoenas allow for bulk collection, and much much much more violation of the 4th Amendment happens under bulk subpoenas than it does under warrants.
That doesn't make sense to me. If your argument is that every warrant should be executable, no matter what, then both fire and encryption block warrants.
If anything, fire is worse, because at least encryption is reversible. If you burn something, we can never get it back -- that makes fire way more risky and dangerous to law enforcement investigations, so I would expect it to be much more highly regulated.
The canonical example here is child pornography. Consumption and distribution is a crime. If a person destroys it before the state sees it, well, that may be destruction of evidence, but the state will never know. If the person can have full access and go back to consumption and distribution, then justice can't be done, and most people agree it should.
I don't think that any warrant should be executable. I'm saying this problem is complex, people are coming up to ways to address it, and they're not the utopic visions of personal privacy-- they involve unconvicted detention, coercion, etc. Miscarriages of justice.
This is sort of going back to what I was saying in my original post though. Encryption isn't a utopic vision of privacy, it is the middle ground. It's not harder today to catch child pornographers than it was before the Internet existed -- it's easier. The fact that we have encryption is being balanced out by the fact that we also have video surveillance, and remote exploits on Internet connected devices, and more payment tracking.
The FBI is not looking for a sensible, middle ground between privacy and bringing people to justice -- the argument they're making is, "if even one criminal escapes, that's too many."
Again, if we were in the opposite situation, if encryption did really mean that we couldn't catch criminals any more, then I'd be more inclined to say this was a complex problem. But the 'going dark' problem is largely a myth; with the advent of the Internet, it has never been easier to monitor citizens or access their data.
This is only a difficult question if it is genuinely a question between anarchy and encryption, but we're not even close to that being the choice we have to make.
You may not be of the opinion that every warrant should be executable no matter what, but when the question is put into the context of modern surveillance, it seems obvious to me that this is the FBI's stance.
This is always the example used to justify anything unreasonable to suspects. For me it's an automatic argument loser, like Godwin's Law.
(If I have a DRM encrypted file that I don't have the keys for, am I now committing two offences? Are law enforcement entitled to a free copy of the HDCP keys?)
Mob mentality doesn't make a very good government, especially when manipulation of information comes so easily. No matter the crime, we can't start the basis of infringing upon an individual's rights on "well most of us on the island agree we should do this". William Golding illustrates this well in Lord of the Flies.
In the case of hidden documents, warrants/subpoenas are used to compel their production. Certain actions (like holding someone in contempt) can also make cooperation more likely (though not guaranteed).
Why do similar procedures not suffice for encryption?