It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before. We shouldn’t drift to a place—or be pushed to a place by the loudest voices—because finding the right place, the right balance, will matter to every American for a very long time.
So why are they asking the courts to stretch the 1789 All Writs Act beyond its breaking point, instead of going to Congress for a new law? The magistrate that signed off on this isn't even an Article III judge.
Except the American People also gave him the job to following any and every possible lead.
There is a fundamental conflict with how law enforcement and intelligence does it's job and the way people communicate today. Period.
If people want more secure communications they are going to have to tell the FBI in no uncertain terms "we are ok with you dropping leads."
Apparently in this specific case that has happened as a mother of the slain said:
“This is what separates us from communism, isn’t it? The fact we have the right to privacy,” Adams said to the Post. “This is what makes America great to begin with, that we abide by a constitution that gives us the right of privacy, the right to bear arms, and the right to vote.”
Now, is it up to this mother to make that case? Yea, if the director is to be listened to. After all that is who he invokes when appealing his reasoning. So based on this statement it seems like they should back off, say shit happens, lets mourn our dead and move on.
We need more of the victims to speak out and tell investigations their true thoughts.
>So based on this statement it seems like they should back off, say shit happens, lets mourn our dead and move on.
Agreed. It's not like the investigators are totally in the dark here, either.
In addition to the rest of the FBI investigation, given the terrorist/national security nature of the case, it's likely that anything the shooters did online during the past few years was pulled and analyzed—including internet history, e-mail, chat and phone conversations—rendering a pretty good picture of both their interests and associations.
If the FBI didn't do this (they can if it's a national security case), then the intelligence community certainly did.
There should be more discussion. Tech companies should seek to understand the government's job to keep people safe, and the government needs to understand that this is a game of whack-a-mole they can't win
> Forcing him how? If he came back with nothing, that's his answer.
It's amazing how many people don't get this. The Congress makes laws, which are the tools at the FBI's disposal. If they didn't give them this particular tool, then too bad: they have to figure out how to do their job without this tool.
And the DFBI did not address the gross incompetence shown by them, when they asked the County to reset the phone. If they didn't even know the consequences of that request, how can they predict the (supposed no) consequences of their request to Apple?
Exactly. From the public and its representatives' view, the matter is settled. But the FBI keeps saying "we need more debates on this" because it didn't go the way it wanted it to go.
This is why we have Rule of Law. There is no "we follow the law when it seems reasonable". Whatever the law is, it should be followed, and it does not matter at all what anyone's personal opinion is regarding what the law should be.
Perhaps it is more complex than this, for example in cases of civil disobedience. However, civil disobedience is for cases of conscience when you cannot personally participate in something that is immoral, so you refuse to participate or you protest it. The key here is that this is ignoring the law to resist immoral uses of power, not ignoring the law to remove intentional restrictions to the application of allocated authority.
This applies doubly to the legal system itself. An appeal to the inability of legislatures to "get things done" has been the argument of every tyrant since Caesar.
I was suggesting following the rule of law. Congress passes laws when they get around to it. I'm not in any way advancing that we should ban encryption or support backdoors (effectively banning encryption). I will not however pretend that I have exhaustively considered all legislation that would allow the FBI to open this phone without setting dangerous precedents.
I can't think of any at the moment but I don't think anyone in a position to do anything with an answer is trying. The issue is too good a political bludgeon to bother even seeking some creative solution to the problem (again, not that I am 100% sure there even is a legislative solution that wouldn't be a vastly undesirable and possibly unconstitutional blow to encryption).
It is not a good starting presumption that no reasonable answer can be reached, especially when the topic under discussion isn't a core issue like "backdoor all encryption" but rather "is it possible to pass reasonable and generally acceptable legislation such that Apple can legally be compelled to aid this decryption effort without setting a terrible precedent?" There may be no solution to that problem, which is fine, but don't suggest that I was endorsing some vaguely Orwellian shit because I don't presume as a given that our opponents on the general issue of encryption have malicious aspirations of tyranny.
It's not vaguely Orwellian. Given an ability to disregard the rule of law, and sufficient charisma, and some hard times, any country can become a dictatorship in 5 years.
We can't kill every charismatic person, and we can't prevent hard times. We can insist on the rule of law.
I suggested that a disagreement might have a non-specified solution a) exists and b) would be achievable through legislation, passed by an elected legislature. Nothing I suggested requires killing charismatic people (?) or advocates against the rule of law.
Normally when people intentionally misread me in the most uncharitable fashion possible I can at least see the deranged logic, but in your case I can't see what evil you claim I am advancing or how you arrived at that conclusion.
I understand your clarification, I was nitpicking that you were saying my response to the original comment was 'crazy' or something, but without the context you gave in your reply I think my original response was appropriate.
I agree with your larger point, there is probably a workable solution that balances the threat of government with the benefits of government, but I disagree that it is 'not getting things done' that is the cause of us not finding that compromise.
In general, I think that appeals to 'a lack of will' are almost always incorrect. People, even politicians, are generally good and want to find good solutions. There is a lot of disagreement on what is a good solution, and even more disagreement on what the long term consequences of choices are. This is why there is gridlock, and if you read the Federalist Papers (in the case of the US), you will see that this gridlock is a Feature of the system, not a Bug.
Every time I hear someone defend the recent state of affairs by bringing up the gridlock being a feature and not a bug, I feel like someone who owns a delivery truck that won't go more than 10 mph. I go to the dealer and ask what the fuck, and the dealer says that there's a speed governor on the truck that won't let it surpass 80mph and the governor is just working better than usual.
I think we are fine though, we are safer than ever before. We don't need to extend government snooping power, because there is no need to do so. If someone disagrees, that is fine, but that is why we have a speed governor.
People have always complained about decisions being made too slowly, but they have never been made faster, if anything they are made more quickly now than ever before. Congress used to only meet a few times a year for a few weeks!
When they only met a few times for a few weeks they were passing laws that were readable in whole by individuals during those sessions. The scope of their responsibilities has increased and I expect them to apply a correspondingly high degree of effort.
I want the legislature to be passing laws rolling back the security state, I want them to cancel or rework programs that don't work across the board. They can't do those things if they spend all their time in some arcane ritual circle jerk whose only and ultimate goal is just to make some of the other participants ultimately look more ridiculous than other participants.
We've normalized and accepted parliamentarian bullshit at the expense of governing (on both sides, but the republicans are the undisputed masters of the dark arts) for a couple decades now. Legislative lethargy (sorry, had to) is supposed to derive from lengthy debate and substantive disagreement, not from participants purposely tanking the process to score points in the cheap seats. Part of the reason that they got shit done in a couple of weeks in ye-olden-congress is that it was closer to a turn based game, news traveled slower and so the results of the whole session were what constituted news, not every little BS stunt.
The frustrated and vindictive part of me hopes that not a single senator or congressman pays a price for their cynical abandonment of their duty to govern. That way, next term we see months long shutdowns of the whole federal government and a collapse of federal services until [Bernie|Hillary] signs a budget that reallocates all non-entitlement social spending to some insane shit that polled well among likely voters suffering from bathtub-gin induced brain damage. The revolting (both senses) congressmen won't actually care about getting the spending reallocated, but them "standing up to" [Bernie|Hillary] will play well with their group so so be it.
Edit: to clarify the frustrated and vindictive part would be rooting for it in the sense that the worst part about democracy is that people get the government they deserve, and at the moment it doesn't seem like the process thinks we deserve much.
Again, you seem to think we are getting 'worse' government, but we have the best government we have ever had. Just because you can see it close up (in a historical sense), you are aware of it's warts. But I assure you, the governments of the past had all the problems we have today, and many, many, many more. If you were to get in a time machine and go back even 50 years, you would be shocked at how corrupt the government was then compared to now.
Stop with your short sighted, historically ignorant whining already.
Reasonable according to whom? And Congress passed a ton of stuff last year, it's not like it is totally deadlocked.
The FBI would love a federal law that forces Apple to crack phones for them. Since they don't have one, they are trying to get a court to invent one by stretching the All Writs Act.
I think that was his point. If he can't pass a law through the People's representatives, then he shouldn't be using mental gymnastics with one of US's oldest laws in existence to force a company to hack its own device...
And no, "Congress gridlock" is not a good response to abusing existing laws.
"a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.
(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction."
This law is basically "The courts can enable discovery and describe implementation of rulings." Setting up the rules of the game.
Could you share why you think that this is a stretch of the AWA? Is it because of the amount of work asked ?
The high profile nature of this court case will ultimately be leveraged into getting Congress to take it up if they lose. I think Apple has a very good case for arguing under AWA that the order is too burdensome, and if the court doesn't agree I think they've got a good case to argue that it infringes their First Amendment right.
I'll go ahead and say that it sounds extremely sincere and straight-forward, much unlike most official statements I've seen from the FBI. He does suggest that his view is right, but he does the right thing, as apple did, in supporting a public discourse about the so-called balance between privacy and safety.
I know many people seem to gravitate to the extremes on every issue, but both Cook and Comey are right in saying that we as a nation need to question our fundamental assumptions in our thinking and think through what the would be the best balance between the two, if there be any. As Comey says, and I personally agree with this, the "balance" shouldn't be decided "by corporations that sell stuff for a living" or by "the FBI which investigates for a living,"...or in general, the government. We shouldn't be grass in a fight between two elephants. The American people need to decide how much involvement both government and large multinational corporations have in their lives. That's what this all comes down to, after all.
Security vs privacy is a false dichotomy here. Backdoors can help law enforcement achieve some goals but weaken overall security. The debate is security vs security, and framing it any other way reflects either willful deceit or ignorance.
Encryption could also help obstruct justice and let criminals get away with things. Most people won't understand till it impacts them personally but imagine if you are robbed of all your money and the answer of who did it lies in a computer you are able to get your hand on. Wouldn't you love it if the police had a backdoor now.
You are wrong in framing the debate as security vs security. Its like everything else when it comes to liberty and protection. The question is how much liberty are we willing to give up for protection. We can't let a corporation or the government anchor our opinions on the left or right.
> Encryption could also help obstruct justice and let criminals get away with things.
That's fine with me. The actions of the few does not justify the loss of privacy for all.
> Most people won't understand till it impacts them personally but imagine if you are robbed of all your money and the answer of who did it lies in a computer you are able to get your hand on. Wouldn't you love it if the police had a backdoor now.
No, I don't. Because that means the police can get into my data at any given time without any oversight and without me ever knowing it. It also means that anyone could attack my system and get that money as well.
I've been in this situation many times in the past, I've lost a lot of sensitive data because I forgot the password to my encrypted DMG files and I've moved on and I still encrypt my stuff in the DMG files.
Do I think the police and others should have a backdoor into my DMG files? No, and there is absolutely nothing that will ever justify it, not even terrorism or even the life of my family.
More Americans have been killed by Americans in the past few years than the entire history of terrorist attacks combined.
How about we take actions to ban guns in the country and see how it works.
> The question is how much liberty are we willing to give up for protection.
None. I don't deserve protection if I don't care about my privacy.
> We can't let a corporation or the government anchor our opinions on the left or right.
Sure we can, by voting and with our money. People trust Apple because they do focus on security of their devices.
Of course criminals can use encryption. I didn't say they couldn't. I said it was security vs. security, because there are security issues on both sides. Duh?
Even more obvious is the notion that innocent civilians use encryption to protect themselves from mass survelence and needless data collection.
Is the end goal no longer total information awareness? [0]
Side note: that was a rather scathing and slanted article you cited. It would have been nice to see the simple facts of the case presented in a neutral light.
My point is that encryption is good and bad, but we can't completely enable it to be utilized for criminal activity. There will be the need for a resolution to be reached on who gets access to this information, in what situations, and maybe even strict prosecution if the information is ever used incorrectly.
Encryption is either secure or it isn't. You can't have a middle ground. It's unfortunate that under rare, and very particular circumstances criminals can perhaps evade crimes because they have access encryption, but it's also unfortunate that they can get away with crimes because they have access to guns, knives, vaults, cars, basements, gasoline, matches, duck tape, chain saws, shovels, etc...
Yet we don't cripple the effectiveness of these items for the sake of preventing crime. Doing so would only harm the quality of lives of the vast majority of innocent people who we trust to use these thing for their own benefit as free citizitens of a free country.
>unfortunate that they can get away with crimes because they have access to guns, knives, vaults, cars, basements, gasoline, matches, duck tape, chain saws, shovels, etc...
But investigators can get access to basements, cars, knives and guns and use them to solve the crime. In this case, post-crime, there is no way to gather information that could be important.
> But investigators can get access to basements, cars, knives and guns and use them to solve the crime. In this case, post-crime, there is no way to gather information that could be important.
Which is irrelevant to how these individual items are used to actually prevent the crime from being solved.
While encryption prevents access to information that may or may not be useful, guns are used to kill witnesses that are never found, shovels are used to burry bodies that take decades to find, cars are impounded or destroyed after a getaway, I could go on. Often enough, even if any of these items are found, it doesn't help solve the crime because the criminal was simply too smart.
I fail to see why encryption is being treated differently than any other legal thing we, as free and innocent citizens, have access to. The cynic in me believes the only actual difference is in the level of familiarity the majority of the voting population has with this particular legal thing. They seem to have, unfortunately, been misinformed and poorly educated about this issue and technology in general.
Sounds extremely straight forward and sincere(many bad decisions are also taken with utmost sincerity) but I would think it would be a bad idea to provide FBI this ability.
For one, it is forever weakening the security and privacy of your iphone.
Bad guys who want to evade the policy officers will do it anyway with encryption systems/software available from any of the non US territories.
Considering that FBI can get significant meta data from telecom providers(who called whom etc), I think this requirement that FBI has gives little gain in security as the upside and comes with a significant downside[ huge loss in privacy].
The article author is incompetent, it's all about private keys, not 'source code'. Neither Android OEMs nor Apple allows the user to set their own root cert.
>Neither Android OEMs nor Apple allows the user to set their own root cert.
Where does the article imply it does? It states correctly that the FBI would have easily hacked it had it been an android.
edit: I'm assuming you're going from this line:
>Although user content is encrypted on Android devices, too, Android is open-source software. Theoretically, the government can produce its own version of the system that would make it possible to hack the encryption.
This is poorly worded, as is the line about how all encryption can be hacked, but he gets the point about how Android would have been vulnerable. I think some of that can be chalked down to writing for the public and not a technical audience.
These people are either being intentionally deceitful or are simply incompetent in understanding the full implications of what they're asking. Either possibility is disturbing.
So they know this leads to an Orwellian state and they do not care? Maybe, but it would be political suicide to admit that.
Certainly Comey is being deceitful in his first sentence. Even Hillary called for a Manhattan-like project to circumvent encryption back in December [1]. That shows it's something that's being discussed in Washington quite frequently.
I doubt any will ever admit they're trying to lead us towards an Orwellian state. Also, encryption is part of our protection from that. Heads of state and encryption currently appear to be directly at odds.
Any time an American official or politician says the word "terrorist", my mind simply shuts off. I know ever subsequent word will be rhetorical nonsense and bluster. That's pretty sad.
They necessitated this whole mess through their actions. I try not to ascribe malice to what can easily be explained through incompetence, but to not even mention it? Shameful.
You don't get it, Comey. The problem isn't that you want this information. Obviously you want this information as an investigator of a crime. The problem is, we can't fucking trust you or anyone enough to cross that bridge, because we know if it's there, you're just going to want more. You and every law enforcement agent in the United States has lost our trust. Call it overreach, or overzealous prosecution, high incarceration rates, crooked judges sending kids to jail for kickbacks... fuck if I list any more, I'm going to want to drink some more.
I'd love to live in a world where I trusted a government, which is supposed to be a composition of its people, more than a profit-motivated, self-interested business, but in some weird turn of events, something vastly different has festered in the last 15 years. You blew it, you asshole, you and every other cop.
A few things. If you want to make your argument legit, refrain from calling the other person names. Second, besides drinking and making yourself essentially useless, what are you doing to improve this situation. One person may not be enough to make a difference but its better than zero.
America is entirely being driven by fear and anger. The current presidential race is a perfect reflection of the hate, anger, and fear that Americans believe exists. The system is a reflection of what the "good" people allow to happen.
The FBI has a job, and Apple has a commitment to its customers. Technology is going to get more advanced, does that mean that we should let it be a tool to obstruct justice. You may not trust the people tasked with the investigation, but people should also look at the fact that the encryption here is obstructing justice.
Tomorrow if a criminal kidnaps a bus of school children and the police are able to get a hold of a locked phone that holds their location, would you be as opposed to the current situation?
The creation of a method to disable the phones encryption is a slippery road, but the option of not aiding the FBI is also a slippery road. This is a organization created to protect the people.
We need to have a open discourse on how the information can be taken off this phone while minimizing the loss of public security and privacy . This is not a black and white issue like many people are treating it.
And just to make it clear, I do agree that the government has done extremely questionable things as well as committed offenses against the publics privacy. We need to fix the issue of distrust, because a lack of trust in our system is equal to cancer within a body.
Counterpoint: Just going out on a limb here, but I'm pretty sure James Comey isn't leafing through Hacker News reading our arguments. If he'd like to have a civil discussion, I'd be more than happy to bring the civility.
Second, our system is essentially set up to make me powerless. It's actually engineered that way. I do what I can to educate people around me, but when it comes to voting or having any control over who's in office or who officiates our most important government positions, the ball's not just not in our court, it's on another planet.
Technology, as you said, will continue to get much more advanced. I reckon at some point, literally every moment of our lives will be perfectly catalogued, from birth to death. This is why what Mr. Comey wants scares me so much, because you can practically hear him salivating in that letter.
We have guarantees built into our constitutional freedoms, and one of the most important guarantees is that the government can't compel us to do things. Sure, there are laws to maintain the peace and common order, but you don't have to house or attend to military in your home, you don't have to testify against yourself, you're not compelled by police to follow their wishes until you're under arrest (which they ostensibly can't do without probable cause), and you don't have to aide in an investigation if you're not involved. These are vital because the moment we're legally compelled to aide investigators, all bets are off. We are no longer able to even protect ourselves against undue search and seizure.
I actually wouldn't bet on that. As much as I dislike the FBI based on previous behaviour, I'm happy Comey is running it now, because he seems to have a fair amount of personal integrity. I wouldn't rule it out completely that on a matter this politically charged and visible he does some homework himself, and HN would be one of the first places to start.
I agree with you. The motives of those with access to this information is questionable and being forced to do things impedes personal liberty, but the people who wrote the constitution could have never imagined this type of situation. It quite literally requires an in-depth discussion by the people and those we trust in power to develop a solution, but the idea that no backdoor will ever exists is not realistic.
Look at history. The encryption the Nazis had created helped them commit terrible acts of war and it was the backdoor the Allies discovered that helped bring justice to them.
I'm being devils advocate here because I don't people should view this as "us vs them". Comey may seem like he is salivating to you, but he does raise the notion that at the other end is a corporation, who also has an agenda.
> Look at history. The encryption the Nazis had created helped them commit terrible acts of war and it was the backdoor the Allies discovered that helped bring justice to them.
Are you suggesting the Nazi's were the only ones to use encryption during World War II?
You forgot to mention at least 9 other nations (including the USA)
Cars, cellphones, fertilizer, kitchen knives, baseball bats, apple seeds, etc. Encryption is only special because of the power dynamic that shifts in the average citizens favour.
No, it isn't. The phone they're trying to break into was the shooter's work phone, provided by his employer. He physically destroyed his personal phone before the acts in question. The idea that he kept relevant information on his work phone, which he left intact, is absurd.
I'm against the FBI's actions here, but your argument against it doesn't make sense.
Did the shooter take the phone with him when meeting with accomplices? If so, perhaps there's GPS info on the phone.
What are the gaps in standard behavior (i.e. phone turned off...). Are these times where the FBI should dig deeper to discover the actions of the shooter?
The idea that bulk collection of metadata is objectionable is something we're seeing in civilized technical circles. Individual collection of the same is even worse.
There is a probability that it may not contain anything, but this fact doesn't disprove the notion that encryption can be used to obstruct justice in future cases.
(What the probability is though is hard for me to predict as I don't have enough information, but assuming he didn't have any slip ups on his work phone is a bad assumption. If there is a 1 percent chance that future attacks of similar scope could be prevented by knowledge gathered about this case, is it not our obligation to find out? Its a complex issue that the courts and media will play out over the next few months)
It is not a matter of trust. People should never simply trust their Government, rather, Government must be accountable to the People. This particular government has not merely done "extremely questionable" things, they have knowingly and willingly broken the law and violated the privacy of just about every American because they believed the means justified the ends.
Of course encryption will obstruct justice in some cases. The same applies equally to the 4th amendment. Sometimes criminals will escape justice because the means to identify them are simply not reasonable. Obstructing justice is not always considered a bad thing. Where we draw the line is crucially important.
The law demands that the particulars of the crime are irrelevant. We must accept that whatever capability we give police can be deployed, with probable cause, in any criminal investigation. So I'll completely ignore your hypothetical case. If bypassing encryption is possible, then bypassing encryption becomes routine. Apple was routinely tasked under the AWA to assist in recovering phone data prior to them deploying encryption which they believed they could not crack.
Worse still is the fact that all police powers will be, at times, abused by the police. So another important question is are you comfortable with this capability being used disproportionately against the poor and marginalized, being used to harass and intimidate, and being used to suppress political dissent?
I believe the police must be capable of working with some limitations on their ability to access our personal electronic devices. Compelling manufacturers to create backdoors in our devices to bypass their security might help the police solve cases, but it would do at an unacceptable risk to the security and privacy of everyone who uses those devices. Generally available and widely usable encryption is a vital national interest. It is the linchpin enabling trillions of dollars of electronic commerce. Even if you dilude yourself into thinking the backdoor would only be lawfully accessed, its mere existence would inflict real economic damage by furthering global distrust of American technology. And it would set the stage for even more troubling backdoors in the future, because the All Writs Act is the wrong legal framework for them to get it.
Under the AWA, how is compelling the creation of this backdoor any different from compelling creation of any software which might produce evidence of co-conspirators? If tomorrow the FBI wants access to an iPhone 6s's camera and microphone, or the ability to remotely download the phone's contents? What if the FBI needs Mozilla to install a backdoor in the Tor Browser Bundle? How can the AWA grant one backdoor and not the other?
> We don’t want to break anyone’s encryption or set a master key loose on the land.
This directly contradicts Tim Cook's statement, in fact he used the same term "master key":
> In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.
the FBI asked for a specialised OS image that would have a check so that it could only work on this specific phone. Combined with Apple's encryption/signing of the update, this image could not be applied to any other device.
So the "work" asked of Apple would produce something that would only work on the phone with the warrant
Yes, the particular OS image they produce would only work on that phone. But all of the work they did to enable the hack would work on every iPhone in the world, and the work necessary to take this OS image and use it on another device is effectively zero. As such, you can be absolutely certain that the FBI would start demanding Apple do this for other phones. And it's not just the FBI, every repressive regime in the world would start demanding this same capability. And Apple would have no grounds to deny it at that point.
"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
Yes, Tim Cook is lying (EDIT: perhaps not lying, but being creative with the meaning of "potential").
The software asked for by the FBI would only work on one phone because it would both have a check for the specific phone, and be signed by Apple (so can't be modified).
The software would need to be modified and resigned by Apple to be used on another phone.
You really, honestly, truly and genuinely believe that if Apple just does this for them, just this one time, just for this one phone, then that's the end of it? That the FBI and other law-enforcement agencies would never, not once, not for the rest of the lifetime of the universe (and certainly not, say, about 24 hours later) come back and say "Well, we're glad you helped us with that case, now here's this list of other cases we'd like help with, and now you've proven that you can do this we're going to be bringing you a lot of these"?
That's what Tim Cook is worried about: it won't stop with this one phone unless it's stopped before that one phone gets unlocked and becomes the precedent that makes this just another routine thing. Every now-routine violation of privacy and security was originally marketed as "just this one case, this one's unique and special and sensitive and we absolutely have to get it done, just this one, trust us" and eventually morphed into "eh, it's Tuesday, time to ship off another truckload of requests to have this done to people".
Of course the FBI will ask again and again. And Apple could be compelled to comply each and every time.
The slippery slope argument being applied here is that FBI search warrants will now work on phones less than the iPhone 5S from now on (until Apple closes the iOS update backdoor I guess). This is far from "China will be able to mass-hack into iPhones from now on"-style comments I've seen from others.
The fact is that Apple left a backdoor in their phones and are now being told to exploit it. An argument against having the backdoor in the first place.
Apple _will_ be compelled. You either don't understand the US legal doctrine of stare decisis or are being intentionally obtuse.
Furthermore, the US legal system encourages elaboration on stare decisis, applying prior case law to novel cases rather than hashing out new decisions. This means that creative applications of this ruling ("give me an uncontrolled backdoor") are simply a question of time once the landmark case is made.
considering that the single-use nature of this exploit is a fundamental part of the ruling (and of the validity of the search warrant itself), you would not be able to use precedent to just get an uncontrolled backdoor.
Whether the exploit is single-use, about a phone, Apple, the 5c, or any other specific parameters is fundamentally irrelevant.
What this case is doing is setting precedent that the courts can compel a company to _create_, no matter how trivial one may think that creation may be today. _That_ is the precedent so many draw exception to, because stare decisis is also a doctrine of incrementalism, of gradual expansion of interpretations. Today Apple is compelled to create a very controlled firmware; who's to say in the figurative tomorrow that Samsung won't be compelled to create and send a firmware update to a specific Blu-Ray player that creates an air microphone out of the laser? Where does it stop?
To be complete (as was pointed out in another sibling thread) incrementalism may be curtailed if implications are carefully argued and acknowledged by the judge. I am skeptical of the value of that approach, but have no hard argument against it.
This is true but, in terms of security, is meaningless. If they can order Apple to unlock one device this way, they can do it again, and once they've got a precedent, the next time won't be national news, it'll just be routine.
Can an analogy be "wrong"? It's certainly a stretch to call it a master key. I'd call it more of a bump key. :P But discussing the rightness or wrongness of an analogy is probably beside the point.
I'm sure the FBI would like to minimize the "work asked of Apple". Ideally, they'd have a system to:
1) Identify target phone and obtain warrant to search.
2) Invoke Apple<-->FBI-specific api to initiate OS image generation to the specified target phone. OS image would be automatically downloaded to target phone.
3) Once downloaded, OS image would present FBI-specific API to allow exfiltration of any desired data.
This might not violate the 4th Amendment ("The right of the people to be secure in their persons, houses, papers, and effects ...") because they'll have a warrant, but IMHO it's a clear violation of the 5th ("... nor shall be compelled in any criminal case to be a witness against himself ...").
And next week they ask for them to sign the next thing to work on the next phone.
At this point, the only valid answer from Apple is that any future devices (if they aren't this way already) should either require the password to do a software update, or wipe the phone.
Right, if the FBI has a search warrant for a phone, Apple can be compelled to comply.
The core thing that makes this not a master key is that Apple has to make the decision to comply on each phone (and warrant). Which gives them the opportunity to push back if it chooses to. If the FBI has somebody's phone and secretly wants to unlock it, it can't without going through the courts and Apple.
What makes you think the US Gov't can't break in and take this, once Apple has built it? They'll have to QA it, it'll be on at least a handful of machines---and the order requires it be network connected.
At that point we should expect more world governments to get the key. Where does that end? Maybe with more companies having to demonstrate that they defend against their own subversion.
I'm pretty certain you and rtpg don't understand stare decisis. This isn't about this specific phone, or even the methodology. This is about the FBI cherry-picking a nearly throwaway (in terms of probable data value) case to set precedent in what is obvious to practitioners as a continuation of their War on Cryptography.
The US legal system and education pertaining to it is largely composed of studying prior case law and its applications. Few cases (including this one) involve truly new decisions, and even those are usually novel applications of former decisions. That is what those who can see past their nose are concerned about. There is every probability that, in the near future, one or both of Apple or the judicial system will tire of the farce of one-phone-per-case firmware and request a new application of the law (because there would now be precedent).
To take any laywer-moderated statement (including Apple's) at face value belies credulity that does not become the HN audience. As others have pointed out, Comey's appeals to emotion and terrorism should be enough, but the idea that an FBI lawyer would advise pursuing this particular case in this particular manner without expecting to set precedent? Positively silly. This is what they do.
All I did was give some technical information. How did you conclude that I'm ignorant of law?
If a new case is not similar to this case, then this case doesn't count as precedent. If a new case is similar to this case, then the arguments put forward for this case apply to the new case, and we shouldn't care about precedent setting. If the new case is similar but different in important ways, then when that new case is fought, the lawyers point ou the substantive differences, and it doesn't count as precedent.
The only concern would seem to be if the future court fails to analyse a case properly. But they can fail regardless of what precedent is or is not set.
I don't think individual cases should be ruled improperly because of concern about the precedent effect. You might want meta-level rules (e.g. Bill of Rights, free speech), but the individual actors should not be taking them into account except insofar as they have been legislated (although the Supreme Court perhaps should).
I conclude that you potentially misunderstand what is perhaps the core governing doctrine of US law because you keep re-presenting the same tired argument that this is about a single firmware tied to a single phone. If you truly believe it's about one phone, we don't have much to discuss.
Whatever you think they should do, judges absolutely consider the precedent effect when ruling on accepting arguments, and lawyers most certainly consider precedent and setting it when presenting those arguments. There may be some intentional offloading of those decisions from lower courts to the appeals process, but no judge makes their decision in a vacuum. Nor should they.
> you keep re-presenting the same tired argument that this is about a single firmware tied to a single phone
I mention that only when people assert incorrect facts about the case. Correcting someone when they're wrong about a matter of fact does not imply any legal opinions.
About your point: there's a difference between "If I agree with argument X in this case, I should agree with it in that case; however, X shouldn't apply in that case, and I can make no principled distinction between the two. Therefore, X can't apply even in this case" and "I agree with argument X in this case, but not in that case; here's a principled distinction between the two; however, since it might lead to someone misinterpreting it down the line, I'm going to rule against X."
The first is a concern for precedent that I'm fine with, the second is not. My understanding is that judges will lay out the reason X applies here but wouldn't apply in the other hypothetical, which makes the concern about it being used as precedent later unwarranted.
The form of the argument here seems to be the second. "Yeah, this one is mostly fine, but we're fighting because of precedent. If they can force us to sign software and install it on a single phone, who's to stop them from forcing us to put software on every device we sell?"
And the clear distinction between the two is that one is only being put on phones that have a warrant (and belong to the government, to boot), and the other would also harm innocents. So the proper response should be "comply with this order, but specifically because of the fact that no innocents are being caught up", and that way it doesn't set any harmful precedents.
Or if you think there's a different reason why this case is fine but the general case isn't, then that itself is a reason for the general case not to have this case as precedent. Whatever those reasons are, make them explicit. If Apple puts those arguments into the court record, and the judge explicitly says "it's ok for X but not for Y",that defeats the harmful precedent.
Am I misunderstanding anything, or do we just disagree?
It's possible that I missed any concern the judge may have had for precedent when I read the ruling. If you know of any (or any other document I may have missed), I'd appreciate a pointer.
I think you're applying some fairly strict high-mindedness to the US legal system while others (myself included) worry that creative extension of intent appears rampant in cases that touch on technology or terrorism and therefore fall into your second form. Many of us don't, as a general rule, trust the courts, law enforcement, or our government to do anything but what is politically expedient and beneficial to them at the moment.
While the wording of this ruling is about one phone and a particular method the FBI has laid out, it appears to set precedent that law enforcement can compel a product company create a non-existent product (no matter how trivial) in order to exploit a known security vulnerability in one of their products. This is what concerns me and many others, because it brings us very short steps away from "make us a version that works against the Secure Enclave" to "make it work as an OTA update over WiFi" to then "make it work as an OTA update over cellular" and subsequently "make a version we can incorporate into a StingRay" and forward. None of these would be illogical steps to take in abject pursuit of stamping out terrorism and might even applauded by parts of society, but taken as what some perceive as an inevitable whole they paint a dim picture for personal privacy.
It's an improbable coincidence that the FBI has elected this particular charged case in which to stake their flag. Given their pleas with technology companies for cooperation over bypassing cryptography in recent months, this appears to be a logical continuation of that campaign.
In conclusion, we likely disagree, specifically due to my cynicism and your seeming lack thereof.
>I think you're applying some fairly strict high-mindedness to the US legal system while others (myself included) worry that creative extension of intent appears rampant in cases that touch on technology or terrorism and therefore fall into your second form.
My main point related to this was above:
>The only concern would seem to be if the future court fails to analyse a case properly. But they can fail regardless of what precedent is or is not set.
To argue against that, you'd need to claim that precedent makes it easier for the later court to fail. Do you have examples, where it should have been clear that precedent didn't apply, yet the court reached the conclusion that it did, incorrectly?
(Preferably in important cases.)
>While the wording of this ruling is about one phone and a particular method the FBI has laid out, it appears to set precedent that law enforcement can compel a product company create a non-existent product (no matter how trivial) in order to exploit a known security vulnerability in one of their products.
I've said elsewhere that this argument seems to be useless. If Apple says it's "unreasonable" to expect them to do this, then they might be forced to hand over the source code, and the FBI will create it themself. The problem is
1. iOS is closed source and
2. iPhone requires a signature from Apple
If Apple doesn't help them, they could conceivably be forced to simply hand the keys and code over. It's a benefit to Apple to be able to create it themselves and maintain control over the keys.
>This is what concerns me and many others, because it brings us very short steps away from "make us a version that works against the Secure Enclave" to "make it work as an OTA update over WiFi" to then "make it work as an OTA update over cellular"
All of these seem fine, assuming that Apple is not modifying the phones to make it easier to hack. In other words, I agree that they follow as direct precedent from this case. (Although they can decide to only update a phone after a proper warrant.)
>make a version we can incorporate into a StingRay
This is the part that doesn't follow from precedent. Giving over control of the tool to the FBI who could use it without a warrant is novel, and would require a judge to justify it.
>In conclusion, we likely disagree, specifically due to my cynicism and your seeming lack thereof.
Maybe I'm just more cynical than you. You worry about a future court doing the wrong thing because they're misled by precedent here, I'm worried about a future court doing the wrong thing just because. I don't think the risk goes up significantly based on this precedent, because I think it could happen anyway.
>It's an improbable coincidence that the FBI has elected this particular charged case in which to stake their flag. Given their pleas with technology companies for cooperation over bypassing cryptography in recent months, this appears to be a logical continuation of that campaign.
Pure tinfoil material here. Do you know of any cases where the FBI had a phone but didn't try to unlock it, that could be said to be as urgent as this? This isn't the only court case with Apple going on, it's merely the one that got a lot of attention, and certainly much of that attention is Apple's fault (not all, but a lot).
> To argue against that, you'd need to claim that precedent makes it easier for the later court to fail.
No, I'm arguing that incrementalism means the court doesn't have to fail (as a court) - the follow-on steps that I outlined and you found perfectly acceptable will not stop wherever you personally think they will, nor have I outlined (or even imagined) all the incrementally creative applications of this case that will likely follow. I'm also arguing (which you didn't counter) that this case lays the groundwork for the FBI to request _new_ products from companies.
> This is the part that doesn't follow from precedent. Giving over control of the tool to the FBI who could use it without a warrant is novel, and would require a judge to justify it.
I was being brief to illustrate, not to set a literal expectation of progression; I'm not here to present an argument for court. Giving the FBI control over the tool could be argued at any point in the very long lifetime of this decision, be it tomorrow or years from now. Perhaps after the courts have decided to allow for multiple devices or when suspect X is holed up for weeks in their cabin eating pizza and playing candy crush and the FBI cannot reach them by normal investigative means.
> Pure tinfoil material here.
Perhaps. Consider the probable value of the data on the phone beyond what's already available externally. Consider that the FBI could have simply requested Apple provide the data on the phone instead of specifying the mechanism and thereby creating precedent that they can compel a company to create something new. Consider that, in spite of generally dim views of the FBI's technical capabilities, the FBI usually provides sound, conservative advice with regard to electronic evidence (as a forensic analyst I've been party to a lot). Consider that over the past year+ the FBI has been protesting with little effect in congressional hearings that $todays_crime_buzzword are "going dark", that technology companies need to cooperate with them, that "securely insecure" (my words) systems are possible if created "at the design level" (not my words). They were nearly shouted down in those hearings. Consider the subsequent public visit the FBI and others made to Silicon Valley, extending an olive branch and returning effectively empty-handed.
Then consider that this case, involving an older and still-vulnerable version of a technology the FBI has been warning about, suddenly falls into the FBI's lap. It's politically and emotionally charged, it involves terrorists - they can ask for the world and society will rubberstamp it "because terrorists."
So the FBI asks for something small but very specific, and in doing so "happens" to set a precedent that they can compel companies to create a new product to bypass the very security measures they've been arguing vehemently against. Do you really think that's accidental?
> I'm also arguing (which you didn't counter) that this case lays the groundwork for the FBI to request _new_ products from companies.
I don't really have an opinion on that. I haven't commented much on whether they should have that power.
I think in this case, Apple would prefer to make it themself than be forced to hand over the code and keys to let the FBI make it. If Apple says "this is too hard for us", it's plausible that they'll need to do that instead. I have seen this point made elsewhere, but I'm not sure of its validity.
>Consider that the FBI could have simply requested Apple provide the data on the phone instead of specifying the mechanism and thereby creating precedent that they can compel a company to create something new.
My understanding was that the FBI asked for the data, Apple said "we can't do it", FBI countered with "Do X, Y, and Z, it is technically possible". Is my understanding incorrect? It sounded like the order needed to specify exactly what was to be done.
About the precedent point:
I think the proper response is to argue against specific details of the actual case (which you have with the point about forcing them to make a product). If you don't have specific objections against the actual case, but worry about precedent, then make explicit the difference between the specific case and the general one, and try to get that acknowledged by the judge. Introduce those arguments in briefs to the court, and try to make it very clear to a future court what the limits of the precedent set should be.
If you haven't imagined what could go wrong, then don't reject something because of unknown dangers. It seems to be born of a lack of trust in future courts, but there's no particular reason to distrust future courts over current ones.
(If you happen to know any philosophy of law articles that discuss this topic, I'd love to see them.)
Thanks for working with me, I understand your point about distrusting future courts over current ones.
I don't think it's too hard for Apple to do this; they could probably turn out the request in very short order. However, as soon as they let this go and do that very thing, their legal obligations skyrocket as thousands of cases pour in; in truth, my cynical conjecture is that this is the very reason Apple is fighting this fight. Not for the consumer, but to minimize the very real cost associated with satisfying this kind of request. The Secure Enclave is likely as much a legal defense for Apple as it is a technological one for the consumer. Whatever the motivation, in this case the consumer seems to win.
I must admit I don't disagree with the specific details of the actual case. What I draw exception to is the specificity of the ruling (dictating how Apple does business) and the precedent it sets of enabling courts to force a company to create. That exception is exacerbated by this case's proximity to the FBI's very recent and very real behavior regarding cryptographic systems.
Plus, even if it didn't have that check, every iOS update is signed by Apple specifically to the device being updated at the time it's being updated, so even if the image was released publically with no restrictions, nobody could install it.
Even if it was open source, they couldn't install it on any iOS device without a signature from Apple, tailored specifically to the device being updated and the software version being signed.
You can't even downgrade to an old iOS version, because Apple won't sign them anymore.
This comment will never be a reply to any other comment. However, if I wrote it because the fbi told me to, why wouldn't I reply to absolutely any other comment the fbi told me to?
Reasonable engineering practice requires that that software be tested before it is used. If it is handed over untested, and bricks the unique device, Tim Cook is going to get found in contempt. The order as written says to only make one specialized version, but also requires delivery of a working version---and that means a general purpose version that can be tested.
Similarly, you could reasonably argue that if there is a master key in this situation, it is Apple's firmware signing key, which the company already possess and has been using as a consequence of its inadequately examined security model.
Edit: LOL, downvotes because you don't like it. Apple made a bad design decision, and it's coming to bite them in the ass. Yeah, it sucks, but I'm not the one who designed the backdoor. Blame the corporate culture, not the messenger. Fix the problem and tell your customers about it. Don't pretend it doesn't exist.
There's nothing to argue. Apple's key is a designed-in backdoor, and this case is about whether they can be compelled to use it.
The right thing would have been for Apple to unlock the phone in question, and issue a security advisory for the longstanding but newly-relevant vulnerability. Then, if they want their devices to be secure against USG, actually fix their security model to remove their own privilege. Instead, they walked right into the FBI's trap on a completely unsympathetic case. They're setting us up for a horrible precedent simply to continue pretending they can provide a level of security that they didn't actually build.
It sounds like Cook wasn't talking about a "precedent" but about the actual OS:
"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
Cook responded before there was much public discussion on the issue, and may have framed it in these terms thinking they would garner support (I never agree with those decisions, but it seems likely). He may also be under obligations to refrain from discussing certain aspects of the FBI's work or may for legal purposes not have been willing to make accusations (they only want this for precedent) that aren't possible to back up. He may even have his own motivations that diverge from those of the public (gasp). Regardless, we as tech-literate folk can understand that this isn't a master key but would be unprecedented (and therefore create new legal precedent).
It's beyond the capabilities of most iphone users to lock their phone with a passphrase that is secure enough to withstand a trillion guesses per second, remember it and deal with inputing it often, so yes, this is de facto a master key to iphones (of course some judge would expect this same type of software to be implemented for all iphone versions).
>The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message. It is about the victims and justice.
This is a big fat lie.
The way US legal system works makes any case like this about setting a precedent, be it intentional or not. The fact that the FBI director goes and straight up lies about this just supports the theory that it may very well be intentional.
I don't believe this is tied to the San Bernardino case. The topic of weakening encryption has been brewing for the past 6-12 months, and the FBI is using the SB case as a convenient hammer to put a nail in the coffin of encryption.
>I don't believe this is tied to the San Bernardino case. The topic of weakening encryption has been brewing for the past 6-12 months, and the FBI is using the SB case as a convenient hammer to put a nail in the coffin of encryption.
This is about much more than just weakening encryption. I think Congressman Lieu put it well
>This FBI court order, by compelling a private sector company to write new software, is essentially making that company an arm of law-enforcement. Private sector companies are not—and should not be—an arm of government or law enforcement.
> ...it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure — privacy and safety.
I hate this narrative so much. Just because we happen to be storing everything we do now, doesn't suddenly create tension between privacy and safety. We used to not store it at all! It's really just a question of keeping privacy with new tech.
Yes, it's a false dichotomy. Either we have a police state, or the terrorists win, take your choice! Don't forget the children, and the .0001 % chance of horribly brutally muderdered by terrorists!
I think it's reasonable to say that the HN crowd is predominantly more educated on this matter than the average American. We are not the target audience of this letter. Neither is Apple. Comey is playing on the emotions and technological ignorance of the bourgeois. He refers to brute force hacking as, "...try to guess the terrorist’s passcode..." Then he plays on Americans' current distaste for large corporations and says that the, "American people," need to decide whether this is right or wrong. No, we don't. Every cryptologist, cyber-security expert and halfway decent IT is saying that this is a catastrophic proposal with the power to mutilate the fourth amendment and send a whole Jenga tower of rights tumbling after it. It shouldn't even be a question.
The FBI appears to be using this tragedy as a trojan horse by which to commit atrocities upon our crucial freedoms. No, our thoughts and prayers are not enough, but our privacy and security is far too much.
Maybe, but it's fun to discuss, and there are enough of us spread throughout that can educate on this single issue. It's pretty easy for me to explain to my non-tech friends the implications of what the FBI is asking. Word of mouth travels fast and I would give pro-encryption the upper hand in this "debate". I have not heard anyone outside government protesting that Apple yield on this issue and I doubt I will.
In fact, the only ones who speak up against encryption are those who do not listen to the people. Since they're supposed to represent us, they will be very easy to not vote for.
Like you said though, it isn't even a question, encryption is here to stay whether the American government permits it or not. Ironically, by putting up such a fight, the government is simply telling criminals where the weak points are in law enforcement, and are thus empowering criminals.
I hope our government can have a good sit down with tech company leaders and experts in cryptology. Despite Comey's request to have an open discussion, they seem to be excluding this group. It's apparent from his discourse, Hillary's, and Obama's that they've spent no time sincerely listening to anyone with any knowledge about the benefits of encryption. The "conversation" he so desires has only happened in Washington among people with no tech background. Presumably there will be some public hearings coming up.
Could Cook even compel his employees to write the desired operating system? If I worked there and was asked to violate/reverse the core of my company's principles, I would quit.
Frankly, this type of order would deeply hurt moral at Apple. I imagine Cook knows that, and in addition to doing the right thing, he must double down on ensuring employee retention and future sale of products. End-to-end encryption is a feature many people buy the phone to get. If that disappeared, many techies would stop recommending it, and sales will slide.
There's a lot on the table here. I don't even think you could measure the impact for reimbursement by the government. sigh, at least you CA folks elected Ted Lieu, good job!
I wonder if they did or what would have happened if they had approached Apple in a non-compulsory manner regarding this.
He is right on the fact that the specific task they're seeking Apple to perform is more or less obsolete and will be entirely so shortly so I'm inclined to believe that they're not really going through all this to abuse that specific tool (though the precedent would be worth the fight in order to abuse it).
It would be sad, but entirely his fault, if his prior efforts to set bad faith precedents to end-run consumer encryption alienated industry to the point where productive conversations are no longer possible.
It seems that Apple got sick of servicing these requests, particularly as they started increasing in frequency. The NYTimes [0] reported earlier this week that Apple had been basically complying until last year, when they decided that they could no longer deal with the burden of evaluating the large amount of legal orders they were receiving and unlocking or retrieving data from phones. It seems unlikely, given their (new-ish) stance on privacy that they would comply with any request at this point.
Edit: I'm not trying to suggest that Apple is suddenly being arbitrarily obstreperous in cases where they may have once been compliant. Apple made a decision to change course and it wouldn't make any sense for them to go back on it just because the FBI asked "Pretty please?".
As far as I can tell Apple has never complied with a request for what the FBI is asking for here. It has complied in cases where the phone was not encrypted, or where a backup in iCloud could be made available to law enforcement, but Apple has never actually produced, signed and installed a custom "this phone only, and only so law enforcement can brute-force it" version of iOS.
And given that we're talking about a federal government that was perfectly happy with pictures of the TSA's master luggage key being published in newspapers, I don't fault Apple one bit for being scared of handing over what are basically the keys to the kingdom.
This request is materially different. The actions requested in the SB case and the NY case involve engineering around the way that older devices implement the newer iOS's security (they don't have the physical secure enclave so with engineering it's possible for Apple to get through a device). In previous instances Apple was either handing over copies of data or helping the FBI use a piece of software that could read unencrypted sections of the hard drive even if the phone was still locked.
I think Apple's got a strong position that this is a stretch of the AWA.
> We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That’s it.
Bull crap. The FBI has it within their power to break the encryption in less than a decade by spinning up tens of thousands of EC2 GPU instances. Forcing Apple to develop products that don't yet exists is simply a choice of economy, with the added benefit of being able to strongarm any other business into creating new products in the future if it suits the FBI.
> Bull crap. The FBI has it within their power to break the encryption in less than a decade by spinning up tens of thousands of EC2 GPU instances.
How? They either need to break the PIN or they need to break the AES256 key that is used to encrypt file metadata.
They cannot use EC2 GPU instances to attack the PIN because the function that derives the AES key from the PIN uses a key that is unique to each phone and not readable by software so they cannot get a hold of it unless they resort to opening the crypto chip and trying to read the key by examining the hardware (and if they do that, they won't need a GPU...any desktop computer would be able to brute force the PIN quickly).
They can use EC2 GPU instances to go after the AES256 key for metadata encryption, but that will take a hell of a lot longer than a decade.
AES256 has 115, 792, 089, 237, 316, 195, 423, 570, 985, 008, 687, 907, 853, 269, 984, 665, 640, 564, 039, 457, 584, 007, 913, 129, 639, 936 possible keys (I've put spaces after the commas to allow HN to wrap because HN can have trouble with long strings lacking whitespace).
Suppose Amazon has some kind of super GPU that can try 1 trillion (10^12) keys per clock cycle, and rans at 1 THz (!0^12 Hz). Note that both of these far exceed any actual GPUs.
Suppose Amazon has a trillion of these available and you use them all. To search that entire key space would take 3669308250866118434801967410403995 years. On average you only have to search half the key space, which cuts this 1834654125433059217400983705201998 years.
Assuming that this computation can be done super efficiently...so efficiently that for each key tested only a single bit actually has to irreversibly change state, and that we are doing this in a system cooled by liquid helium, it would need a minimum amount of energy equal to the entire energy output of the Sun over 3.6x10^20 years. (I don't remember if that is worst case or average, so let's be generous and call it only 1.8x10^20 years worth of total solar energy output).
Do you understand what "two raised to the 256th power" means? No, you can't bruteforce AES-256 in years or decades, even if you owned Amazon and were willing to use all the servers for this.
"The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message. It is about the victims and justice."
When the government revokes the Gulf of Tonkin resolution I will again put faith in its ability to act responsibly with emergency powers and provisions that might, just might set a precedence. Until then, fool me once, shame on you, fool me twice, fuck you.
I don't like it, but I think we're better off if the courts side with FBI on this case, because it should accelerate the inevitable: either Apple and other mobile device makers give up on privacy and governments around the world get whatever they want, or they make zero knowledge devices that can't expose their data unless the user complies.
In the meantime, FBI losing this case just means they try again. And again. And so will other governments, not just federal, but state and local. And foreign.
The reality is Apple can change the software to in effect cause the data to become accessible. Apple doesn't have keys to the front door. But they have the power to weaken the hinges. Saying they won't do that isn't the same as it not being possible.
It's a real media posturing war going on right now. I wish Apple would win the argument on merits of the argument alone, but it's obvious with this the FBI thinks it can rely on the terrorism argument alone.
Notice that Comey doesn't even bother to refute the technical and legal precedent arguments that Apple and other privacy advocates raise in the media.
Google or Facebook have to get involved and put something on their homepages; else, I think that the FBI wins this.
There's a certain hypocrisy I'm noticing in these forum discussions whereby many of you are distrusting the FBI, but apparently you're fine with Apple having the best chance of cracking your phone.
Everyone is going on about encryption, but notice it's not just encryption that protects us, but also vendor-controlled measures such as self-destruct.
I prefer a level playing field in these matters. I don't want the FBI to have a better chance at cracking my phone than anyone else, but I also don't want Apple to have the best chance either.
It's all or nothing. My vote is for strong encryption, but up until that point, everything else should be on the table for all parties such as crime investigators, not just exclusively controlled by tech giants.
The hypocrisy is only superficial. It seems at first thought that the government (being owned by the people) would be more trustworthy. But...
Based on incentives, the government is incentivized to put as many people in jail as possible. Apple is incentivized to make people's phone/laptop experience as good as possible.
Based on worst-case actions, Apple can use your data to 1) sell you more devices, 2) generally reduce your consumer purchasing power, or unlikely 3) sell/share your data and seriously compromise your privacy. The government, on the other hand, can throw you in jail, for life.
Based on history (Germany, Russia, China, Iran, North Korea, ...), governments have demonstrated real threats with user data in hand that corporations have never really come close to competing with.
Your idea that Apple are incentivized to make people's phone experience as good as possible sounds like a wishful proposition. There's plenty of examples of Apple features and restriction designed as eco-system lock-in, behavior manipulation, or control limiting for commercial reasons. "What's good for Apple is good for our customers" is what they want you to believe, but as a long time iOS device owner, I reject that on numerous grounds.
There's also an equivalence flaw in your argument that sounds like it's coming straight from infowars.com: "the government is incentivized to put as many people in jail as possible".
I'm with your on incentives though. And one thing Apple have no commercial interest or incentive for, is fighting crime.
We don't live in Star Wars. There's more than dark vs light. My position doesn't mean I am not aware of western government corruption, bungling, even war crimes. But I am not permanently polarized, forever holding "the government" in contempt for misguided actions and evil intentions.
On history... if we're talking tyrannical governments, then "protection of user data" I suggest would not have stopped any given government in your examples from unleashing hell on its people one way or another.
Well I think my main point was just to say that the "hypocrisy" you mention actually has a rational backing. Not that the backing is bulletproof, but it's certainly more than a hypocrisy.
You seem to most strongly disagree with the assertion that "the government is incentivized to put as many people in jail as possible". It's pretty clear from my original comment that "the government" is referring to law enforcement agencies, such as the FBI. Note that the FBI has thousands of agents whose performance is measured ultimately by the percent of cases they close. Thus, the claim that FBI agents "are incentivized to put as many people in jail as possible" is more an observation than some crackpot theory. It doesn't mean that we should change that - running a law enforcement agency any other way wouldn't make much sense. But it does provide the rational backing for someone to be more concerned about worst-case government abuse of data than worst-case corporate abuse.
> Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t.
Interesting admission that they don't really have "probable cause" here, rather they apparently want to go on a fishing expedition. I guess if the owner of the phone is already established as a terrorist then that is cause enough, but I still would have expected them to be arguing they have specific reason to believe the phone is necessary for their investigation.
The owner of the phone is San Bernadino County, which gave the FBI permission to search the phone. Pretty sure probable cause is a moot point when the owner of the phone grants you permission to search it.
Absolutely. I just think it significantly weakens their argument to admit they don't even know if they phone data will be useful. I would have thought when they are going to court to compel a company to do something they would be asserting something more than "who knows, it might be useful?" in support of their case.
So why are they asking the courts to stretch the 1789 All Writs Act beyond its breaking point, instead of going to Congress for a new law? The magistrate that signed off on this isn't even an Article III judge.