
Your iPhone just got less secure. Blame the FBI - Libertatea
https://www.washingtonpost.com/posteverything/wp/2016/03/29/your-iphone-just-got-a-lot-less-secure-and-the-fbi-is-to-blame/
======
studentrob
The FBI's refusal to detail the flaw will just add to the pile of
miscommunications between technologists and the government. That hurts the
government's ability to advance their own technological capabilities and
understanding. Every day, they're getting better at shooting themselves in the
foot and widening that communication gap.

I see nobody out there capable of bridging it. Not Tim Cook, not the EFF, not
Obama, and certainly not the DOJ.

Bruce Schneier's previous coverage from 2015-07 [1] is what first got me
interested and up to speed in the recent SB case. Even if Apple isn't
demanding the FBI's method at this moment, I respect what Bruce has to say
here.

[1]
[https://www.schneier.com/blog/archives/2015/07/back_doors_wo...](https://www.schneier.com/blog/archives/2015/07/back_doors_wont.html)

~~~
exelius
I agree with you, and I think this will eventually lead to a world where
governments are unable to exert meaningful influence on large corporations.
We're already starting to get there; I have a feeling that if the supreme
court had forced Apple to write a custom version of iOS that things could have
gotten really messy very quickly -- there were rumors that Apple's entire iOS
engineering team was ready to resign if the case went the wrong way. It's
plausible to see a scenario where Apple says "You know what? Fuck it, we're
based in Ireland now."

Ultimately, I don't think governments are designed to deal with corporations
that make as much money as a company like Apple does. These companies are the
size of governments -- if Apple decided it wanted to hire a bunch of
mercenaries and take over a small country, it could probably do so (if it
didn't mind getting embargoed by whoever was friendly to the country they took
over).

I wouldn't be surprised to see corporate sovereignty become a big
international issue in our lifetimes. International law is a huge grey area,
and I expect companies to exploit that to their advantage to avoid enforcement
actions by individual nations.

~~~
wyldfire
> Fuck it, we're based in Ireland now

Apple has an enormous investment in their design team in Cupertino. It would
be an enormous impact to their product development capability to start over
somewhere else.

It's not enough to say "HQ is over here bro," court orders still work in
California. Then again this whole All Writs effort to "build me a tool to help
my investigation" seems to break new ground. Maybe it wouldn't even be enough
to pack up and move all R&D out of the US (e.g. injunction on US sales until
the foreign Apple company complies).

~~~
studentrob
> Maybe it wouldn't even be enough to pack up and move all R&D out of the US
> (e.g. injunction on US sales until the foreign Apple company complies).

There are already state level bills proposing this in CA and NY [1]. Those
bills would fine manufacturers $2,500 for every phone sold in those states
that isn't capable of providing decrypted data. The language originally comes
from a white paper by Manhattan DA Cyrus Vance. Feinstein-Burr are working on
a similar federal bill which was supposed to be released late last year, then
this month [2]. It has obviously been delayed by the public's response to the
SB Apple case.

[1]
[https://www.techdirt.com/articles/20160122/06200833403/calif...](https://www.techdirt.com/articles/20160122/06200833403/california-
legislator-says-encryption-threatens-our-freedoms-calls-ban-encrypted-cell-
phones.shtml)

[2] [http://www.politico.com/tipsheets/morning-
cybersecurity/2016...](http://www.politico.com/tipsheets/morning-
cybersecurity/2016/02/march-is-encryption-bill-month-hackers-going-after-
japans-infrastructure-a-mixed-final-2015-tally-212865)

------
kazinator
This is awful reporting.

For weeks, the experts in security had been saying that FBI does _not_ in fact
need Apple's help to get into that phone; that they are just posturing in
order to obtain a back door.

This was posted on the ACLU website:

[https://www.aclu.org/blog/free-future/one-fbis-major-
claims-...](https://www.aclu.org/blog/free-future/one-fbis-major-claims-
iphone-case-fraudulent)

 _The FBI can simply remove this chip from the circuit board (“desolder” it),
connect it to a device capable of reading and writing NAND flash, and copy all
of its data. It can then replace the chip, and start testing passcodes. If it
turns out that the auto-erase feature is on, and the Effaceable Storage gets
erased, they can remove the chip, copy the original information back in, and
replace it. If they plan to do this many times, they can attach a “test
socket” to the circuit board that makes it easy and fast to do this kind of
chip swapping._

Maybe the FBI has a secret new exploit --- or maybe they just did the above
method, getting the "third party" help with the desoldering and the attachment
of a socket, and hardware for reading NAND.

It's just speculation.

Even if the FBI are exploiting some hole, that is better than them having a
back door, which is essentially a security hole put in by design.

The article is speculating, and it's conflating security holes with back
doors.

~~~
gravypod
I don't know about anyone else, but the iPhones seem to be the focus of all
the academics that do security research.

I'm always seeing X done to the iPhone.

Just these past few weeks I remember seeing a key extraction done via EM
emitted from the phone.

------
tptacek
Schneier has never had a strong intuition for how software vulnerabilities
work. In the 2000s, he wrote articles in his newsletter blaming eEye (a
security research firm then the home of Derek Soeder, Barnaby Jack, Ryan
Permeh, and the like) for _publishing_ their vulnerability research. He is at
turns anti-disclosure, pro-disclosure, and all points in between.

~~~
bertil
My understanding is that “disclosure” is a nuanced thing, time-wise:
responsible disclosure is to mention the vulnerability to someone who can and
is intent to fix it first, give them the time to write, test and send a patch,
and then publish it. Publishing it earlier sound very unreasonable, especially
before handing the details to the manufacturer.

I am not familiar with what Schneier said 15 years ago, though. He might have
chastised someone for doing that; he might have changed his mind. If he hasn’t
in a decade and a half, I’d be shocked.

I realised this morning that 15 years ago, I was proud of using SAS, “the most
powerful analytics software there [was]”. I changed.

~~~
tptacek
There is no such thing as "responsible disclosure". That's a term invented by
vendors to coerce independent researchers into doing free work for them.
Semantic drift has somewhat legitimized the term, but I think it's important
we remember why it was conjured in the first place.

~~~
wolf550e
How would you call Google Project Zero's 90 day policy?

~~~
tptacek
A decision they had every right to make for themselves, but that they have no
right to impose on anyone else. I'm pretty sure Tavis Ormandy agrees with me.

------
mikeash
I find this rather silly. iPhones didn't get less secure because the FBI used
a known vulnerability to break into one. iPhones were that insecure all along,
and the only thing that changed is that we now know it.

The article further states, "There’s no such thing as a vulnerability that
affects only one device." Except that I'm pretty sure that whatever attack the
FBI used relied on the fact that the phone in question had a short passcode
set. I'd bet dollars to donuts that whatever attack they used would _not_ work
against my phone with a secure passphrase.

I'm usually a fan of Schneier, but I think he really missed with this one.

~~~
terrywilcox
If phones are no less secure when the existence of a vulnerability is
disclosed, why do we prefer security researchers to notify the manufacturer
before disclosing the vulnerability?

And perhaps the vulnerability in question wouldn't work against your iPhone,
but how many millions of iPhones still use a short passcode? This
vulnerability could still affect all of those phones.

~~~
mikeash
Disclosing vulnerabilities makes us more secure, that's why we want people to
do it. That doesn't imply that failing to disclose them makes us _less_
secure. Not doing anything leaves us where we started.

------
AdmiralAsshat
To be clear, the FBI isn't _compelling_ Apple to leave the vulnerability open,
are they? My understanding is they simply found/were informed of a
vulnerability that existed in this model of the phone, and are not informing
Apple of it. If Apple themselves finds the vulnerability and patches it, the
FBI can't do anything about it.

~~~
wrsh07
Correct. And to be fair, the FBI could have received the vulnerability from a
foreign state that is also opposed to terrorism but did not want their
identity or methods known.

ie they may have reasons for not disclosing the vulnerability, and it's a
shame that there's so much mistrust of the government [caused by actions of
the government] so that we can't assume the best.

------
mgberlin
Without disclosure of the method used by the FBI, this sounds like nothing
more than political posturing and a non-story. Apple is no longer able to
fight the All Writs Act, and the FBI is no longer compelled to watch one of
their favorite misuse-able laws get taken away.

------
mortenjorck
Schneier knows this, and this is a particularly idealistic op-ed, but this is
just how the exploit market works and while it would be nice if law
enforcement would take the white-hat road, the hazard here is still vastly
better than some kind of legal precedent for requiring backdoors.

The good thing about the exploit market is that it is naturally self-limiting:
you don't burn a zero-day on a dragnet; you limit its use to high-value (and
ideally court-sanctioned) targets.

~~~
jfoster
There's a bit of irony to this, too. If Apple had been more cooperative
earlier, law enforcement probably would've taken the white hat approach. Apple
protected users from the FBI, but is now potentially unable to protect them
from organised crime.

~~~
laveur
There is absolutely no proof of that. What they where asking for was keys to
every lock in the world. It would have given them unprecedented power to do
whatever they wanted. Apple did the right thing. Users should always come
first especially when privacy is at stake.

~~~
ikeboy
They were willing to hand over the phone to Apple and allow all work to take
place on Apple premises. How does that affect other phones?

~~~
woodman
So have you not heard that this would set a legal precedent? Apple would get
pestered until they finally create an automatic gateway for turn key security
breaking, not unlike the wiretap portals setup by telecoms - or the frequently
abused youtube DMCA mechanisms.

~~~
ikeboy
It only sets a precedent that they can give phones to Apple and have them
unlock them, as long as it's possible for them to do so. If Apple decides to
build a general backdoor and hand it to the government that's entirely their
decision.

~~~
jfoster
In fact, it also only sets precedent if they have to legally force Apple to do
it. Apple could've used their discretion to comply in this particular case,
avoiding courts, and avoiding legal precedents being set.

------
Karunamon
NPR was playing this story up as if it's a blow for Apple - is it really?
Isn't the vulnerability the fact that the phone has no secure enclave, and so
the timeout/wipe can be worked around by external access to the flash?

Isn't that the whole reason the newer phones were upgraded?

Older device fails, newer device with improved security doesn't. That's not a
blow to Apple, that's the way the world works.

~~~
mikeash
As far as I've been able to figure out, the Secure Enclave does not have its
own storage. The proposed attack of cloning the phone's flash memory would
work just as well on a new iPhone 6s. A lot of people are assuming that the
Secure Enclave would prevent this attack, but I've not yet been able to find
any basis for that assumption.

The main security advantage of newer phones in this context is that Touch ID
makes it practical to set a more complex passphrase. The flash cloning attack
only works if you have a short passcode set, because it relies on brute force.
If your passphrase is complex enough that it can't be brute forced even when
the software controls are removed, then you're still safe.

I'd wager that when the iPhone 7 ships this fall, it will have an upgraded
Secure Enclave with internal storage that can prevent this attack entirely.

~~~
Karunamon
That's absolutely correct, come to find out. For some reason I thought the SE
was responsible for holding some of the keys and the wipe counter, but instead
it's a section of the NAND flash called "Effaceable Storage".

Their own whitepaper defines it:

 _A dedicated area of NAND storage, used to store cryptographic keys, that can
be addressed directly and wiped securely. While >>it doesn’t provide
protection if an attacker has physical possession of a device<<, keys held in
Effaceable Storage can be used as part of a key hierarchy to facilitate fast
wipe and forward security_

(emphasis mine)

However, it appears that the FBI attack only worked because the people in
question used the 4 digit pin. A strong passcode is the way to go and protects
you from these kinds of brute force attacks.

~~~
mikeash
I initially assumed the same as you, because it would just make so much sense.

And yes, it looks like all of this came down to the short PIN. It's likely a
six-digit PIN would still fall, but a proper passcode would have made this
whole affair moot. It looks like even the older phones are still totally
secure in that case, although the lack of Touch ID can make it somewhat
impractical.

This is a major problem I have with the "iPhones got less secure" idea. Apple
has gone through heroics to make short passcodes somewhat secure, but there's
only so much you can reasonably expect there.

------
dahart
Is this just FUD? Simply confirming the vulnerability seems likely to lead to
it being plugged, whether or not the FBI reveals their methods, in effect
doing the opposite of what the title suggests.

~~~
llamataboot
How can it be plugged, if Apple doesn't know what it is?

~~~
runjake
Because Apple being the OEM of the hardware and software involved, things tend
to get back to them, even if in rumor form.

And them knowing their software so well gives them intimate knowledge when
tracking down vulnerabilities.

------
koenigdavidmj
We'll find out based on how many of the similar cases are withdrawn.

If the answer is zero, they're probably lying.

------
eggbrain
I find myself divided on this article, as a person who values security very
strongly.

1\. If the vulnerability the FBI used worked because the device was an iPhone
without a secure enclave, Apple probably knows how they did it, but they can't
really fix devices that have already shipped without the security features.
While this obviously hurts users of those phones, every phone going forward
won't have this issue, and this won't be replicable on a mass scale.

2\. Because this was a vulnerability that was found, not intentionally
created, there is a high likelihood that the bug will be found again by
security researchers, or at the very least paid for handsomely by Apple. This
isn't necessarily true (Heartbleed existed for 2 years without being noticed),
but it means that the vulnerability has a timetable that rapidly closes. This
is far different from an _introduced_ backdoor/vulnerability, where Apple
knows exactly what can be used to get into a device, but has their hands tied
by the government, which would _unilaterally_ make our phones less secure. I
don't like buying a door lock if I know there's a master key that can open any
of the doors of that brand.

3\. The author I feel is misleading when he says things like "A vulnerability
in Windows 10, for example, affects all of us who use Windows 10". Even if a
piece of software has a vulnerability, that vulnerability could perhaps only
be exploitable under certain conditions, like software running on certain
hardware (eg: without a secure enclave), or under certain conditions
(passcodes of less than 4 digits). It also can be highly theoretical or
impractical to do on any sort of scale -- if the vulnerability involves
reading data off a hard drive using an electron microscope to check for
magnetic signatures, I'm not going to be too worried that it will be abused,
as the man hours to have it work for a single case would be astronomical and
only feasible in the most extreme circumstances.

It's probably very frustrating on Apple's part that the FBI found a
vulnerability that they (likely) don't know about, and in an ideal world,
governments would disclose those vulnerabilities to make us more secure. But
as long as the software and hardware continue to get more secure and not
intentionally crippled, any benefit they derived feels short lived at best.

~~~
coldcode
I am sure Apple knows exactly what they did, it's their system and their own
hacking team likely finds stuff like this. The key is if anyone else in the US
is able to purchase this hack to unlock a phone: if they do and try to
reference evidence based on that a judge will require the process be
disclosed.

------
akerro
It always was insecure, know we just know for sure.

~~~
pazra
We do not know for sure.

------
S_A_P
I think this just feeds off the myth that the iPhone was invulnerable to
exploit. The average washington post reader isnt going to know the intricacies
of mobile security. NPR and to some extent this article makes it seem like the
iPhone was previously secure, and now less so. There is no data supporting
either hypothesis, but its likely that this was not a new exploit found that
the FBI used.

I am curious, however, as to whether this affects only the 5c or if it affects
the newer 6/6+/s/s+

~~~
deegles
I have a CS degree and I don't know the intricacies either.

------
julie1
Given how much corruption is revealed in wikileaks maybe less privacy is not a
bad trade off.

What amaze me is the tendency for our society to call for more privacy that
favors the more powerful, hence the more likely to be corrupted.

If we want make it easy for our leaders to be corrupted I definitively would
call for more cryptography.

Me except the normal "shameful" stuff from the common people I want privacy
to. Like anyone I have stuff I want to hide. But, it is okay if I get
_caught_. I will not say everything I did is okay, because, I still have hurt
indirectly people. There are stuff I want to hide to protect myself from the
intolerant crowd.

But guess what, I have as much as I could went to present my excuses, and took
responsibility for my own shitty actions. And for the stuff I should not be
ashamed of, I don't see the need to hide it, in a democracy we have the
freedom to fight for our opinions.

None of my stupid stuff requiring privacy have been leading to blood being
shed, exploitation, or making the market noncompetitive.

With greater power/wealth should come greater transparency. And iPhone are
definitively more expensive than most phones.

So let's remember that is often the look from the others that makes us more
virtuous, and let us all accept to live in an house made of glass (except for
the bathroom, and the bedrooms).

In a fair competitive market access to information is symmetrical. In a real
democracy government are expected to be openly enforcing the choices of the
voters.

In a world tending towards virtue, there is no need for more privacy.

------
ctdonath
The "vulnerabilities equities process", adopted in 2010, requires the FBI
disclose the exploit details to Apple on request.

More info can be found in am EFF case:
[https://www.eff.org/files/2014/07/01/eff_v_nsa_odni_-
_foia.p...](https://www.eff.org/files/2014/07/01/eff_v_nsa_odni_-_foia.pdf)

~~~
kodablah
Maybe I misread the redacted doc [0] but from what I see, it is sent to the
Entities Review Board (ERB) that then determines whether they should
disseminate. I was unable to find the requirement to disclose. Can you point
to it?

0 - [https://www.eff.org/document/vulnerabilities-equities-
proces...](https://www.eff.org/document/vulnerabilities-equities-process-
january-2016)

------
Zigurd
There are reasons to doubt the FBI is being honest about this matter: Firstly,
they must have known they could desolder and copy the chip containing the PIN-
protected key, replacing it if it were erased. This method was put forward by
multiple security experts. So the FBI started from a dishonest position
regarding Apple being the only source of assistance.

Now they claim that a mystery exploit emerged and was put to use between
Sunday night and Tuesday. That's not enough time to crack the phone using a
pile of chips and brute-forcing the PIN.

Since the FBI started from a dishonest position, and hasn't spent enough time
to plausibly have used the know approach, _and_ they have carefully avoided
saying they actually got keys and/or encrypted data out of the phone - or that
they did anything at all with the phone, there is a more than small chance
that the FBI has simply moved their own vague goalposts and declared that they
have everything they need.

Not honest or ethical.

------
wyldfire
There's a typo in the URL for the "process" link in "They even have a process
for deciding what to do when a vulnerability is discovered."

Should be [https://www.whitehouse.gov/blog/2014/04/28/heartbleed-
unders...](https://www.whitehouse.gov/blog/2014/04/28/heartbleed-
understanding-when-we-disclose-cyber-vulnerabilities)

It's interesting that the government (at least this part: " Special Assistant
to the President and Cybersecurity Coordinator") recognizes the pros/cons:

> Disclosing a vulnerability can mean that we forego an opportunity to collect
> crucial intelligence ... Building up a huge stockpile of undisclosed
> vulnerabilities while leaving the Internet vulnerable and the American
> people unprotected would not be in our national security interest.

------
allemagne
Ideally (for the FBI), the FBI would order Apple to build a backdoor into
their system. This is functionally the same as ordering Apple not to patch a
known vulnerability. In turn, this is functionally the same (in the short
term) as not disclosing a known vulnerability to Apple.

While the article's title is misleading, the FBI might as well be taking an
active role in making the iPhone less secure. In communicating to the non-
technical public, insisting on elaborating the difference is being pedantic.

------
wimagguc
Why did the FBI drop the lawsuit against Apple in the first place? They both
made it sound like the stake was much higher than unlocking the one device.

Or, the FBI managed to unlock this phone and so they can unlock any other
devices too, now-and-forever? They don't need a backdoor any more?

Wouldn't the logical next step from Apple's side be to protect their users in
their future devices? Wouldn't then the FBI have to sue them again?

Feels like we are missing key facts here, and I wonder why Apple is not
talking now.

~~~
XorNot
It's a writ. It is legally only valid if there is necessity and the court must
be informed if that changes.

------
frogpelt
Hilarious. Apple wouldn't give the FBI access but now the FBI should give
Apple access to their information?

This conversation has jumped the shark.

------
spriggan3
> We don’t know what the method is, or which iPhone models it applies to.

Where is the proof? there is no proof.

~~~
CaptSpify
How do you prove something you don't know?

~~~
spriggan3
Well the burden of the proof lies with the person that states something as a
fact. What is the title of that blog post again ? Right. now let the author
show you the proof. If he doesn't know then he shouldn't talk about something
like it is a fact. The only fact in this whole debacle is that the FBI has
been caught lying again,again and again. So they were lying then and people
should believe them now ?

~~~
CaptSpify
> Well the burden of the proof lies with the person that states something as a
> fact.

Sure, but they aren't really stating a fact. They are saying they don't know
if it's fact or not.

I agree the title of the post is terrible and inaccurate, but the formatting
of your comment is focused on the quote, not the title.

------
sveit
Can we file a Freedom of Information Act request to have the method the FBI
used disclosed?

------
Mistri
TBH there have been so many security holes in the iOS software... this
software did not "get less secure", it has been there all along. If you're an
iPhone user, you notice the CONSTANT iOS security updates... most of them are
because someone found a hole and patched it up! I just hope that the FBI lets
Apple know of this hole once the case is over (but I doubt that will happen),
or Apple figures out how they did it. If Apple can't find out, it won't be
very easy for others to get into iPhones either... but it's still possible.

------
frade33
Changelog: iOS 9.4 1st April 2016 \- Fix FBI backdoor

------
HillaryBriss
Bruce singles out the FBI for blame in this case, but if that's right, then
whatever third party helped the FBI also deserves blame.

------
gigq
This is the company the FBI supposedly contracted
([http://www.cellebrite.com/Pages/cellebrite-solution-for-
lock...](http://www.cellebrite.com/Pages/cellebrite-solution-for-locked-apple-
devices-running-ios-8x)), if that's the case it's just older devices that
don't have the secure enclave which is not surprising.

------
darkseas
Did anyone else get the ad from Huawei? "Focus Perservere Breakthrough"
..delicious..

------
lmeunier
I don't know why I still haven't seen this here, but there are plenty of
videos on youtube on how to bypass the 10 wrong pass code limit. I don't
understand, it looks so easy and still people are talking about encryption
modules etc ... Am I missing something ?

------
incepted
Blame the FBI?

How is it the FBI's fault that this particular iPhone was not secure?

It's mystifying to me how far Apple advocates will go to not blame their
favorite company.

------
exelius
This is bad reporting.

The iPhone did not get less secure. It has always had this security hole. I,
like many others here on HN, believe the vulnerability to be related to the
lack of a secure hardware biometric / encryption module. If this is the case,
then your iPhone probably did not get less secure -- such exploits would only
work on iPhones prior to the 5S (I think? The 6 series phones are covered for
sure). Basically, if your phone supports Apple Pay, you're good.

And if it's the case, it was a design decision rather than a bug. The secure
hardware wasn't ready for consumer use when those devices were designed (and
even then, the first generation of iPhone fingerprint readers was pretty bad).
Apple knew a desoldering attack could be successful, but such attacks are very
expensive, require long-term physical possession of the phone, and are
impossible to pull off covertly. There may be another way to pull off a direct
hardware access attack without actually needing to desolder the flash memory,
but that would still only be effective in the absence of a hardware security
module. The only defense against this type of attack is a secure hardware
encryption module like the one Apple included to support Apple Pay (because
the banks likely insisted on this level of security).

~~~
wrsh07
Funny to call it reporting when it's more of an editorial by the renowned
security researcher Bruce Schneier.

However, I'll defend his point: take the Monty Hall problem
[[https://en.wikipedia.org/wiki/Monty_Hall_problem](https://en.wikipedia.org/wiki/Monty_Hall_problem)].
The probabilities change, even when a door you didn't pick [and doesn't hold
the prize] is opened.

I think this is a fair analogy. We've now gained knowledge about the existence
of a vulnerability, and that knowledge makes everyone's device less secure
[for a variety of reasons -- eg others know to look for the vulnerability,
others know it can be bought, etc]. It doesn't matter that the vulnerability
"has always been there" if nobody knew about it.

~~~
adekok
> The probabilities change, even when a door you didn't pick [and doesn't hold
> the prize] is opened.

That's the common misunderstanding of the problem. Most people think that the
probabilities go fro 1/3, 1/3, 1/3 to 1/2, 1/2, after choosing a door and
having Monty Hall open one of the others. The probabilities don't change.

The probabilities are 1/3, 1/3, 1/3 at the start. After you choose a door,
they're still 1/3, 1/3, 1/3\. Or a put a different way: 1/3 (your choice) 2/3
(what you didn't choose). When Monty Hall opens one of the other doors, the
probabilities don't change. It's still 1/3 (your choice) 2/3 (what you didn't
choose).

However, because he's removed one of the doors from the problem, the 2/3
probability is now applied to the remaining door.

That can be viewed as "probabilities changing" for the remaining door, but
it's better described as the probabilities being shared between the doors you
didn't choose.

~~~
zodPod
This is a great clarification of how this problem works. I STILL can't grasp
why you'd switch. Since you have no idea which door it is, couldn't the 2/3
probability be applied to either his door or your door? For all you know, the
one that he removed was just one random one of the goat doors. Your chance of
picking the car was 1/3 before, if you could have the car already, why would
it be better to switch now that he removed which ever would definitely be a
goat? It seems like you still have a 50/50 chance of getting the car and your
chance of the one that remains (unpicked/unremoved) is just as likely to be
the other goat as it is to be the car..

That said, I made a simulation of it where it does the following: 1. Assign
random locations behind doors, 2. Player randomly chooses door, 3. Host
excludes door (random if both are goats, goat if one is a car), 4. Player
either switches, or doesn't switch based on what's chosen for that simulation
(the full set gets one or the other)

After running 50,000,000 then dividing success/tries, my number is nearly
exactly the same between the two. Is my simulation working incorrectly?

~~~
wrsh07
I find it clearer if you have 100 doors. Pick one. The host eliminates 98
doors [with no prize behind them].

Do you switch? Or do you stick with your original guess? As the parent
mentioned, switching gives you a 99/100 chance of being correct, staying gives
you a mere 1/100 chance.

~~~
ricardobeat
I don't see why the other door has 99x more probability of being correct. It's
still 50/50, original door or remaining door, the other 98 don't change the
odds.

~~~
pklausler
Your initial pick has a 1/100 chance of being right. If you switch, and you
were right, you lose. On the other hand, your initial pick has a 99/100 chance
of being wrong, and if you were wrong, and you switch, you win. So switch.

------
known
Windows is less secure. Can we blame FBI?

------
intrasight
I don't blame either of them. But I do blame users who put confidential
information on their phone with the expectation that it will remain so.

------
artursapek
If Apple refused to comply with the FBI's request, why should the FBI owe
Apple a disclosure of this vulnerability they found?

Keep the downvotes coming, lads! They're meant for burying spam and junk
comments, not expressing disagreement, but I enjoy them anyway.

~~~
enlightenedfool
If I find a vulnerability and exploit someone's device, it's not okay under
law. Why is a government institution exempt from law in a supposedly exemplary
democracy?

~~~
lmz
Because it's not someone's device anymore. It's government evidence?

~~~
ceejayoz
It was the county's device to start with, too. Not the shooter's.

------
noobie
I am sure they just hired some interns to

    
    
        code = 0000
    
        While code_is_valid not true:
    
            enter code
    
            reboot device
    
            code++
    

Edit: I am pretty sure there's a delay in code execution registering a wrong
pin hence by rebooting the phone, the wrong pins won't get detected.

~~~
hyperliner
Good one, though I can see there is a problem with this "algorithm"! haha :-)

~~~
noobie
What might it be?

I assumed the code is just 4 digits long but judging by the quick downvotes
people aren't even considering this to be a possibility.

~~~
cjbprime
There's a mechanism that wipes the device after too many incorrect code
attempts. Disabling this mechanism was the entire point of the FBI/Apple
lawsuit.

~~~
noobie
I am pretty sure a reboot bypasses that mechanism.

~~~
function_seven
Why would you be pretty sure of that? That would be totally braindead if a
simple reboot bypassed the limit. It doesn't. There was a flaw in an earlier
iOS version where you could reboot the phone a few milliseconds after an
incorrect PIN, before the phone recorded the failure, but that was fixed some
time ago.

------
Overtonwindow
I completely disagree with the premise here. This is Tim Cook's fault and it
should fall completely on Apple. I've been using Apple products my whole life,
and I think security and privacy are great, but I don't believe for one second
that Apple is the holy savior of our privacy. They fought the FBI because of
marketing and profits, not out of a sense of duty to protect our privacy.

I also don't buy the rhetorical come-back about those who would trade liberty
for security deserve neither. There is a line between the two, and each side
must give and take. If the FBI found a way to do it without Apple, bravo. Let
Apple figure it out if they want.

This is Apple's fault and they should not get a pass for spinning the PR in
their favor.

~~~
riprowan
You're certainly right about one thing: the people want privacy and will
reward a company handsomely for delivering.

In theory government is supposed to also be responsive to the will of the
people especially with regards to things like privacy from government searches
so I guess it's up to the reader to decide for themselves which side they're
on here.

------
at-fates-hands
Your iphone just got less secure - so don't use iphones anymore.

It always amazes me when people get on their high horse and start complaining
about something when the remedy is quite simple - don't use the iphone. Get an
Android phone, or an Ubuntu phone, or a Blackphone or a Windows phone. There's
plenty of other devices that haven't been cracked by the FBI.

There's irony in the fact Apple resisted the FBI attempts to crack the phone
and now this vulnerability will go unpatched and the FBI now has a zero-day
exploit they can use whenever they want.

~~~
gambiting
I don't believe even for a second that Android devices are any more secure
than iPhones. And I use an android phone. But let's be honest - with dozens of
manufacturers there isn't a single one who dedicated the same amount of work
to securing their devices or engineering things like the secure enclave in
iPhones. I use a Sony phone but I'm sure Sony wouldn't have the guts to stand
up to FBI, or that FBI would even need to ask - the hardware is most
definitely not on the same level of polish as Apple's.

~~~
neximo4
Hi! do you have contact details/email I can reach you on?

