I'd support companies with excellent privacy / data security records being exclusive brokers of my information. Apple is on that very short list.
Are they? Do people care that much? I think, except for those who actually have their credit stolen, the average person just sees it as another headline of the world today, and will move on, the way people still fly United.
Until people's lives are actually impacted by the lack of privacy, they'll continue doing what they always do. Once something requires action on the person's part, (either credit being stolen from them, or something along those lines) most people won't take preemptive actions. And this would need to happen on a massive scale for giant corporations to change their ways.
Other than financial data I don't particularly value my privacy. I post on many online forums under my own name and even my alias here is a composition of elements of my name. I use Google services quite happily and have no material gripes with the company. But still as an iPhone user it's good to know that Apple has my back on this and I'd really hate to lose that feeling.
Also I don't think this is entirely a commercial decision on Apple's part. They must know very well that taking an absolute stance on customer privacy puts them diametrically at odds with the interests of the Chinese government. China hasn't taken many overt steps against them (though they shut down iTunes movies and iBooks and chipping away at Apple Pay and other services with onerous regulations), but does anyone really doubt that they're working away in the background to try to help Apple's competitors and slow down Apple sales and service expansion in China? There's no way the Chinese government can be happy with Apple's privacy stance and nunlike int he West Apple doesn't have the shield of the law to protect them in China. They stand naked against the Chinese government, but they're still doing it. That takes some real conviction. I really don't think I'm exaggerating in saying that this stance on privacy may very well cost them any chance of a dominant position in the Chinese market. It's certainly running a real material risk of that anyway.
Volkswagen is basically back to it's pre-dieselgate sales levels. Admittedly its stock price is still quite a bit below it's pre-dieselgate high, largely due to the still unknown factors surrounding possible future fines.
We know that since Snowden, if you remember those revelations.
I'm not implying i agree with how far governments go to intrude into peoples privacy. Just pointing out that i agree with OP that there's a big difference between for example Googles and Apples stance towards privacy.
For many people, their only positive quality seems to be "Not Apple."
Is Apple good? Probably not. Is it less of a threat to your privacy and freedom than Google? Probably.
While I really like non-Google Android + PMP in theory, the reality is that this solution is extremely complicated, limiting, error prone, buggy and probably not worth it.
So I concede that Apple is the better choice, but nevertheless, persist in my futile efforts to make my 1+ somehow work securely.
Samsung, really? :D weeping angel rings a bell?
In practice i have very strong doubts this is a fact. Specially since there is '5 eyes', '6 eyes', '9 eyes' and '14 eyes'. And any of the members are monitoring domestic terrorism, so i fail to see how they are not spying on their own citizens.
Maybe they offload it to the US as NSA is so good at it.
Really, Apple is anti Nothing and pro only one thing: money. Everything else is a corollary. Including ease of use, freedom, and privacy. It's just that those things do matter to me (and GP).
Edit: It would massively increase the usability, allowing any apps to be installed, not just Apple approved apps, making the phone more usable to many people. Back in the day I had to buy a dev account just to load an emulator I wanted to use (it was open source) without rooting.
I know the risks, I want to do it anyway. I don't want a nanny over my head.
For that reason, societies tightly regulate car ownership and driving. More so than phones: I don’t need a license to use a phone, nor do I have to register my ownership or have it regularly inspected.
But technology these days has this same characteristic: Others bear the costs of your decision. Every device connected to the internet is a DDOS vector.
I don’t want you deciding whether to keep your device up to date with the latest security patches, because if you (and a few million others) don’t, GitHub is down for me.
Anyway, if you don't want to deal with how an iPhone works, how about buying a different phone.
If they ever do offer sideloading/gatekeeper, it should be turned off by default and turning it on should make the risks crystal clear with a scary alert and passcode prompt. Average folks should be heavily discouraged against using it. Gatekeeper works on the Mac, but iOS devices are both far more numerous and far more personal, so the stakes are much higher.
You can compile and sign (and, AFAIK, upload to the Google Play Store) an APK on an Android device, using AIDE and perhaps other dev toolkits. No computer needed.
I mean, tha hardware is not open, so developers can't generally implement any proper secure boot schemes (starting from a trusted bootloader), and can't generally control what goes on in the radio modules. Because there are no devices that have those parts open (or I'm unaware of something I want to buy?), there's no security/privacy possible.
Apple can provide privacy there because they're damn huge and they can purchase or design any hardware they want, fully documented to the last every single logic gate. An average free software developer can't.
As for the userland - I believe there are AOSP derivatives + 3rd party apps that result in a reasonably good privacy and security experience.
The only part of their ecosystem that has major restrictions is the App store. Which kind of makes sense as that's one of the few ways in which someone other than Apple could completely trash their platform and make alot of Apple's customers unhappy.
I'm not saying your "anti-freedom" point is completely invalid, but it's really not a major concern for most tech-savvy people, IMO.
More timely, the WTC attacks 16 years ago marked another notable event whereafter great freedom was lost in the name of safety.
As that old saying goes, "Those who sacrifice freedom for security deserve neither."
I see your point in the context of e.g. WannaCry and WTC attacks emboldening authoritarians, however this privacy move we're discussing is actually an anti-authoritarian move. It gives the user more privacy and more freedom.
Advocating more lax privacy/security in this case is siding with authoritarian tendencies.
I wish Google Fi would support iOS devices.
I know, I know, walled garden prevents users from opening their phone to vulnerabilities or guaranteeing a secure experience. I guess I wish I could have my cake and eat it too.
You can't setup TouchID without a passcode. The "attack vector" only exists if you 0 security on your iPhone to begin with. (So , yes, technically the attack vector exists if you choose to use no locks at all).
> Under iOS 11, this sequence has changed to also specifically require the passcode on the device after the "Trust This Computer?" prompt.
That's part of it. Another part is that the passcode has to be entered after the device has been connected to the computer, as opposed to starting with an unlocked phone (maybe you took it while it was in use, maybe you forced the user to unlock it) and connecting it to a computer later.
Once the device is unlocked, what is the obstacle to change the passcode?
Needing the old one?
Or - if you prefer - what happens if you have a TouchID and forget the passcode?
Do you need to reset the phone?
(No, aosp is unusable, and Google is working to make it more unusable).
Google = no privacy
Therefore, Android = no privacy
Privacy is achievable on Android. However, I do agree it os very involved.
By your standards, privacy is achievable with difficulty on Android and not at all on iOS.
That leaves app installs (Android allows side loading) and push messaging (I'm not sure about this one) as the ones that you can't disable on iOS.
You can argue until you're blue in the face but Apple have little to no incentive to exploit my private data while Google have every incentive to do so.
Usually that path is: boot the phone into bootloader mode, plug it into a computer, run `fastboot oem unlock`, accept the warning that this will void your warranty, then `fastboot flash <image file to flash>`. An officially documented and supported way to do exactly what you want with the device that you own.
Yes, it voids your warranty, but it's not forbidden or illegal in any way.
Please. Actions speak louder than words, I am sure you would agree with that.
I see you defending Google around here. That's fine but a grand-grand-grand...-parent of yours was saying that AOSP is unusable, and that point still stands and is true.
The officially supported experience with Google Play Services is no more privacy invasive than the officially supported iOS ROM, but in the case of Google devices, the device is yours, and you are not limited to officially supported ROMs.
There are no open source ios phones
Your comment looks like trolling. We were discussing a feature to prevent others from backing up your device.
I don't know what Google collects, but the simple answer to this is "virtually everything". Apple collects very little data from their users and does everything they possibly can on-device (and the stuff that requires the cloud is either encrypted or heavily anonymized). Nearly all data that Google collects, Apple doesn't.
Let's not get too hippy-feelgood about Apple's intentions regarding its users here.
AFAIK this is true
I'm curious - how can a carrier support and not support certain networks? Isn't that like saying I wish my ISP supported Linux?
So, I don't think it is physically possible to do the "two provider at the same time" thingy without a special hardware (you'd need two parallel "radio" circuits on your phone).
And with that, your analogy falls sadly on its face, as you seem to indicate an artificial "software" limitation has been set to prevent us from using Google Fi on our iPhones.
It's not that they're blocking iOS, but more that iPhones don't support the network.
They are not able to implement it to their satisfaction with iOS, so they stick with their own devices.
I understand that it would be needed for purchase apps, but just to download a free one?
For example Gab was rejected by the App Store because of the content people were posting. Of course, they won't ban Chrome despite being susceptible to the same content violation.
Unlike any of the tech giants, they have no incentive to pull you in their ecosystem, embrace open source, etc
If only FirefoxOS had arrived earlier and was not tied to a catastrophic 'web on mobile' idea.
I'll trust Google's engineering prowess and rigor before I trust Apple, who couldn't even bother to verify emails before leaking Apple ID attributes towards that email, and who allowed things to "fappen."
But I'll trust none of them to "guard my privates." Please.
Apples security record is not better than say, Adobe
Personally I dropped a lot of Google products with the Snowden leaks revealed to shocked Google engineers that they shouldn't have been sending unencrypted traffic between data centers and assuming it was safe because it was on leased lines. There's bugs, and then there's just negligence.
Counter-Forensics: Pair-Lock Your Device with Apple’s Configurator:
Protecting Your Data at a Border Crossing:
By the way, the writer of this post now works for Apple on their Security Architecture team.
I would not be surprised if this change came from him.
"you have to start fresh, with a brand new install of iOS."
So, doesn't this just mean that border agents will force you to write down your password, key it in themselves to verify that it works, then walk away with the phone to image it?
My guess is this somehow foils or mitigates the workaround that the Israeli company sold to the FBI after the San Bernadino phone issue.
Edit: No seriously. See https://en.wikipedia.org/wiki/Making_false_statements. Refusing to unlock your device is one thing. Claiming that you did unlock it but in fact just used a "duress passcode" is a lie and can land you in jail.
On the other hand, a "duress fingerprint" that locks out TouchID wouldn't be misrepresenting anything. But it wouldn't surprise me if they could still get you in trouble some other way for knowingly locking out the device after it's been confiscated.
Refusing to unlock your phone would be equivalent. Unlocking fake data would not.
What does work is putting the wrong finger on your sensor until it tells you TouchID is disabled and needs your passcode (which is 5 times) or rebooting your device without logging in.
"I just entered my duress pass code, the device is now wiped".
Can they charge you with destruction of evidence when they have no idea if/what evidence was on the phone?
If you go the US citizen route, be prepared to never get your phone back and many future border crossings to be 4+hr affairs where they confiscate most of your things.
As a US citizen they have to let you in.
True, and of course they can arrest you the instant they let you in.
Can't wait for iOS to have the deniability factor of having different passwords unlock different things :)
For the sensitive stuff, have periods of time that require several computers to solve cryptographic challenges in order to unlock the phone. Some of which may be your friends' devices. If they don't hear from you and your intended hosts in a certain amount of time, phone stays locked.
Or one of the devices can be an NFC or Wifi hotspot in a certain area, and one at home. If you don't reach it, phone stays locked.
Even if you give up all your passwords they can still keep beating you to get the "real" pass.
In this case, a blank phone probably means no entry and not actually harming you. So, there's that, I guess.
And your "solution" is to share this experience with your friends by tying their devices to yours and setting up a device in such a manner that that actor would not be able to verify if they indeed have access to the real data stored on it.
This is a recipe for a one way ticket to a very dark place for you and your friends.
For the most part the number one rule of actual data safety is that never implement protection that would put your well being or the well being of others at risk, unless you are protecting the nuclear launch codes no data is worth being physically harmed for.
What if you’re just protecting the location of your daughter/sister from her physically abusive ex? Not worth enduring some violence for?
The user controls their device to the exact same extent now. You have to trust the apps and OS you use. And iOS isn't exactly open source either.
I am saying the user can choose to select beacons to unlock their phone during a trip somewhere. The user CHOOSES to lock their phone in a way that requires things additional to the password, things that represent having made it safely past the security lol.
To be adequately effective, though, all back-up activities would require the full authentication credentials for all verification factors, which might only be possible with, say, a fingerprint scanner (touchbar) equipped laptop, or additional external hardware peripherals for other types of systems.
Nothing beats a strong password.
Wow, that's really nice. I wish Google was so forward thinking about things like this. I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be. The only reason this is how it works today is because it's much "easier" for the government for force your finger onto the phone, or take blood from you, or hair, and so on - and they can't really do that with passwords. But we can fight back with technology and ingenuity and ensure that a fingerprint auth is "just as good" as as password, at least from this point of view (government forcing you to give it away).
In Canada, there is a difference. A fingerprint is something you have. A password is something you know. Police can compel you to use your fingerprint to unlock the phone. They can't compel you to disclose the password.
There's a whole series that shows this. It's Canadian Border Guard, or similar. I see it when I cross the border, which I do with some frequency. I just unlock my phone for them.
What I meant was that inside of Canada, there are situations where the police don't need a search warrant for your phone. You can unlock it with your fingerprint, so it's legally "open" and searchable.
If you have a passphrase, then they need to know your mind in order to unlock it, and they can't force you to disclose something you know.
Curiously, citizens aren't immune in Canada. I have dual citizenship and they still sometimes want to flip through my phone. No, I'm not sure why. They have had me power on and unlock my laptop a couple of times, as well.
It has never left my sight. I can't speak for others and didn't watch the show that carefully. I live so close to the border that my neighbors get Canadian television. So, I've seen it there. I don't actually have TV hooked up, so I don't see it often.
Edit to add: I came across this one:
It was linked from an article in 2014, though this PDF document is dated March 2017.
The government has no right to interfere with your personal effects, this is fundamental to freedom and democracy, and the idea of the private individual.
Yet it seems this too is 'normalized' and citizens are more interested in technology workarounds to deal with this abuse from the state.
- All data encrypted by default
- The "dump" of the phone memory or macbook hard-drive makes it looks like the whole drive is full. It means that the free space is populated with random data that is, itself, encrypted.
- User can switch from his user profile to a fake user-profile and import some data (like contacts/messages/photos)
I thought you were suggesting a way that meant phone backups all appear the same size as the disk 100% of the time. (meaning the true volume of content is hidden)
Then the second point is the user could potentially have two (or more) profiles on the device, and it is possible to unlock it into one or the other. Meaning a user under duress can unlock a device and not reveal the true content while the person trying to get into the device has no way of knowing if that is the true profile or not.
I figured that would be a pretty sweet feature. It would also tie neatly into allowing users to have multiple profiles on their device which is currently impossible on iOS...
yeah it isn't awesome that we feel we need these security features. However I think even without the fear of government (etc...) this is still a great feature I would love to have.