In contrast, Google and Facebook are companies that sell advertising. The value they can offer to publishers and advertisers completely relies on how well they know their users. Their competitive moat involves collection of proprietary data to continuously improve their products. These are clear incentives to be sticky and greedy with your information, but also to keep it private and proprietary for their own sake.
With these incentives in mind, I more readily trust Apple when it says that it will not collect my data compared to data conglomerates, and Apple hasn't done anything to aggressively betray that idea in its history (to my knowledge). Combined with the technical security advantages of iOS, I'm inclined to believe Apple products to be the least-bad option for the security-minded today.
First: Move out of Google and Facebook. In particular email. It's a good idea as they can lock you out for nothing without explanations. They've been doing this a lot recently. You are not famous so a twitter mob won't help you.
Second: Since most pages out there have trackers from these companies, install Ad Nauseam extension to not just block tracking but click on everything in the background. This will ruin the information they gather about you with noise.
And indeed, Apple seems to handle privacy well, in practice. I've managed to create pseudonymous accounts at Apple's online store, and fund them with anonymized Bitcoin. And I've managed to make digital purchases.
However, I haven't tested buying physical devices, for pickup in meatspace. I wonder if they'd require official ID, or just proof of purchase. Anyone know?
I appreciate the alert when an app tries to grab my contacts/calendar/location.
On the other hand I DO NOT appreciate, when an app (e.g. British Airways) always informs Facebook, Google Analytics, tens of other trackers, that I am using the app, and I am left in the no-mercy of those folks, and Apple is not helping me to protect my privacy.
This is why an iPhone is only good when it's jailbroken (replace hosts file, install ProtectMyPrivacy, install Firewall IP -yes it works on 10.2-)
My guess is that they paid online for in-store pickup, and Apple asked for ID to confirm they're the one that made the purchase. In which case that's an Apple thing, not a government thing.
Bought my iPhone 6S from Apple in Ausralia: ID required under Australian law.
Very likely the person also bought a phone with a sim/service, hence being asked for ID (under Australian law.)
In Australia, do you need to show ID when you buy random electronics for cash? If you pay cash in an Apple Store, do you need to show ID?
Buying Apple hardware used, at swap meets etc, seems the best option. As long as it's not been stolen, anyway. But you can test for that, before paying.
The older tech companies like Apple and Microsoft have built business models with a different set of incentives around user data. Google, and particularly Facebook's, business models entirely rely on monetising user data.
My personal 'ranking' of a companies incentives to respect privacy is something like:
Obviously this ranking is very debatable, but I agree business model incentives are a pretty good heuristic.
I would flip MS/Apple personally, but I understand the argument from the perspective of business incentives. Microsoft's incentives don't really care about collection of telemetry, but they have a strong incentive to keep customer data secure from outside entities (e.g., businesses on MS products losing corporate data to outsiders).
Microsoft derives most of its revenue from enterprise software, the others from consumers. Enterprises are generally more concerned with privacy of their data than consumers.
Arguably Apple have shown better practices of late. My ranking is more what one would expect 'in theory'.
Compare that to Linux, OS X, BeOS, any of the BSDs, etc and the difference is night and day. Ok, Ubuntu had a stint of including Amazon results in their search but they were relatively transparent about that and it was quickly disabled by default after the backlash they, deservingly imo, received.
Windows has always been very "hands on" because it's designed for users who are very "hands off".
The pattern has repeated itself so many times throughout history. Technology developed with the best of intentions ends up getting used for bad.
For example, I don't really want to give most apps constant access to my photos, my camera, or my mic but I really don't have a whole lot of choice if I still want to use popular apps and services like Facebook, Instagram, Messenger, Hangouts, Line, WhatsApp etc..
I really wish that every time they wanted a photo they had to get it through some OS level UX and they only got to see the photos I selected. As it is they get to see all my photos the moment I want to give them a single photo. Similarly if I take a photo in one of them they require permission to read all my photos when all I want them to be able to do is save the photo.
As for the Camera I don't give any of those apps access to the camera because I don't trust them not to spy on me in some way (I assume camera = mic access so they could be doing the subsonic listening for ads things etc...). Instead I take the picture with the built in app then access that picture from the app I want to use the picture in. Of course that leads to the problem above.
Mic access is more problematic. I don't want any of them to have mic access when I'm not using the mic function directly but I can't talk to my friends who use all those services to call me if I don't give the apps mic access.
I feel like if Apple was more serious about privacy they'd handle these issues in some way. The photo one seems mostly straight forward for most use cases. Don't let the app access them at all, only the OS. The camera one is less straight forward. I get that there are innovative things apps can do with the cameras by using them directly. On the other hand most of the current use cases could be handled by letting the OS access the camera only, not the app, and then just giving the result to the app.
Apps won’t need to ask for permission to your entire library - the Photo picker will just appear, and only user selected photos will be available to the app.
On the iOS 11 GM at the moment, Facebook still seems to need access to my entire library. So perhaps it is only for apps that link against iOS 11. Guess we’ll see soon.
They could allow an app in their app store to provide this functionality to users who want it.
I often asked myself what happens when I give an app access to my photos. Can that app upload all my photos to their servers? The same question for things like calendar, address book, health data etc.
I don't think this is true though. From the code I've seen you only get returned the single UIPhoto the user chose.
also remember you can revoke permissions at any time via Settings.
Going to settings doesn't help
1) Giving an app permission to access the photos and then revoke it when I'm done doesn't prevent the app from accessing as many photos as it pleases until I revoke access. In the time it would take for me to give an app access, select a photo, and then revoke the app could easily have looked at 10s or 100s of photos.
Add some ML in there for their ad targeting
2) With the Mic off I can't receive calls in any of the messaging apps. If iOS asked each time then that issue would go away but as it is I need to leave on mic support or miss calls. I suppose I could miss the first call, turn on mic support, return the call, turn off mic support.
Still, I'd argue if Apple wanted to be even more serious about privacy they wouldn't allow apps to have unrestricted access to the mic as they do because it makes the burden of trying to prevent spying pretty cumbersome.
I feel like I'm having a heart attack every time thinking I posted it already. That feature sucks.
Compared to that workflow, if you open the stock/official photos app and use the share sheet to share one or more photos with a specific app, then it would get only the selected photo(s).
Mark is definitely not sitting around thinking "ayyyyy what's Gregg up to?"
I guess I just don't understand why Apple needs to put things in place to allow you to work with companies you don't want to work with.
Why should the photo app get access to the rest of my library of photos, like the ones in "business receipts" or "my son" that I would never upload to said photo sharing site?
Sometimes you want to use the app, but you don't want it to have a world-view of your phone. In a perfect world the photo sharing app should only get access to the individual photos that I choose to share with it (either through the Photos app or an OS-level file selector)
What's not to understand? Apple is selling you the phone. Its in Apple's interest to put as many features on the phone as potential buyers might be interested in. Allowing the owner's of a device to control access to different parts of a device rather than 3rd party software is not just good business practice, its common sense.
"A Local Chinese Government Will Oversee Apple’s New iCloud Data Center"
I went to the photo store not long ago and had dozens of rolls of negatives scanned, from a vacation to Europe 20 years ago. I uploaded them into Google Photos and it geotagged many of them automatically.
They have a public API that does the same, it detects landmarks in images and can give you a lat/long position for it as well as a confidence score.
They took 126 million geotagged images from the web, bined them into 26,000 squares, and trained a neural net to predict which square on the earth an image was taken in. That's very poor resolution, but if you see that 20 images taken around the same time are tagged as the square that contains San Francisco, you can be pretty sure they all happened during a trip to San Francisco.
Reminds me of https://geoguessr.com
Privacy means you only have to care about communicating with the person with whom you're communicating. Organations with incomprehensible motives are cut out of the loop. You can be yourself, not a self-censoring corporate-approved robot.
We generally speak of organizations, companies, and governments that are seeking the penetrate the veil of individual privacy. And I think that leads people to forget that these entities are made up of people. And people can, and do, do bad things.
A soft example is Snowden stating that NSA operators regularly pass around intercepted nude photo/video from people who had their privacy unjustly compromised: "These are seen as the fringe benefits of surveillance positions.." But the particularly disconcerting issue is that this information can be used against individuals. This article from the EFF highlights the FBI's actions against MLK:
That letter was the FBI, posing as a disillusioned black supporter, detailing various embarrassing information that had been collected on MLK and encouraging him to commit suicide -- stating that it would all be published otherwise. And these couple of examples are only stuff that's being done by the "good guys." Information can, and will, be leaked, stolen, traded, and so on. One can only imagine the sort of things the "bad guys" could cook up.
I can't speak for Pol Pot, since I'm not very familiar with his regime, but with Hitler and the Danish, my understanding is that record keeping aided in the German occupation. Not that the Denmark went downhill after it started doing meticulous data gathering.
Can you give causal examples?
1. Though it's understandable that Apple earns money primarily by selling hardware, it's sort of amusing and alarming at the same time that a proprietary almost-closed-source software company is focusing on protecting and preserving privacy whereas partially open source platforms competing with Apple seem to be nowhere close on this aspect. Do any of the Android forks try to do as much as Apple does for privacy right out of the box (something a non-technical lay person could get)?
2. It's abundantly clear that a lot of thought has gone into the foundations of a design focused on protecting privacy and in creating silos of information in/with different SDKs and features.
3. It's a bit unclear to me as to why Health data is stored encrypted in iCloud whereas messages aren't. Is there a distinction here between iCloud sync and iCloud backups? The documentation on messages suggests turning off iCloud backup as a protective measure.
See here for details: https://www.apple.com/business/docs/iOS_Security_Guide.pdf
A lot of the features would be very difficult to implement in Android without cooperating hardware, and hardware is notoriously expensive to get right and scale up. Projects like neo900 and Purism regularly encounter delays, unexpected costs, and pricing issues. It's really tough.
On a broader note, people are spending more and more time in data-hungry apps anyway, which can send almost anything they want to the network. This is sure to chip at any device-level security, pushing it towards irrelevance. I wish I had a log entry every time an app used the location service on my phone along with a database containing a history of Internet-transacted data.
> On a broader note, people are spending more and more time in data-hungry apps anyway, which can send almost anything they want to the network. This is sure to chip at any device-level security, pushing it towards irrelevance. I wish I had a log entry every time an app used the location service on my phone along with a database containing a history of Internet-transacted data.
I've long wished for network access permission on iOS, allowing the user to decide which apps can never connect to any networks. To reduce the total attack surface, I'd want to keep many apps (especially games) running only within their sandboxes and having access to only the data they create/generate on-device and no other external resource/server.
AFAIK, Android has had this even in the days of permission requests at app install time. I don't know if granular control is available on this from Android 6 onwards.
> Apple has no way to decrypt iMessage and FaceTime data when it’s in transit between devices. So unlike other companies’ messaging services, Apple doesn’t scan your communications, and we wouldn’t be able to comply with a wiretap order even if we wanted to. While we do back up iMessage and SMS messages for your convenience using iCloud Backup, you can turn it off whenever you want.
Compare that with what it says under the Health section of the same page (quoted excerpt):
> And any Health data backed up to iCloud is encrypted both in transit and on our servers.
Every story about Apple and privacy chooses to omit thus huge piece of info.
Why should anybody trust them now? What has changed to make anybody believe they aren't still lying about privacy?
"Not" being a partner in PRISM doesn't mean much; it just means you're legally obligated to handle that paperwork by hand. Like every other company in the country, you're still required to comply with a valid 702 directive.
“We have never heard of PRISM. We do not provide any government agency with direct access to our servers, and any government agency requesting customer data must get a court order,” stated Apple spokesperson Steve Dowling.
Other investigative reporters have worked out what PRISM was. Now we're all pretty sure we know. But that doesn't mean the tech companies at the time had any clue. It is somewhat unlikely that the thing that sent them 702 directives was, to them, referred to as "PRISM".
Generally, I find, when PRISM gets introduced into a discussion about Internet privacy, the discussion gets rapidly dumber. I'm not saying that's happening here, but it's a risk. If you're going to talk about it, you should have better sourcing than whatever the hell "The Cheat Sheet" is.
One thing we're certain of, however, is that Apple has the signing keys. They also encrypt their firmware and even other apps to hide how they work.
I hope you're not expecting them to open source the secure enclave.
The calls and messages should be end-to-end encrypted but they are in control of the PKI so they could probably eavesdrop if they wanted.
On stage at Black Hat, Ivan Krstic claims that hash function to have been a Vitamix blender.
Almost always, when companies claim to encrypt things serverside, they're doing something shady. And it is always preferable that secrets be kept on devices and never touch servers. But Apple is taking the problem seriously.
Thank you for this kind of information. Very grateful to have this stuff at hand.
In the case of the FBI fiasco, for example, we know that the FBI later paid Cellebrite for an exploit that allowed them to unlock the device. We don't know the details of how that happened, but the FBI had to reach out to an independent company and exploit an older device to do it.
To be clear: if they were lying about the lengths they go to with encryption, on device security, and user trust, this story would have been different. The FBI would have already had a backdoor or been able to trivially break the device, or they would have complied and created an iOS variant to break in. All we know is what they stood their ground on, and the effort that was required for the FBI to get in.
iOS 10 security white paper: https://www.apple.com/business/docs/iOS_Security_Guide.pdf
Talk about iOS security at Black Hat: https://www.youtube.com/watch?v=BLGFriOKz6U
"Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."
After it became clear that it is technically feasible for Apple to assist with those data requests, they quietly removed that claim from their website.
The idea that iPhones were inaccessible to highly-resourced attackers like the U.S. government seems unbelievable. Everything has security holes, even if they are expensive to discover or exploit; the U.S. government's resources are immense and the payoff - access to iPhones when needed - is clearly worthwhile. And in the end, when the PR debate had run its course, the FBI found a way in.
There is no upside and potentially massive downside/fallout for Apple to just lie about something like this.
Maybe they will 'screw up' - but it would be completely stupid for them to say one thing, and then intentionally do another.
A 0.5% drop in sales would be a disaster.
Twitter can cause quite a storm these days, companies are generally afraid of it.
If it were Google, or FB, it would be another story, because their business is your personal information.
But Apple - not so much. They want to hustle hardware. For them 'your data' is mostly a hassle they'd rather wash their hands of.
1. Are there any statistics showing the ratio of security issues between Android and iOS?
2. What's the most common phone among security researchers?
To be honest, I'm wary of believing this, but I own a Nexus and if I had to give the benefit of the doubt to either Google or Apple, it would most certainly be Apple because they don't make business out of my data (or, not as much as Google, at least).
Not sure about which phone Schneier uses, but he argues that having a smartphone is a logical choice in today's society. However he also brings a nuance to that: that we have the pragmatic choice of leaving our smartphone at home instead of taking it with us, and that we should consider that.
Keep in mind a dumbphone also has big brother capabilities. Just likely less precise/severe (assuming no remote holes).
You can look at the CVEs reported on each platform though and the number of critical vulns is comparable at this point.
I sometimes wonder what would happen if I were to create a new Google identity and start fresh. Would they try to re-establish my old email's profile and data to the new one without my consent?
Probably a dumbphone, so neither Android nor iOS...
Wouldn't they be able to hold that privacy promise much better if they actually allowed people to keep iCloud backup on let's say for pictures, but still be able to disable iMessage messages? I think many people would like to use iCloud but without it backing up personal conversations, too.
Also, iMessage's end-to-end encryption was rather flawed last Matthew Green checked, compared to other end-to-end messaging apps.
As for their use of differential privacy, when they introduced that it was essentially a hidden way of gather more of your data than before, not less, but while still being able to say "hey, we may gather more data than ever on you starting with the new iOS, but it's pretty private, so it's cool, don't worry about it".
All of that said, I know Apple is still miles ahead of Google on privacy. If anything, over the last 1-2 years, Google has become increasingly bolder and more shameless about tracking users without them realizing (except in the EU, where they are forced to make it a little easier for users to understand how they are being tracked, and even that happened because of the anti-trust lawsuit).
Here's just one example of Google's increasingly privacy-hostile behavior:
But "differential privacy" has a specific meaning that isn't just "sharing less data". It's trying to find provable methods to get the aggregate data needed for ML without exposing any single user's data. Kinda like the big data version of having one random soldier with a blank in his rifle in any firing squad.
What about when it's not in transit between devices?
> While we do back up iMessage and SMS messages for your convenience using iCloud Backup, you can turn it off whenever you want.
Sure, I can stop backing my data up - but presuming I don't (like the vast majority of people), does that mean if they got a wiretap order they could just read the data straight from their backup servers?
I'm not sure if this was just phrased poorly, or if backing up your iMessages effectively undoes any protection your messages once had.
Although in a link in this HN article Apple says backups _are_ encrypted: https://support.apple.com/en-us/HT202303
So, a little confusing...
bah, found this:
"Right now, although iCloud backups are encrypted, the keys for the encryption are also stored with Apple. "
Does anyone have suggestions for making that process easier?
> Apple's Terms of Service impose restrictive limits on use and distribution for any software distributed through the App Store, and the GPL doesn't allow that.
1. I can’t use your app ID
2. I can’t sign it with your cert
3. I can’t tie it to your website for deep links (because of #1)
It used to be the case that I couldn’t even get the binary to the device without paying $99/year, but that’s not necessary any more.
The dev program fee was definitely anti-GPL. I don’t know if the other things I mentioned are.
No, here's the deal:
GPL: You CAN modify and share this software, and if you share, you have to keep these terms for others so they have freedoms too.
iOS terms: You MAY NOT modify or share ANY software you get as a normal consumer through the app store OR install any software from outside the app store.
The problem is that Apple's unwillingness to ALLOW a user to exercise these basic freedoms means that there's no way to distribute in the App Store any software that requires these freedoms be maintained. It's incompatible terms. And the fault for the incompatibility lies 100% on Apple for their terms that are the ones that demand that freedoms not be allowed.
Is it? I know nothing of this subject and am speaking out of my ass, but that seems.... trivial to do. I'm curious as to why that is hard for them to do?
And the BSDs predate Linux. Yet Linux distros are the most popular free software OSes out there.
I did not say the GPL invented open-source, but I don't think you can deny that it was the force to move it from obscurity, at least at the beginning.
I mean, more stuff encrypted without provider escrow keys is a generally good thing, but are your contacts specifically more important than, say, you calendar events?
What threat model are you worried about with your personal, curated contacts?
On one hand, users are told that they should never ever, ever install applications from untrusted sources. They should always use the play store because applications are scanned for vulnerabilities and whatnot.
On the other hand, we have people telling us that one of the great advantages of Android is that you can sideload apps - bypassing the store security model completely.
On one hand, a VPN needs root access for transparent proxying (or at least TOR does). Changing the hosts file needs root access. Changing gps.conf requires root. Lots of useful operations need root access, so if you are concerned about privacy and security, or just want more control, you probably want root.
On the other hand, root is stupidly hard to obtain, and users are strongly discouraged from doing it anyway because it opens all sorts of attack vectors. Unless the manufacturer provides a legitimate method to do it, the operation of obtaining root itself, like on iOS, often relies on an unpatched local privilege escalation vulnerability. Note that any application can exploit this, not just the rooting app.
I don't support Apple's walled garden approach, but we can't argue that they have a much clearer picture with regards to security, and the results speak for themselves. Malware in the Apple devices is rare, whereas in Android it is rapidly becoming routine. Unfortunately, you sacrifice flexibility for enhanced security.
Also, one can sideload apps, if you have a Mac, onto iOS. Obviously, that's not anywhere as integrated but maybe that's a good thing. Heck, maybe Apple even added that so people in China could sideload VPNs. Maybe iOS VPNs are good enough (no root, no TOR?)?
Don't conflate unpatched Linux systems with the Play Store. Anybody who uses Android and cares about the security of their device (like anybody who uses a Linux-based router and cares about the security of their network) uses vendors who deploy timely security updates.
But lately Google has decided that rooted phones are a security issue. So if you do choose to install some sort of su utility, some functions like Android Pay may cease to work, not because of technical reasons but because Google deliberately disables them on rooted phones.
 for example https://wiki.lineageos.org/devices/lux/install
What they've done could be said to be more authoritarian, and indeed if you do the analogous of throwing everyone in jail by default because they could be guilty, then you will basically have no crime. The question is whether that's actually a good idea... it's the old "freedom vs. security" argument.
And they support a lot of devices. I dug out an old Moto G (1st generation, falcon) last week and it runs Lineage 14 (based on Android 7/Nougat) smoothly.
There's also Tor with Orbot and Orfox.
But only explicitly configured apps will use this proxy. Proxying all traffic does require a rooted phone.
In general, Android gives you more options but iOS seems to offer a more privacy-minded configuration OOTB.
Same with Tor. You can use an unlisted bridge to get on the network.
It's not as good as auditing the code (and making sure the code you see is what actually runs), but it's a lot less effort, and depending on the extent to which you are ready to dedicate resources to improving your privacy, may be an acceptable tradeoff.
But, yes, it's mostly based on a trust rather than knowledge obtained during some audit. (I'd even say that trusting is sorta the opposite of knowing)
Disclaimer: never owned any Apple device, ever. But they're doing quite well (or, should I say, others are doing so damn terrible in regards to user security and privacy, Apple starts to look not so bad) so I may consider them as an option.
Also, its still an improvement from when cops borrow your thumb. At least this way, you have less chance of getting hurt.
Once they have that warrant, it doesn't matter what biometric locking method you use.