Wonder if this will help to kill a meme, about how much Apple cares about users and what great values they have, how they're going to stand for the user, fight with governments, etc.
While iPhone itself is pretty secure as a device phone (and Apple makes sure to remind you about that in each ad, public speaking, attacks on competitors, etc), as an ecosystem it's not secure. And it's like that on purpose - there's no good and easy option to backup your phone other than iCloud.
You have to be a tech person to know how to keep an iPhone secure. Average Joe buys iPhone and pays for iCloud. They Apple first $1k to get (on top of other things) a secure device, and then they Apple monthly fee to give Apple all their data and make insecure. Pretty genius business strategy.
> Wonder if this will help to kill a meme, aboyt how much Apple cares about users and what great values they have, how they're going to stand for the user, fight with governments, etc.
In this instance, Apple decided to continue to not encrypt iCloud backups because, according to one source,
> […] the company did not want to risk being attacked by public officials for protecting criminals, sued for moving previously accessible data out of reach of government agencies or used as an excuse for new legislation against encryption. [0]
Apple's stance on privacy is more than mere marketing and more than a meme, but their legal team decided that encrypting formerly unencrypted backups, which had already been used as evidence in previous cases, is ill-advised.
Most people, including technically knowledgable users here on HN, were unaware iCloud backups have always been unencrypted. Many of us concerned about privacy have avoided iCloud backups because they are subject to subpoena.
I wish Apple would (have) offered encrypted iCloud backup as an option, and I understand why they chose not to. However, I disagree that their stance on privacy is mere marketing. Apple has a balance to strike between the issues of encryption, privacy, and law enforcement, and their products are not perfect for either users or law enforcement.
That doesn't mean Apple doesn't care about its users and their privacy.
>Most people, including technically knowledgable users here on HN, were unaware iCloud backups have always been unencrypted.
iCloud backups are definitely encrypted. They just aren't end-to-end encrypted.
That should have been plainly obvious to any technically knowledgeable user because you don't lose your data forever when you forget your Apple ID password. Or the fact that you can see your photos through a web browser on icloud.com
> That should have been plainly obvious to any technically knowledgeable user
As you can see from this discussion, there are a lot of technically knowledgeable people who did not find this "plainly obvious". It's disingenuous to blame users for Apple's misleading marketing.
Can you explain how this works? I'm assuming my IPhone itself is encrypted with my own password (or thumbprint) and that Apple could absolutely not get into it without that thumbprint.
But when you send a backup, it's decrypted, sent to Apple, and then re-encrypted with a key that Apple controls, and has nothing to do with any of the "things I personally know"? IOW, I couldn't myself decrypt my Apple backup stored on their servers?
Apple can't comply with a law enforcement request to get into your iPhone. They can comply with anything in your iCloud Backup except for the things under End-to-end encrypted data on this page [1]. Backups are encrypted "in transit" and "at rest", but Apple retains a key. That's how they can comply with legal requests and how you can restore your data if you forget your password.
>That should have been plainly obvious to any technically knowledgeable user because you don't lose your data forever when you forget your Apple ID password. Or the fact that you can see your photos through a web browser on icloud.com
You could build a system where there's a key stored on each of your devices, and your password also acts like a key. In that case, you could
- Lose all your devices, but unlock with password
- Forget your password, but unlock with any device
- Lose one device, but unlock with your password or any other device
Of course, most people's passwords would probably be way to weak, no matter how much time Apple spends stretching it into a key, but for people who use a decent password this would be fairly secure.
> Apple has a balance to strike between the issues of encryption, privacy, and law enforcement [...]
No, they do not. If Apple wants a reputation for privacy and respecting its customers, then it has to put them first. Don't apologize for them making this user-hostile choice.
We could be merely a generation away from the hell hole that is social credit. We can't afford to keep ceding ground on privacy. We have to engender a sense of importance and urgency.
I'll be very mad at the rest of you lot that choose convenience over privacy, rights, and autonomy if 2030 sees our freedoms and liberties eroded further.
Our collective choices matter, and that's why I'm calling you out.
1. Yes, they have to. At least in the US any company has to cooperate with the law enforcement as you might know. The only choice to do business in the US or based on US governed soil is to comply with them.
2. Apple at least put some effort into this matter because otherwise there would not be so much media attention to breaking into iPhones. To get data from android on the other hand seems to be no problem at all.
3. We are already in hell but we do not yet see the flames surrounding us.
4. If collective choice matters so much we are all doomed and you know it. Try telling the Joneses about encryption etc. - they will still use WhastApp, Facebook and the likes because it is so convenient over a way of burden and hard work to get there or even live without all these "magical devices and services".
Please tell me why do organizations use canaries[0] then to tell people they where not forced to cooperate?
The funny thing is that I just read a bit of the linked Wikipedia article to find this:
Companies and organizations who no longer have warrant canaries
The following is a list of companies and organizations whose warrant canaries no longer appear in transparency reports:
* Apple
* Reddit
* Silent Circle
>I'll be very mad at the rest of you lot that choose convenience over privacy, rights, and autonomy if 2030 sees our freedoms and liberties eroded further.
> Our collective choices matter, and that's why I'm calling you out.
I'll be direct and say that your passion inspires me and your articulation of the issue is very close to my deep feelings about privacy, encryption, and law enforcement. Of course, human brains are complicated pieces of kit and the rational part of me tempers "my deep feelings" with the context I elaborated in the parent comment.
Still, I wanted to take the time to thank you for expressing this in precisely the way you did. Thank you.
I also want to take a moment to recommend an article by John Gruber (Daring Fireball), "Regarding Reuters’s Report That Apple Dropped Plan for Encrypting iCloud Backups" which critiques the basis of the Reuter's article. [0] (I'd have made it a post except for being a bit overactive this morning with submitting Daring Fireball-sourced links to HN.)
You might feel less disillusioned considering Gruber's points, many of which are excellent.
> We could be merely a generation away from the hell hole that is social credit.
I fear we have one foot there already. If you are so inclined, you can dig up a lot of dox on most people, all freely given to social media, or via scrapers like Spokeo.
I guess it's all still optional, kinda. And there is no centralized clearinghouse. But it is scary.
>If Apple wants a reputation for privacy and respecting its customers, then it has to put them first.
"or used as an excuse for new legislation against encryption."
By not making iCloud backups end-to-end encrypted, they likely did exactly that.
Although I'd find it funny if by doing so they caused a backlash from governments with new laws that ruined encryption for everybody. It'd be dystopian but also funny.
> Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in.
When Apple markets themselves this way, people expect their actions to be in line with their words. Making iCloud device backups vulnerable to law enforcement demands is not in line with Apple's privacy-oriented marketing, and suggests that Apple's vocal defense of encryption is just a public relations strategy.
Somehow it seems like you have mistaken my comment for sarcasm. I wasn’t straw manning anyone, I was simply lauding a principled stand and articulating my desire to see that in more places.
It is mere marketing, if it is meant to convince consumers of one thing while in practice it's a different thing. The reasons why are irrelevant. Apple would prefer everyone just assumed iCloud backups were encrypted, and tries to keep quiet about it.
You need to be extremely technical to understand the difference between "Encryption: Yes" and not end-to-end encrypted. To the lay user, Apple is explicitly telling you that they're encrypted. See https://support.apple.com/en-us/HT202303.
Nice trick. The big table in the beginning has "Yes" almost everywhere but in fact the data are accessible by Apple. And they don't even have warnings like "Your data might be decrypted and given to law enforcement agencies". Absolutely deceiving article although Apple hasn't written a single untrue word.
The FBI proposed that Apple could write a new operating system variant that would let the FBI robot-brute-force the password, for certain older models at the time (without a Secure Enclave).
I believe Apple changed how operating system updates were installed after this point so that they did not have the capability to upgrade the phone's operating system without access to the device passcode.
It's one thing for them to not encrypt the backups, its another thing to go out of their way to include the iMessage private key in the backup, which completely undermines iMessage E2E for 99% of users.
If Google is able to introduce this then why can't Apple?
>Starting in Android Pie, devices can take advantage of a new capability where backed-up application data can only be decrypted by a key that is randomly generated at the client. This decryption key is encrypted using the user's lockscreen PIN/pattern/passcode, which isn’t known by Google.
>By design, this means that no one (including Google) can access a user's backed-up application data without specifically knowing their passcode.
I like it. If you want, you can make local encrypted backup. If you are lazy (and lots of terrorists and drug dealers probably are) you just flip the iCloud switch on at the first prompt. :) I have read that airport security is also not 100% effective but it is enough for these kinds of problematic people.
But you can't do so automatically. You have to open iTunes (or Finder) and explicitly click a button.
If Apple is privacy-oriented, they should've allowed this long time ago. Unlike iCloud backups, this is certainly not a legal issue, as the "local" backups stored on one's computer are already fully encrypted.
As someone who has bought into that meme I will admit this feels like a pretty huge betrayal by Apple. So, yes, I think if Apple sticks with this, their whole privacy stance is going in the toilet now. And a very dirty toilet it is.
Beyond just the facts of not protecting data, there is also the deception. This is some really very, very, nasty stuff for Apple's brand and the reputation of every person who works at Apple. I don't know how to state it strongly enough.
Huge Apple fan until today... see my comment history... this is devastating for them amongst the sliver of their users who pay attention to this stuff. And they should realize that even though we may be just a sliver, we can lead other customers away from them if we want to.
There is a plausible argument that Apple needed to give a little in order to avoid the creation of laws against any encryption. And/Or also avoid laws that required a backdoor to everything.
I know I'm going to be called a fanboy or too generous to Apple, but given that the government has used every opportunity to call out Apple for not helping (when they have helped where they could) there is a line here that Apple is tip toeing around.
I also do not think this as bad as you are making it out to be. Apple has always been clear what is fully E2E encrypted and what is not. This article is about something Apple planned to do and decided against. The reasons are what's important and the article only speculates.
Sadly, I think you are absolutely correct. Lindsay said outright that either tech companies figure it out, or senate will do the figuring for them.
I am not sure Apple made the right move, but.. average person does not seem to care and/or understand the ikplications. Now.. Apple could make them care. They are big enough to make waves and I am not certain goverment could deal with bad PR come election time.
Sure, but if iOS was open enough, users who cared could use some third party online backup that was actually secure. And it could rely on an app that users could obtain, regardless of whether it was legal or not.
There's no way to legislate backdoors now. They tried and failed with Clipper.
The current status quo is good enough for the spooks. Zero regulation of data privacy allows third-party aggregators to do the desired collection activities without explicit government involvement. When they want something they know who to ask, warrant optional. Enacting laws that expose what the government is doing would risk a public backlash like the mass mobilization to deploy HTTPS.
What you say makes sense. Still, if that’s the case, then when they decided not to go down the user-is-in-full-and-absolute-control path for encryption of iCloud backups, they should have publicized it loudly and with extreme clarity on what exactly was happening and where the lines were. So that users could make informed choices.
They have never hidden how iCloud backups or anything else related to iOS security works. This support document spells out clearly what data is end-to-end encrypted [1]. No one was actually misled into thinking all iCloud data was E2E. For one, most of Apple's customers don't know or care about the technical architecture of their products and services. The people who do would have known better when you can go to icloud.com and access your photos and files from a web browser.
When the very first line of that table tells people that iCloud Backups are encrypted on the server... to then have the last few lines add effectively "Oh, but not end to end!" is just taking the piss.
You're absolutely right. They could have definitely misled anyone that didn't read the entire support article, including the first paragraph under Data Security:
>iCloud secures your information by encrypting it when it's in transit, storing it in iCloud in an encrypted format, and using secure tokens for authentication. For certain sensitive information, Apple uses end-to-end encryption. This means that only you can access your information, and only on devices where you’re signed into iCloud. No one else, not even Apple, can access end-to-end encrypted information.
Technically telling the truth below the fold or in the fine print, while misleading consumers who only give the literature a glance. This is very typical behavior from a corporation, it shouldn't surprise us. Except that Apple's marketing team has managed to dupe a huge number of consumers into believing Apple 'thinks different.'
Typical users who may care about privacy were definitely misled by Apple's public pro-security and pro-privacy stances. I have family who fall into that category.
The difference between E2E and 'yup we're encrypted!' isn't understood by laypeople. Let's not do ourselves or the average folks out there a disservice by letting Apple off the hook for bad communication and the intentional misleading of users.
So it seems like the data are encrypted both in transit and on the server and it means that nobody is able to get unencrypted data even if they can intercept the traffic or access the server.
It has been known and talked about on HN for a long time that only certain things are E2E encrypted on iCloud. And, if full privacy was the goal, then either the user can only do local backups or no backups at all.
HN is among the most technologically-literate demographics in the world. Using HN as a control group to say that it's been 'known and talked about' is a bit disingenuous when we're the proverbial 1% who are in the know. Meanwhile, the other 99% are left trusting Apple's advertising.
Just to understand this better is it that Apple is misleading or the public is uninformed? Trusting 3rd parties should default to “others have access including law enforcement” behavior.
I don’t see how it’s a betrayal, as iCloud has never had E2EE, and Apple has never made any secret of this omission.
Did we really think that Apple was just so incompetent they could run online file services for 20+ years and have almost a billion users, and not have encryption only because they hadn’t figured it out yet?
Yeah, but what are the reasonable alternatives? Android, with its freewheeling stance on privacy and app permissions? Do they even let you disable location tracking any more?
Depending on what functionality you're willing to give up, an Android device can pretty easily be used as a more private option thanks to the latest iteration of Google's privacy settings.
If you're willing to root your device, you can have the best of both the functionality and privacy worlds.
That only applies to motivated and reasonably tech savvy users of course, out of the box iOS is still the better privacy option.
My favorite apple deceptive practice: allowing you to think you have disabled Bluetooth, when all you have disabled is Your Own ability to use it. Merchant partners of apple can use it to finely track you.
> Do they even let you disable location tracking any more?
You're confusing iOS with Android. On iOS, every time you get your location, that location is also sent to Apple, and there is no way to disable this. Android's collection of this data is gated by a checkbox that is shown to every user on device setup.
Last time I used android there as no option for disabling location tracking as they had removed it a few major versions prior. Instead you had "enabled" and "kind of disabled, but not really" options
I have used Android devices since 1.0. They have all had the ability to disable location history and the ability to disable Google Location Services (under various names). Apple doesn't even give you the option of not sending GPS locations to Apple. If any app requests your location, Apple gets it too. https://support.apple.com/en-us/HT207056
> That’s just not enough for disabling it on Android.
Citation needed.
> I am working with androids even before 1.0, but does this really matter?
It matters if I am claiming that there is no point at which GGP's statement was true, which I am. If I am not, GGP could say that the version of Android he used "last time [he] used Android" did not allow him to disable location collection and that it predates my experience with Android's location settings.
Practically speaking there is no privacy in the digital world, unless you're willing to go full Stallman, or abandon modern society and go live in a mountain shack somewhere, and even there you'll have dozens of Starlink satellites overhead pretty soon.
The best you can do if you're technically inclined is to use open source as much as possible (AOSP, postmarketOS, PureOS, etc.), minimize use of untrusted software and services (from all major corporations, no social media, no proprietary software in general), use network-level ad blockers, Tor if you think it helps, VPNs, encrypt everything, etc. And you'd still be tracked and profiled.
If you're not technically savvy, forget about it.
In either case if privacy is really a concern vote to elect politicians that are willing to enact laws that regulate the way companies can use personal data. Though considering both companies and governments benefit from the status quo, I don't foresee things improving in the near future, barring some kind of revolution where the majority wakes up, which is also unlikely. If the Snowden revelations didn't do it, I doubt anything will.
Honestly I didn’t realize icloud backups were encrypted by a key Apple controls. I assumed they leveraged the same key sharing tech as iMessage ...
I’m turning off iCloud backup — the only reason I have it on actually is to turn off the annoying backup nags. My photos are in iCloud, and so are my messages - I actually can’t think of anything valuable outside of those that I’d actually need iCloud backup for ...
I have not bought into the meme and am not a huge Apple fan, but I think you can continue to if you wish. "Reuters could not determine why exactly Apple dropped the plan." It's possible that Apple didn't want the support costs of dealing with customers who lost/forgot whatever secret was to be used for encryption, and thus would have been unable to use their backups.
> there's no good and easy option to backup your phone other than iCloud
This is false. You can back it up with Finder (since Catalina) or iTunes on Windows or previous macOS versions. You can even do wireless syncing over your local network.
All of the infrastructure that Apple built out before iCloud still exists. You can sync your photos with the Photos app locally. It’ll one-click import everything and you can have it automatically clear out storage on your phone after the import has completed as well. You can then back up your computer however you want - locally, encrypted with Time Machine is an easy option.
libimobiledevice works just fine on Linux. Your distribution's version may be out of date, but the upstream version gets updated to match any iOS changes pretty quickly.
> there's no good and easy option to backup your phone other than iCloud
Except plugging it into a PC or Mac with iTunes, toggle a checkbox and set a password. No, it's not effortless, but it's not bad and it's not difficult
It's less about Apple here and more about the government forcing their control. Apple, Microsoft, and many other companies have gone to congress to try and keep them from controlling encryption. At the end of the day they're a company and they need a certain level of cooperation from the government to do business.
THIS is why open source matters. The government can't control people when money isn't what motivates them.
> Wonder if this will help to kill a meme, about how much Apple cares about users and what great values they have, how they're going to stand for the user, fight with governments, etc.
In China, all your data belongs to China. That is simply the way it is if you choose to do business there. You might point fingers at Apple and other companies who operate there, but China is a huge market. The Chinese people know their data is all backdoored, so there is no coverup or obfuscation.
> The keyboards on their laptops are barely functional
This must be some definition of "barely functional" I'm unfamiliar with.
I've had a mid-2017 MBP since they were released. Yeah, I had to get the keyboard replaced when some keys failed after a year, but at least they did it for free. Actually, overall I prefer this keyboard to the 2013 I had previously. I think their failure rate is unacceptable, but they are certainly not "barely functional" by even the remotest interpretation of the phrase.
> Yeah, I had to get the keyboard replaced when some keys failed after a year
As a person who had to replace a microsoft ergonomic keyboard after 14 years and has never had to replace a laptop keyboard I would consider a laptop keyboard that was replaced after a year due to failed keys to qualify as 'barely functional.'
It's the 21st century. keyboards are a solved problem that should never fail. It's abject failure on the part of Apple for releasing a flagship laptop with such a comparatively shitty keyboard.
No keyboard should be failing at this point. Keyboards are a mature technology.
The new keyboard design was meant to solve problems that keyboards have not specifically addressed before. It was a failed design, absolutely. But 99% of the time the keyboard works to my satisfaction, so I do not understand how you can label it "barely functional". "Barely functional" means "it rarely functions as intended", which is so obviously not true in any case.
Does it fail more than the average keyboard? Yes. Am I pleased about its failure rate? Absolutely not. Would I rather have a different keyboard? Uhhh probably? I like the way the keys feel, such as the fact that they don't wobble in place like the previous generation, and I find that I do not type any slower or make more mistakes than I did previously. Probably this keyboard was designed for somebody who types like I do, and perhaps this is not the way most people type. (I say this because most people complain about the reduced travel, about the keys not registering presses all the time, and other issues.)
So my point is: the keyboard is bad as measured by multiple metrics, and Apple should feel bad about it, but there is no conceivable definition of "barely functional" that applies to this keyboard. Most of the time I have not had to worry about its failure rate because it does not affect me on a daily basis, even though I use it for typing everyday.
It's heavily dependent on personal taste, but for example I make easily twice as many mistakes typing with the new keyboard than I did with the old keyboard, or any other keyboard I own, on other laptops or standalone.
I wouldn't call it "barely functional" either since I can clearly still type on that keyboard, but the combination of mechanical failure rate and personal accuracy issues means that I buy the MBP despite the keyboard.
Yes, this is absolutely a fair take. I've heard from multiple people that they seem to make more mistakes with this keyboard, and other people say they just don't like the way it feels.
Personally, I find it to be an improvement over the previous generation in almost all respects except for, obviously, the failure rate. I hate that I am expecting the keyboard to fail again within a year or two, but perhaps I will be surprised and this particular one will last long enough that it won't be a problem for me.
>While iPhone itself is pretty secure as a device phone
I simply don't understand why people would blindly believe marketing material from a for-profit corporation. It's a device running closed source software. There is no way to prove this claim. If anything, Apple has been caught in the past sending very very personal sensitive information [1].
Not blindly. Outside of marketing material there have been more meaningful signs and signals indicating that they did care. Including principled people working there, threatening to quit if the FBI got its way on San Bernardino, etc. This is new.
I work at Apple in engineering. User data privacy is a VERY big deal and something which is taken really seriously. It is not just marketing lip service.
Is Apple perfect? No, of course not. Are there things I think they should do better? Yes. E2E encryption with user keys stored in Secure Enclave, etc, etc. However, for Joe Consumer, I think they make the most secure, easiest to use platform.
You can trust an organization to do some mix of pursuing its goals and filling someone's pockets. So while you can trust some non-profits, you can never trust a for-profit about anything you care about as a customer.
Taking this argument to its logical conclusion, you can't trust eating food someone grew for profit, riding in a car someone built for profit, wearing clothes someone made for profit…
Under capitalism, encouraging repeat business from a customer is a successful long-term strategy, and maintaining customer satisfaction by not selling them food that will make them sick, cars that have faulty brakes, clothes that deteriorate in the rain, or so on is part of that. So if I purchase a product from a company that's been around for a while, I can have a reasonable expectation - a trust - that that product will be of some quality.
> Taking this argument to its logical conclusion, you can't trust eating food someone grew for profit, riding in a car someone built for profit, wearing clothes someone made for profit…
> So if I purchase a product from a company that's been around for a while, I can have a reasonable expectation - a trust - that that product will be of some quality.
That's pretty naive. Companies that used to produce quality products often switch to producing garbage once they have established a brand people trust.
For-profit companies kinda work when they are being watched and it's easy to judge the quality of their products – i.e. when you don't need to trust them.
It's really a matter of market efficiency, where efficiency is dictated by how much information is available. In a market where information is accurate and timely, companies cannot abuse customer trust because doing so for anything the customer cares about will result in them valuing that company and its products less. Different industries have different levels of information available, making it easier or harder to depend on what they say or show.
Apple is specifically hard to reason about because some aspect of their value is based on their products being a status symbol, and some value is based on their functionality (this is true to a degree for most products, but is skewed quite high in the status symbol ratio for a tech product for Apple).
On the one hand, Apple has been fairly straightforward in their communication and not lied often, and also has a business model that has less conflicting interests when it comes to consumer privacy, but at the same time they could conceivably get away with more because of their position as a status symbol.
What this means for me is that I take Apple's claims about consumer protection fairly seriously in most cases, but for anything important I would not rely on them. They've banked a lot of good will, and at some point they might see it as worthwhile to make a withdrawal on it (if they hopefully haven't already), and I would rather not be in a poor position if/when that happens.
> would blindly believe marketing material from a for-profit corporation
I am equally likely to believe or disbelieve marketing material from non-profits. Look at the malfeasance and lies from the Red Cross [1] -- or the sky-is-falling proclamations from the net neutrality crowd -- predictions of doom that never came to pass -- not to mention the lied-about motivations around net neutrality (the real motivation was about who pays for bandwidth.) [2]
That meme should have been diminished earlier. People should understand while Google saves much data within itself, Apple shares it with advertising partners.
By the way, People have come to hate Google all the way for some advertising and easily switch-able data collection. But one should at same time hesitate to vote to apple with their wallets, they are a monster company which patents rounded corners of phones and things like optional chaining in Swift. That just goes unnoticed in circles like HN. IMO apple has always been more evil than Google.
You should look into the 'borg' backup tool - it has become the de facto standard for remote backups because it does everything that rsync does (efficient, changes only
backups) but also produces strongly encrypted remote backup sets that only you have a key to ... your cloud provider has no access to the data.
How does this help with the context we're dealing with here: the iPhone (and other iDevices) being heavily encrypted/secure and iCloud not being secure?
Quickly looking at Borg's website (thanks for the heads up - great tool/option) I see it doesn't support iOS or backing up an iOS device.
I assume you're just suggesting it as a general purpose option for general backups on the desktop?
After looking at a few alternatives (Borg, Duplicacy etc.), I setup Arq on my Mac yesterday.
One thing that irks me about these solutions is that they seem to scan my folders each time they want to backup. Are there tools that are smarter about this? For e.g., while running, they could keep a log of what's changing and only scan those while backing up.
> After looking at a few alternatives (Borg, Duplicacy etc.)
I may have looked at Duplicacy. I'm sure that I looked at one of Duplicacy [1], Duplicati [2], and Duplicity [3]. Whichever one that was, I kept getting it mixed up with the other two when looking for information online, and finally said "screw this" and bought Arq.
Arq for Mac is great for backing up, though the restore util could use a little work. It's not exactly user friendly.
I've used it on my macs for years without any issues at all. I switched after Time Machine broke down for the n'th time in a month saying it needed to recreate the backup, and not once in the 3-5 years i've been using it has it every given me any problems with broken repositories, and every integrity check/restore has succeeded.
Arq on Windows is a different beast though. I'm sure it's technically solid, but the UI leaves a lot to be desired. On windows boxes i default to Duplicaty.
On my servers i use Borg like any sane person would, but the lack of a good client UI makes scheduling backups on a personal computer a lot more work than i'm willing to put in.
You would probably need the kernel / filesystem to keep track of it to keep it efficient. And even then that is iffy if you implement it as a kernel extension vs. something integrated into filesystem code.
I've found arq backup overall slow once you start hitting 0.5TB overall. It's a design issue.
I am backing up ~130 GB data (with a combination of large and small files) every 12 hours to S3. The first upload quite some time but all the later ones take ~10 minutes in total.
Rclone simply copies data. If you `sync` `~/Documents` to your remote it will keep an exact copy.
This is a simple backup since you only have one version. Anything deleted, the next time it syncs, gets deleted.
Borg is a backup tool. Versioning is at its core. It does that efficiently by deduplicating file (chunks really) even if they’re not in the same location.
So with Borg, if you create a backup 1 of `~/Documents` today and a backup 2 tomorrow of `~/Documents` you can see both backups and work with each snapshot. The size it takes should be close to the amount of data changed in the whole source.
If you move directories or files inside, rclone has to reupload them. Borg detects but doesn’t have to store it again.
With rclone some remotes have versioning (Google Drive, Dropbox). This could help in this case, but it depends on the remote. With Borg this is built in and you can change the underlying storage and migrate without loosing any data. Using versioning with crypt would probably be a pain too due to the file names. Not sure if rclone has commands/flags to help with this that I simply don’t know about.
"Rclone simply copies data. If you `sync` `~/Documents` to your remote it will keep an exact copy."
...
"This is a simple backup since you only have one version. Anything deleted, the next time it syncs, gets deleted."
This is correct. It is widely advised to not consider a "sync" like this a proper backup.
However, for what it's worth, rsync.net does support rclone[1] and because of the ZFS snapshots that are created and maintained[2] in your account, you can just do a dumb sync because the retention is handled by the snapshots.
I am not sure if rclone is really the right tool for plain old cloud backups - I think rclone distinguishes itself for the ability to transfer data between cloud providers.[3]
Why would you think it was end-to-end encrypted? Did you never use icloud.com where you can simply access all your icloud data with a usernam+password?
Well, you just have to “trust” the server to not serve a website that will phone home your password. Apple pinky swears they won’t do that, and all your browser extensions running all the time do, too.
As Mark Zuckerberg once opined: “They ‘trust’ me. Those dumbfucks.”
And it’s not just mere words, here is he actually set up a honeypot site to get people’s passwords and break into their emails to satisfy his burning curiosity when he first launched Facebook:
Instead, he decided to access the email accounts of Crimson editors and review their emails. How did he do this? Here's how Mark described his hack to a friend:
Mark used his site, TheFacebook.com, to look up members of the site who identified themselves as members of the Crimson. Then he examined a log of failed logins to see if any of the Crimson members had ever entered an incorrect password into TheFacebook.com. If the cases in which they had entered failed logins, Mark tried to use them to access the Crimson members' Harvard email accounts. He successfully accessed two of them.
In other words, Mark appears to have used private login data from TheFacebook to hack into the separate email accounts of some TheFacebook users.
In a world where we are sending our Alexa data to “the cloud” and the companies are admitting people are listening to it, in a world where Facebook secretly records everything it can, why would you assume your password isn’t being sent?
To the downvoters... you may say that this was only when Mark Z was a young man and now Facebook the company is far more responsible. But then we have this from just a few months ago:
I can understand trusting a browser or a software release that can be tested by many people, and was signed with a checksum. But a website and all your extensions on every website?? Those can ship new code at any time.
I think the truth of the dumbfucks argument is too painful for some people especially seeing how things are the same given recent developments. As an engineer I would be deeply ashamed if I were still working for Facebook.
First party backups from Apple (iCloud/iTunes) restore a perfect replica of the phone, including app icon locations, arrangement, notifications, offloaded storage etc. I'm honestly skeptical that anyone else would be able to pull that off.
Yes, Apple has private APIs that it uses for its monopoly abuse benefit. That is why Apple's "Music" app can't be deleted from your computer ("'Music.app' can’t be modified or deleted because it’s required by macOS.") but Spotify can be deleted.
The solution is for Spotify to sue them on this specific issue and for other people to sue them similarly.
Wikipedia says Created by DigiDNA, the software was initially released in 2008 as DiskAid, enabling users to transfer data and files from the iPhone or iPod Touch to Mac or Windows computers. DiskAid was renamed iMazing in 2014.
The point of key derivation is that it can use a key to encrypt that is in turn protected by another key/password. So the amount necessary to re-encrypt when your password changes is just the encryption applied to the key. A similar technique is used in local disk encryption, where you don’t need to spend hours re-encrypting your hard drive just because you’ve changed your local account password...
Then it must need my password to decrypt the key which was used to encrypt the raw data? What if I do not tell them my password, (assuming my password is one way hashed and stored) would that brick the key and in turn brick the data? Clearly I am missing something here..
Edit: or since it is "derived" and not really password which is used for encryption -- the derived thing could well be the hashed password. We are doomed. They might as well serial number their user and use that as key then. Never mind.
You’re overthinking it. Create a private key. Protect that key with a pass phrase. If you change your password, you’re really changing the pass phrase.
Does that clear it up at all?
FYI these concepts are originated from military crypto. The foundations are solid. Implementation... well you know how that always is.... one CVE away from perfect!!
The local backups are encrypted with a key separate from your iPhone passcode. You can change it in iTunes, not sure if it re-encrypts or not.
But of course, we are talking about local backups so if you have full-disk encryption or back them up to an encrypted virtual drive, you don't even need whatever encryption comes with them.
They can’t be; users lose their devices, then forget their passwords and need to reset and restore.
Also, all your photos and notes and email and other stuff in iCloud are available to Apple (and by extension the FBI et al) as well. Even Apple CSRs have a ton of access to the contents of a lot of iCloud services.
The situation is a lot better if you have all recent devices and 2FA turned on, then it can use iCloud Keychain for some stuff which is trust-circle based. You can read more in Apple’s latest platform security doc released late last year.
Not only this, but your iMessage E2E is protected in flight by a key that gets stored in your backup. So anyone recording your iMessages can retroactively decrypt them if they get your backup.
I don’t understand. Am I just tired or misreading? The table on this page clearly shows backups are encrypted in transit and at rest...
https://support.apple.com/en-us/HT202303
If it rains it gets wet outside. But if it's wet outside does not mean it's rained. This is an example of a recurring case where snakeoil and dishonest companies use this seemingly obvious logic puzzle, because people in general are bad at logic.
To answer your question directly. TLS encryption from your phone to apple's servers means they terminate encryption at the other end when they receive your data. This means "they decrypt the information that was in transit". Then they explicitly apply another encryption to the received and decrypted data before storing it on disk. Since these are two separate steps, you have no protection what-so-ever since apple will have a registry of all decryption keys for the disk backups that they'll happily use for whatever reason when they want to get hold of your data.
The only thing their disk encryption protects against is if someone were to walk away with the physical disks. It protects squat against the threats customers actually care about (unauthorized access to the data by someone other than the customer owning that data).
And seeing as they run on AWS, physical security means that the only way metal leaves the data center is if it's in millimeter sized shredded metal grain. So the threat model of concern here is exactly what apple has decided not to provide customers any protection against.
"Encrypted" is a weasel word. There are many examples of "it's encrypted" even when encrypted refers to 8192-bit keys, where the system is not secure. Examples abound.
So claims of "encryption" are meaningless.
Instead, claims of "only X, Y and Z could access to this" are meaningful.
"Could" is a strong word because it includes unforeseen circumstances such as writs and court orders.
On MacOS, iTunes/Finder stores (optionally) encrypted backups at
~/Library/Application Support/MobileSync/Backup
I was thinking of creating a symlink to Dropbox, for instance and having my cloud backup there. I don't know if the backups are incremental which could be a storage problem, but that can be managed through some scripting.
Since we hold the key it's a blob nobody can get to without brute forcing it.
Surely in the context of "End to End backups" the user is the "End" on both ends. The servers shouldn't have the keys. They can be as compromised as they want, but the most they should be able to see is when, how much and where from you're backing up data, but not the actual data.
You ever read the reviews for the MEGA app? Everyone complains that there's no password reset feature. MEGA is fully encrypted so you literally can't reset your password. It says this when you first crate an account and get a recovery key.
I don't think the general public would understand end-to-end encrypted backups. It would probably hurt their company if all backups were totally unrecoverable.
You do have the option. Local device backup with a password.
You can’t even turn off the backup password on an existing device for a new backup without knowing the old password (protecting against Evil Maid problem).
I’ve had to reset a device when I forgot my local iPhone backup password to get it back to unencrypted backups.
> More than two years ago, Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud, according to one current and three former FBI officials and one current and one former Apple employee.
> Under that plan, primarily designed to thwart hackers, Apple would no longer have a key to unlock the encrypted data, meaning it would not be able to turn material over to authorities in a readable form even under court order.
Mega has an Alexa rank of 372 and over 50 million Android installs. They're a good example of how end-to-end encryption can be popular among the general public when executed properly, and they've even released the source code for their clients.
What did suck on Apple's part was that they wouldn't allow you to disconnect the actually end-to-end encrypted iMessages from the iCloud backups for many years.
So since most people kept their iCloud enabled, that meant their "iMessage end-to-end encryption" was nothing of the sort, as all messages had a copy that Apple could read on its servers.
I actually don't know if this is still true since I haven't used an iPhone in some time, but I sure hope it isn't anymore.
It's huge problem that you believed something that was never suggested by Apple and never part of the threat model. Apple runs the closed source OS on your phone. You obviously can't hide information
on your phone from Apple.
Well, you can choose not to use iCloud for that reason (as some of us do).
A bigger problem is that Apple deliberately locks up iDevices so it's hard to get your data off them using only local means, particularly if you don't also want to buy an Apple laptop just to do it.
If only iTunes worked reliably on Windows and didn't have a long track record of bugs, subtle usability issues causing catastrophic data loss, connection problems where it doesn't detect the device properly...
Let me know when iPhones and iPads support standard plug and play protocols that work universally without relying on either Apple's proprietary and frequently broken software or someone else's commercial alternative that attempts to do what Apple should have been doing all along.
Is there a specialised protocol for all types of data stored on iOS devices? Probably not. But plenty of other models of phone, tablet, camera and other data-processing devices manage to communicate just fine with Windows (or Linux or macOS) using generic protocols as USB mass storage devices, there is little excuse for iOS devices not to. In fact, you actually can download your photos and videos from an iPhone to a Windows PC by just plugging it in and doing the same as you would with your camera, but ironically this only works if you haven't installed iTunes.
So while it is “communicating” with iPhones. What exactly is it suppose to communicate over standard protocols that would alleviate the need for an application and still perform all of the functions of iTunes?
What other devices perform all of the backup, restore, and os upgrade, functionality of iTunes.
The iPhone doesn’t use the standard “usb mass storage” protocol to allow you to download pictures. It uses the picture transfer protocol.
What other devices perform all of the backup, restore, and os upgrade, functionality of iTunes.
I've never claimed anything about any other functions of iTunes. I just said it was a problem that Apple devices make it difficult to get your data off using only local transfer.
The iPhone doesn’t use the standard “usb mass storage” protocol to allow you to download pictures. It uses the picture transfer protocol.
So it’s difficult to get data off of your device - even though you can get pictures and video off of your device like any camera and most other documents are stored in their own folders on iCloud - that you can get to either using the iCloud Drive app for Macs or Windows or by logging into iCloud.com.
Have you forgotten that the entire point of this discussion was that iCloud is not properly encrypted, which is why it is such a problem that local transfer of all data is so difficult?
So now we are going back to needing some type of app to backup and encrypt since there are no “standard protocols” that can do encrypted backups and restores.....
If everything is being transferred off your device locally, why can't you use whatever backup arrangements you normally make for your other data?
We already have a good, properly secured backup system that we use for all our workstations and servers. We just want to be able to export data from any mobile devices we use and manage that data using the same policies and tools. In most cases, that is straightforward. The one big exception is the iOS devices.
I'm not making any claim at all about "most people".
I'm saying that iCloud isn't properly encrypted, which for some people and organisations will be a problem, and that it is then a greater problem for those people and organisations that it is unusually difficult to transfer data between iOS devices and other systems through other means because of the inhibiting choices that Apple has made.
It's much like the argument that the default behaviour for consumer software should normally be to install security updates automatically, but installation of updates should still be configurable for those who do know what they're doing and need more control over their systems.
If you’re part of an organization where security is important, you would force all of your employees to register with your MDM solution and prohibit any iCloud backups, you would probably have them using Office for iOS and tell them to save their files to OneDrive for Business and enable encryption.
I doubt many businesses are using iWorks with iCloud.
1) There is no way Apple would be allowed to sell iPhones in China, without China government having access to anything. So, I assume that Apple users in China have e2e encrypted exactly nothing.
2) I have a strong suspicion that those 'enter your Apple ID password because your account needs it' message really means 'a government has requested your data and even though it's encrypted, we will nag you about entering a password, and if you give it, you're a free game'.
I don't blame Apple for this, I'm sure they're doing what they can, but when a government says 'give us this data', they can't not comply. Vote responsibly - companies can't protect us from a government we have put into power.
Regarding #1: iCloud in China is operated by a mainland Chinese company and subject to that company's terms and conditions. So you can pretty much assume iCloud data is completely accessible by the government.
Apple uses third party data centers, if it can't host encrypted data on a Chinese server without China having access to the data, there is something wrong with the encryption.
> On Wednesday, Apple officially handed over its iCloud operation in China to a local state-run company, along with all encryption keys to unlock local user data. The switch will give the Chinese government unfettered access to the photos, emails and contacts of over 240 million iPhone users in China.
> "The simple fact is that once the encryption keys are stored on Chinese servers, they will be easier for Chinese authorities to access — with or without legal requests," says Sharon Hom, executive director of Human Rights in China, a US-based NGO. "Since Apple has declared its willingness to 'comply with Chinese law,' its reassurance that it, not its Chinese partner, would control the encryption keys is not exactly reassuring. In addition, Chinese authorities could bypass Apple to address their requests directly to Apple’s Chinese partner, a state-owned enterprise that, of course, would have to cooperate with Chinese authorities."
The keys exist on machines managed by the third party, which is controlled by the state. Therefore, the state has access to the data.
Edit: Apple itself has stated that the keys are in China. The option of having Apple devices talk through the Great Firewall to servers in the US that then encrypt the data for storage in China (and on the querying end, request encrypted data from China to decrypt and process in the US to serve back to devices through the Great Firewall) that Apple apologists wishfully theorize is every bit as ridiculous as it sounds. https://www.reuters.com/article/china-apple-icloud/rpt-insig...
I have not changed the subject. Apple clearly delineates which data is e2e encrypted and which data is not. Those same standards apply in the US and China - unless you have evidence otherwise.
I no more trust my privacy to the US government than a Chinese citizen should trust China.
You started this thread by responding to somebody discussing the Chinese government's access to all iCloud data, but you changed the subject to talk about systems where the private key is on device, which does not apply to iCloud. You absolutely did change the subject.
> Those same standards apply in the US and China - unless you have evidence otherwise.
Those same standards don't actually protect your data from whoever controls the iCloud server or whoever controls the iMessage key server. In the US, that is Apple, so Apple has access to that data. In China, that is the Chinese government. Therefore, the Chinese government has access to all Chinese iCloud and iMessage data.
> I no more trust my privacy to the US government than a Chinese citizen should trust China.
Then you are unfamiliar with the laws of both countries.
The laws of the US say a lot of things. But the facts are that all the government has to do is scream “terrorism”, “drugs”, “or think about the children” and they can easily get a warrant. The law states that one branch of government has to ask another branch of government for a warrant. You have to believe that the judicial branch actually would safe guard privacy and keep law enforcement from overreaching.
> You have to believe that the judicial branch actually would safe guard privacy and keep law enforcement from overreaching.
These warrants become public record. I don't have to blindly believe it. I can look at the records and see that the US is not even close to China as far as government access to user data.
You started this thread by responding to somebody discussing the Chinese government's access to all iCloud data
If some of the data is e2e encrypted using private keys,China doesn’t have access to “all data”
Those same standards don't actually protect your data from whoever controls the iCloud server or whoever controls the iMessage key server.
If the private key is generated by the same entity or “key server” that generates the public key, and then transmitted to the client. That kind of defeats the entire purpose of public/private key encryption.
I’ve never seen an implementation of public/private key encryption where the client device doesn’t create the key pair and send only the public key to encrypt data.
> If some of the data is e2e encrypted using private keys,China doesn’t have access to “all data”
You have two mistakes in this sentence.
1. None of the iCloud data (mail, docs, drive, etc.) is E2E encrypted. Some of the data stored in iCloud (like keychain backups) is encrypted prior to being sent to iCloud (using symmetric encryption, not with asymmetric key pairs). China has access to the data that was ultimately sent to iCloud.
2. The way Apple implements E2E encryption for services like iMessage that are E2E encrypted allows China access to that data.
> If the private key is generated by the same entity or “key server” that generates the public key, and then transmitted to the client.
That's the point. Since Apple's implementation relies on a key server to distribute public keys, it is straightforward for the key server to generate its own key pair and serve a fraudulent public key to the recipient, decrypting and re-encrypting messages that the iMessage servers relay. Apple relies on the technical illiteracy of its users to get away with its deceptive and often plain false marketing claims. Now you know better.
The “key server” does not in fact “generate public keys”. It distributes public keys. But you can’t decrypt a message with public keys - that’s kind of the point...
But after reading research from security experts you have found a citation where Apple is generating a key pair from its servers and sending the private key to the client?
> The “key server” does not in fact “generate public keys”.
That's the point. It should not, but the security model of iMessage allows the key server to get away with it, which is almost certainly happening in China right now. Try reading the article and following the example.
> But after reading research from security experts you have found a citation where Apple is generating a key pair from its servers and sending the private key to the client?
No, it sends the public key. Encrypting messages is done with the recipient's public key. Go read the Wikipedia article on asymmetric encryption. Because the owner of the keyserver can send its own public key, it can decrypt messages with its own private key before re-encrypting with the intended recipient's public key.
Again, if it Apple were in fact creating their own key pairs on their server and sending users the key pair, don’t you think someone would have discovered.
But since it’s in a Wikipedia article, I guess that kind of closes the case.
> [If] Apple were in fact creating their own key pairs on their server and sending users the key pair, don’t you think someone would have discovered.
You once again misunderstand the vulnerability. The vulnerability is that China does this because China controls the keyservers in China.
As far as anybody discovering this, that would be very difficult because Apple does not let you install your own apps on the device and would not approve an app designed to detect this.
But even more, why would they bother? People who care about their privacy will simply avoid closed source software and especially closed systems like Apple's instead of trying to use a known compromisable system safely.
>But since it’s in a Wikipedia article, I guess that kind of closes the case.
I was pointing you to a place where you could learn about cryptography because you seem not to understand the basic concepts. The Wikipedia article does not describe this particular vulnerability.
> Those same standards don't actually protect your data from whoever controls the iCloud server or whoever controls the iMessage key server. In the US, that is Apple, so Apple has access to that data. In China, that is the Chinese government. Therefore, the Chinese government has access to all Chinese iCloud and iMessage data.
Seeing as how Apple complies with FBI and law enforcement requests to get iCloud data, that is definitely not the case in the US.
And that doesn’t make the statement untrue. Apple clearly lists which data stored on its servers are e2e encrypted. Those are all considered “iCloud data”.
I would urge you to read up on the Chinese cryptography law [1] which took effect on the 1st of this year. Essentially all companies foreign or not must provide unencrypted access to data to the Chinese government and must do so in secrecy. Prior to this, companies were being compelled to give up their data anyways but this just makes things easier.
By the way, the source below is an official Chinese government media source.
I've just read the article you linked[1], as well as several others [2][3][4] and cannot find any information about this:
>all companies foreign or not must provide unencrypted access to data to the Chinese government and must do so in secrecy
either plainly stated or implied.
Can you provide a source for this claim? I don't doubt that this may occur, but I'd like to speak with _my own_ managers about my china & encryption concerns in an informed way.
Do you have any evidence that Apple rearchitected their system to have access to private keys that it doesn’t have access to anywhere, to have access to give it to China?
> It is the latest development in a pattern of Apple acquiescing to Beijing’s demands. Last July, Apple deleted VPN apps from the App Store that let mainland Chinese internet users evade censorship. Apple’s lawyers have also added a clause in the Chinese terms of service that states both Apple and GCBD may access all user data. Apple has not responded to requests for comment.
> Meanwhile, Chinese laws do not protect internet users’ privacy from government intrusion. In 2015, China passed a National Security Law, which included a provision to give police the authority to demand companies let them bypass encryption or other security tools to access personal data. The National People’s Congress was not available to comment.
Well, after actually reading the Reuter’s article that the verge citation is based on...
Apple says the joint venture does not mean that China has any kind of “backdoor” into user data and that Apple alone – not its Chinese partner – will control the encryption keys. But Chinese customers will notice some differences from the start: their iCloud accounts will now be co-branded with the name of the local partner, a first for Apple.
> And even though Chinese iPhones will retain the security features that can make it all but impossible for anyone, even Apple, to get access to the phone itself, that will not apply to the iCloud accounts. Any information in the iCloud account could be accessible to Chinese authorities who can present Apple with a legal order.
> Apple said it will only respond to valid legal requests in China, but China’s domestic legal process is very different than that in the U.S., lacking anything quite like an American “warrant” reviewed by an independent court, Chinese legal experts said. Court approval isn’t required under Chinese law and police can issue and execute warrants.
Apple says the joint venture does not mean that China has any kind of “backdoor” into user data and that Apple alone – not its Chinese partner – will control the encryption keys. But Chinese customers will notice some differences from the start: their iCloud accounts will now be co-branded with the name of the local partner, a first for Apple.
> That means Chinese authorities will no longer have to use the U.S. courts to seek information on iCloud users and can instead use their own legal system to ask Apple to hand over iCloud data for Chinese users, legal experts said.
U.S. courts are highly unlikely to order Apple to release iCloud data to Chinese officials. Any cases would be public and attract international media attention. For Chinese iCloud users, that makes all the difference.
How many countries have laws that state user data must not be in foreign data centers?
Every company in the US has to comply when it’s ordered by the court to give up user data. The US justice system is not exactly a shining light on the hill when it comes to needing a high bar to give investigators search warrants. All someone has to do is say “terrorism”, “drugs” or “protect the children” and courts will fall over backwards.
Also from the same article:
Until now, Apple appears to have handed over very little data about Chinese users. From mid-2013 to mid-2017, Apple said it did not give customer account content to Chinese authorities, despite having received 176 requests, according to transparency reports published by the company. By contrast, Apple has given the United States customer account content in response to 2,366 out of 8,475 government requests.
You have much more faith in the US justice system than I do.
> Until now, Apple appears to have handed over very little data about Chinese users. From mid-2013 to mid-2017, Apple said it did not give customer account content to Chinese authorities, despite having received 176 requests, according to transparency reports published by the company.
By moving iCloud data and keys to China, the amount of data Apple handed to Chinese authorities on Chinese iCloud users went from zero to a nonzero amount. Therefore, Apple degraded the security and privacy of Chinese iCloud users by making the switch to Chinese servers.
Due process is much more frequently ignored in China than in the United States, but that fact isn't even necessary to establish that Apple's switch to Chinese servers negatively affected Chinese iCloud users. The above is sufficient.
You need to brush up on your Cryptography 101 course before arguing with people about how asymmetric encryption keys work. There is nowhere that states that only one entity can have "control" of the keys. If you don't understand that, then I can see why you're so confused about this whole situation.
No. How the technology works is irrelevant when the end result is that the government wants the data.
If your boss asks you to build a machine that produces a widget, does he really care what your code looks like? Probably not. In the same vein, Apple can figure out whatever solution they want, whether it involves conventional use of encryption keys or not, to provide a system where the Chinese government can get access to their users' data.
> 2) I have a strong suspicion that those 'enter your Apple ID password because your account needs it' message really means 'a government has requested your data and even though it's encrypted, we will nag you about entering a password, and if you give it, you're a free game'.
Haha I hadn’t thought of that. If true, I must have every government requesting my data frequently as I constantly get bombarded to enter my iCloud password.
FISA warrants can go multiple hops from the target which with network effects can sweep up tons of unrelated data from random people who happened to interact with someone who interacted with x bad guy.
That’s just for domestic surveillance keep in mind.
The only thing in that list that would is of interest to a government are messages. Fortunately no-one really uses iMessage in China. People use WeChat and (you guessed it) the Chinese government has access to these.
As for the End-to-End encryption of iMessage it is a bit overrated. Apple does the key management for you. So theoretically they could pretend that the key of your interlocutor recently changed (because new phone or something), it would just work transparently. So if a "nefarious" entity were to gain access to iMessage servers, they could use that technique to decrypt, "on the fly", the messages of whoever they want to spy on, without the clients knowing that this even occurred.
When you use "WhatsApp" you have the ability to get some kind of warning when the interlocutor's key has been updated. It's also possible to check each others' identity by scanning some kind of QR Code. But the app does not really put any emphasis on which accounts have been verified. Signal is about as bad as WhatsApp. My guess a government that wants to spy on your WhatsApp/Signal messages probably could, because most people would notice the key change warning nor understand what it means.
Only Apps which makes a big fuss about key management (Threema for example) are properly End to End encrypted, with no possibility for Big Gov to hack into servers and spy on you by adding their keys to conversations. But then they would probably just hack the OS on your phone at this stage. In fact that method, is probably better than messing around with iMessage/WhatsApp servers. You bypass ALL forms E2E encryption, and you get access to everything else, with one swift hack. I bet the NSA and their Chinese equivalents have such hacks in reserve for very juicy targets they want to spy on.
With the kind of unlimited budget the NSA has, it's hard to imagine something they cannot hack. That is why big A-list targets like Bin Laden went totally off-grid for communication.
>1) There is no way Apple would be allowed to sell iPhones in China, without China government having access to anything. So, I assume that Apple users in China have e2e encrypted exactly nothing.
E2E works exactly the same in China. You can read more in my comments here:
Please stop spreading disinformation. Your sources are outdated and the quotes that you are referencing are not legally binding. The fact is that Apple has clearly stated that iCloud data for Mainland Chinese users is stored on servers operated by a Chinese company, which must abide by the local laws and regulations. It is also a well known fact that all companies operating in China can be compelled by the Chinese government to provide data and do so in secret.
There's no disinformation in my comments. I seem to be one of the few people on the planet who seems to have actually dug into this exact issue while others only offer the typical FUD we've seen about how Apple's encryption works in China.
The fact is that Apple has said multiple times (and even under oath) that end-to-end encryption applies to iPhones and iMessage in China, the same as it does everywhere else.
And once again Erik Neuenschwander, an Apple privacy exec, told Congress in a hearing in December that this was still the case.
> In fact I seem to be one of the few people on the planet who seems to have actually dug into this exact issue while others only offer the typical FUD we've seen about how Apple's encryption works in China.
I think instead of researching how Apple works in China, you need to start doing some research on how the Chinese government works and their track record on legal matters and rule of law.
Also, the segment in the Senate hearing you referenced shows a senator who obviously does not have a good grasp on encryption technology asking bumbling questions about encryption. I have paraphrased the section here:
> Senator: Do you sell phones in China? Are they encrypted?
> Apple: The phones are the same and all of our phones are encrypted across the world
Yes, obviously all phones have encryption but the Senator did not clarify what was being encrypted here and Apple took advantage of this in the response.
> Senator: You're telling me that they [China] allows you to sell devices without you allowing them to breach the encryption and gain information about the users?
> Apple: You're 100% correct
Once again, the question posed was incoherent. Of course there is no "breaching of encryption" here - the Chinese government just asks for the keys or the data. It's all about language here.
If this Senate hearing is your case for why data is safe in China, I honestly fear for all the political and religious dissidents that are trusting Apple for their safety.
I think there might be misunderstanding about what exactly is "encrypted" and how. The comment 3 or 4 levels above says:
> The same "vulnerability" of being able to respond to legal requests for iCloud data that exists in China exists everywhere else in the world.
And an article on Apple's site [1] confirms that most data in the cloud are "encrypted", but without E2E encryption, possibly in a reversible way. That article also notes that while messages are E2E encrypted, a cloud backup might contain a key to decrypt them:
> Messages in iCloud also uses end-to-end encryption. If you have iCloud Backup turned on, your backup includes a copy of the key protecting your Messages. This ensures you can recover your Messages if you lose access to iCloud Keychain and your trusted devices.
So it is possible that the data on the phone are encrypted, the data in transit are encrypted, the data in the cloud are encrypted for every user in the world, but the cloud operator has the encryption keys for some of the encrypted data: Chinese operator for data of Chinese users and Apple for everyone else. This doesn't contradict neither with Apple's statement nor with the article nor with that comment above.
The question and his answer were both completely clear. And most importantly, it's perfectly consistent with what Apple has said multiple times. You can believe they lied to a federal court and Congress if you want, but I certainly don't.
It was a reduction in security for Chinese iCloud users:
> The U.S. company is moving iCloud accounts registered in mainland China to state-run Chinese servers on Wednesday along with the digital keys needed to unlock them.
> In the past, if Chinese authorities wanted to access Apple's user data, they had to go through an international legal process and comply with U.S. laws on user rights, according to Ronald Deibert, director of the University of Toronto's Citizen Lab, which studies the intersection of digital policy and human rights.
> "They will no longer have to do so if iCloud and cryptographic keys are located in China's jurisdiction," he told CNNMoney.
Apple's decision to store the keys on Chinese servers was criticized because it effectively reduced security for Chinese iCloud users:
> Chinese users of Apple’s iCloud service will see their data–along with that data’s cryptographic keys–stored inside the country beginning Wednesday, Reuters reports. The move will mean that Chinese authorities will have easier access to Chinese users’ iCloud data than before when that data was stored in the U.S. The move is a contentious one, as human rights activists say Chinese authorities will now have an easier means of obtaining dissidents data since it no longer needs to go through the U.S. legal system to get Apple to hand over its cryptographic keys for Chinese users.
This hard line is too facile. If you are paranoid about malicious code updates, then making part of your stack open-source doesn’t matter. I could push an update to your OS that reads the keys out of your BitWarden.
Yeah, I always verify the hashes of updated binaries match what I compile myself in parallel. Also that takes too much time so I just never update anything and have a homebrew version of 'Damn Vulnerable Linux'.
Long-term, there may eventually come a solution to this problem in the form of [binary transparency](https://wiki.mozilla.org/Security/Binary_Transparency). However, we're obviously a long way away from that being the norm, and there's still the problem of supply-chain attacks on hardware to consider.
I doubt the hardware supply chain can ever be secured. Even if you were to open-source every single part of manufacturing, there is no reliable way to ensure that the chip you, as a customer, have obtained, hasn't been backdoored. You'd have to delid it and put it under an X-Ray if that even resolves the tiny featuresin modern CPUs.
With an open hardware design, periodically de-liding and examining a random sample of available consumer hardware would probably sufficient to protect the general consumer population, and targeted attacks become very difficult if you purchase your hardware from a store rather than order it by mail.
Even so I agree that examining all hardware in that manner is impractical. A better approach might be having a small, simpler core of secure open-source hardware managing your root of trust, and trying our best to mitigate the impact of compromises in the more complicated components (like the motherboard, CPU, etc) with approaches such as requiring open source firmware, sandboxing individual components by filtering their external communications through open hardware, and limiting their access to sensitive data like encryption keys. Obviously there's only so far you can go with that, but I don't think it's an entirely hopeless battle either.
>With an open hardware design, periodically de-liding and examining a random sample of available consumer hardware would probably sufficient to protect the general consumer population, and targeted attacks become very difficult if you purchase your hardware from a store rather than order it by mail.
How do you trust the person that verifies the CPU? Can you trust the X-Ray imaging machine? Is the X-Ray Machine verified to be open source and not backdoored to hide backdoors (aka bootstrapping trust).
You'd have multiple trusted independent parties from multiple international jurisdictions reviewing the hardware design, not just one. And yes, obviously the X-Ray machines would need to be verified using similar techniques.
The same way you trust anybody? If you're so paranoid that you believe literally everyone is out to get you, then you're not going to be able to function in any society, let alone one as interconnected and interdependent as our own.
How do you trust your hash program? How do you trust the cryptographers who came up with the hash algorithm? How do you trust your compiler is faithfully interpreting the source code you're reading?
IMO if you're at the point where you believe you can't trust multiple decentralized, independent, multi-jurisdictional bodies all telling you the same thing: that the hardware they've tested matches the published design, you've reached a level of paranoia where no amount of reassurance, technological or otherwise, will satisfy you.
I suppose if you really wanted to you could build your own X-Ray machine from scratch and check the design yourself. That's probably not much more difficult than going line-by-line and manually verifying the source code of your entire software tool chain because you don't trust anyone else who's read the source code enough to believe them when they tell you they've already verified that everything looks correct and that your text editor probably isn't lying to you about the contents of your source files. Which is to say probably totally impractical, but again, that's kinda my point.
Yeah, try to do that on a mainstream Linux distro for example.
While I'm not saying maintainers & users are checking all changes in packages, all the work happens in the open & all the source is compiled on distro infrastructure.
So once you actually do an atack like this and it is discovered, you can be sure anything done by the maintainer will be combed with a very fine brush & the account disabled.
Given that it can take years to build the trust needed to become mainatiner of an important package, only to loose it all once you atack is known, I really can't see this used for anythin else than very targetted high stakes attack omce off attack, definitelly not for any long term dragnet surveilance.
I meant that closed source even gets less safe when allowing auto-updates. When using opens source the auto-updates need to be trusted; yes. Nothing new.
I thought iMessage private keys are somehow based on data in the "secure enclave" chip, and thus not able to be stored in the cloud. It's my understanding that Apple could add new "devices" to listen in on future conversations, but it can't read iMessage conversations in transit between existing devices.
It can also read iCloud backups of conversation content, which are created by the client device after decrypting the message. But that's not the same as storing the private key itself in the backup.
If you lose your device and buy a new one and restore your device with a back up, all your messages will be returned.
There’s no way to accomplish this without having the private key in the backup.
EDIT: When I say there is no way to accomplish this, I’m talking specifically about the process that exists today where the user doesn’t have to remember a password other than their iCloud password (which today, can also be reset).
Maybe we're talking about different private keys? To clarify, here's how I think it works:
1) To send you a new iMessage, someone else's iPhone Z encrypts it with your public key and sends it through the iMessage network. Apple can't read this message since they don't have your iMessage private key, which is only on your device.
2) Your iPhone A receives the encrypted iMessages and decrypts them with your iMessage private key on your device.
3) iPhone A stores the decrypted iMessage content in its filesystem, which is encrypted locally with the device private key (derived from your device passcode).
4) Time to backup... iPhone A decrypts its filesystem with your device private key, and sends filesystem contents as plaintext, through an encrypted tunnel, to an iCloud backup server, which encrypts the backup locally with an iCloud server private key, and stores it.
5) You get a new iPhone, let's call it iPhone B. iPhone B asks iCloud for a restore. The iCloud server locally decrypts the backup, and sends it to iPhone B as plaintext through an encrypted tunnel. iPhone B receives the backup as plaintext and stores it locally, encrypting it locally with the device private key.
6) iPhone B iMessage client loads the plaintext old iMessages into the Messages client for you to read.
At no point in this process would Apple have or store your iMessage private key or your device private key.
The encryption is a bit complex on the iPhone, because keys are typically held by the secure enclave which enforces policy on use. These keys are both used to encrypt the base filesystem, and can be applied by policy against individual files.
Example policies from the Secure Enclave would be that a private key is available on first unlock, only while unlocked, only while a PIN is set, and/or whether the key should be shared with other trusted devices.
The base filesystem is encrypted with a key released on boot, while individual files can be encrypted with some set of these policies. I believe this can be done either on individual files in an app's data, or as an entitlement to apply by default to the entire app. https://developer.apple.com/documentation/bundleresources/en...
My understanding is that files set with a policy that they are only available while the device is unlocked will still be backed up in the locally encrypted form. So, assuming Signal/WhatsApp/etc set a single flag their data is stored encrypted in iCloud.
Further, per your list I suspect that the backup data is sent to/from iCloud already encrypted by a secret - but that secret is shared with iCloud for recovery and shared further on official government request. The goal there is to limit the amount of unencrypted user data sent to third party servers (in this case in the US I believe Azure-hosted storage for backups).
The keys are separately stored on Apple-controlled servers in non-China countries. In China, I believe they were required by law to have the key storage instead hosted by a Chinese data center.
You are right; Apple isn't storing your private key, they simply have access to an unencrypted dump of your iPhone at the time of iCloud backup creation. My comment implies that once you take a backup, Apple would have the ability to read all your future messages as well.
> There’s no way to accomplish this without having the private key in the backup.
Uh, no. There's no way to accomplish that without some kind of user-managed escrow (even a pass phrase would be fine). It's maybe not the seamless experience Apple wants to offer, but it's certainly not impossible.
Frankly the kind of dummyproof restore being offered is fundamentally incompatible with private backups.
Your messages are returned via the backup, not via the "iMessage servers". Once the messages are at rest on your device, they're no longer encrypted using your "iMessage private key".
I would think that you could, however, encrypt that private key with a user-controlled password. (Granted, this should be optional and have big red warnings about being unable to recover without the password)
The secure enclave is a feature to secure the device and the data on it from casual threats. Apple gets a bit evasive any time the subject of security shifts to their cloud services. (i.e. they will talk about how a particular feature is e2e encrypted but their security talking points seem to mostly end at the device)
I think a better way to look at what they're selling you is a device that provides you pretty-to-very good protection from casual hacking and theft. But in the event of a government knocking on their door for more information, they'll quietly hand over what they can which probably is quite a bit more than the average consumer thinks it is. All bets are off as to what the full story is when the government/jurisdiction involved is not the United States.
> Messages in iCloud also uses end-to-end encryption. If you have iCloud Backup turned on, your backup includes a copy of the key protecting your Messages. This ensures you can recover your Messages if you lose access to iCloud Keychain and your trusted devices. When you turn off iCloud Backup, a new key is generated on your device to protect future messages and isn't stored by Apple.
But wait does it mean that if you haven't iCloud backup activated but use local backup you can actually sync message without storing private key... the wording here is important would be nice if someone clarify.
It's not uncommon for software to claim to offer this feature. Windows does it for example, and it was a bug in such a feature for WebCrypto in Firefox that made the news recently here.
Invariably such features are weak and a sufficiently capable attacker can override them. In Windows for example you could reach into the opaque data structure and toggle the Boolean that forbids exporting keys...
One thing to keep in mind with iMessage. Even if you back up locally the receiving party is probably using iCloud. So not sure how much it will really help.
Yes, this is unfortunately true and unfortunately complicated compared to iCloud backups.
I backup locally to my mac (encrypted), then I have my mac do time machine backups to my Synology NAS (encrypted) and then I have my NAS backup to BackBlaze (encrypted). I do that to satisfy the two pronged backup strategy: local (fast) and remote (slow, but useful in catastrophic local situations such as fire, flood, theft, etc).
There are a few possible approaches. I figured they're using b2.
I'm pretty sure Synology has builtin tools to mirror to it. Backblaze will backup external drives that are attached to the computer, but if they are disconnected for 30+ days the data is deleted. While network drives aren't backed up there are ways to have them appear as local drives (which I hear are a pain to deal with).
Yes exactly, this is what I'm using. There is a time machine folder on the NAS, the Synology tool mirrors that encrypted folder to backblaze. I have the backblaze sync set to run at 1am so it's not uploading and affecting my bandwidth while I'm (typically) awake. Yes, my remote backup is up to 24 hours behind my local time machine backup, but this is acceptable to me since it's only for catastrophic recovery.
That works for any service where you don't fully control the other endpoint. They are just being transparent. Although the wording re: website is peculiar. Could it be their form of a canary like warning?
As a developer, I expect that smaller shops' infrastructure isn't as thoroughly locked down and things like passwords getting logged to splunk/ELK is tech debt, and par for the course. However that's a very specific exception though, to the point that instead of putting work into adding that into their disclaimer, they could have made sure the password wasn't being logged instead.
They deleted all of my data when one of my payments didn't go through, without notifying me. They are impossible to contact outside of passive aggressive email support. I deeply regret trying to trust this company with my data, which is now gone. Do yourself a favor before trusting them and give them a call and to ask about their services.
So do you need to decrypt the backup of your iMessages to get to the private key inside of it, or can apple see your private iMessage keys in the backup without needing to decrypt anything?
Reminds me of WhatsApp claiming it had encryption everywhere and then this [0] dropped. Except, in this case, I'm actually surprised. Didn't Apple publicly claim that it wouldn't bow down to any demands from the agencies?
Apple and Microsoft both tried to build ad businesses, but when they weren't as successful as Google, they turned lemons into lemonade by launching data privacy PR campaigns against Google.
Meanwhile, Apple and Microsoft quietly censor their products in China, surrender data to Chinese authorities, and now we find Apple is intentionally leaving iCloud data insecure.
Presumably Google was under the same pressure from US law enforcement, but somehow Google delivered end-to-end encrypted Android backups in October, 2018. And Google did it without all of Apple's self-congratulatory media hoopla.
Just a reminder, Apple ceded control of its iCloud management in China to a state-controlled company, in addition began storing its encryption keys in China in order to "comply with local regulations". So whether or not your backups are encrypted is almost a moot point, given that the government can submit a lawful demand for your data at any time..
"It's not happening in my country" is a naive argument. You might not care if human rights activists and HK protesters are affected. But Apple's actions in China set a precedent for other countries to follow. If a country demands that Apple "comply with local laws" by providing encryption keys or else risk losing access to that marketplace, Apple will comply, regardless of its effect on user privacy.
There was no need to misinterpret the comment and write things like '"It's not happening in my country" is a naive argument.'
I didn't say that Apple is right to do it in China or elsewhere. I merely pointed out that it's happening in one place and asked why that would make it moot elsewhere. I agree with everything you said except for the phrasing of your first sentence.
Dont know why down voted. Even if someone doesnt care for individuals in other legal zones (sad), the extra injustice Apple is accepting in other legal zones is a clear display of what they are willing to do to you eventually.
> Injustice anywhere is a threat to justice everywhere.
Morality is defined by culture, so there is no "universal" morality as such. However, there are certain specific things that are included in almost all moral frameworks.
Apple says the joint venture does not mean that China has any kind of “backdoor” into user data and that Apple alone – not its Chinese partner – will control the encryption keys.
Incorrect. Please see the official Apple Support page [1] that debunks this. It specifically states:
"iCloud services and all the data you store with iCloud, including photos, videos, documents, and backups, will be subject to the new terms and conditions of iCloud operated by GCBD."
And since all Chinese companies are bound by local laws, you can be assured that your data is readily available for access by the government.
That still doesn’t make the original statement that “encryption keys are given to China” correct.
The data that is available in China is not encrypted and would also be available to US authorities.
Can you quote the part of the article that states that Apple must give China private keys? Can you find a citation where a third party has found proof that Apple changed the iMessage architecture?
Apple may still be controlling the encryption keys but this says nothing about sharing the keys if compelled to do so.
> Can you quote the part of the article that states that Apple must give China private keys? Can you find a citation where a third party has found proof that Apple changed the iMessage architecture?
Apple is smarter than to put some text on their official website saying that the Chinese government has access to all your data. The key here is that their Terms and Conditions state that they operate "...in accordance to local laws". This is a cop-out legalese way of saying "We abide by whatever the Chinese government tells us to do".
You need to take a step back, take off your engineering hat, and realize that the issue is not about private keys. This is about a company (Apple) needing to follow the laws of the country that it operates in or else it is banned. It doesn't matter if Apple was selling bread or handbags, they MUST provide the government with data about their customers when compelled. This is the case with all companies operating in China, foreign or domestic.
The only way it could follow the laws of the country would be to rearchitect its entire system and somehow send the private keys to its servers and save them.
While technically they could do that, do you realize how much legal trouble they would be in in the US if they did so without disclosing it?
Alternatively, they would have to have a special build of iOS for China.
Also, none of the “citations” make mention that the Chinese law forces Apple to give private keys to China.
The questions you are posing make it clear that you don't understand the issue at hand, much less the broader context behind these data laws. Companies have invested much more $$$ and resources for much less reward. Also, everything is legal as long as your lawyers sign off.
I perfectly understand how public/private key encryption works. Can you find any citations to support your specific claims that Apple is sending user’s private keys from their devices and giving those keys to China?
It doesn’t by itself - but you have neither shown that Apple has surreptitiously uploaded user’s private keys in China or that it was required to do so.
I think the overlooked answer in this conversation is that Apple doesn't need to modify their service for China at all. In in all countries, they hold the encryption keys for most user data. Only these things are E2E encrypted[1]:
Home data
Health data (requires iOS 12 or later)
iCloud Keychain (includes all of your saved accounts and passwords)
Payment information
QuickType Keyboard learned vocabulary (requires iOS 11 or later)
Screen Time
Siri information
Wi-Fi passwords
You might say "what about iMessage". The link has that answer, too:
>Messages in iCloud also uses end-to-end encryption. If you have iCloud Backup turned on, your backup includes a copy of the key protecting your Messages. This ensures you can recover your Messages if you lose access to iCloud Keychain and your trusted devices.
This means Apple can produce the data a government is looking for in virtually all cases, and that's probably good enough for China.
Maybe the fraction of users who do that is small enough that China won't push them on it. Or (this would be relatively easy to check) they could just hide that option in settings when the device region is China.
Another factor to consider is that SMS and iMessage are rarely used in China due to SMS historically being more expensive than email/data over there.
That brings up another point. How important is iMessage in China? The iPhone’s market share in China is small and statistically when you’re using iMessage you would probably be sending a message to a none Apple device over unencrypted SMS.
Email is the least secure method of sending data and always has been.
I’ve never paid attention to it until now, but you can selectively disable iCloud backups for any of the built in apps and third party apps in settings.
That brings up another point. How important is iMessage in China? The iPhone’s market share in China is small and statistically when you’re using iMessage you would probably be sending a message to a none Apple device over unencrypted SMS.
Email is the least secure method of sending data and always has been.
A good way to figure out whether this kind of claim is marketing bullshit, is to look for a PR claim that goes the other way: that Apple or whoever helps the FBI find uploaded images of child sexual abuse. If they are matching on your data, they can’t be encrypting it; if they’re encrypting it, they can’t be matching on it. It’s one or the other.
And, well, Apple have confirmed that they‘re matching on your data[1]. So, guess what?
"Apple is intentionally leaving iCloud data insecure" ... if you'd done some research you would know that iCloud backups are not end-to-end encrypted. That means you have a choice: backup to iCloud for the convenience and give up some privacy, or turn off the iCloud backup.
It would be nice if Apple was more forthcoming with that fact but there is some onus on the customer these days to understand what's private and what is not.
I don't think you are reading the list right. That is all the stuff that is encrypted both at-rest and in-transit (with keys known to Apple).
The list of E2E is further down, separate from the table, and includes: Home data, Health data (requires iOS 12 or later), iCloud Keychain (includes all of your saved accounts and passwords), payment information, QuickType Keyboard learned vocabulary (requires iOS 11 or later), Screen Time, Siri information, and Wi-Fi passwords. So virtually nothing, by comparison.
Messages, probably the most personal and relevant for legal cases, are end-to-end-encrypted as well, but if you have iCloud Backup enabled, the key is stored in the backup, making this useless.
Involved isn’t the same as guaranteed. I think commenters here are arguing that Apple isn’t being honest about what is available to law enforcement. My guess would be silent updates targeted at individual users who they have search warrants against.
The list of E2E above is quite transparent. The notion of building a special version of the OS to target an individual is just not how the infrastructure works, and totally goes against the entire spirit of privacy that pervades everything you do internally.
Sorry to be blunt (and I am a big fan of Apple's pro-privacy shift of late) but nobody outside Apple can know that with any certainty.
Even the 2016 blackhat talk on youtube, which describes an elaborate signing mechanism for updates, doesn't preclude shipping targeted OS updates to individual users. Maybe I missed something though, and in that case I'd appreciate you pointing it out.
I can tell you that most people inside of Apple would be shocked if such a thing occurred. I doubt the code pathway / infrastructure even exists to do such a thing. There’s always the possibility of strange things happening that only the right 1-2 people know about.... although it’s probably have to be many more given the number of changes that would be needed to be made to propagate a special one off code signed OS OTA. That would likely have a whistleblower somewhere.
The reality is it’s way easier to just exploit a weakness that you can text someone [1].
But if you’re dressed in tin foil hat to toe, then there’s nothing that I can say to convince you. At that point I’d suggest not using any computing technology that you don’t personally build yourself and watch 24/7.
I think I would have once considered this acceptable, that it agrees with the spirit of the law, except that warrants have lost their meaning thanks to FISA rubber stamps.
IIRC it said it wouldn't bow down on implementing back doors to unlock protected devices and encrypted content on them, which is specific enough not to cover this case.
Bowing down on implementing new security features doesn't go against the promise to not bow down on the security of existing ones, as written. It can be argued to go against the spirit of the earlier public statement of course, but that doesn't count for much in the eyes of a corporation being given a strong suggestion by a government agency.
It is not a backdoor, nor does it circumvent anything.
It is a front door convenience feature which has distinct privacy/security trade-offs.
There exists no magical way to provide a means of lost password/device recovery which doesn’t grant Apple access to decrypt your data. It turns out that a lot of users want to have a way to recover from a lost device/password and are willing to let Apple decrypt their data.
You do this by ticking the ‘iCloud Backups’ toggle on your iPhone.
A backdoor by definition is not a user facing and configurable feature which is thoroughly explained in end-user documentation.
I’m not sure about that. Face and fingers are typically authentication mechanisms. They can grant access to a key, but they cannot themselves be the key.
The thing doing the authentication can be your local device, or a cloud-device. That thing must necessarily store a validator for your face/fingerprints which it can use to decide your submitted capture is “close enough” to consider a match, after which it grants access to the key, usually indirectly, by allowing certain cryptographic operations with the key.
Apple takes pains to ensure the biometric validators never leave the Secure Enclave of a local device. Possibly they could allow syncing these validators between Secure Enclaves of paired devices but I think you have to re-enroll. Absolutely never do they transmit these biometric validators to the Cloud in a readable form.
So in a lost-device scenario, you are also losing the biometric validators as well as the keys which were unlocked by the validators.
I think storing decryptable biometric validators is worse than storing decryptable device backups. Such a fingerprint database would almost certainly be abused by a government (forced to match a terrorist’s fingerprint against their users).
The singular reason I am willing to use biometric authentication on my phone is because the authentication is done locally.
For example Amazon’s recently announced project to link Amazon Pay to a palm print in stores is a total non-starter for me. Besides the fact that it’s a clumsy and bad idea to begin with, no way I want them having my palm print validator sitting in the Cloud.
> They can grant access to a key, but they cannot themselves be the key.
My assumption is that device recovery is such a special case, that it can use very different algorithms than those used in phones today, they could be very computationally expensive and turn fingerprints into usable keys. And of course there is no need for anyone to store them or being able to match them individually or even just tie to an identity of a person.
There are two things that make this problem “hard” if not “intractable”.
Encryption keys are precise integer values (or can be represented as such) and they gain a large part of their security from two facts; a key that is wrong by even one bit will appear totally wrong / disclose zero information, and two, the key space is unfathomably large.
To turn a fingerprint directly into an encryption key would require first; some sort of mapping between the analog representation of the finger/face (which could be two or 3 dimensional) into a digital value, and second; for that value to be absolutely repeatable over time.
The biggest problem is that of course neither your face, nor your fingerprints, are absolutely unchanging over time.
So the first thing you would somehow need to accomplish is a way to map the biometric scan to a repeatable precise integer value. Such a mapping would require, by definition, a loss of precision.
How much precision? Well, it’s directly a result of how resilient you want the algorithm to be in the face of things like scanning error, micro-abrasions on the finger, body fat percentage, the temperature of your hand, swelling, hair growth, etc...
The less precise you make it, the more different fingers (or different scans of the same finger) must necessarily resolve to the same key.
This is the same thing as saying that we are reducing the key-space.
Once you have reduced the precision of the mapping from a biometric scan into a key that will reliably generate the same key over time, you have, by definition, reduced the key space to the point where the encryption is fundamentally unsound.
The only exception to this would be perhaps using DNA sequences, but even then, I believe DNA is not actually perfectly unchanging over time, and is also not at all random [1]. But assuming you could probably handle the minute coding changes that do occur, and reliably scan the same part of the genome, I think you could end up with enough entropy to generate a secure key. Assuming you are willing to precisely sequence a chunk of DNA in order to generate your key. This is rapidly becoming feasible, if not somewhat dystopian and entirely impractical.
But you still have the fundamental problem that the key is not being generated as a uniformly random value in the key space. This happens to be extremely important to the security of encryption algorithms. You wouldn’t want, for example, a close relative to be able to cut your entropy from 512-bits down to 64-bits and into the realm of brute force.
In short, biometrics will remain an authentication method rather than a direct encryption method, likely indefinitely.
I found some research on fingerprints [1]. At 512 dpi fingerprint sensors have 0.01 bits per pixel of information mutual between samples but still individual, meaning that 160x160 sensors can give 256 bits of information usable for keys. And there are multiple fingers, so it seems enough to derive an encryption key from and even some room for redundancy.
Refreshing it every few years isn't a big deal (as obviously none of it will be used directly as an encryption key for all of your data, but only to encrypt an actual encryption key).
That paper has absolutely nothing to do with generating keys directly from an image of a finger. They are discussing the lower bounds on how small a fingerprint sensor can get.
It doesn’t seem like you read my reply at all.
It’s not a question of raw entropy from the sensor, which is what the paper is discussing. It’s an issue of repeatability.
To quote spoc in ST-TWOK: "not a lie, an ommision".
It isn't a deliberately implemented backdoor. It is a deliberate decision to not install doors at all, just empty frames. I know we are arguing semantics here, and it doesn't make it right, but it doesn't go against the letter of how they've claimed they'll behave.
Everything that happens internally has to be approved by privacy, it’s always part of every discussion (meaning it’s like you’d hope for). There’s no shortage of projects killed or altered because privacy said no.
But the marketing is nonetheless deceptive. Apple’s practices are much less stringent, and their cooperation with law enforcement much greater, than their marketing and even their privacy pages suggest.
This is a crappy place to turn around and tell your customers, “should’ve read the fine print!”
That was a coordinated PR move between Apple and the government. The US administration went into heavy damage control mode with the large tech companies after the Snowden revelations about PRISM et c.
Put yourself in their shoes. These companies stand to lose tens or perhaps hundreds of billions of dollars in foreign sales if there isn’t a counter-narrative to “well of course they spy for their national military, just look at these Snowden slides”. Making it seem like they are fighting for their customers at odds with the FBI is a perfect counterpoint.
Meanwhile, it’s business as usual for US military intelligence, as evidenced by TFA. Excellent reporting!
The ambiguous position on true end-to-end encryption shows once again that Apple is in for the marketing (both to consumers—predatory and dangerous, and to engineering talent—dishonest). Same hypocrisy as on the China issue. Not that there is an easy solution when you are one of the biggest companies on the planet and that shareholders essentially expect you to grow forever while playing nice with everyone.
If you delete your remote backups, they are likely be deleted, eventually. If you don't delete your remote backups, they won't be deleted.
There's no business case for keeping backups around for Apple, unless they suddenly became an ad company and started mining your backups for personalization data.
There is a business case - charge the FBI or any government agency for the cost of restoring/delivering it to them, or use the contents to improve any machine learning they are conducting, and I'm sure there are others.
For the longest time Facebook couldn't actually delete photos that you requested the deletion of. They could remove it from indexes so it couldn't be found, but if you had the link it would still be available (akamai cdn). Because, to them, either the cost of the hosting was miniscule compared to the cost of writing the software to ensure things actually got purged from the CDN.
In the EU, big tech companies actually delete your data within a short period of you clicking the delete button because they're scared of the GDPR requirements.
Outside the EU, small companies, or non-tech companies might we'll keep it forever.
On the contrary, Apple has very decent device security. See their Platform Security guide: https://manuals.info.apple.com/MANUALS/1000/MA1902/en_US/app.... Some of their online services are end-to-end encrypted with very smart designs (see iCloud Keychain page 82, Find My page 103).
The iCloud security overview [1] says iCloud backups are encrypted "in transit" and "on server", but indeed doesn't say much about the encryption keys. There is "end-to-end encryption" on just a few items (iCloud keychain, WiFi passwords, etc.)
Maybe this is me being out of touch with modern deployment, but that is absolutely not what my impression of "on the server" means. My mental model is that of a client, whatever software is under my control, and a server, which is whatever my client connects to. "Encrypted on the server" then means that at no point is the plaintext data visible to any part of the server.
If Apple splits up the server into a web server and a storage server, then uses "encrypted on the server" to refer only to the storage server, that is entirely disingenuous.
There's a difference in encrypted data at rest vs. end-to-end client side encryption. Encrypted data at rest protects against stolen physical storage devices. Without access to decryption key stored on a separate machine, you're unable to read data on the storage device.
Encryption at rest doesn't protect users from the company, since the company has the decryption key. It protects your data if the company misplaces the storage drive.
It's common in corporate environments to check compliance boxes, which is why AWS offers encryption at rest:
True, but I read a big distinction between "encrypted at rest" and "encrypted on the server". Encrypted at rest has the implications that you state, being there to prevent somebody from walking off with a hard drive. Encrypted on the server implies that it is never unencrypted while on the server, and that any server-side computation is done solely through homomorphic encryption.
I will not be surprised if further reporting points a finger at the influence of China as well.
Apple already conceded to hosting Chinese iCloud data on Chinese servers, and that news came out about 2 years ago... which is also the timeframe reported for this decision to forego end-to-end encryption of iCloud backups.
I'm guessing here, but I think it's safe to assume that China was not going to permit such encryption (for the same reason they insist on hosting the data) and thus to provide it for the U.S., Apple would have had to fork iCloud. Add in the political risk in the U.S. and you have a recipe for "maybe not."
Offline attacks for a motivated adversary, like NSA, are trivial on low strength security measures, like a standard PIN, numeric only PIN, or pattern. Only a strong passphrase would really help you.
This is false. This concern is addressed in the second paragraph of the linked article:
> this passcode-protected key material is encrypted to a Titan security chip on our datacenter floor. The Titan chip is configured to only release the backup decryption key when presented with a correct claim derived from the user's passcode. Because the Titan chip must authorize every access to the decryption key, it can permanently block access after too many incorrect attempts at guessing the user’s passcode, thus mitigating brute force attacks. The limited number of incorrect attempts is strictly enforced by a custom Titan firmware that cannot be updated without erasing the contents of the chip. By design, this means that no one (including Google) can access a user's backed-up application data without specifically knowing their passcode.
This is true for iOS as well as every other widely used computing platform. Android is better; unlike iOS you can build AOSP yourself without the Google proprietary parts, and unlike iPhones, Pixel phones have unlockable bootloaders so you can install your own OS builds.
Even most PCs running Linux have plenty of nonfree binaries in various firmwares and common peripheral drivers. I support efforts to make devices with fully open firmware on which you could run Android's AOSP or other open source operating systems.
Between things like this, and the shenanigans Google pulls (with Android, the store, developers, and other things), I'm quickly going in a different direction.
My ultimate plan is to build my own phone; yes, I'll still be stuck with a carrier (I use t-mobile, and I haven't had a problem with them over 10+ years I've used them), and the hardware won't be completely "open source", but the software and OS will at least be what I make of it myself.
In the meantime, I'll be playing with one of the Pine64 phones; hopefully it will give me most if not all of everything I want and need, and maybe I can help with bug testing or perhaps software development? At any rate, it won't be Apple or Google.
There are times that I have when I sometimes think to myself that going back to simple email on a text screen, and not much else, would be a better thing than what the web has become. Maybe go back to BBS's over ssh or something? "Dial In" using my TRS-80 Model 100 "laptop" and move out to the boonies...
But it is nice that you have the option to not backup with iCloud. They are not storing the information whether you like it or not as a lot of companies do.
While Apple doesn't force its users to use iCloud, they also don't provide an alternative way to do full backups of iOS/iPadOS devices over a network. Yes, you can plug an iPhone into a Mac, but that doesn't scale.
Yes, but that only works in local networks and after connecting via USB once – and the backup is always stored on a single computer. So it may be acceptable for personal usage, but not suited for an enterprise environment where you want to centrally manage hundreds of devices and provide some redundancy for your backup system.
I guess I don’t understand this point. This is exactly what ios mdm backup is for. Relying on users backing up their personal icloud accounts for work seems highly problematic. If they are work icloud accounts... usb should not be a problem since you are probably provisioning the devices, and the itunes backup approach over your intranet seems actually ideal.
Beyond HN and tech circles, is there any detectable groundswell of demand for privacy? When you talk with friends & family about privacy, does anyone care?
When average people care about privacy, the large players will respond. Until then, pressure from the state can be accommodated without irking customers, so Big Tech will play along.
Maybe they don't care about privacy because they're defining it as secrets that people have. Snowden's description of privacy is much more powerful:
"And if we actually think about it, it doesn’t make sense. Because privacy isn’t about something to hide. Privacy is about something to protect. That’s who you are. That's what you believe in. Privacy is the right to a self. Privacy is what gives you the ability to share with the world who you are on your own terms. For them to understand what you’re trying to be and to protect for yourself the parts of you you’re not sure about, that you’re still experimenting with.
"If we don’t have privacy, what we’re losing is the ability to make mistakes, we’re losing the ability to be ourselves. Privacy is the fountainhead of all other rights. Freedom of speech doesn’t have a lot of meaning if you can’t have a quiet space, a space within yourself, your mind, your community, your friends, your family, to decide what it is you actually want to say.
"Freedom of religion doesn’t mean that much if you can’t figure out what you actually believe without being influenced by the criticisms of outside direction and peer pressure. And it goes on and on.
"Privacy is baked into our language, our core concepts of government and self in every way. It’s why we call it 'private property.' Without privacy you don’t have anything for yourself."
Next time you encounter someone who claims they don't care about privacy, ask whether or not they close the bathroom door (or the stall in a public restroom) when they're taking a shit. And if they say yes, ask them why? What's going on in there isn't any big secret. It's not like they're in there plotting a terrorist attack. What are they hiding behind that door?
It turns out most people _do_ care about privacy. You just have to frame it in relatable terms.
I don't want anyone else watching me take a dump because that's private, and it's not any of their business. Likewise, I don't want other people knowing what articles I read on the internet, or what music I listen to, or reading the contents of my business plan, or scoping out my dick pics, or any of a thousand other things, because those things are also private and they aren't anyone else's business unless I choose to share them.
Restrooms have doors, and most people close them for privacy. Data has a privacy door, too, and it's called encryption.
> "ask whether or not they close the bathroom door (or the stall in a public restroom) when they're taking a shit"
This is a pretty bad question and a hyperbole. Most people would want no one (including people who they are usually intimate with) to watch them defecate. And that is not the same thing as government snooping on its own citizens. Arguments for massive surveillance given by governments is not so much about invading the personal privacy of people than it is about protecting national "security" or preventing "terrorism". For this reason, few people are going to get convinced if you equate the privacy to use the lavatory without anyone watching to the privacy of being able to communicate without the government monitoring you. The best argument against massive surveillance is the one that Snowden gave during a Reddit AMA:
> "Some might say "I don't care if they violate my privacy; I've got
nothing to hide." Help them understand that they are misunderstanding
the fundamental nature of human rights. Nobody needs to justify why
they "need" a right: the burden of justification falls on the one
seeking to infringe upon the right. But even if they did, you can't
give away the rights of others because they're not useful to you. More
simply, the majority cannot vote away the natural rights of the
minority.
> "But even if they could, help them think for a moment about what they're
saying. Arguing that you don't care about the right to privacy because
you have nothing to hide is no different than saying you don't care
about free speech because you have nothing to say.
> "A free press benefits more than just those who read the paper."
General public are generally ignorant about risks in the tech they use. That doesn't mean they don't care about their privacy.
There's an assumption that laws and safeguards are in place so technology in general can be trusted and transacted on.
In other words they trust in us "the tech circle" to police ourselves and assert security and privacy. It's not circle jerk about privacy. It's a duty we have by being in the frontlines.
Exactly. The general public is never going to say “I demand end to end encryption and complete privacy” because they don’t know how all the tech works. But they’re surely going to expect that their private text messages are private and their private pictures are private. People expect privacy as a default and sharing as an option, and they rely on the “experts” (tech companies, lawmakers, etc) to help them.
Not exactly. Some people may not understand and also not care and I'm sure there are many many people who that fits. When you're talking about surveillance level vs. someone hacking your iPhone that's not really a comparison. Of course people don't want their private messages stolen, but that's not really the case with most surveillance. Not that I'm in favor of it necessarily but there are merits.
Surveillance and privacy aren't the same thing but this is the FBI and people are bringing up snowden. In general I think you're right privacy is just expected for personal messages, but at some point when it's just data I don't think most people care, and may in fact support some level of surveillance.
There's this popular conception that the average person doesn't care about privacy, but I think that's wrong. I just think that to really get a handle on what it means to be private in the modern digital age is too complicated for the average person. People are concerned but feel overwhelmed by the technical details and don't know where to start. You need to know the fundamentals of encryption, what a key is, backdoors, the difference between E2E and non-E2E, and so on. We, as the tech community, need to do a better job communicating and explaining.
>There's this popular conception that the average person doesn't care about privacy, but I think that's wrong. I just think that to really get a handle on what it means to be private in the modern digital age is too complicated for the average person.
Cypherpunks tried to sound the alarm around the time email got popular and had similar difficulty then - the barrier to adoption was too high and laypersons didn't really understand why they should care enough to overcome that barrier.
See Wired coverage from 1993 as one example of this view from the techno-savvy thinking things should be one way but acknowledging that reality is much different.
https://www.wired.com/1993/02/crypto-rebels/
From the article Crypto Anarchy, he believes, is inevitable, despite the forces marshaled against it. "I don't see any chance that it will be done politically," says the Cypherpunk. "[But] it will be done technologically. It's already happening."
Rather than some digital utopia where personal information is heavily protected and not linked, we have a generation of programmers working to persist tracking cookies across browsing sessions, whether anonymous or not and law enforcement leveraging a scraped database of three billion photos to identify people whether they choose to be identified or not.
By following the electronic links we make, one can piece together a depressingly detailed profile of who we are: Our health records, phone bills, credit histories, arrest records, and electronic mail all connect our actions and expressions to our physical selves. Crypto presents the possibility of severing these links. It is possible to use cryptography to actually limit the degree to which one can track the trail of a transaction.
Of course that didn't happen beyond https everywhere for point-to-point encryption and regular leaks of consumer data proves once collected data is often not protected from public scrutiny, much less encrypted at rest, anonymized, etc. So even if you do protect your data perfectly there's still a chance it'll be discovered elsewhere. Rather than trying to create backdoors, government should be trying to enforce much more stringent regulations on use and protection of consumer data. However given the political zeitgeist I don't really see that happening.
Obviously there is. Otherwise it wouldn't be advertised on billboards and television.
Privacy is just like your personal health everyone wants a convenient solution but no company can honestly offer it. (Doesn't stop then from pretending they do)
I seem to recall when Jennifer Lawrence (a famous actress you may have heard of) had her icloud storage hacked and leaked all over the internet, there was a bit of a murmur online and even offline about it.
Actually yes. Some even have bought into VPN services without me recommending it and without any missionary ambitions from my part. Generally these are also not people using services of the largest offenders too much though.
I would even say the majority in my circle cares about it. They just have no real clue how to mimimize data exposure. There certainly is an effect that influences consumption though.
From the techies within my cirlce everybody cares, most to a pretty large degree.
No there isn't, despite repeated high profile hacks.
I've spoken to many in security, selling E2E enablement for the enterprise, and even among CIOs, there is no urgency to implement this. You can imagine the indifference among the less tech savvy
the youth are either ambivalent or just not the point of understanding, have you seen how much they freely share with each other on messaging, dating, and other apps? Now for the most part today's schools lean pretty heavily towards indoctrination and with so many distractions at hand at their age they don't see issues we see.
As I posted before, then you get into the so called adult realm and many of them will trade away privacy if it gets them money off their coffee, coupons for grocery, or just to brag about their latest get away. it does not matter if its private or government channels, give them a reward and they want it.
Privacy also tends to be highly associated with identity theft not realizing that not only is privacy important for reasons of protecting your stuff but your person and your interest also need protecting both from government and private parties.
Also, some here seem to think they have more to lose than they do. It seems more about wanting to be part of a victim class as long as people and organizations they don't like are punished
tl;dr no, most don't care, give them a cookie and you can get their email and more
It turns over data more often in response to secret U.S. intelligence court directives, which sought content from more than 18,000 accounts in the first half of 2019, the most recently reported six-month period.
When you think about it, that volume is staggering. 36,000 iDevice-using intelligence targets every year? Imagine the amount of analyst time required just to go through 36,000 iCloud backups every year!
Really makes you wonder what the criteria is for being investigated.
Maybe I’m naive, but I find it hard to believe that there are 36,000 yearly iCloud accounts with probable cause to be tied to terrorism activity and/or national security matters, especially if that’s only in the US.
As someone with a (half) Middle Eastern heritage and name (but born and raised in the US) I’ve experienced my fair share of nuanced discrimination at airports and one weird situation with what I assume was the FBI. There’s always the ignorant TSA agent who raises an eyebrow when you report coming back from the Middle East... like why would anyone ever travel there if it weren’t for terrorism?
I’m a pretty average techie so I’m not too worried about anyone going through my iCloud backups, but I feel like there should be some more transparency around this stuff. I feel like if you’re secretly investigated but discovered to be innocent, shouldn’t you deserve to know you were spied on? I guess that’s what FOIA requests are for.
The war on terrorism feels like a game of whack-a-mole sometimes.
This is disabled by default in iOS 13, to keep your messages secure (and because often times it doesn't work properly). You have to go into iCloud settings to turn it on. And then if you aren't using iCloud Backup, the key to your Messages backup is not stored by Apple, so they can't read it anyway (see https://support.apple.com/en-us/HT202303).
I also have iMessage configured to delete messages after 30 days. I wish WhatsApp has this feature, but it does not, so I just periodically clear the history manually (which you can do in WhatsApp as well as Signal).
I don't think we understand yet how to model the impacts of breaches on products.
Cambridge Analytica didn't ruin facebook, but it did enable CCPA, and it probably changed how FB users think of / trust the product. Ashley Madison / equifax breaches completely ruined their cos.
Snowden disclosures were a weird middle case that meant all things to all people.
Weakening aapl's privacy claims may not matter to a post-truth public but I suspect it will further drive demand for actual consumer privacy products, when and if they enter the market.
As a big fan of the Mac and iPhone this is incredibly disappointing. I always assumed the privacy situation with Apple was candy-coated but I didn't think they were this spineless.
Once you enable it, it immediately begins uploading your contacts, photos and passwords to Apple. Then you need to disable those specific things. Even after you delete those things, every app on your phone can silently and without your explicit permission, start loading data into iCloud.
Then, even in iCloud Backups, you'd still not have the key to decrypt specific app data without having unlocked the device.
I realize you could do this right now (in theory) with your own encryption solution and Keychain, but a first party solution that's as easy use as the Data Protection features/apis would be really nice.
Apple has plenty of data that I wish was E2E encrypted, but if many/most of my 3rd party apps had their own data locked, that would go a long way in the right direction.
I believe this has been the case for sometime hasn’t it? I vaguely remember reporting indicating this was true during the San Bernardino case and that Apple handed over that backup. Either way I do remember reading the Apple law enforcement guidelines a year or two ago and this was the case. iCloud data is not secure from law enforcement.
My project list has implementing a WiFi backup Windows/iTunes VM for this specific case. Does anyone know how iOS backups will be handled on personal PCs once iTunes is discontinued?
PSA: libimobiledevice exists and supports native encrypted backup/restores for iPhones using the idevicebackup2 utility. You can also disable iCloud backups from there for fun.
With the ifuse utility you can even mount a subset of your iPhone's storage as a FUSE filesystem. (If it's jailbroken you can mount all of it)
Doing so I was able to read the SQLite database Photos uses on my girlfriend's iPhone to migrate only photos she had favourited to her new phone; she hated the idea of moving them all over so much that she was ready to let the best ones perish.
Apple is not 100% secured against SIM swapping. There’s an option to have an SMS sent with your 2FA code as another option. I know because I’ve done this before and was wondering when Apple would let you use a hardware token or virtual MFA as an option like in my Google account.
Yes, but the key is stored in your iCloud backup if you use it. As soon as you disable iCloud backups it will roll the key for iMessage and they will be effectively E2E encrypted.
When Messages in iCloud is enabled, iMessage, Business Chat, text (SMS), and MMS messages are removed from the user’s existing iCloud Backup, and are instead stored in an end-to-end encrypted CloudKit container for Messages. The user’s iCloud Backup retains a key to that container. If the user subsequently disables iCloud Backup, that container’s key is rolled, the new key is stored only in iCloud Keychain (inaccessible to Apple and any third parties), and new data written to the container can’t be decrypted with the old container key.
> Yes, but the key is stored in your iCloud backup if you use it. As soon as you disable iCloud backups it will roll the key for iMessage and they will be effectively E2E encrypted.
Assuming this is true, you still don't know what people on the other end will do, meaning it is never actually E2E encrypted.
E2E usually means from endpoint device 1 (my iPhone) to endpoint device 2 (my friend’s iPhone). What the other person will do with it doesn’t factor into the conventional definition of E2E.
No, conventional definition is actually both: from endpoint device 1 to endpoint device 2 and from endpoint device 2 to endpoint device 1. If device 2 has backups in question enabled, there is no E2E anymore.
The fact that they started working on the problem then abandoned it after the FBI complained is disappointing, especially to Apple consumers. But all it means is the status quo marches on.
Headlines like this vindicate my decision to never purchase an Apple product.
Why do I have to use a phone in its place? Apple products aren't a basic human necessity. I like to think I can just use a phone for the sake of wanting a phone, not to replace the void that not being an Apple consumer leaves in my soul, or something.
What exactly do you mean by "basic aspects of modern life"?
I have a desktop PC, a laptop, a work laptop, a LineageOS (Android-based) phone, and a VR headset for gaming. Anything I want out of modern tech, I either already have or doesn't exist yet.
I tried to get by with Lineage without gapps for a year and a half (if you don't forego gapps then there's no point from a privacy perspective).
I couldn't get push notifications on Slack because they went through gapps.
I couldn't use several online dating services because they were only on mobile, and their mobile apps broke without gapps.
I couldn't check my bank account from my phone because I couldn't get a hold of its app outside of the Play store, and because its mobile site locked my account for suspicious activity because I roamed between cell towers while using it.
I couldn't find places because there was no reasonable mapping option (OSMAnd, at least at the time, was abysmal to the point of being almost useless).
I once bought a pair of headphones that I couldn't use at all because you had to use Bose's app to set them up, and - you guessed it - the app was broken without gapps.
Even Signal - the OSS encrypted messenger - was partially hampered without gapps.
We can talk all day about how we got to this status quo and what can or can't be done about it, but the reality is that if you want to live a real, modern, urban life in 2020, so many people and organizations just assume you to have a fully-functional smartphone that you will be actively hampered without one.
> We can talk all day about how we got to this status quo and what can or can't be done about it, but the reality is that if you want to live a real, modern, urban life in 2020, so many people and organizations just assume you to have a fully-functional smartphone that you will be actively hampered without one.
And for that, you can also get a stock Android phone like 80% of the world and still not be an Apple consumer. :)
And be saddled with Google's services, which are 10x worse with regards to privacy than even this new revelation about Apple? I don't really understand your reasoning.
1. This isn't news about Apple's encryption, it's just that the status quo is here to stay. Your Apple device is as secure/private as it has been, and will probably not get any better. It's disappointing, but not really much of a revelation to anyone who's been paying attention since 2012 (or earlier).
2. The way the headline is phrased vindicates an orthogonal personal decision I made to not purchase hardware or software from Apple.
Given your most recent comment, at some point, you must have assumed a lot of things that a) aren't true and b) I never stated or implied.
We users are better off if Apple is "compromising" on something like this at the stage we're in now - especially since nobody forces you to use iCloud Backups - than we'll be when/if the US gov makes Apple an offer it can't refuse and forces a real backdoor master-key on the whole system top to bottom.
Assumption: That not going full encrypted backups will prevent the government from having the political capital to enact a "crackdown" forcing a backdoor on the devices, iMessage, etc.
Another case of a headline not being supported by the story:
“ Reuters could not determine why exactly Apple dropped the plan.
“Legal killed it, for reasons you can imagine,” another former Apple employee said he was told, without any specific mention of why the plan was dropped or if the FBI was a factor in the decision.”
And further on:
“ However, a former Apple employee said it was possible the encryption project was dropped for other reasons, such as concern that more customers would find themselves locked out of their data more often.”
So 4 of the 6 sources were speculating (FBI), and one actively admits they don’t know the reason, but the lede says 6 sources confirmed this. Hmmm...
I know you're being cheeky and pedantic, but that's exactly the spin the title was trying to go for in order to get clicks. While not "technically" wrong, it's still pretty dishonest imo.
I disagree. I read "after" in the headline to succinctly imply that there are reasonable grounds to suspect a causal link between these events, while acknowledging that such a link has not been admitted by Apple (otherwise it would be "in response to").
It's true this kind of wording is often used in low quality journalism to float the idea of a causal link when there is no good reason to suspect one, but this is not one of those cases.
> However, a former Apple employee said it was possible the encryption project was dropped for other reasons, such as concern that more customers would find themselves locked out of their data more often.
This. It's also the reason why photos are not E2E encrypted on iOS: Apple really doesn't want to be in the position of saying "sorry, you lost all your data" or "sorry, it's sad that Grandma just died, but all of her photos are gone and there's nothing you can do about it."
So what about all the privacy billboards, 'What happens on your phone stays on your phone?'. New version: 'What happens on your phone stays on your phone and unencrypted on the cloud'.
A completely e2e encrypted backup system would have to include photos (current iPhone backups do not). But true encryption means that when customers forget their passwords, they lose their data.
Already, people who don't use iCloud Photo Library and lose their phones, or forget their passwords, lose the photos that were on the phones.
Anyway, I think the customer experience issues weigh pretty heavily here.
Shame on the FBI. Unfortunately the average Apple user won't understand or care about the implications. This is why I don't use ICloud or any cloud products. Apple are not nearly as bad as Google though.
I stopped using GDrive about a year ago and I aim to be Google-free for 2020. I don't use Gmail for anything important any more.
It could be for many reasons too, including average people forgetting iCloud passwords and wanting their data back. Does Apple unlock an iCloud backup in that situation?
Perhaps a pro-privacy compromise would be for Apple to offer the feature but have it turned off by default, which means 99.99% of users won't ever change that.
The small bit of iCloud E2EE which Apple allows (for health data, passwords and Mac-to-Mac clipboard sharing) is using your iPhone's 6-digit passcode and your Mac's admin password.
That means that Apple can also now perform an offline brute-force attack against your file vault password.
Apple should allow third-party backup solutions. Backblaze is the option I would use if I had the choice, because they already support end-to-end encryption (end-to-end does have a downside: losing the encryption password means losing the data. Most people would make this tradeoff, but many HN readers would).
When will a startup enter into this space with a privacy focused smartphone? Does this already exist, and is it good? If not, why all the complaints and why hasn't someone entered the market? Obviously startup costs will be high, but there will certainly be funding available for this market, right?
Pine64 and Purism make devices that might fit that bill, but keep in mind that the devices may not be as user-friendly in their current state as you might hope for. There's still a lot of rough edges, I hear.
a) Is this just tactical move? Apple might choose to delay it's plans In effort to avoid confronting the current administration and wait for more reasonable one.
b) Is this permanent change of strategy? Giving up.
1) There is no way Apple would be allowed to sell iPhones in China, without China government having access to anything. So, I assume that Apple users in China have e2e encrypted exactly nothing.
2) I have a strong suspicion that those 'enter your Apple ID password because your account needs it' message really means 'a government has requested your data and even though it's encrypted, we will nag you about entering a password, and if you give it, you're a free game'.
I don't blame Apple for this, I'm sure they're doing what they can, but when a government says 'give us this data', they can't not comply and stay in business. And a whole point of a company is staying in the business.
Vote responsibly - companies can't protect us from a government we have put into power.
Well, that's one way to force Apple to reverse a stupid decision. Looking forward to backups encrypted so well Christopher Wray would have a nervous breakdown, in the next iOS update.
To be fair to Apple, if you believe Apples privacy policy, being able to access it, theyliky won’t access it and sell your data to advertising Agencies.
Some companies are better than others but there's absolutely no reason to believe any one of them would ever be on "your side" for any reason. You can vote for who makes decisions in government, but you can't vote for who makes decisions in companies.
What choice is there beyond Apple or Google in terms of smartphones? And I mean actual, ergonomic, everyday convenient choice -- my mother will firmly refuse me if I said "I'll buy you a phone but will have to tinker a full weekend to make it half-privacy-aware". And even if she was on board, she'll just yell at me if she can't do a basic task (this is a controversial topic around here but heavily modded and supposedly Google-less Android is absolutely not as useful as a Pixel or vendor-modded Android).
So...
Google is an ad company. There's nothing they won't do to get to your data. And of course, being a huge company, they will lie about it at Congress hearings, lobby against punishments, make PR campaigns to mislead the general public, cover up their work with China until they are caught, etc. They already did all of these, many times.
Apple is seemingly a good citizen but do we really know what they do behind closed doors? As an Apple user I am still a realist and I know the answer to this question is a firm "No".
We seriously have no adequate choice. I like my iPhone; I don't play games on it (well, a few brain-teasers and a bunch more serious like chess but you get the idea), and I read a lot of stuff on it: work- and hobby-related. Social media gets almost zero attention from me. So smartphones can be very useful if you don't get hooked on BS.
And so I ask you again -- what actual choice do we have? How can we really vote in a way that will make a difference?
> How can we really vote in a way that will make a difference?
We (that is, tech users) have a choice, if we choose it. The ones who don't are the people who can't take that option (ie, your mother). You've stated as much.
What is your choice today, as someone technically inclined?
Well - you mentioned modded Android; but there are several other options, if you don't mind fiddling with things.
More than a few phone operating systems are out there, many open source.
For hardware, probably among the best right now is the Pine64 phone (it's currently in a strange "high-beta" state - you can order one, and it will supposedly be close to what will eventually be sold, but it isn't completely considered a "commercial product").
Alternatively, you could build a phone using a Raspberry Pi (or a similar board, like a Beaglebone or something), or an Arduino (you'll be very limited in what you can do using a standard Arduino - basically make/take calls, store some contacts, maybe SMS).
Note that most phone modules out there are 2/3G - but you can find 4G/LTE modules, fairly cheap if you know what you're looking for.
Your "phone" won't look pretty, but you probably know that. It will likely be fragile at times, and the software quality will depend on what you can find and what you can code yourself. Cloud storage, games, etc - will all be mostly up to you to implement in some manner.
A lot of work, certainly - but that's the only real option, as you can never be absolutely certain, if you're not in control of at least the OS and software.
But for everyone else? Yeah - they don't have any choice, unless they are willing to step up their knowledge (and most aren't, nor should they have to).
That's not how that works at all. You can buy something you need because you need it and that's not at all a vote saying you like the management of how it was produced.
It’s a vote whether you claim it is or not. If you need something but don’t want to support the company, buy it from someone else. And if you can’t find a company that you want to support, then you’ll see it’s just like real politics. You don’t get to only vote for the parts of a candidate you like, you vote for the whole package.
>then you’ll see it’s just like real politics. You don’t get to only vote for the parts of a candidate you like, you vote for the whole package.
Well, that's the problem with politics as well, and the reason that modern democracy is a sham (compared to ancient Athenian direct democracy [1]).
[1] obviously for those it included at the time. After all, modern democracy didn't include slaves, women, and even poor white folks (the extension of voting rights to non-property-owning white men happened in 1828, and it was hampered in the South until the early 20th century) until well into the 20th century.
> It’s a vote whether you claim it is or not. If you need something but don’t want to support the company, buy it from someone else.
Again, that's not how that works at all. I can name hundreds of items that I've purchased in the past year where there aren't meaningful competitors. I can name dozens of contracts I've entered into where management changed after the contract was signed (sometimes years afterward). Of course then I'm screwed because I'm still stuck in that contract.
If only there was a viable competitor phone. The choice is iPhone or some flavour of Android. Even if Android had E2E encrypted backups, it leaks is so many other unpleasant ways that it's not even a choice.
For me it's a vote for less tracking, or at least less invasion. It's not saying it's perfect or even close. There's a ton of things I'd change on iOS if I could.
So yeah, I 'vote' Apple because the alternative is a dumb feature phone.
They do. Including an independent third-party security audit.
> it leaks is so many other unpleasant ways
I recall Android security used to lag behind Apple at the device level, but I'm not sure that's still true with current hardware and OS. Could you educate us on the current state of Android data leaks?
For me, my device should have full-disk encryption, sandboxed apps and fine-grained control over app permissions. Both iPhone and Android have that.
Intentionally leaving iCloud insecure in the absence of legal compulsion is a sneaky evasion of all the much-ballyhooed device-level security.
Voting is a very bad way to send a message, especially if it is 'voting with your wallet'.
Its at most 1 bit information, often even less, since it could be a huge number of reasons for each person to vote one way or another, or maybe not vote at all or just random.
You can boycott a company your whole life, and nobody not even the company will care.
Tell me how I can choose to avoid buying fungible commodities from a particular source: suppose I want to avoid oil from BP because I disapprove of their handling of the Deepwater Horizon disaster. Avoiding BP-branded filling stations doesn't actually mean I'm not buying oil that came from BP. Or suppose I want to avoid wheat from Archer Daniels Midland, or crops sprayed with Monsanto products. The idea that capitalism provides meaningful choice is a joke.
The whole point of commodification (which arguably has little to do with capitalism - this process occurs in communism too) is to eliminate differentiation.
Pecunia non olet ("money doesn't smell") is the motto of any commodity market for thousands of years. You might not like who you're buying from, but the POINT is fungibility: to completely remove all distinguishing characteristics, allow interchangeability, separate the value of the good from the value of the producer or seller.
Early stage markets (usually aided by capitalism) on the other hand allows for disequilibrium, competition, and differentiation. At worst is how you get commodities repackaged as "artisanal bottled water" and "bone broth", and the like. But it would be the way to differentiate ethical oil (is there any?) from unethical. Also has been pretty successful at labelling GMO / non-GMO food. So, yeah, there are many cases where you do have meaningful choice.
Except that you only reasonably have two choices in smartphones: Android or iOS. So you can vote between plague and cholera.
And, no, you would have to be in a very privileged
position already to have any other choice.
The "free market" can't fix things if there are no choices to choose from. Sometimes politics has to fix this instead.
You vote wholesale (on all aspects of a product) and not just on those you like or don't like, so the point is moot.
Plus, to vote "away" from a company/product, other products should exist that are better, and not just in this single aspect (encryption of backups), but in other aspects that count for your usage.
And at the end of the day, these tools are powerful and it's unclear that it's in the best interests of humanity at large to make it easy for every individual to cipher their data in a way that no other human can ever access. Apple (and their peers in the big-cloud-data-storage world) may not even be in the wrong for not willing to hand the same tool to the just and the unjust for cheaper than those individuals rolling their own.
> it's unclear that it's in the best interests of humanity at large to make it easy for every individual to cipher their data in a way that no other human can ever access
Arguably making it harder for enforcement agency to do their jobs. I believe this is their burden to bear and work with, since privacy for every citizens is also important.
Tell you what. The minute the entire government and FBI start recording their activities openly on an immutable blockchain, or at least every police officer wears a bodycam on-duty, we can talk about handing over keys for all citizen data being open to said government. But still hard to search and index en masse.
And same goes for every other government. Why should the government can do whatever they want secretly?
I can't speak for all governments, but in the US the National Archives has responsibility for recording the things the federal government undertakes on behalf of the people. This includes even the tapes Nixon made of his own conversations as President.
The guiding principle the US government operates on in this context is "When a man assumes a public trust he should consider himself a public property" (Thomas Jefferson). There are plenty of ways the fed falls short of the goal, but the goal is set.
... and I don't think anyone's talking about "handing over the keys for all citizen data being open to said government." But we are talking about avoiding having common practice for private citizen information stored in servers owned by a third-party private corporation becoming "It's stored in such a way that nobody, not even the third-party private corporation, can ever access the data without a key the private citizen can throw away." There are some good cost-benefit discussions to be had about whether that should be a thing commonly offered (even if an individual can build it themselves).
To give a concrete example, imagine if Epstein's data on the human trafficking he conducted were impossibly ciphered now in an iCloud backup he made. Does that benefit society? And more practically (regardless of larger ideal morality questions), is it a good PR look for Apple if their tech made it easy for him to do and when the fed comes knocking on Apple's door to retrieve from Apple's servers a dead man's documents that could bring justice for sex-trafficked children, Apple's response was "Sorry; we don't have enough computing power to help you?"
It’s only a matter of time before easy-to-use open source technology becomes available to everyone to host and they won’t need Apple to manage their data.
Unless you mean Apple should be actively trying to siphon off private data via their OS and hardware and index it for the feds?
I would then say Apple’s “trusted computing base” isn’t so trusted.
> It’s only a matter of time before easy-to-use open source technology becomes available to everyone
I believe people have been saying that to me for thirty years now, but I'm younger than my peers. ;)
In the context of cloud services specifically, I think that's even less true than in the OS space. Half the benefit of clouds is someone else is maintaining the infrastructure, the backups, the ubiquitous connectivity, etc. None of those are trivial to handle as a solo project, and attempts to make them easier compete with free (as in time).
I agree that the resopnsibility is on the FBI to do investigation, but transitioning from a world where private documents are irretrievable without a warrant to one where they're structurally irretrievable because of mathematics and computational limitation would fundamentally alter the balance of power between society and individuals within that society in ways that society hasn't had to explore. It's something that's worth thinking deeply about before leaping upon it.
Part of the FBI's responsibility of investigation is to surface this concern to companies that have the power to make that world a reality, and it appears Apple has agreed with them on the risks.
It's part of the Finder now. Same UI (and probably code) just doesn't come as part of a monolithic application (but now part of a even more monolithic OS).
I'm so done with Apple, but more generally, with cell-phones. I was very excited to have Signal available back in 2015, but when I learned about cellular baseband processors and DMA attacks, I realized the whole smartphone stack is insecure. We can't audit anything in our phones, down to the cellular chip.
Even if our phones were 100% trustworthy, they are triangulated by cell towers thousands of times per day. Location tracking can only be avoided by:
1: Not owning a phone.
2: Powering off your device and keeping in a Faraday cage while not using.
Tempting to own a cute little purse that blocks phone signals, but do I really need a cell-phone on my body, 24/7?
If I evaluate the overall pros & cons of my cellphone, it has been overwhelmingly negative. I've had a phone since August 2014, when I went to college. Before that, I would text with my parents' phones.
Here are the top negative things that have happened due to using a phone:
1. Miscommunication, isolation, social anxiety due to social media and texting. Talking in person is so much better. And what about the hours and hours of snapchatting, so pointless and sad looking back.
2. False sense of security, thinking you can know what's going on, help people, intervene when necessary (friend sexual assault stuff at parties). What about when their phone dies? It made me wish we had landlines, or that I had been there. If I didn't have a cell phone, I don't think I would've left the party. I would've stayed and kept watch.
3. Poor posture, lack of sleep, constant exposure to blue light (who knows if the light is really bad), etc.
4. Missed connections by having my head in my phone all the time in public.
5. The US government has a total map of my life since August 2014, even though I have sent thousands of encrypted messages and hundreds of encrypted phone calls.
6. Less time available each day. I have spent typically 1-3 hours per day on my phone since I got one, about 1,980 days ago. This amounts to probably 4,000 to 6,000 hours, or about 170 to 250 days. In other words, about 1/8th of my life since 2014 has been dedicated to bullshit technology.
Here are some positives:
1. I have lots of photos that would otherwise have required a camera. But I have dozens of film cameras and a few digital ones, and there's no reason to shoot photos on such a tiny format. Good luck printing cell-photos beyond 5x7 or even 8x10.
2. I occasionally talk to family. This could be accomplished with a landline.
Maybe I've missed some things, and maybe I'm being pessimistic, but the reality is that I've lost lots of sleep and experienced more problems with interpersonal relationships as a result of having a phone. It's likely that not owning a phone would expose me to a new class of problems, but I've decided to get rid of my phone.
I'm in the process of switching accounts and removing 2FA, so I don't need cell service. Once I get there, I'm planning to write a little blog post about it. After having a baby, it's become clear that a phone is sucking my life away, and I need to be present with my family. Hope to have this all dealt with in the next week or two.
I think you are indeed missing a few pros. Off the top of my head,
1. Having Google/Apple Maps has helped me find new restaurants and kept me from ever getting lost in foreign cities,
2. Tinder and other dating apps have enabled me to date people that I would never have met in my day-to-day life,
3. Lyft and Uber have come in handy more times than I can count,
and the list goes on. My use-case is different than yours, I’m sure, but there is a reason that smartphones are ubiquitous: we as a society have roughly evaluated the cost-benefit analysis of owning one and tend to side with the ‘benefit’.
No company cares about anything. A company is not a person.
Apple, because of its privacy-marketing, is incentivized to be the privacy player in the market. But only so far as consumers keep them honest about it.
They got away with this loophole because it stayed under the radar; if it gets enough attention and enough customers show that it matters to them, it could change.
On the other hand, it's possible that because we have a smartphones duopoly, Apple only needs to maintain a position where people will say "well at least it's not as bad as Google". I'm upset about this personally, but I'm not ditching my iPhone. Of course, this does cement my decision to never pay for iCloud, for what that's worth (much less, but not nothing).
It's a common misconception that corporations are amoral incentive-driven machines impervious to ethics, morals or mission.
Corporations are run by leaders.
Many leaders choose to pursue unethical and immoral activities to maximize profit. They justify their actions by saying "it's just business", or "we have a fiduciary duty to the stockholders to maximize earnings per share by whatever means necessary".
Other leaders realize that an ethical purpose can often deliver outsized profits over the long term. Leaders with a moral mission make decisions that sometimes sacrifice short-term profits with the intent to build an organization and a brand for long term.
>t's a common misconception that corporations are amoral incentive-driven machines impervious to ethics, morals or mission.
Not really, those leaders are pretty quick to hide behind the corporate veil when it's convenient for dodging questions of moral (or even legal) responsibility.
The whole point of corporate legal structure is to create an entity that is _separate_ from the humans that occupy offices. That entity is not a person.
OK, but isn't this largely semantics? Saying Apple doesn't care about customers may technically be true, but that is taking it quite literally. The statement can also be meant to imply the people in Apple care, of course no one would speculate that a non human corporate entity would care.
I also largely agree that the Apple meme of privacy being trotted out lately doesn't quite jive with this news, but at the same time surely there are people who care about it at Apple, and maybe as a whole they even prioritize it more than others.
But I also don't know how much I really disagree with the FBI's position. In general I have seen this kind of access to be used in the right situations (IE collecting communications of criminals). I understand this can be a slippery slope, but should we trade that for leaving clear evidence against criminals unturned in the name of "privacy"?
The people at Apple who write the code usually do care about privacy. Their bosses and execs? It is harder to tell.
From the information I have, the majority of Apple employees do care about values such as privacy and ethical business practices, as well as product quality and usability, but those values can sometimes be undermined by executive decisions based on business and monetary motives.
the (current) supreme court doesn't agree, and their opinion carries a tad more weight. corporations are basically super-persons in the eyes of the court, since corporations intrinsically funnel the resources of its many consituents, unlike individuals.
instead, any entity exerting outsized power, like corporations, should be held to higher standards of duty and transparency. that's an inherent value embedded in the US constitution that is absolutely being trampled over by power-seekers (commercial, political, or otherwise).
From a distance, to me it's interesting to read about fears of the FBI snooping in our phones and email, and see how quickly the discussion jolts to how immoral (or amoral) corporations are because one of them is not completely defying government.
(part of) how we mitigate unfettered power is pitting powerful entities against each other. the answer isn't "government!" or "law enforcement!" or "corporations!" but rather more even distributions of power, and lacking that, powerful entities in opposition.
that we also rationalize the actions of a corporation (to understand them) doesn't necessarily imply complicity. lacking more evenly distributed power, we should want apple to be pitted against google, the fbi, and the chinese government to funnel apple towards a privacy-oriented stance that it might not otherwise have.
incidentally, (many) americans vehemently support 2nd amendment rights as a way to have some semblance of power in an otherwise overwhelming power structure.
>Leaders with a moral mission make decisions that sometimes sacrifice short-term profits with the intent to build an organization and a brand for long term.
Shortly before being dragged off the stage by the stockholders. Careers have been ruined over not over-performing enough, let alone leading multi-million dollar enterprises off into the desert.
Worth noting that there are benefit corporations, or B-Corps, which get around this 'profits and nothing else' ideology. Some examples include Patagonia, Kickstarter, and Allbirds.
Thinking about long-term profits instead of short ones, and sacrificing profits for some greater moral purpose, are not at all the same thing.
It's entirely possible to do immoral things for the sake of profit while still being prudent about your company's future.
As for sacrificing profits to a moral end: private companies may do this occasionally. Not often, but sometimes. But the CEO of a publicly-traded company expressly does not have the option of sacrificing profits for any higher purpose, unless that directive comes from his shareholders. And in today's world, the shareholders of most major companies are of such a large number and have so many layers of detachment between themselves and the actual company (I don't even know what companies Betterment has me invested in; they can change every day) that the only common goal they can agree on is almost always profits.
An interesting exception to the norm is Facebook: despite being publicly-traded, Zuckerberg maintains both a majority holding and the position of CEO, and is therefore free to go on his crusade of "connecting the world", mostly ignoring what the other shareholders might want.
>> But the CEO of a publicly-traded company expressly does not have the option of sacrificing profits for any higher purpose, unless that directive comes from his shareholders.
to go a little deeper, corporate charters set out the values and goals of the company, as amended by the board from time to time, so it's whatever the (amended) charter says (it doesn't have to be solely profit-seeking). executives are judged by their ability to deliver on the goals of the charter.
baords are largely controlled by the various (professional) shareholders, and most, if not all, of them explicity seek profits above all else. that's one way markets get dominated by profit-seeking companies.
the other common argument is that in capital-oriented markets, not-primarily-profit-seeking corporations are at a competitive disadvantage over time, as the extra profits of greedy corporations can push them faster/further along the technology adoption/innovation curve (or economies of scale/scope).
so it's hard for such companies to survive. i don't think in practice that this is a dominant factor in competitive markets, but it's an argument often made (in business schools, for example).
> On the other hand, it's possible that because we have a smartphones duopoly, Apple only needs to maintain a position where people will say "well at least it's not as bad as Google". I'm upset about this personally, but I'm not ditching my iPhone. Of course, this does cement my decision to never pay for iCloud, for what that's worth (much less, but not nothing).
Agreed, and I am likely going away from Android and into iOS for my next phone. It's still a setting you can change so it's not forced into their cloud thankfully.
I wish Microsoft hadn't backed out. A Surface Phone would be kind of cool. I guess too much stigma about Microsoft has held them back from being competitive. I think they shoulda waited for their Microsoft Store to grow much more organically, then release the Windows Mobile phones.
They botched the original Windows Phone through a failure of management. They botched subsequent pushes on it because the bootstrapping problem around apps had grown too deep. At this point, if they tried to give it another go, there would be trust issues: "Am I investing in a phone ecosystem that's going to be dead in a few years?" Not to mention how much they've gone all-in on Android development.
A Surface-branded Android phone wouldn't be out of the question, but my gut tells me it would die a quiet death from thin margins and differentiators that aren't big enough for people to get excited.
They botched the original Windows Phone through a failure of management
This. I had some Windows Phones besides my iPhones, because I liked very much what they were doing. Windows Phone 7, despite being technically weak (it was based on Windows CE), had an awesome UI. Nokia had some really affordable phones that were really well-built for the price and Windows Phone was getting traction. Quite a few friends/colleagues bought a Windows Phone, because it was the hip thing after the iPhone. The development story was also great, they used .NET and XAML (IIRC), which also made it possible to demo applications on web pages through Silverlight.
Then they screwed over all the early adopters by completely deprecating Windows Phone 7, doing one final release (7.8). None of the Windows Phone 7 devices were upgraded to Windows Phone 8. Most of the traction they had up till that point was lost and they were basically starting over with Windows Phone 8. Windows Phone 7 was already late to the market, the hard WP8 cut set them back even more years. And then it was simply too late.
There were technical reasons for WP7 -> WP8 (such as moving to the NT kernel, adding multi-processing support). But the hard cut was a catastrophical mistake. Either they should have started with the NT kernel in the first release or they should have had a gradual migration route from WP7 to WP8.
And underpinning all of that was the unfortunate fact that this truly innovative and lively new area of development at Microsoft found itself happening at the tail end of Ballmer's tenure.
There was this spark of energy around Zune, and then Kin (https://en.wikipedia.org/wiki/Microsoft_Kin), and then Windows Phone that was just completely orthogonal to the stagnant money-printing strategy that Microsoft followed before Nadella. It was this little glimpse of an Apple-like spirit somewhere deep in the behemoth. It was exciting. But it was always treated as a side thing instead of being placed front-and-center. It's now been diffused into Microsoft's various consumer products, most obviously the Surface, but Windows Phone was already dead by the time things started to change.
Aside: despite all the jokes about it, the Zune (2nd gen and forward) was awesome. It was a little late to the game - it really nailed the traditional mp3/video player right when the iPod touch had just come out - but I think it may've been the peak of that category. Everything from the UX to the hardware buttons was so meticulously considered, the screen was much bigger than an iPod Video, it had momentum scrolling that worked really well despite lacking a touch screen, etc. It did not at all feel like a Microsoft product of the time. It was even one of the first services to offer all-you-can-download, subscription-based music. And you could download songs over WiFi.
I completely agree. In an environment where everyone was just copying the iPhone UI, some of the design ideas that were coming out of Microsoft were really refreshing and exciting. There was really nothing else like it (and nothing like it ever since).
I would love to see how Windows Phone would’ve matured.
They botched subsequent pushes on it because the bootstrapping problem around apps had grown too deep.
I don't really think that is necessarily accurate. Lots of smaller developers would be more than happy to have a green field for app development if another player were willing to put the effort into assuring the device isn't a POS, and is priced reasonably. Some of the larger apps (netflix for example) already have a dozen diffrent versions because they support not only ios/android but a pile of similar devices (TV's/etc) and likely don't have a problem with another platform if it has a future and has the prereqs (widevine or similar DRM for example).
It's that chicken-and-egg problem where people don't want to use it because it doesn't have the apps they need, and developers don't want to make apps for it because it doesn't have enough users. Microsoft tried to throw money at the problem to middling success. But I think it was too little too late.
I'm not sure why paying developers is a problem to bootstrap the ecosystem. Although, if they were paying for temple run (as the image might suggest) that seems odd. Mostly because presumably they should be targeting the apps everyone uses that don't have good replacements (aka netflix/prime streaming/etc).
I would guess that they aren't the only ones. Do you think LG/Sony/Samsung/roku/apple tv/firestic/etc all got the netflix app ported for free? Maybe. Plex probably isn't getting paid, and they do it too..
But MS was a special case, it seems to me that every time I looked at CE/Mobile/etc they were tossing existing app compatibility aside for the latest and greatest toolkit that went with some not particularly good set of phones.
> But MS was a special case, it seems to me that every time I looked at CE/Mobile/etc they were tossing existing app compatibility aside for the latest and greatest toolkit that went with some not particularly good set of phones.
Their (forward) app compatibility was actually very good. I wrote a WP7 Silverlight app in 2011 that still works on the last release of Windows Mobile 10.
But that was basically quite late in the game, microsoft had been making a phone OS's for ~ a decade at that point. In 2011 it seemed like MS was already putting it on life support.
It was not just any anonymous developers, it was Google specifically making Gmail and Youtube not working on the Windows Phone. They refused to port their apps, and when Microsoft created clones (i.e. Microsoft built Youtube apps) Google made them not work on the back-end and threatened to sue MS.
It was just monopoly abusing its walled garden and being anticompetitive, nothing else. Not that Microsoft would behave any differently, if the positions were switched.
> A Surface-branded Android phone wouldn't be out of the question, but my gut tells me it would die a quiet death from thin margins and differentiators that aren't big enough for people to get excited.
There's definitely a differentiator, at least. Apparently they've forgotten what happened to the many previous attempts at dual-screen phones and tablets.
Yeah, I forgot about that one, although it's not really primarily "the Surface of Android phones"; it's an experimental form factor that happens to be lumped under the Surface brand because why not.
I'd argue that Android, without the Google stuff, is still the best option if you care about privacy and security.
This is not accessible for average Joe, but I'm pretty certain the majority of readers here can use the tools to load an alternative ROM. That you can enable and use F-droid just fine. And are knowledgeable enough to know what apps to avoid.
Ironically, the best phones to do that are Google's own Pixels - to give credit where it's due, at least the phones have unlockable bootloaders (unless you buy the Verizon variants).
eg GrapheneOS currently only supports Pixel 2, 3 and 3a:
> No company cares about anything. A company is not a person.
Personification is a useful concept. Companies do have some mechanisms that lead to consistently different decisions compared to other companies in similar situation.
You're like the thousandth person objecting to the concept of HN, with always the same air of revelatory smartitude. But it's not contrarian insider knowledge. It's simply the inability/unwillingness to understand basic symbolic speech.
Apple cares about me paying $999 for another iPhone in a couple years, along with trying to sell me iCloud storage (that sweet, sweet reoccurring revenue). Now I’m never going to purchase iCloud storage because the feds could snoop in it.
If there is a more secure alternative available then I’ll go with that. Also iCloud backup is now turned off, no need to enable the surveillance state.
> I think we all understand what that word means in this context.
No, we don't, and that's exactly the point.
We tend to anthropomorphize companies. "Good Guy Google" has become "Evil Google". "Micro$oft" has become "Altruistic, OSS Microsoft". Apple has become "Defender Of Privacy". But all of these are illusions created by marketing departments; there is no real sentiment behind any of them that can be used to predict future behavior. And it takes constant vigilance to remind yourself that those narratives are empty.
Persons [edit: I don’t mean in the legal sense] they are not, but do you think it is reasonable to model companies as non-person _agents_? (In the sense of “thing which has goals/preferences, (and beliefs?) and acts so as to further those goals/preferences”.)
If it is useful to model such organizations as non-person agents, then while they of course would not “care”, in the sense of emotions, about things, it would be coherent to say that such an organization e.g. “has protecting privacy as a goal”.
Err, I guess I’m eliding the distinction between “being useful to model as an agent” and “being an agent”, which may be a mistake. I suppose what I should say is “if it is useful to model as an agent, it is useful to treat as coherent the claim that it has goals of e.g. privacy stuff.” .
Even at this late date in the advance of capitalism I think it's still a useful reminder.
With all of the carefully crafted marketing aimed at bypassing the forebrain and making people feel as though corporations care, it doesn't hurt to keep the fact that they don't at the forefront of the conversation.
No, it's because they can't effectively leverage your data to sell you stuff, they don't need it.
Thus follows the privacy marketing.
If Apple did find your data useful, they wouldn't be able to leverage that marketing angle.
Companies are made up of people, who care about people, and also, corporate objectives are not evil, generally. Working with the FBI might raise your eyebrow, but it may not for others, and it's an ambiguous question to most.
In the end, the balance of power has not fundamentally shifted. Most people have little to worry about, some criminals may have more to worry about. Of course the problem arises when innocents are needlessly entangled - hopefully this can be minimised. It's not like the FBI has instant and easy access to your phone, thankfully.
It could be signalling one is a far more attractive target to exploit, because e.g. harden shell, soft interior (M&M architecture). Or maybe Apple patches quicker and gets them out quicker, so exploit lifetime is shorter. Just to be clear, I don't know if that's true, but it's equally plausible explanation for the exploit price
That's a good point- there are nine times as many Android devices out there as iOS devices, making exploits for the former more valuable in certain ways.
As far as I know the consensus is still that iOS is more secure. Of course there are different definitions of that. Certainly Android is more ripe for abuse by apps downloaded from the store, for example; they can run freely in the background and do things like draw UX over other apps.
Also, privacy != security. iOS is absolutely in a better position on privacy, even after this new development.
iOS has been worse for privacy for a very long time. You cannot even develop apps for your own device without telling Apple who you are or download apps without creating an Apple account. You cannot get your location without also sending it to Apple. Android suffers from none of these problems.
Theoretically, maybe, I have no idea. Realistically, not at all, considering that even flagship Android phones tend to receive updates with an insane delay (except Pixel ones), not even talking about non-flagships or any android phones that have been released 2+ years ago.
Last Android phone I have bought was Galaxy S8+, just a few months after the release. Had to wait about half a year (if not more) for the next major Android update after it had already dropped for Pixel devices.
Agreed. Once a company goes public, it is obliged to maximize its profits. The investors can literally sue you if you don't. So, anything considered legal is on the table.
The inconsistency in that is telling.
Corporations demand the rights of persons, yet do not tend to shoulder the responsibilities that come with it [0]. Given how gigantic they are, the consequences are way too costly for the society.
This used to work great and is still my preferred method of back up, however for at least the past year it has been extremely unreliable. I have to manually load iTunes on my computer and initiate a back up from there because it never happens automatically anymore. And there’s no way to manually start the Wi-Fi backup from my iPhone anymore (that was removed in iOS 13). Maybe it’s more reliable if you use a Mac instead of a PC, but at this point I get a full backup maybe once a month, and often end up plugging in via USB cable in order to even get that.
(to clarify, my desktop PC is always turned on and connected to the network, so most of the other complaints in this thread about it not working with low battery laptops or whatever don’t apply in my scenario. Even in my ideal-case scenario it still barely works)
Worth mentioning iMazing by DigiDNA fixes a LOT of the issues around Apple's poor implementation of iOS <-> PC/Mac support. You can use it to schedule automatic wireless local backups, among other things.
It's also flakey as hell (just looked and my latest backup is over 3 weeks old), and relies on having a computer plugged into power at the same time as your iPhone, with enough free space.
I've tried getting my family to use it (mainly because they didn't want to pay for iCloud storage), before giving up and just paying their storage for them.
> and relies on having a computer plugged into power at the same time as your iPhone, with enough free space.
I mean... yes? How else would they do it? Data transfer is power-intensive, and encryption is also power-intensive. Imagine having only 10% battery on your laptop while you're working at a coffee shop or something and suddenly it drops to 5% because it's started a backup.
Sure, the majority of the time it'd probably work out fine, but Apple's UX philosophy is to simply remove undesirable states from the equation by sufficiently limiting users' options. This is exactly the kind of behavior I would expect from this feature.
It was the primary reason I gave up trying to make it work for my family - none of them regularly leave their laptop plugged in overnight, so there was rarely a time when both laptop and phone were charging at the same time.
It's a command line suite for Mac, Linux, and Windows that interfaces with your iPhone through it's native protocols, including for encrypted local backups.
I’m familiar with this suite. It’s fantastic but at the same time it is often broken with a new iOS release. Apple seems to have no interest in maintaining compatibility. Recently I needed to copy some MP3s to my phone. I’ve done this before with my laptop with success. But that day it did not work. And neither did my Linux desktop. I ended up listening to the audiobook (a free audio book from Cory Doctorow) by pairing my laptop with my cars Bluetooth and playing from my laptop. Cumbersome.
I’ve also had instanced where the latest version of that library supposedly works, but it’s dependent on another library that hasn’t hit Debian mainline yet. So I’ve had to wait months just to get a version that works even though it was released much earlier.
Apple EULA used to (still does?) prohibit virtualization of macOS on non-Apple hardware, so this setup could get flaky (need to patch around new hardware detection mechanisms when upgrading the OS).
You want me to pay for a Windows license to run a Windows VM on Linux, all just so I can back up an iPhone? I don't have to do any of this crap with an Android phone.
And do you believe that running Windows in a virtual machine prevents Microsoft from getting the telemetry data and who knows what else? While you're right that running Android does expose (some of) your data to advertisers, with Windows, we're not even sure what exactly is leaving your machine the last time I checked.
Or are you suggesting that an average Linux user who wants to back up their iPhone needs to buy and install Windows in a VM, and then is further expected to tinker with ingress/egress network rules to make sure no data is being sent over to Microsoft? I'd say that's a tall order.
It's encrypted with a password you type on the computer, but is stored on the phone. Annoyingly, if you lose the password, the only way to change it (even just for future backups) is resetting all settings on the phone.
Why do people care about phone backups anymore? I don't think I have any apps that don't store everything in the cloud. There's nothing of value that's only on my phone. I could toss it in the bin and set up a new one now and not lose anything.
iPhone backups are only one snapshot and done semi-frequently so unless you notice the corruption very quickly, it's going to 'backup' that corruption too.
I said don't have the primary store of your content to be on your phone and don't backup your phone.
The reason for this is that if you lose your phone, everything is lost since you last backed up.
Instead, store have the primary store of your content to be in the cloud and backup your cloud.
The reason for this is that if you lose your phone, it's irrelevant, just get a new phone and re-install your cloud apps. Now you don't need to worry about backing up your phone ever.
The only failure mode is if your phone has not had a signal to upload new content to the cloud in the last few minutes. Well guess what you probably also haven't backed up your phone in that time either if that's what you were doing.
This is why I don’t use iCloud. I don’t think the benefit is worth making my phone subject to search. But when I lost my phone, I was a little disappointed to lose some of the photos I had never moved. A local backup with zero-effort automation does sound promising, if it’s secure.
Right, so don't backup your phone, on iCloud or anywhere else. Use another cloud service to store your files and only sync to your phone, don't back it up.
I didn't say syncing was a backup. That's the opposite of what I'm saying.
I'm saying that my phone becomes the synced copy. The cloud is the canonical version (backed up separately elsewhere) and my phone becomes a device I can lose or break and not worry about anything so I don't need to back it up. As soon as I create any content on my phone I upload it to the cloud.
Take my phone from my hand at any moment and as long as I've had signal in the last few minutes I've lost nothing.
I asked about why people care about backing up their phone. My point is the data should not live on the phone. It should live somewhere more durable, and that durable store should then be backed up.
It’s not only way more friction than iCloud, but you’d be surprised how many people nowadays don’t even own a PC or Mac.
In my wife’s family 60% of people only have phones, tablets and smart TVs. They use laptops/desktops only at work.
It's a shame, but I don't see how this is a knock against Apple. They offer a basic service to backup your device on their infrastructure. If you want better/more you have to provide the infrastructure yourself, and they support several common permutations of suitable infrastructure. I think that's reasonable.
On a daily basis there’s no more friction than iCloud. My phone backs up to my computer over WiFi once a day when it gets plugged in for charging. I don’t ever think about it—it just happens.
The other thing that’s a bit of a problem as I discovered is it doesn’t seem as if there’s any documented way (and I’m wary of undocumented workarounds) to have this backup anywhere but your boot drive. Especially with several OS X devices this can be a bit of a problem if you have an older undersized SSD. I was getting critically low on space and it turned out a huge amount was these backups.
Precisely my problem -- with all the household devices, we've outgrown the boot drive's ability to locally back them up. This is despite the fact that all itunes (or whatever it's called now) media is stored on a NAS and accessed through iTunes seamlessly.
I suppose I could mount a network drive / nfs share at the location iTunes believes is local ...
> I believe even encrypted backups can be cracked.
If anything, they're more prone to being cracked, since an attacker can load the backup onto a very fast system and run custom software which can run many permutations of keys against the backup to try to unlock it. With actual hardware, they're limited by having to manually enter the key in most cases plus any hardware or OS features which would notice that the user has entered too many incorrect passwords and lock down the device further.
On the other hand, if the computer that your backup has been backed up to is itself reasonably locked down, that might be an acceptable trade-off; they can't try to crack a backup they can't access in the first place.
The ecosystem is incredibly insecure, despite what Apple's marketing department wants you to think. More than anything I'm surprised at how often their advertising talking points are parroted in tech circles, like here on HN.. in any thread hinting at Google vs Apple you're bound to find a long comment chain about how Apple is "secure" and "privacy oriented" and "cares about their users". Not that Google is any better, but it makes me wonder whether it's just astroturfing or if their marketing is actually working.
It all depends on your threat model. For the average person I’m not worried about the relatively unlikely occurrence of the government accessing their backups. I’m much more worried about the more likely occurrence of local malware or ad tracking by malicious apps including from big names like Facebook & other social media companies, and in that instance Apple is still ahead as the OS restricts what apps can do a lot more than Android, the App Store is more curated against malware, etc.
Here's a threat model for you: how fucked would you be if your iCloud backups suddenly became searchable online? Ever say anything bad about your boss or company in an convo? Ever taken any photos you wouldn't want released? Ever downloaded any files you wouldn't want public?
The agencies have proven time and time again that they're pretty terrible at keeping secrets (see: all the leaks, data breaches, TSA master keys, etc). As long as they have a back door, it's inevitable that it'll happen -- it's effectively a ticking time bomb.
When I send a message over Signal, I can trust that it'll be kept private, barring some truly extraordinary incident. With Apple, it's kept accessible by design (storing the encryption key with the encrypted backup? really?).
It's not about the government accessing my messages, because I really don't care all that much, it's that I don't trust them to keep secrets. If the government has access to something of yours you must assume it'll eventually be made public... whether or not you're okay with that is up to you.
I mean, I'm not up to anything illegal, and I don't actually care about people seeing my stuff, BUT... the concept as a whole of "government should have access to everything all the time, regardless of reasoning" does not sit well with me.
You don't even, for instance, torrent? Movies, music? (You don't have to answer that. :P ) Ever pirate apps, even just to try before you buy?
The point is, if someone from the law is interested in you, or suspects you of some grander illegal thing, a lot of us do illegal things that might just be a little less grand, and a lot of us have the digital equivalent of a broken taillight.
There are plenty of more mundane misuses as well. Ex-spouse stalking, slighted lawman with a grudge, technician with a weird fetish. Sure there should be safeguards but they don't always work as well as you'd hope.
Is Apple’s privacy white paper dealing with fats on iCloud just an eyewash now?
> Instead of protecting all of iCloud with end-to-end encryption, Apple has shifted to focus on protecting some of the most sensitive user information, such as saved passwords and health data.
> But backed-up contact information and texts from iMessage, WhatsApp and other encrypted services remain available to Apple employees and authorities.
Way to confuse laypeople with promises of security and data privacy. If Apple had concerns about users losing the key, why not implement it similar to two factor authentication on Apple IDs where Apple also provides the recovery codes (and additionally disallow any other mechanism of recovery)?
While iPhone itself is pretty secure as a device phone (and Apple makes sure to remind you about that in each ad, public speaking, attacks on competitors, etc), as an ecosystem it's not secure. And it's like that on purpose - there's no good and easy option to backup your phone other than iCloud.
You have to be a tech person to know how to keep an iPhone secure. Average Joe buys iPhone and pays for iCloud. They Apple first $1k to get (on top of other things) a secure device, and then they Apple monthly fee to give Apple all their data and make insecure. Pretty genius business strategy.