Hacker News new | past | comments | ask | show | jobs | submit login

So, why are we mad about this? The techniques used maintain perfect privacy throughought the process. It's a neat feature with no downsides for the user.



> So, why are we mad about this?

Not everyone wants the software/OSes we run to automatically send data elsewhere. I bought the damn device, I own it, yet somehow it/the company decides that some things it comes across on it, can be sent to the company?

No thank you, I prefer consensual computing.

> with no downsides for the user

No downsides for you, with your requirements/use cases. If the user has a requirement of "Doesn't send anything to anyone without consent", then this is obviously a downside.


>Not everyone wants the software/OSes we run to automatically send data elsewhere.

I personally find it offensive when a mega-corp makes the assumption that my connection to the Internet is available for them to build for-profit services without giving me sufficient agency.

The cant-disable-Wifi-safely dark pattern is bad enough. But turning me into a data harvester for their million-dollar services, without even thinking about giving me a cut?

No thanks.

Alas, these anti-patterns have become a norm by way of ignorance, and its not getting better.


Because apple could, just as easily, have auto opted-out everyone from this decision. I am not, by default, training data for AI or machine learning.


Could Apple get any training data out of this, given the homomorphic encryption?


If not, then why the default opt-in?


Because it's a) a useful function for the app that b) can't practically be done entirely on-device, and c) they believe they're not sending anything private off-device.


None of that justifies signing me up for something without my permission.


That wasn't the question you asked.

You asked "If not [for training data], then why the default opt-in?" and I gave three alternative reasons.


None of which are sufficient justifications...


It must be awfully convenient to believe that the only possible reason for it to be on is the same reason you made up entirely based on nothing. Personally, I don't find that logic convincing.


Your reasoning is also based on nothing beyond your own speculation.


no u

a) Compensating for missing location metadata is a valid feature for a photo library. Peer competitor Google Photos also implements location estimation from landmarks.

b) The sizes of contemporary general vision models and the size of the vector database for matching potentially millions of landmarks suggest that this is not suitable for running on-device.

c) Apple's entire strategy is to do cloud computation without private data, so it stands to reason that they believe they're not using private data.


Apple turning a feature on by default can only possibly be a conspiracy to surreptitiously obtain training data from people’s photos? It couldn’t just be that they think most people will want the feature?

It’s one thing to argue that Apple is doing this for nefarious reasons, but to suggest that this is somehow the only conceivable option is a bit nuts.


>Apple turning a feature on by default can only possibly be a conspiracy to surreptitiously obtain training data from people’s photos?

Apple-scale companies have done far worse. It is hardly a "conspiracy" to allege Apple behaves like other companies of its ilk.

>It couldn’t just be that they think most people will want the feature?

Apple doesn't care what the end user thinks. "A lot of times, people don't know what they want until you show it to them" -Steve Jobs

>only conceivable option is a bit nuts.

I am merely suggesting it as the most likely option, not the only option.


I think it's technically impossible for Apple to get training data from this due to the encryption. In which case it's not a likely option at all.


The landmark tagging models are obviously trained on something. Apple still benefits from my involuntary participation.


They’re not obviously trained on encrypted data that Apple can’t decrypt.


The purpose of the homomorphic encryption is that they can compare similar landmarks without seeing the unencrypted image data on the server side.


Right, so I assume that they can’t build up a database of images to use for training future models. But I was hoping someone who understands homomorphic encryption and machine learning better than me could confirm this.


The question should be "why are you not mad about this?".


I'm not mad about this because I use Google Photos, which has been doing the same thing for the last two years without people on the internet telling me to be mad about it.


Using Google photos was your own choice. If your default browser decides tomorrow to opt you in to mine crypto, I'll wager you would be unhappy.


Not sure what you mean. Google Photos is the default on every smartphone I've ever owned and this setting has been on by default as long as it has existed. You could just as easily say "Using Apple Photos was your own choice" and get shouted down.

The point is that outrage isn't automatic. Not everyone is going to be equally mad about a check box.


People aren't complaining that Apple Photos is installed by default. They're complaining that it's sending data up to Apple by default. You have to explicitly opt in to Google Photos backing up your photos to the cloud. That setting is not on by default.


You have to opt in to send your photos to Google Photos.


Same. But honestly with all the "I pay extra for Apple because privacy" posts around here, I kind of expect better from them. Whereas everybody pretty much knows that if you dance with Google, they're going to be looking down your top...


Personally, the whole "send a vector embedding of part of a picture wrapped in encryption and proxies" seems like it probably is better, but maybe Google is doing all of thatz too.


Because Apple did a great job implementing a useful feature in a privacy-preserving way, and I don't want to toggle on 100 opt-in features when I setup a new iPhone


This should be a choice between "recommended experience" and "advanced experience" when you set your phone up. If one selects the latter they get all the prompts. It should then be possible to toggle between experiences at any point.


Ok. Maybe Apple won’t mind running my executable and letting it send sensitive data to my server?


"We" don't automatically, naively assume that a brand new feature, which has undergone no external validation, that uploads data/metadata from your personal, private photos library without your foreknowledge or consent, is "perfect".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: