This no longer works on iOS. Apple locked the ability to enumerate installed apps several years ago. It makes app deep linking more painful but eliminates this privacy nightmare.
Funnily only apple and its ad network now has access to the list of installed apps.
It is a hard problem because there are so many side channels and apps keep swapping over to use those. And then it becomes a policy and review challenge, which is tricky to nail correctly every time.
They could easily do this on device, but I still see ads for apps I own (including my own apps). Often right above the organic App Store search result for the same app
> Of course Apple can program in apps that look at your list
Why is that of course? I mean I get that technically they can do that, but should they be allowed to? If apps need to follow some sort of guidelines to be published on the App Store, wouldn't it be unfair if Apple don't have to follow those guidelines?
An example of an 'app which can look at your list' is Springboard, the iOS internal name for the GUI with home screen and so on. Another example is of course System Preferences.
I don't believe Apple ships any optional apps which have this level of privilege, let alone one which doesn't obviously need to have it. This is possible, but to the best of my knowledge it isn't the case.
> Why is that of course? I mean I get that technically they can do that, but should they be allowed to? If apps need to follow some sort of guidelines to be published on the App Store, wouldn't it be unfair if Apple don't have to follow those guidelines?
If you trust Apple to control the underlying software on your iOS device that, for example, is supposed to enforce all the other restrictions, then you are already extending them far more trust than you extend to any other app. It's not clear to me why that trust would end at a notional border between "the OS" and "an app".
I'm okay with this. It's nearly impossible to stop every company from collecting your data, but it becomes a hell of a lot easier if you try to consolidate and trust one major company. For me, I try to limit the collection by Facebook and Google by putting more trust in Apple services and features. The alternative to this isn't Apple getting less data from me, it's that same data also going to Google and Facebook.
As an example, if I'm going to buy a "smart assistant", it'll be a Homepod instead of Alexa or Google Home. Not by brand loyalty, but by the nature that Siri may already be capturing all of that anyways so the additional data loss is minimal.
It becomes not a question of whether to trust some for-profit company, but which one.
Many people feel, and I tend to agree, that Apple is a safer bet than Google, because Apple (at least, at the moment) makes its money via hardware and software sales, rather than advertising revenue. They have significant incentives to protect your data and not share it, in a way that an advertising-centric company like Google does not have.
Of course, as Apple gets into the ad business they will erode this advantage, whether actual or just perceived.
In last few years Apple is attempting to do incredibly successful vendor lock in, mostly in the name of privacy, like forcing Sign In with Apple, content sales via their channels, etc. Leaving Apple eco system will only get harder each year.
They’re slowly boiling the frog.
Note - I do use Apple devices, so I’m also a frog.
I would say it is a safer bet but not by much. What info Apple is keeping is pure speculation though and nobody outside of Apple could verify it effectively.
I mean sure, but google has done zero to make me think they care even the slightest about privacy (security != privacy). Apple at least takes some steps. Trust me when they go rogue I will be just as harsh.
why isn't it possible to have 99% of apps be completely silo'd off from others? essentially a walled garden within a walled garden. Sure a few will need to communicate with other apps, but that should be a user permission and not available "just because"
100% agree, I was just using it as an example. When I had an Alexa, it was much more well-rounded than Siri is. Living with Siri (or without?) is just an inconvenience of the consolidation.
sadly I think apple stock holders demand growth and one way is through forcing the user to listen to 5 ads before the can check their SMS or make a call.
Related anecdote: I've recently had a banking app on my Android that would close itself after reporting that I had "Flappy Bird" installed on my phone and that it was malicious which of course isn't true. (It's the original authentic version of Flappy). That said, it is very eerie that the banking app would scan my phone for other apps. I didn't log in and uninstalled the app. It was either it or Flappy Bird.
I find it more eerie that your (and probably mine) phone lets an app get that info at all.
We wouldn't need to talk about privacy abuses if phones wouldn't let apps do all the shady stuff at the first place, and if privacy controls were better.
It is fairly common for banking apps to include various "RASP" (runtime application self protection) tools/libraries which scan for various syndromes of attacks, including whether the phone is rooted, any screen overlay is active, a debugger is active or a dangerous app is installed (e.g. a SMS stealer). See e.g. https://www.ibm.com/products/trusteer-mobile-sdk
In general that shouldn't be allowed. I believe on the latest versions of Android, both apps need to explicitly set which apps may have visibility of each-other. I think you can apply to Google to bless your app with visibility of all installed apps, but I think that's limited to things that need extra security. The banking app could qualify for that.
Very possibly just that it is an ID that is no longer on the App Store - in Flappy’s case because the developer was frustrated with the demands being placed on him by all of the app’s global users (it was a sad story at the time). But a security provider wouldn’t be wrong to assume that the majority of times an app is removed from the App Store it is because it was found by Apple to violate policy in some way - and why introduce the liability of having a whitelist for things like Flappy Bird?
I assume all public testimonies in front of government officials are a political exercise. The politician gets some video clips they can use as evidence of “taking action” or at least attempting to take action, and the business gets to keep doing what they are doing at the cost of an executive traveling to DC and saying nothing material over and over for a few hours.
Yes, especially this senator in particular, an obvious Russia supporter[1], has absolutely no grounds to conduct a McCarthy communist hunt. He knows his base is the audience here and that they are too ignorant to see a China-Russia alignment. His party should have chosen anyone other than him to run this grandstand hearing exercise.
And yeah she was being super evasive. Between the two of them, there was no information transfer.
FYI this is common if the report is referring to the Android app.
Apps can scan the device for installed packages that have opted to be “visible”. This is particularly common for apps that attempt to prevent rooted users from using the app. I would not be surprised if the iOS platform has similar functionality.
I’m not a fan of TikTok but this seems like it’s not a big deal.
It’s done for legitimate purposes, for example knowing which apps to launch for a given ‘Intent’ or checking for the existence of Google Chrome to attempt to launch a Chrome Custom Tab.
It’s normal functionality for mobile apps.
Now, if they are scanning visible installed packages and sending that data to their own servers for storage, that would be a story, but there’s no evidence that that is occurring.
It should be, but apps can create custom Intents anyway (for example, com.facebook.(katana|orca) definitely has custom Intents). It's closer to scanning what fonts you have by webpages though: they can't enumerate what packages you really have but they have a huge lists of these custom Intents to scan.
Ahh yes, the old "it's not tracking", it's "to prevent ad fraud". What harm are you trying to prevent from preventing "rooted users"? Do you think it was common for Windows applications to refuse to run on Administrator accounts?
> What harm are you trying to prevent from preventing "rooted users"?
Bypassing $streaming-app DRM/accessing its downloaded files for offline viewing (to copy elsewhere and keep forever) is a fair reason I think (better if it would just disable that feature of course).
No idea why banking apps don't like it though. I assume they (or rather some manager, presumably not their Android devs) think 'rooted, uh-oh, hacker!' or something.
It’s wild that content producers stoop to “the only way for me to protect my shitty content on your phone, is for you to no longer be in control of your own phone.”
But it’s even more wild that the industry laps it up. Thank god the tech isn’t actually good enough to truly achieve the vision - a corporate wet dream and a consumer dystopian nightmare.
It's one of those things about privacy like Steve Jobs said: "Privacy is knowing what you sign up for". A lot of the time, the creepiness comes from poor communication. It's fair for people to assume something bad.
I don't think it's a communication issue, I think it's a phone/system permission issue.
Somehow we made a change with our view at permissions to let or not-let apps access our files (storage permission), location and camera/microphone... but why the hell do we let them do all the other stuff (like this?).
There should be a giant popup saying "TikTok wants to see the list of all the installed applications on your phone, do you want to allow that? yes once, not now, not ever" and it's done. Then have the developers deal with the "monetizers" if it's worth it for millions of people to see that popup and question why does a video app need such a permission.
This outrage against the invasion of our privacy is good. I just wish Americans would consistently extend this to US BigTech too and feel the same level of outrage. TikTok is not doing anything different than what Microsoft, Apple, Google, Meta and Amazon do. Over the last 2 decades, Windows, macOS, ios and Android have been harvesting every bit of our personal information from our devices - the file name / metadata of every file you create on your system is transferred to their servers (either as part of their "cloud" feature or under the terms of their malware detection system), the OS continously monitors what applications you run and when you run it and informs on you, your geographic location is sent every time you start your OS and repeatedly when you connect to the internet, every time you connect to any network (bluetooth, wifi, LAN, internet etc.) the OS relays this information to their creator along with the devices involved, and so on ...
I am not saying this is a valid excuse to condone TikTok's privacy abuses in any manner. Just that we shouldn't lose focus that these types of abuses are becoming more and more common, and TikTok is just another symptom of it. The solution is not banning TikTok or some other company tomorrow, but a strong privacy preserving legislation to cover BigTech and future startups (irrespective of their origin country). This is a human rights issue - not a chinese or US BigTech one.
> the file name / metadata of every file you create on your system is transferred to their servers (either as part of their "cloud" feature or under the terms of their malware detection system)
Source? This seems like it would be a pretty extreme breach of privacy.
Their ToS mention collecting file names but not necessarily uploading them to the cloud. They may be pseudonymised or hashed before uploading or they may not be uploaded at all.
Scroll down to the section "Information we collect automatically " and the sub-section "Device Information".
This is what it says:
Device Information
We collect certain information about the device you use to access the Platform, such as your IP address, user agent, mobile carrier, time zone settings, identifiers for advertising purposes, model of your device, the device system, network type, device IDs, your screen resolution and operating system, app and file names and types, keystroke patterns or rhythms, battery state, audio settings and connected audio devices. Where you log-in from multiple devices, we will be able to use your profile information to identify your activity across devices. We may also associate you with information collected from devices other than those you use to log-in to the Platform.
It can be a mechanism to detect bots, for sure. Many app platforms have serious trouble with people just hanging hundreds of phones on a wall with an auto clicker for ad fraud or other nefarious purposes. Being able to detect and shadow ban them would be a good way to use this technology.
I don't think keystroke patterns will be similar enough across devices to properly detect and track users, though they can certainly make an attempt. They'll be able to detect users on shared devices but I don't that adding much value to their tracking.
I have a vaguely reconstructed memory dating to the 80s of Michael Chrichton invented (related?) AppleSoft Basic code that considered intra-key delay on password entry as biometric validation it was entered by the right party.
Does that mean they can collect all the filenames on the device?
Personally I'd be more worried about these: "keystroke patterns or rhythms" and "Where you log-in from multiple devices, we will be able to use your profile information to identify your activity across devices. We may also associate you with information collected from devices other than those you use to log-in to the Platform."
This is no longer possible on Android as well unless Google manually grants you this permission. You have to ask for it and explain why you need it when you publish the App, I wonder if they did that for TikTok.
All commercial app stores make exceptions for giants like Tiktok, Facebook, Netflix, Amazon, and anyone else profitable enough. If either Apple or Google were punishing all apps equally for behaviour that goes against the ToS, Facebook and Tiktok would've been killed years ago.
If Google kicks Tiktok out of the app store then that's going to cost them sales. They need to maintain an attractive platform more than they need to uphold the terms of service.
Apple risked banning Fortnite because the company started going into a direction that could cost them billions. The choice to ban is really no more than a calculation based on the cost of a lawsuit plus the probability of losing multiplied by the loss of revenue of said loss.
No you can use third party Apps which list every Apps permissions.
I just checked and TikTok doesnt have the Query All Package permission so I suspect the privacy policy is just outdated or they are getting the info from third parties in some cases, realisticly only Launcher and antivirus type Apps are given this permission nowadays because its their core functionality but some of these probably sell this data.
Could someone who knows international trade laws explain why the US Government allows Chinese social media apps to operate in the US while China bans American social media apps? I am genuinely curious about it.
To clarify my question further, "tit for tat" is an effective strategy in game theory. If China bans US social media companies, the optimum response from the US should be to ban Chinese social media companies. Trump went through a convoluted process rather than outright banning TikTok and then gave up. So I wonder whether there are any legal limits that stops US from banning TikTok.
Because it depends on the context. Kill someone on the street - you are a criminal. Join the army and kill someone , you are a hero. Follow someone and collect data about him - it might be criminal. Make an app which does that - you are a respectable businessman.
I have root on my phone, is it possible to do something like a chroot jail just for TikTok? I'd love to know about it if there's interesting/useful ways to lock down and spoof the environments for apps on Android. I suspect there must be a way since I was able to get Google Pay to work despite being rooted.
What are we so afraid of with TikTok? If there's a real concern, someone propose a law and let's debate it.
It's so tiresome to just see senators personally attacking executives in Congressional testimony.
Imo we're afraid of: (1) the algorithm's influence on our kids' minds (a health risk), and (2) transfer and mining of personal data for dystopian uses we can imagine.
For 1, are we ready to regulate what individuals can do to their minds (which we used to say books and tv could also hurt?) Maybe there's a case for it now. Let's debate it.
For 2, let's talk honestly about what we're okay with and what we aren't. Industry has decided these things are ok, but the law isn't super clear on it. It's mostly based on GDPR:
-it's okay to use personal data to fix bugs and provide the service the user expects
-it's okay to use it to secure and avoid fraud etc
-with consent, it's okay to use it to do product development studies to inform how you build future features
-with more specific consent, it's okay to use it for targeted marketing in the app or other places online
-regardless of consent, any government whose country you care about can force you to hand the data over to law enforcement authorities.
Another complexity comes in with the vendors you share the data with for all of these purposes.
I think the biggest deal is there is no wall between the data and the CCP, it is currently 100% available to them and that means they are essentially datamining and entire generation, some of them too dumb to realize that what they do now can and will be used against them in the future.
The American government is doing the same thing the EU government has been doing for years: setting up defences against influences from foreign tech companies.
Most of the world's social media used to come from the USA and that wasn't a problem for the American government because they could control those companies. The EU was all up in arms against this but the USA did not care as there was little need to give any concessions.
The result was the failing of Privacy Shield and more and more American companies and services now becoming illegal in the EU. I bet the GDPR wouldn't have even existed if it weren't for Facebook and Google.
The fact your Congress is so afraid of what American tech companies have effectively been doing happening to them is quite telling. It confirms that every other country's actions against American tech giants is warranted.
As for your points: the shortening of one's attention span is quite noticeable if you use these apps for a while and then take a break. The constant dopamine hits of videos you like are addicting and are definitely having an impact on children. This is incredibly hard to regulate and I expect this behaviour to continue for quite a while.
As for your second point, the less data you collect, sell, or transfer, the less likely it'll be that you run into trouble. If you want to be sure about what you can or can't do without a legal advisor, assume that wherever the law is vague about something, it's probably preventing you from doing something.
For most American states the privacy laws are incredibly forgiving as long as you make sure to not have any children under 13 on your platform (COPPA) or collect health INFORMATION (HIPAA). California was the first state to set up privacy barriers but they're not as strict as other countries' laws.
If you (plan to) cross borders, though, you'll have to stick to the lowest common denominator or do some very creative corporate structuring. In many cases this also means setting up subsidiaries that cannot share any data with your American company and physically store the data in the legislative area as many countries ban storing customers' private data in a foreign country (i.e. in the EU, in Russia, and I believe in India as well). If you're afraid of giving user data to a foreign government, don't operate under those governments.
If you want to mine data like you're describing, get yourself someone who can give you legal advice. Tiktok is breaking the law in many places but unlike you it's got enough millions to burn on an eventual lawsuit and settlement.
I'm no lawyer but the simplest answer I can give you is "stop mining so much data unless your legal team signs off on it".
Funnily only apple and its ad network now has access to the list of installed apps.