Hacker News new | past | comments | ask | show | jobs | submit login

The good: .aab can be optimized by the Play store for the device that is requesting them (for example stripping resources that don't apply to a particular device)

The bad: it will be more difficult for non-Google app distribution storefronts to jump-start their catalog by grabbing APKs from the Play Store, because they won't be able to get one neat APK per listing via some APK downloader. (For apps that do want to get listed on those storefronts, life won't be very different.)

The ugly: APK distribution is a "zero-trust" model which allows the developer and the user to not have to trust the store not to make any changes to the application. In fact that's what prevents the kinds of "good" optimizations mentioned above: Google can't reach into an APK to strip resources that are irrelevant to a particular device, because doing so would invalidate the APK's signature. By forcing apps to be deployed with keys under Google's control, this trust model is broken. The Play Store no longer guarantees through cryptography that APKs haven't been tampered with between the developer's build system and the recipient device.




As side note, this is how Windows Store (the old one) used to work, a cloud compiler generated native code specific for the device.

The whole JIT/AOT infrastructure from Android 5 onwards, Microsoft did it on the cloud instead.

Hence why Windows Phone always tended to be snappier than similar Android devices.

https://channel9.msdn.com/Shows/Going+Deep/Mani-Ramaswamy-an...

https://channel9.msdn.com/Shows/Going+Deep/Inside-NET-Native

Also how iOS and watchOS apps, since app thinning and bitcode were introduced as packaging format.


Idk about Windows but yes the App Store has been doing it this way from the start. I suspect this is a case of Apple envy.


Unfortunely the Apple envy doesn't go as far as providing proper C++ tooling instead of NDK pain, modern 3D frameworks with GPGPU debugging instead of cloning Vulkan github repos, SIMD frameworks instead of low level neon intrisics + NDK pain, a modern language that actually supports value types (they don't care about Java anyway), a proper real audio API that is part of the NDK instead of yet another github repo to clone, a mostly useless watchOS, .....

Yep Apple envy could do some good to the development experience on Android.


About the last point, I am not familiar about the key distribution there...

Say Google wanted to create an eavesdropping Facebook Messenger, couldn't it hide the real one and replace it with an app of the same name, signed by an entity named "Facebook Inc [random invisible unicode character]" and essentially do the same thing?

I always assumed that the APK security model did not protect against a compromised Google Store?


If my understanding is correct, it wouldnt protect new installations of the app in your hypothetical, however any attempt to update existing users would fail.


That is correct. An APK is signed with its developer's key. When you first install an app, the system trusts that key. If you later update it (by installing an APK with the same package id on top of the existing one), the key must be exactly the same for the update to succeed. The only way to install an app with the same package ID but a different signature is to uninstall the existing one first. This is done to protect the potentially sensitive data the app stores in its dedicated directory under /data/data/.


What happens if your key is compromised? Is there a way to continue with a new key or does everyone have to reinstall your app?


V3 signatures do support key rotation iirc, but they're only supported by several latest Android releases. Their existence wasn't even officially announced by Google yet. So, yeah, as of right now, everyone would have to reinstall the app.


That seems like a very interesting attack vector. Compromise a key, have a replacement/fake ready. In the ensuing confusion you trick people to install your app.


It sort of does. The new app in your case would be signed by a different key and so wouldn't have access to the existing app's data. It would boil down to a phishing attack - the new app would have to impersonate the UI of the old one and get users to log in again.

Hence my concern with this part of the article:

"While it’s unlikely Google would ever do so, it is possible that it could sign apps on behalf of a developer"

Actually given the trend the company has been on over the past 6 years I'd say it's very likely Google would do this ...


I don't think the "ugly" point is completely true, there's another optional level of signing that uses keys under control of the developer alone, approximating the APK model: https://developer.android.com/guide/app-bundle/code-transpar...


From the linked documentation:

> The code transparency file does not verify resources, assets, the Android Manifest, or any other files that are not DEX files or native libraries contained in the lib/ folder.

> Important: The Android OS does not verify code transparency files at install time, and continues to rely on the APK signing schemes for verification of any installed APKs.

Do I understand it right that it basically means user's device doesn't care about this signature at all? Wouldn't that make it pretty easy to supply modified versions of app to specific devices without them noticing, unless users decide to check signatures by themselves?


Right, it's not checked by the device. Its purpose seems to be more forensic: say an app version turned out malicious, and the developer said it wasn't me, Google was pwn3d and used my keys to sign malware. Code transparency would allow a developer to prove that tampering had occurred between their build of the app bundle and the user installation. It looks more like an accountability mechanism in case of failure than a protection.


That's right, the transparency signature is not checked during installation. If you think somebody at Google might be out to get you and has all those powers and resources, there are many ways other than serving bespoke bundles, even with good old APK. Since they control the store and the system software that goes with it, they could download and save the original APK to give you some false sense of security, but apply a patch before extraction or code compilation. Or they could just crack the developer's signing key by using spare processing resources across their fleet, at least for anything comparable to RSA-1024 or lower. Those three attacks are theoretically feasible, but, I think, all equally hard to pull off without detection inside or outside the company.


At least they give you the signing certificate so you can verify the end user is using a Play Store build and not something potentially malicious. Testing release builds requires a round-trip through the Play Store, though, and apparently AOT-compiling is only triggered when the Play Store installs its blessed version of the app.


Users trust Google more than they do random app devs in many / most cases


This isn't about trusting the app dev at all, this is about trusting that Google didn't tamper anything. Whether or not trust a dev, you can't trust that you are getting what the dev intended. If you download a version of Signal Google silently changed to include a backdoor, it doesn't matter whether you trust Signal or not: Google will get all of your data.


I would not be surprised if Google force-monetised apps like they did with YouTube videos.


Totally agreed. However for the few cases where it matters, IMO it's a regression


I do not find "The ugly" very compelling since most developers are not very good at keeping their signing keys secure.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: