This is the problem with choice. All your choices are secretly malicious and have incentives to violate your privacy. Remember those flashlight apps that ran in the background consuming CPU/data and stole users' personal information? It's just generally a bad idea to rely on untrusted third parties for core functionality.
I would argue that, if you really want to talk about people "having a choice", you need a similar concept: that you are "free to choose" if-and-only-if your choices are indeed what you understand them to be. Which means some form of regulation or curation needs to happen to enforce that.
Of course, this doesn't have to be at all the same thing as the sort of "curation for quality" that the iOS App Store gets up to. Instead, more like FDA labelling requirements on drugs: list your active ingredients or get out.
Consider a hypothetical policy: "whatever misapprehensions a consumer has, due to your marketing, are your fault; a complaint about misapprehensions about your software that cites your own marketing, and which we ascertain as being valid, will result in a ban of all your apps from the store."
Can you make the "I Am Rich" app? Sure; it does what it says on the tin—proves you're rich with a $10k IAP. Can you make a Flashlight app that asks for your contact info? Nope; customers weren't expecting the app to ask. Banned. Even if you never send that info anywhere.