Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Python grapples with Apple App Store rejections (lwn.net)
254 points by leephillips on June 27, 2024 | hide | past | favorite | 110 comments



It's not just Apple that pulls shenanigans like this.

Try building a Python app with PyInstaller while you have Windows Defender live scanning on, which is the default setting. You won't even be able to compile a binary without Defender preventing you from doing so.

Similarly, try running the binary produced by PyInstaller with Windows Defender on. Defender will say it's malicious and won't run it.

It's a bit dystopian that both major OS platforms go out of their way to prevent you from distributing and running your Python apps.


It's not just Python apps. It's anything by small-time developers without expensive certificates. I once used MSVC to compile a C program that was little more than a "Hello, World", and Defender called it the Win32/Wacatac Trojan.


The code signing certificates that MS requires are ridiculously expensive. It's a cartel of certificate isseuers. It's a downright robbery. We need something similar to Let's Encrypt for code signing.


SignPath gives free certs to OSS projects:

https://signpath.org/about/

We (sqlitebrowser.org) have recently started using them for signing our Windows builds.


If it's limited to OSS projects then it's not like LetsEncrypt


Fair point. For OSS projects it's better than the alternatives though. ;)


Defender calls anything Wacatac.

Ironically I've seen tons of actual malware that doesn't even give the slightest warning.


Even when legit malware gets flagged as "Wacatac" some percentage of users are sure to google the name, see that for years (if not decades) MS has wrongly flagged a ton of legitimate software as being that virus and then whitelist the actual malware on their machine assuming that Microsoft must have just screwed up again. I'm not surprised that MS hasn't fixed the problem after all this time, just disappointed.


This is how malware propagates. Most apps you get from questionable places have instructions that say disable your antivirus. I get it....but I don't want to play a game that bad.


I made a single for loop in Go at work to show a coworker, that binary got flagged as malware.


> It's not just Python apps. It's anything by small-time developers without expensive certificates.

This is definitely the case and has been my experience, as well.

We live in some dark times when it comes to building and sharing anything as small developers, especially if the things you're building are free.

I stopped updating my open-source Mac apps because I can't justify the cost of jumping over artificial hurdles Apple puts in place that ensure users can't run the apps they want to use. I have other hobbies where spending money actually gives me tangible goods and benefits versus paying an arbitrary yearly tax for the privilege to build stuff that ultimately benefits Apple.


> stopped updating my open-source Mac apps because I can't justify the cost of jumping over artificial hurdles Apple puts in place that ensure users can't run the apps they want to use.

hah, that's the exact reason I stopped using os x and went full Linux on my old Mac book air about 8 years ago.


This was the case for our open source app as well. The only reason we're on Apple is one of our users likes the app so much, they handle the certificates.

Which is deeply ironic, if you think about it.


It is super interesting that someone who is not you can take care of proving to Apple that you are really you so that Apple can assert to all other users that they have verified that you are really you because they made you prove it.

This world is just awesome. :)


I think Homebrew is the best solution for shipping open source Mac apps if you don't want to pay the developer fee or jump through any hurdles, assuming your users are technical enough to use it.

The alternative is not signing your binaries and explaining to users that they can run them by right clicking and selecting "Open" from the menu.


Unfortunately, getting users to install Homebrew is a hurdle that's hard to pass for what I'm dealing with. It's a non-starter if users have to open a terminal to install anything, even though Homebrew has a .pkg installer now. The users typically don't know what a terminal even is.

> The alternative is not signing your binaries and explaining to users that they can run them by right clicking and selecting "Open" from the menu.

That's what I'm doing now, and it's still an issue, unfortunately. Non-power users are not going to remember the right-click -> Open ritual when they just double-click on everything else.

And the warnings Gatekeeper shows also caused users to think their apps, and even computers, were broken or hacked.


Why aren't you just ... paying the 100$ a year to sign your app?


> I stopped updating my open-source Mac apps because I can't justify the cost of jumping over artificial hurdles Apple puts in place that ensure users can't run the apps they want to use. I have other hobbies where spending money actually gives me tangible goods and benefits versus paying an arbitrary yearly tax for the privilege to build stuff that ultimately benefits Apple.


This is not my experience but maybe I'm doing something different. I ship an electron app. I build it into an installer with electron-builder. I'm not sure I set any configuration settings. It's set to install in the user's folders, not at the system level. My understanding is that's allowed and just works.


This is usually caused by their machine learning virus scanner. For some reason it determines basically anything that you compile yourself to be a virus. Can't miss a virus if you call everything a virus, I guess.


Tbf the gamedev community has seen people submit Trojans to game jams


This is not relevant to the topic.


No, it is. Running unsigned exes with no sandbox actually is risky, and game jams are a specific case where it comes up.


Running signed binaries neither is as long as the corresponding keys leak all the time [1].

Anyone can just get their malware signed by just throwing some dollars at it.

[1] https://bugs.chromium.org/p/apvi/issues/detail?id=100


Okay?


Okay :)


To be fair to Windows Defender, a PyInstaller binary does look like malware. If it didn’t do this out of the box, it would become a standard way for malware to be distributed very quickly. Unfortunately for every person trying compile or run a valid python program from the net on a Windows PC, there’s a 1000 malware/Trojan instances trying to infect one.

Ideally code signing / validation would be cheaper

https://www.reddit.com/r/learnpython/comments/e99bhe/comment...


This is reason enough not to use either windows or macos. Calling these programs malicious is an outright lie. They have not proven this. At most, they can say the program is untrusted or unverified. But calling it malicious is a falsehood and therefore a breach of trust.


> that both major OS platforms go out of their way to prevent you from distributing and running your Python apps

This goes beyond just Python apps. The signing procedures to get past Windows SmartScreen require you to buy certificates and or deeply integrate with the golden path of Windows development. Stray from the Microsoft approved path that and all your users will get a nice scary warning. Part of it is justified by security concerns, but there is a bitter taste of major OSs pushing "the" way to develop.


I built PyInstaller binaries on Windows a few months ago and had no trouble with Windows Defender at all.


This was my experience as of two weeks ago using a pyenv-compiled Python 3.12.4 and the latest PyInstaller on a fresh install of Windows 11.

If you used the Microsoft-signed or Python Software Foundation-signed Python binaries, maybe it doesn't trip the Defender alarm, same thing if you used older versions of Python or PyInstaller. 3.12.4 only came out on the 6th.


Yup, it works mostly but not regularly. Did it for a few years for a cross platform proprietary PyQt app with Pyinstaller. Signing helped a lot but the release process was still to submit the binary to a website that checks with most known antivirus software just to be more certain we didn't ship a dud for most windows users. Interestingly, sometimes a rebuild fixed the issue facepalm..


MS Defender _is_ malware. Avoid at all costs. /s


Windows isn't a platform for developers. It's a platform for normie consoomers. Isn't that obvious?

If you want an engineering OS, use GNU/Linux.


Until you can run a Windows-free build system with WINE (there are a few reported blockers, several others and I have tried) and PyInstaller, cross-platform apps will require developers to compile their Windows ports on Windows itself.

Windows is where the users are. Not targeting it is a bad financial decision.


Past 10 years we ship py2exe based Win32/Win64 software from WINE / Ubuntu There are no problems - we don't sign it (https://www.vintech.bg)


Thanks for the info, I was hesitant to introduce more platform-specific tools hence why I stuck with PyInstaller. I'll see if py2exe suits my needs.


To hell with cross platform. Developing for windows perpetuates a harmful ecosystem. It's really no different than selling ammo to the Sineloa drug cartel.


It’s a little bit different.


Yes that's easy advice to follow when you don't have a job


What do you mean?

At my job, we build cross-platform software, but we build it on Linux. It runs on Windows, but building it on Windows is torture, so almost all our dev machines are Linux.


I hope one day I won't have a job!


I've met uncomfortably high number of Mac users who cannot navigate the file system on their Mac.


The multi billion dollars company that I work for don't get this. I'm forced to do all me dev work on a virtual windows machine. They have their reasons, many of them valid, but it's still a pain


did you download your personality from /g/


I thought this was interesting

> Alex Gaynor suggested that the project try a an approach that Keith-Magee had not put forward inspired by Gaynor's experience with the cryptography library. The project often receives complaints that the library refuses to parse a certificate that is technically invalid, but was in wide use. He said that the policy was to accept pull requests that work around those issues ""provided they are small, localized, and generally aren't too awful"". But, he added, these patches should only be accepted on the condition that someone complains to the third party (in this case Apple), and extracts some kind of commitment that they would do something about it. He suggested that the workaround be time-limited, to give users a decent experience ""while also not letting large firms simply externalize their bizarre issues onto OSS projects"".

as a solution to the familiar problem of users wanting OSS to work around bugs in commercial software because OSS maintainers are easier to bully and they know bug reports to Megacorp go straight to a black hole.


Seems like the quote is a non-statement given that Apple's bug report system is a black hole. Big talk as usual from Pythonistas without any connection to reality.


Why can’t Apple just add “itms-services” as a forbidden URL scheme on a sandbox level? I don’t see why the App Sandbox can’t block (and isn’t already blocking) certain protocols.

Heck, what if I have a malicious web frame inside my app that tries to invoke “itms-services”, similar to this Polyfill.io debacle?


I’m not sure what the big deal with the url handler is, but I can’t imagine it causing remote code execution or other actual malicious behaviour.

At this point Apple seems to be using simple substring matches, so if there is any exploit vector the malware authors can circumvent the check using "itms" + "-services" or something more sophisticated like ROT13.


Which is also just why… the App Store review process claiming this is a problem doesn’t seem to make any sense.

Imagine your app embeds a WebView at myapp.com/terms. It’s your Terms of Service, you show it to everyone when they sign up. Everyone clicks OK.

After it’s on the App Store, you modify your WebView to include `itms-services` for some reason. You’ve just completely bypassed App Store review and gotten that URL handler into your app. The sandbox should stop you - but clearly the review processes don’t consider this possibility and are enforcing it before you publish. Why?

My point is that the scanning for this handler, if Apple doesn’t want to allow it, seems misplaced if they wanted the ban to actually be effective.


The review process is about Apple giving you a clear direction about what is acceptable.

There are many ways to get around their restrictions. But doing so will get you banned.


I haven't heard anything to suggest that Apple intends the review process to give clear direction.


They refused to tell him why it was rejected, hardly "clear direction"


> The app installed or launched executable code. Specifically, the app uses the itms-services URL scheme to install an app.

Seems like pretty clear direction to me:

https://github.com/python/cpython/issues/120522


The app in question doesn't actually do any such thing, though.


The wording could be better.

But it has been known for over a decade now that Apple searches binaries for strings. They've never done runtime execution checks which would pick up you actually making such an HTTP call.


My baseline would be similar to that which you get from modern compilers and automated tests. Something like:

    Lib/urllib/parse.py contains disallowed string "itms-services" at line 62 column 25
    Reason: Apps may not install or launch executable code, such as through the itms-services URI schema.
To get credit for "giving you a clear direction", I'd want them to make available and suggest a fix. In general that could be "if this is a false positive, click to request a human reviewer make an exemption" but ideally monitoring for this kind of issue in the first place (sudden increase in identical rejections) and fixing the broken check.

Instead, Apple seem to omit the information on the first line that they likely already get from their internal tool, and not only don't suggest a fix but make the proper fix so unavailable that it's easier to get the language itself changed than a check in their review framework.

> But it has been known for over a decade now that Apple searches binaries for strings. They've never done runtime execution checks which would pick up you actually making such an HTTP call.

It's true that someone with folk knowledge about the way Apple does checks, gleaned about the process by other frustrated users, could likely infer the way in which Apple's test is broken and so eventually deduce the first line from the second line. That's not Apple giving a clear direction - that's developers managing to work around an inscrutable system.


Such a check would be possible if:

1. Apple received a copy of your source code to do source analysis

2. Apple actually supported running python code as an option for writing iOS apps and wrote source code analysis tools for python for reporting compliance issues

neither of these are true.


I'm not suggesting any language-specific features like telling you which function it's in - just the filename, position, and matched string. This is information already available to Apple, and can be useful even for compiled/object code.

From the linked Github issue:

> After lots of 'we can provide you with no further information' I finally submitted an appeal for the rejection which at last resulted in Apple telling me that parse.py and its .pyc were the offending files


It does the exact opposite of clear direction. It's an arbitrary black box


Python just has to rot13 the scheme and hash the encrypted schemes.


That came up in the original post with whether an obfuscation could be an acceptable workaround; but it came up that Apple really doesn’t like obfuscation techniques. The workaround for now is a compiler flag that excludes problematic code from iOS builds.

I’m still asking though why, if Apple doesn’t like apps which use this protocol, the sandbox is not intervening; as surely an approved app could have a malicious web view.


Sandboxed apps are allowed to use itms-services:// links, it's just not allowed in the App Store - iOS enterprise apps using in-house deployments can use it for installs and updates, and sandboxed Mac apps deployed outside of the App Store can use it as well.

However, App Review Guidelines forbid App Store apps from installing other apps, so that scheme gets scanned during review.


That makes some sense, but then I have a new question: Why doesn’t Apple have different certificate schemes for in-house versus App Store (if they don’t already)? In which case, the iOS Sandbox should be smart enough, and probably does already delineate, allowed functionality based on whether an app comes via App Store or via a private deployment.


> Why doesn’t Apple have different certificate schemes for in-house versus App Store (if they don’t already)?

Enterprise distribution allows you to deploy applications to corporate managed devices with no App Store review whatsoever. The restrictions are mostly in the business agreement in who you can provide enterprise distribution to (e.g. employees and contractors) and on what your apps can do. The justification is that the enterprise has a relationship with the employee/contractor, is ultimately on the hook for abuses/harms they do via MDM on employee devices.

This is the situation that led to both Facebook and Google having their enterprise accounts banned temporarily a few years ago, as they were each using them as part of a market analysis program - they offered to install VPN software onto consumer devices that monitored web and third-party app usage. Such monitoring is not allowed even for employees per the enterprise developer account agreement.


This can be handled by granting privileges to open that scheme to enterprise Apps and not granting to regular App Store apps. Relying on string scanning is simply not secure.


Welcome to Apple's much applauded security model.

More seriously, I'm sure they also prevent the privilege to that URI scheme. This is likely part of some ill-thought defense-in-depth approach. Same way they search for the names of private symbols in the exec, even when the linker will outright refuse to give you those. I absolutely detest this pervasiveness of useless layers of security that add almost nothing. But since almost nothing is not nothing, no one can remove any of them. Like cockroach papers, I'm going to call them "cockroach security". Practically everything is infested with those these days.


> The workaround for now is a compiler flag that excludes problematic code from iOS builds.

That sounds like the correct fix, in that it is least likely to get your developer account banned


The URL is likely used from within Apple frameworks for various purposes, and therefore it's possible for an app process to open the URL even without the app itself knowing about the URL.


Can we have Separation of Powers on our digital platforms?

It is pretty shitty that the one who sells phones also determines what goes on them.


Are you asking for mainstream phones with no operating system on them?


I choose Apple to be the custodian of apps that go on my phone. There is a platform where the user is the one that determines what goes on their phone, it's called Android.


While I am glad that you have the choice to be a digital serf - paying your lord for using land (devices), I would much rather prefer the choice to not be one.

That way, you can opt into digital serfdom and people like me can opt out.

-

Serf here isn’t pejorative, but descriptive. Apple is behaving like a feudal lord. Down to the claims of protecting the land and arbitrating disputes.


Why do you act like Android doesn't exist? You _can_ opt out of "digital serfdom".


How is Android any different?


In some cases one can install a de-googlified version of android, and install whatever app (.apk) they wish through alternative app stores or directly.

For 99.9% of people, Android and iOS will be virtually identical in terms of freedom, as they will just install apps from the play store and use google-services on the manufacturer-provided (and -bloated) android install that comes with their phone.

That being said, for those who do care, the ability to take control of your phone and run AOSP, an actually FOSS distribution, and only run FOSS apps, or install whatever app you want, is unparalleled on Android vs iOS.


Android is open source, there are several freedom respecting distributions of it. You can even get mostly freedom respecting hardware.


Your line of thinking has so much wrong with it that it would be a waste of time trying to argue with you because you cannot be helped.


Sure, so why should everyone else with an iOS device have to make the same choice?


Those with an iOS device can make the exact same choice - to get an Android device. If I don't want Apple to be my custodian, that's the choice I have.

I don't understand why people feel the need to force Apple into a specific strategy when they are not a monopoly. They aren't the only game in town, you don't _need_ an iPhone. Every single person who has an iPhone has chosen to have that device.

Perhaps the reason people don't like the idea that if you want choice you choose the platform that gives you choice, is because Android is a steaming pile of garbage.


>Every single person who has an iPhone has chosen to have that device.

I didn't want an iphone. I was forced to get one, because Apple won't allow any web browser on their platform except their web browser. So I can't deliver a working website for iphone users without having a real iphone to develop on. This is an absolutely abusive, anti-competitive and shitty artificial limitation Apple has forced on the world. I'd really rather just tell my users to install Chrome or Firefox and use that instead of being forced to use Safari. This isn't good for developers, and it's not good for their customers whether they know it or not.


They're not a monopoly; they're a duopoly. It is a marginally better position but I have no idea how you justify to yourself handing over a consumer's power to the largest tech company in the world.


Because it's a set of tradeoffs and not a strict better/worse?


Apple is perfectly capable of having a phone that runs your own software with minimal trade offs, they just don’t want to because they’re not about empowering users but about creating a class of dependent consumers.


It is sad when even here people dont care about the right to run whatever code you want in the hardware you own.


You're mistaken, you don't actually own it.


This is not true - you do own your phone if you pay for it outright. Apple cannot take away your phone.

As far as the software - Apple owns iOS and licenses it to you for your device.


You are mistaken.

I bougth it, is not a lease, is not a license to use it, I own the device. I should be able to run whatever I want on it.

Stop spreading missinformation, you are thinking about software.


Apple, the trillion dollar company, doesn't need you to defend their shit business practices for them.


> I don't understand why people feel the need to force Apple into a specific strategy when they are not a monopoly.

According to your argument, they __are__ a monopoly. I.e. one on digital serfdom. Because apparently Android is in a different field.


Obfuscation seems like a great way to get your developer account suspended. I suspect Apple is doing a lot more than just basic static analysis of the binary on disk.

Glad they went with a config option instead.


Depends. The actual rule being "violated" isn't that the app can't contain the string "itms-services". Rather it's:

  Guideline 2.5.2 - Performance - Software Requirements
  The app installed or launched executable code. Specifically, the app uses
  the itms-services URL scheme to install an app.
i.e. the app can't try to trigger an install of another App Store app. The app in question isn't doing that, it's just that the basic check is incompetent for the rule it's supposed to be checking and the reviewer isn't doing any manual checking after that to see if it was a false-positive. So obfuscating the string, if you're genuinely not trying to install other apps, should leave your app just as non-violating as it was before... just not tripping the badly written check.

Apple can of course be arbitrary and capricious after that point.


They are opaquely rejecting apps for just literally containing the string "itms-services" in the binary and you still give them credit for a more sophisticated analysis? Lol.


You’re assuming that’s all they are doing, and that it’s all they will ever do, but neither assumption is supported by any evidence.

Apple is saying what test broke, not that other tests aren’t running.


No. See the above article for more information, specifically the discussion linked [1].

> Some light obfuscation of the magic string appears to avoid the issue.

It is a simple string match causing the failure.

[1] https://discuss.python.org/t/handling-incompatibilities-with...


Fair enough, though I was really just saying that simply because they are doing something simple doesn't mean that don't (or, more importantly, won't) do something complex. In the LWN article discussion it's mentioned that Apple doesn't like obfuscation, presumably this means they can detect some forms of it.

Put another way: if it was my app, and this string wasn't important to me, I wouldn't want it obfuscated, I'd want it removed.


If they did are doing more, why are the apps getting rejected?


We can assume that a simple string search is one of the basic checks they do before moving onto more advanced checks.


Every story I've seen over the years about MAS/iOS AS rejections point to their checks not being advanced at all.


Why can we assume that?


Because it’s less effort to implement?


The offending string is only there because Python’s urllib has a hard-coded list of schemes which use a hostname component or “netloc”. It’s fine for that list to contain known schemes from RFCs. Anything else — including proprietary third party schemes — should just use a heuristic.

The list is called uses_netloc and is used to help parse the user@host:port part of https, ftp, etc. domains. It’s this list of schemes that includes the forbidden string itms-services, used for Apple’s proprietary iTunes software.

The only code that needs this is urlunsplit and urljoin. If your parsed URL has a netloc then the list isn’t even relevant — if you have a netloc then you are assumed to be in uses_netloc.

This all seems like a much more sensible approach than trying to selectively include or exclude naughty strings from the source code, per some corporation’s passive aggressive demands.


Why is the “ itms-services URL scheme” in base Python to begin with? Why does Python have code for interacting with iTunes out of the box?


Apparently it was another hack to prevent incorrect parsing of such URLs on the server side, without the actual interaction happening [1].

[1] https://github.com/python/cpython/issues/104139


This is from a fucking test case to detect hyphens in the URL scheme? Just change it to asdf-zxcv. Or remove test cases from end-user python installs. What are we doing here?


Why does urllib have this URL scheme anyway? If Python libraries are hard-coding knowledge about Apple proprietary stuff, then it should be no surprise that Apple may take issue with that.


I believe it was added as the `itms-services://?` URL format is non-standard and broke the existing logic in `urllib.parse`: https://github.com/python/cpython/pull/104312


My understanding per RFC 3986 is this is a valid URL.

The authority consists of a host with a blank name and blank path (path-abempty).

The issue is a lot of libraries do not make a distinction between a field explicitly being declared blank and being left undeclared (e.g. nil vs ""). Looks like rather than change python's URL library to recognize this distinction, they created an exception list that just hard-codes output behavior.

One can make a case that client software should be robust enough to handle both forms of URL (e.g. itms-service URLs are likely just URI, but the original developers thought all URI should have two forward slashes in them). I can say specifically for "scheme://" calls that such robustness on the recipient side is very rare, because if they understood URI well enough to be robust then they wouldn't have had two extra slashes that they will never use.


A comment on the issue linked from the PR references https://url.spec.whatwg.org/#url-serializing as a rationale for why itms-services would normally not have the "//" and therefore needs the exception. However that standard says non-null, not non-empty¹, so it sounds like python's URL library is actually just wrong here in that it's treating an empty host the same as null.

¹Also I checked, that standard does actually define an empty host (https://url.spec.whatwg.org/#empty-host) as a distinct concept from a null host.


The article is pretty clear that it has this string because it's the Apple recommended method for launching a new app, and Python on MacOS does that. It's part of a standard library.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: