Hacker News new | past | comments | ask | show | jobs | submit login
Signing Your Apps for Gatekeeper (developer.apple.com)
211 points by atarian on Oct 20, 2018 | hide | past | favorite | 132 comments

On the one hand, of all the big companies, Apple seems closest to “getting it” with respect to security.

On the other hand, NO entity is inherently trustworthy “forever”, nor should any entity have the power to be a unilateral Decider even on its own platform. Just like when a good restaurant may some day become bad under new management, we are just one “new management” away from Apple becoming something that maybe we don’t trust so much. This system is being set up to give “Apple” tremendous power for “all future definitions of Apple”, which is ridiculous. That wouldn’t make sense even if Apple were a perfect saint today, invulnerable to buggy software and disgruntled employees and other potential weaknesses.

We need a system whereby users decide which SET of entities they trust, one of which may include Apple, and which may even exclude Apple if the user so chooses. The complex mechanism for signing and verifying things should be open-source so it can be understood and validated and reproduced cross-platform. Then you decide who you trust, period. You can rely on others to help you determine what is trustworthy. Given this type of system, I would be fine with macOS saying “select at least one trusted source to enable software installations”, knowing that I ultimately decide what those authorities will be. I am not fine with their seeming “father knows best” approach.

That works for "us" HN geeks. The consumer market is all about convenience, and the monent the custoners have to go through some hoops a new company will pop up that will do it as a service which they then trust for doing it properly. And while I'm all for decentralizing power, I'd rather have 1 accountable company or a couple of them handle this compared to a bunch of small independent ones consodering they are less auditable and more open to skewing the system. Maybe something like the W3 consortium but we know how that turned out...

True, but there need not be a convenience tradeoff. Make Apple the default signing authority, but make it configurable by the user. Normal users need not ever see it, but they retain an opt out if it ever becomes necessary.

Normal users see that when they're social-engineered into doing something they later regret.

You can always disable gatekeeper and run some 3rd party version if you really wanted.

Yeah, I would love that. Don't think that will happen any time soon unless legally obliged to.

It’s trivial to disable gatekeeper. It’s just a radio button in settings.

Microsoft does this with drivers since Windows 10 (attestation singing). I think this is okay, since users hold Microsoft responsible if the kernel crashes and this at least gives them the possibility to investigate.

Signing apps with normal (EV) code signing certificates is the best we have for other apps, I think. If you detect malware signed by a a certificate you blacklist that one and foward information to the relevant criminal authorities. If they don't investigate you blacklist the whole country. That last step is probably the most controversial one, but otherwise you still have a malware problem.

Sure but you CAN install unsigned drivers if you want.

Windows is pretty open about this, Apple is a walled garden. Safe but boring.

To paraphrase an infamous quote: Apple have root.

What good is any signing if you can't trust the OS? There are a myriad of ways they could undermine the verification.

Your position is based on an (implicit) false dilemma.

It's perfectly possible to "trust" Apple in the sense that you use their operating system and hardware after you've bought it. This kind of "trust" is similar to "agreeing with an EULA or TOS". There is not much of an alternative to it.

This does not imply that you therefore must also trust Apple as the one and only broker and manager of certificates for software running on your machine. You could, for example, suspect that Apple might revoke a certificate for some developer that creates a controversial product (e.g. a p2p client) or some product that competes with Apple products or some future plans of features. For example, Apple has in the past essentially copied innovative apps and revoked licences for the app store of the original developers. In theory, Apple could just erase such applications, since they "have root". In practice, they wouldn't do so. But they would and will revoke certificates and app store licences.

The new rules also make alternative app stores impossible, of course, although I'm not sure they were possible before.

In a nutshell, trust != trust. You can trust a hardware or software maker in some respects but not in others.

The primary goal of Gatekeeper is monopolization of secure access, not security.

> For example, Apple has in the past essentially copied innovative apps and revoked licences for the app store of the original developers.

Source? I develop in the Apple ecosystem and never heard of this. Would be scandalous.

“Heavily inspired” Apple apps, yes. Revoking certificates or access to developer program (which I guess you must mean by “App Store licenses”, because there’s no such thing otherwise), not that I’m aware of.

Sure, I agree that I may not necessarily want to trust the organisations they trust or indeed mistrust the organisations they mistrust. Or to put it another way, my web of trust may differ from theirs.

But the OP was for example talking about potentially wanting to "exclude Apple" which would, IMHO, be nonsensical on Apple's OS.

I'm not Apple fanboy in any way, but how exactly trusting Apple to hold root access to your system is any worse than trusting to baseband in any other smartphone? It's just whole chain of trust so fundamentally broken so there no way you can trust to any device and Apple at least not any worse here.

In literally every phone baseband is not isolated, run it's own OS and can do anything to OS even if you have root. On top of that every ARM SOC have TrustZone that supposedly might also run any code and you have zero control over it.

I think you misunderstand me. If "new management" were to take over Apple and you no longer trusted them, why would you continue to trust their OS?

In that scenario code signing is irrelevant because an untrusted OS can undermine it.

Agree. I humbly suggest that you add "revokable trust" to your "opt-in trust" thesis.

So this is another step for Apple to try to get everyone paying them their software development tax, since "notarization" requires you paying the $100 annual fee.

While you currently can still run digitally signed software that is not notarized, this document admits that in the future signed software will require Apple's approval, so Apple will make everyone's software LESS secure (by forcing developers to not sign the code if they aren't paying Apple) so that it will continue running.


So this is another step for Apple to try to get everyone paying them their software development tax, since "notarization" requires you paying the $100 annual fee.

I don't think Apple really cares about the fee. I know that the three platforms are now bundled together under one developer program, but suppose that there are 100,000 Mac developers that publish signed or App Store applications (which is probably an overestimation), the fees will make Apple 10 million. This is less than pocket change to them.

To me it seems that the $100 annual fee is to prevent fraud: (1) using a credit card to tie the developer account to an actual person; (2) make it expensive enough to avoid playing whack-a-mole with malicious account creation.

Indeed. Plus don't underestimate the psychological barrier that value can provide. 100$ are a sweet spot: it keeps everyone and their dog from signing up just because ("it's free, I might give it a go"), but at the same time it's not too much for a motivated developer who wants to dabble into programming for the Apple ecosystem of devices.

>it's not too much for a motivated developer

living in a first world state.

We're talking about owners of a Mac. They're not likely to be that strapped for cash, regardless of where they live.

Plenty of people out there with hand-me-down or 2nd (or 3rd) hand Macs.

At a global level, those are called "rich people". I'm sure you can find some individual exception somewhere, but by and large every mac user will be able to scrape or borrow $100 by the time their software is developed, tested and ready to ship.

That is so hilariously out of touch I feel hard not to consider this wilful ignorance.

That seems uncharitable of you.

And will they be same set of people proficient enough to make paid software on mac platform?

...paid? You think only paid software needs certification? Also, yes, they will be the same people anyway, not every worthwhile project gets VC funding. Also, unpaid software is _already_ suffering from the certification situation.

You don't pay to develop, just publish. So hobby developers are fine.

> it's not too much for a motivated developer

This is pretty subjective.

Ok, what entitles anyone to be able to publish their software in the repository of a for profit company? Why should it be open for free to everyone? You don't want pay that fee? Cool, there's Android and many other open platforms to program for. I will never understand the sense of entitlement people have against Apple. You are free not to buy Apple products and services if you think they are overpriced. Plus, no, it's not subjective. It's objectively low compared to many other access or licence fees a developer might pay in heesh professional life.

I'm not sure where you get a sense of entitlement from my comment. But my point about the fee is that even if it is low compared to other fees, that doesn't mean all developers who want to publish in the Apple ecosystem can afford to do so.

And making it expensive enough to turn down hobbyists or teenagers. I don’t think I agree with this approach.

The $100 is for distribution that apple can provide via its store and signing. You can build a non-signed version or a local only version for free.

Isn't all this achieved with a generic code signing certificate? What additional benefit does Apple's "notarization" offer, other than further fragmenting the market?

The page has a list. In short, you get a better experience with regards to Gatekeeper, and you will need to notarize your apps if you want them to continue to work on later versions of macOS.

As others have pointed out, you already have to pay the tax to get a trusted signing certificate. So as far as the economics go, this isn’t a change.

It's actually a big change. One issue is that Apple may not approve you(for whatever reason) or keep you on the waitlist for weeks(happened to me). Not to mention that there won't be any competition. Next would be to make Safari and any browser running on osx accept only Apple tls certificates. That would be "awesome"! Finally Apple "secures" the internet

It's a big change because Apple may keep you on the waitlist for weeks, which already happened to you?

Why is a time traveler speculating on the future?

You don't need to be a time traveller to see that Apple will be the only one accepted as issuer. I really don't understand how people can feel comfortable with that. The next step is making the browsers trust only Apple's tls. At some point Apple will have to "obey the law" and ban any application that either violates certain country laws or its own TOS.

I don't think Apple is making any significant money out of these $100 subscriptions. The payment fee is the best SPAM detector out there.

$100 is nothing compared with game consoles fees, or enterprise software licenses.

Hopefully Apple's platform continues to be a general purpose computing platform and not a console.

Interestingly enough I only see such complains from devs that are using OS X as pretty Linux, and seldom from those that are actually selling OS X apps.

We all know that software devs love Apple because its a walled garden. No competition from F-droid or piracy to worry about it.

We love Apple because of their Objective-C and Swift stacks and related OS APIs.

Having a POSIX layer is a minor detail.

Anyone that buys Apple, but actually wants a GNU/Linux laptop should do so.

Your post is full of assumptions.

First of all, $100 per year merely covers the costs for this service.

Apple will never force apps to always be notarized, although they might enable admins to enable such a feature for security purposes which I think is great.

> Apple will never force apps to always be notarized...

Apple's website actually says that they will:

"Note that in an upcoming release of macOS, Gatekeeper will require Developer ID signed software to be notarized by Apple."

I think this is about what is required. To get the benefit of being signed (i.e opening without user interaction to get past Gatekeeper) will in the future require being notarised.

So you will still be able to ship an unsigned app, but you won't be able to ship a signed, un-notarised one.

A pretty good blog summary of how this differs from past practice.


TLDR: You have to adopt the new OS user privacy protections. (Users must give permission for your app to access things like the webcam, microphone, contacts, photos, location data, etc.)

Your app gets scanned for malware before it is signed.

There is still no App Store approval process involved.

> You have to adopt the new OS user privacy protections.

Don't you have to do this anyways?

> Note that in an upcoming release of macOS, Gatekeeper will require Developer ID signed software to be notarized by Apple.

Does this mean that anything distributed outside of the App Store will have to be approved by Apple? Will the App Store sandboxing rules apply to outside apps too?

No, that's not what it means. Currently there are three security tiers for applications on macOS: App Store, not App Store but cryptographically signed, and not cryptographically signed. All three can be run.

That sentence means that in the future, the second category (applications which are not on the App Store but are cryptographically signed) will need you to generate a certificate with Apple. Nothing about the app store's sandboxing rules, and nothing about all applications -- you can run unsigned applications to your heart's content.

Weren't most developers already generating a certificate with Apple for the second category? I was paying apple 99/yr when I was actively developing a non-app-store app largely just so I could get the app signed with my developer id. I tried reading into alternatives it seemed like a risky process. Ideally I'd like to pay a bit less if I'm not using the benefits of app-review, but the notarization process seems like a decent compromise.

Even on windows my experience is that signed-non-windows-store apps can get flagged as malicious upon download if they're fairly niche/aren't used by many people, supposedly using an EV Cert helps with that. The user experience is actually worse from my experience (windows will show something red, then you have to click some non-obvious buttons to successfully run the app).

Yeah, Gatekeeper has always had Apple as the sole root of trust. What’s changing is that each app will now have to be uploaded to Apple to be ‘notarized’, rather than signing being a purely offline process with a previously obtained certificate.

> Even on windows my experience is that signed-non-windows-store apps can get flagged as malicious upon download

But I’m Windows, any approved CA can issue a certificate, not just a Microsoft.

Apple’s new regime is not only restricted to Apple being the only CA, but that Apple is the only one who can sign the apps.

That’s immensely restrictive.

It is still a writing on the wall. It makes no sense to tighten the security on signed apps if it is not to prevent unsigned app from running at one point. Which is clearly the direction of travel (and what Microsoft also did with WinRT and Windows 10 S). I don't have to wait until the final destination to not like the direction.

How do you run unsigned code on iOS. e.g. How can I build an XCode project and have it run on my iPhone?

This article is about "Mac Apps", as the title states. I figure—unless posted elsewhere—that iOS hasn't changed and is still as restrictive as ever.

You cannot run unsigned code on ios. However you can, for free, generate codesigning certs (that expire after 7 days).

C’mon, apple will never allow you to run your own code without their say so.

On macOS? You totally can. You can also do the same on iOS, but in a much more limited way.

You can now, but this is clearly a step in the direction of locking down the platform.

It's another slow gradual step toward total lockdown.

I feel like eventually I'll have to abandon Mac, but for what? Linux is still flaky and Windows adware unless you spring for Enterprise.

I would pay for a commercial Linux as polished as MacOS, but there may not be enough of me. (It could also have a list of officially supported hardware to at least approach the stability benefits of Apple's vertically integrated HW/SW stack.)

Not debating Apple's overall trajectory, but I think it's cynical to call this a step towards lockdown. Mitigating certificate forgery attacks ultimately increases trust in software distributed outside Apple's walled garden.

How is this not lock-down? If Apple doesn't like VLC or whatever software competes with their interests they can simply not renew their membership.

Are you talking about the old policy?

You can run unsigned apps. The new policy affects the process for signing apps.

But things like gdb just don’t work unless they’ve been signed. So, if Apple starts blocking unnotarized aogned apps, whether or not these programs will run will depend on whether Apple considers them malicious

You can still self-sign your apps.

* Assuming Apple never decides to flip certain switches, after the OS infrastructure and hardware support are in place

Turns out that even compiled software can be modified to suit the user’s intent, thankfully.

(Note: Not talking about Apple now, hypothetically...) How would one do this on a fully trust-chained system with processor support?

Barring software bugs that allow for arbitrary code exec as the binary?

Signed package + necessary keys embedded in silicon -> processor verfies signature at memory load -> processor disallows user privilege escalation to write to arbitrary memory

I mean RHEL Workstation is a very polished Linux environment. It might not be up to date enough for your needs but it's very stable and we'll documentated. Not the best for a laptop though.

I would spring for Fedora or Ubuntu and deal with the lack of paid support for my own personal use but then again I've been using Linux for a while.

Both options are perfectly stable desktops, it's really the commerical software support that will get you.

Ubuntu has Ubuntu Advantage paid support.

I'm in the same spot as you and am eyeing Elementary OS.

Elementary OS is quite polished and takes many cues from OSX.

What is adware about windows 10? I work on windows 10 pro and it seems fine.

Notarized, not approved. It's not a manual process, doesn't involve App Store-like approval rules, and doesn't require sandboxing.

But in the future, it could, as apple boils the frog bit by bit.

Of course it could. Almost anything "could". But it doesn't, as of right now.

This exact discussion first happened when the original iPhone came out. It's been a decade+ now, and nothing has happened to your ability to run whatever you want on a Mac.

If this really happens to be some sort of frog-boiling conspiracy, it's progress must be glacial. Which doesn't square very well with the other usual criticisms, namely that Apple doesn't care about the Mac, and that they suffer from short-termism.

Apple was very clear that the notarization process is not an App Store review. It could be, at some point, but as it stands currently there are so many apps that don't fit into the sandbox model that they'd just be shooting themselves in the foot.

What exactly is this if it's not a review process? If you violate Apple's policies you are banned. Of course your app won't stay in queue for review but once Apple found they don't like your app you can no longer sign it and I'm pretty sure they can make gatekeeper stop the existing installs as well.

It's automated scanning for malware. That's it.

So they cannot scan apps signed by a 3rd party cert authority?

That's irrelevant really - Gatekeeper only uses Apple's root when determining if software is signed.

Give the fucking shenanigans we've seen from the commercial CA's recently, I'm not surprised.

> boils the frog bit by bit.

That’s a myth: https://en.wikipedia.org/wiki/Boiling_frog

It's a metaphor, he doesn't think Tim Cook has a big pot

But the thing he's trying to use as a metaphor has been proven scientifically false, so the metaphor doesn't actually mean what the OP thinks it does.

Have any better metaphors?

Well I don't agree with your basic premise that this change is bad for either users or developers, so no.

Well. Seeing that the frog boiling thing is a myth....


It's pretty much the same(notarized/approved). The only difference is that Apple may not "reject" the app before you publish it, but once it decides to "ban" it you can no longer sign it as they suspend your developer account. I can think of uTorrent as a possible victim of this "notarization" process. Apple can quickly put peer to peer file sharing apps on the "malicious/banned" list just like they did on Appstore.

They can do that already, and have been able to for years (block software) with XProtect.

No. Users can still disable Gatekeeper for complete control, just as they can disable Secure Boot on PCs, and just as they can disable Google Play Protect and enable "unknown sources" on Android.

Nitpick: Secure boot is about the firmware protecting the machine at boot time against unauthorized OSes or OS-modifications, establishing trust in the boot-media.

App-signing is about protecting the user within an already booted OS, trusted or not.

These are very different concepts.

Yeah, I agree. Just comparing them in that they both restricted users' perceived control and autonomy.

Secure boot is 100% under the users control. You can load your own keys.

Similarly in Windows you can control which CAs (or individual cettificates) you trust.

They 100% respect the users freedom.

Very much unlike Apple does with gatekeeper.

So if Apple won't like my app it won't renew my developer account so that I cannot sign it anymore. Seems quite "fair" and I sense no lock-down or censorship. Let's be happy for making the internet "safer". As some have said this is good news for ...(fill the gap)

Are you talking about the old policy, or what? The only thing that has changed here is the process for signing an app, which has always required a developer account. It just requires uploading a binary now, instead of working entirely offline. And you can still run unsigned apps.

You didn't need a Apple developer account to sign the app if you used a 3rd party(i.e digicert) certificate/signer.

Yes, you can sign an app with a third party certificate, but only a Developer ID certificate works with Gatekeeper. And there is no app review when you notarise and app anyway.

Of course it's a review process. That's why you need a developer ID. The only difference is that the app will be reviewed "later". Here is an example: You sign an app like "uTorrent", all is working great, people can download and run it. Then mpaa tells Apple to ban your application because it's used to pirate their content. Apple now "reviews" your app and suspends your developer id. O top of that it tells gatekeeper to stop running the installed instances. Review done! Any question? Please review our TOS to find out why we blocked your app...bla bla blah

No it's not. It's an automated malware scan. They can already revoke Developer ID certificates and block whatever they want by updating XProtect (even unsigned apps).

And then uTorrent is released again as an unsigned app and everybody can happily keep on torrenting.

I was recently cleaning off someone's Mac and made this note:

> I found no fewer than eight fake Adobe Flash updaters, six of them identical and signed by Nevaeh Mitchell (WMAA75SZMS), one signed by Lambert Jeremy (B4MCPEJ42J), and one by Wolfe Bailey (3W8NF7PWUL). It does not appear that Apple has revoked any of these signing certificates or flagged any of these installers through macOS's built-in malware removal tools.

So it doesn't seem to me that malware authors are exactly afraid of signing requirements.

Today it may be optionally notarized.

Tomorrow it must be notarized.

Then it won't be notarized if it uses "dangerous" APIs.

Then it won't be notarized unless it's distributed through the AppStore.

You can't force every vendor into the AppStore, but you can gradually train users to distrust everything that's not in it. What we are seeing here is just an Act 1 of that.

Did anyone figure out how to just get a DMG or whatever created outside Xcode notarized? These docs are too Xcode focused.

The docs say you can use ‘xcrun altool’ to get a dmg/zip/pkg notarized. That’s an XCode tool but no indication that what you’re submitting needs to be created in XCode.

xcrun just locates tools inside the current $DEVELOPER_DIR, which is inside the Xcode.app bundle (based on which Xcode is selected with xcode-select). You can think of xcode-select as virtualenv or similar systems: you select which version of the tools you want to use.

Fun fact: If you symlink xcrun as a different name, it will assume that is the name of the tool you wish to run. So "ln -s /usr/bin/xcrun altool" will make altool an alias to run the version of altool bundled with the currently selected Xcode.

Xcode doesn't do anything special to your binary; you should be able to submit a valid codesigned application bundle AFAIK.

The problem is the “submit” part. I.e. the docs say “xcrun aitool” but doing that gets some you an undocumented CLI app where it seems like you have to supply your Apple ID and password on the command line.

Seems... less than ideal? How does that work with 2FA?

> How does that work with 2FA?

It's for people who want to submit their app non-interactively (e.g. as part of a build process). Those people aren't going to use 2FA since the keys are under control of a robot.

This protects against… making many slightly-different copies of a malware app and signing them all locally, so that revoking one doesn't affect the others?

This lets Apple nuke a compromised version of an app without doing it to all versions or apps by that developer. Which would have been useful in the case of Transmission [1][2].

[1] https://arstechnica.com/information-technology/2016/03/first...

[2] https://blog.malwarebytes.com/threat-analysis/2016/09/transm...

Transmission’s Developer ID cert wasn’t compromised, the compromised binary was signed by a different developer.

So, basically this a replay of Orwellian "1984" where the tables have turned. Now Apple is the Big Brother and it is there to dictate. Who is going to throw that hammer nowadays?

Remember that time Apple messed up and decided to punish all their app developers by forcing them to resign their apps?

What's the difference between this and Authenticode?

Good, this doesn't even need for the software to be sent to Apple, only the hash.

How can they scan your software and run security checks on it based only on a hash?

Are the checks actually run locally by xCode?

> Give users even more confidence in your software by submitting it to Apple to be notarized. The service automatically scans your Developer ID-signed software and performs security checks.

Looks like it uploads a bundle to them for all that to happen on their side.

The software is sent to Apple when it is notarized. When a user downloads and runs it, it is not sent to Apple. It just uses the hash. It would be pretty crazy for Mac OS to send the whole package to Apple to do a security scan, but a naive reader might think that.

They scan the software, not your software. If the hashed match against a known scanned good bit of software then it's safe. Maybe.

What does this mean for non open source projects? They need to submit the source code to ensure there is no malware, right? This is the standard practice in iOS apps.

> This is the standard practice in iOS apps.

Pretty sure you’ve got your wires crossed here - you absolutely do not have to submit source code to put your app on the App Store.

iOS app binaries are submitted to the App Store, not the source code. Malware is detected via static analysis of the byte code (and presumably some runtime heuristic analysis) but it isn’t perfect.

For the most part, the main thing that I’m aware that static code analysis detects is using non public APIs. Most of the malware protection comes from IOS’s sandbox.

iOS does not involve sending your source code. This involves ensuring your app runs in their Hardened Runtime and is properly signed. Apple checks that there is no Malware.

Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact