On the other hand, NO entity is inherently trustworthy “forever”, nor should any entity have the power to be a unilateral Decider even on its own platform. Just like when a good restaurant may some day become bad under new management, we are just one “new management” away from Apple becoming something that maybe we don’t trust so much. This system is being set up to give “Apple” tremendous power for “all future definitions of Apple”, which is ridiculous. That wouldn’t make sense even if Apple were a perfect saint today, invulnerable to buggy software and disgruntled employees and other potential weaknesses.
We need a system whereby users decide which SET of entities they trust, one of which may include Apple, and which may even exclude Apple if the user so chooses. The complex mechanism for signing and verifying things should be open-source so it can be understood and validated and reproduced cross-platform. Then you decide who you trust, period. You can rely on others to help you determine what is trustworthy. Given this type of system, I would be fine with macOS saying “select at least one trusted source to enable software installations”, knowing that I ultimately decide what those authorities will be. I am not fine with their seeming “father knows best” approach.
Signing apps with normal (EV) code signing certificates is the best we have for other apps, I think. If you detect malware signed by a a certificate you blacklist that one and foward information to the relevant criminal authorities. If they don't investigate you blacklist the whole country. That last step is probably the most controversial one, but otherwise you still have a malware problem.
Windows is pretty open about this, Apple is a walled garden. Safe but boring.
What good is any signing if you can't trust the OS? There are a myriad of ways they could undermine the verification.
It's perfectly possible to "trust" Apple in the sense that you use their operating system and hardware after you've bought it. This kind of "trust" is similar to "agreeing with an EULA or TOS". There is not much of an alternative to it.
This does not imply that you therefore must also trust Apple as the one and only broker and manager of certificates for software running on your machine. You could, for example, suspect that Apple might revoke a certificate for some developer that creates a controversial product (e.g. a p2p client) or some product that competes with Apple products or some future plans of features. For example, Apple has in the past essentially copied innovative apps and revoked licences for the app store of the original developers. In theory, Apple could just erase such applications, since they "have root". In practice, they wouldn't do so. But they would and will revoke certificates and app store licences.
The new rules also make alternative app stores impossible, of course, although I'm not sure they were possible before.
In a nutshell, trust != trust. You can trust a hardware or software maker in some respects but not in others.
The primary goal of Gatekeeper is monopolization of secure access, not security.
Source? I develop in the Apple ecosystem and never heard of this. Would be scandalous.
“Heavily inspired” Apple apps, yes. Revoking certificates or access to developer program (which I guess you must mean by “App Store licenses”, because there’s no such thing otherwise), not that I’m aware of.
But the OP was for example talking about potentially wanting to "exclude Apple" which would, IMHO, be nonsensical on Apple's OS.
In literally every phone baseband is not isolated, run it's own OS and can do anything to OS even if you have root. On top of that every ARM SOC have TrustZone that supposedly might also run any code and you have zero control over it.
In that scenario code signing is irrelevant because an untrusted OS can undermine it.
While you currently can still run digitally signed software that is not notarized, this document admits that in the future signed software will require Apple's approval, so Apple will make everyone's software LESS secure (by forcing developers to not sign the code if they aren't paying Apple) so that it will continue running.
I don't think Apple really cares about the fee. I know that the three platforms are now bundled together under one developer program, but suppose that there are 100,000 Mac developers that publish signed or App Store applications (which is probably an overestimation), the fees will make Apple 10 million. This is less than pocket change to them.
To me it seems that the $100 annual fee is to prevent fraud: (1) using a credit card to tie the developer account to an actual person; (2) make it expensive enough to avoid playing whack-a-mole with malicious account creation.
living in a first world state.
This is pretty subjective.
If this changes some day, I already know what I tell the users of my program, though: I, the developer, have not made any changes to the application and the reason why it no longer runs is solely Apple's responsibility. Please contact Apple's Customer Support and/or sue the company in case you cannot access your data any longer.
Further support requests will then be forwarded to Apple.
And I have predicted people like you are wrong because there is no hardware backing for it yet, unless by "the future" you mean literally a good decade or more down the line which for tech is getting into really hazy land. But equating the Mac to iOS devices right now just plain makes no sense because the Mac doesn't have that whole hardware chain. iOS isn't magic, while I think Apple's level of lockdown should be illegal on a cultural level on a technical level what enables it is the hardware. On the Mac maybe the newest ones with Apple's T-series chips represents first steps in that direction, and of course if Apple does in fact someday decide to switch the Mac from x86 to their own ARM-based processors they'd have far more power there. But even that doesn't solve the legacy issue, and at this point Apple has built a great deal of culture and brand around supporting hardware for a long time. They still support Mojave officially on 2010 Mac Pros  for example, and right now they're still selling new Macs without even T-chips. Sure, it's not impossible that they could release an ARM Mac system in 2020 and then immediately put all "legacy Macs" on life support , but that'd a big a pretty surprising choice wouldn't it? It'd engender a lot of justified fury.
>If this changes some day, I already know what I tell the users of my program, though
I hope you realize how pointless this is likely to be? It'd be better to work on a political answer around it and inform your users accordingly.
1: A few unfortunate caveats around features hampered by the MP's very old EFI aside, which is a complex discussion on its own.
2: Though I'd expect lawsuits over a decision like that at least in the EU.
As for the rest of your post, I don't get it. To close their ecosystem Apple only has take away the option to make Gatekeeper exceptions for unsigned apps and the option to switch off Gatekeeper. I wasn't talking about anti-jailbreaking protections, it suffices that you'd have to hack the OS and violate the EULA in order to run unsigned apps.
Whether that happens or not I don't know. So far, the path has been down that road. (Just look at the gradual changes in Gatekeeper, introduction of sandboxing, unification of iOS and MacOs, etc.)
Why is a time traveler speculating on the future?
Having a POSIX layer is a minor detail.
Anyone that buys Apple, but actually wants a GNU/Linux laptop should do so.
First of all, $100 per year merely covers the costs for this service.
Apple will never force apps to always be notarized, although they might enable admins to enable such a feature for security purposes which I think is great.
Apple's website actually says that they will:
"Note that in an upcoming release of macOS, Gatekeeper will require Developer ID signed software to be notarized by Apple."
So you will still be able to ship an unsigned app, but you won't be able to ship a signed, un-notarised one.
TLDR: You have to adopt the new OS user privacy protections. (Users must give permission for your app to access things like the webcam, microphone, contacts, photos, location data, etc.)
Your app gets scanned for malware before it is signed.
There is still no App Store approval process involved.
Don't you have to do this anyways?
Does this mean that anything distributed outside of the App Store will have to be approved by Apple? Will the App Store sandboxing rules apply to outside apps too?
That sentence means that in the future, the second category (applications which are not on the App Store but are cryptographically signed) will need you to generate a certificate with Apple. Nothing about the app store's sandboxing rules, and nothing about all applications -- you can run unsigned applications to your heart's content.
Even on windows my experience is that signed-non-windows-store apps can get flagged as malicious upon download if they're fairly niche/aren't used by many people, supposedly using an EV Cert helps with that. The user experience is actually worse from my experience (windows will show something red, then you have to click some non-obvious buttons to successfully run the app).
But I’m Windows, any approved CA can issue a certificate, not just a Microsoft.
Apple’s new regime is not only restricted to Apple being the only CA, but that Apple is the only one who can sign the apps.
That’s immensely restrictive.
I feel like eventually I'll have to abandon Mac, but for what? Linux is still flaky and Windows adware unless you spring for Enterprise.
I would pay for a commercial Linux as polished as MacOS, but there may not be enough of me. (It could also have a list of officially supported hardware to at least approach the stability benefits of Apple's vertically integrated HW/SW stack.)
You can run unsigned apps. The new policy affects the process for signing apps.
Barring software bugs that allow for arbitrary code exec as the binary?
Signed package + necessary keys embedded in silicon -> processor verfies signature at memory load -> processor disallows user privilege escalation to write to arbitrary memory
I would spring for Fedora or Ubuntu and deal with the lack of paid support for my own personal use but then again I've been using Linux for a while.
Both options are perfectly stable desktops, it's really the commerical software support that will get you.
This exact discussion first happened when the original iPhone came out. It's been a decade+ now, and nothing has happened to your ability to run whatever you want on a Mac.
If this really happens to be some sort of frog-boiling conspiracy, it's progress must be glacial. Which doesn't square very well with the other usual criticisms, namely that Apple doesn't care about the Mac, and that they suffer from short-termism.
Give the fucking shenanigans we've seen from the commercial CA's recently, I'm not surprised.
That’s a myth: https://en.wikipedia.org/wiki/Boiling_frog
App-signing is about protecting the user within an already booted OS, trusted or not.
These are very different concepts.
Similarly in Windows you can control which CAs (or individual cettificates) you trust.
They 100% respect the users freedom.
Very much unlike Apple does with gatekeeper.
> I found no fewer than eight fake Adobe Flash updaters, six of them identical and signed by Nevaeh Mitchell (WMAA75SZMS), one signed by Lambert Jeremy (B4MCPEJ42J), and one by Wolfe Bailey (3W8NF7PWUL). It does not appear that Apple has revoked any of these signing certificates or flagged any of these installers through macOS's built-in malware removal tools.
So it doesn't seem to me that malware authors are exactly afraid of signing requirements.
Fun fact: If you symlink xcrun as a different name, it will assume that is the name of the tool you wish to run. So "ln -s /usr/bin/xcrun altool" will make altool an alias to run the version of altool bundled with the currently selected Xcode.
Seems... less than ideal? How does that work with 2FA?
It's for people who want to submit their app non-interactively (e.g. as part of a build process). Those people aren't going to use 2FA since the keys are under control of a robot.
Tomorrow it must be notarized.
Then it won't be notarized if it uses "dangerous" APIs.
Then it won't be notarized unless it's distributed through the AppStore.
You can't force every vendor into the AppStore, but you can gradually train users to distrust everything that's not in it. What we are seeing here is just an Act 1 of that.
Are the checks actually run locally by xCode?
Looks like it uploads a bundle to them for all that to happen on their side.
Pretty sure you’ve got your wires crossed here - you absolutely do not have to submit source code to put your app on the App Store.