Hacker News new | past | comments | ask | show | jobs | submit login
Apple memory holed its broken promise for an OCSP opt-out (lapcatsoftware.com)
422 points by latexr 40 days ago | hide | past | favorite | 108 comments



Regarding OCSP, off-topic for Apple: Firefox enables OCSP by default. This means that for every TLS connection an OCSP plaintext HTTP request will be made to the certificate authority that signed the certificate of the website the browser is connecting to. This means that the certificate authority receives very well timestamped information about the exact domains you are visiting if you are using Firefox and don't disable OCSP and the website you're connecting to does not use OCSP stapling (most don't). Note that disabling OCSP will make Firefox unable to get certificate revocation information (maybe it still uses system's revocation store, I'm not sure about that, but it certainly does not use more privacy preserving CRLs).


For intermediates, Firefox distributes OneCRL[1] based on CCADB[2] data and the Mozilla Root Program’s own trust decisions. For leaf certificates, they announced the CRLite[3,4] experiment a while ago but it’s not clear how far along it is.

[1] https://blog.mozilla.org/security/2015/03/03/revoking-interm...

[2] https://www.ccadb.org/

[3] https://blog.mozilla.org/security/2020/01/09/crlite-part-1-a...

[4] https://bugzilla.mozilla.org/show_bug.cgi?id=crlite


It's a trade-off though, isn't it? Everyone seems upset about OCSP as of late, but the domains being sent in plaintext is not exactly limited to just OCSP - we've had this with SNI and DNS.

The advantages of OCSP are that you get a real-time understanding of the status of a certificate and you're not needing to download large CRLs which are stale very quickly. If you set security.OCSP.require appropriately, you don't have any risk of the browser failing open, either.

It seems to me like the people who most dislike OCSP are CAs who have to maintain the infrastructure capable of responding to queries. I have really limited sympathy, that should be part of running a CA.

The privacy concerns could be solved by mandating OCSP stapling, and you could then operate the OCSP responders purely for web-servers and folks doing research.

Unfortunately the ship has sailed with ballot SC63 now, and we are where we are. I don't necessarily agree that OCSP as a concept was unfixable, though.


> Everyone seems upset about OCSP as of late, but the domains being sent in plaintext is not exactly limited to just OCSP - we've had this with SNI and DNS.

My main privacy related concern isn't about domains being sent in plaintext but that they are sent to the CA and they can then theoretically do analytics on this data and profile web users.

But maybe this concern doesn't really make sense as we have strict personal-data regulations now.


This kind of stuff is a major reason I completely cut all Apple stuff out of my life.

When you're network is all Linux everything actually does "just work" more so than it ever did with OSX. Everything is just an SSH away, it's really pretty amazing.


SSH is built in to macOS and can be enabled in a few clicks.

https://support.apple.com/lt-lt/guide/mac-help/mchlp1066/mac


Be careful doing this without enabling your own key/certificate validation. We deployed SSH to a few dozen OS_X machines in a lab (for backend maintenance) without certificates for handshake (i.e. password-based) and next morning several machines had been compromised.

We were using complex passwords; this was when 10.6 was latest OS, so perhaps security is better now with simple password-based SSH — but I would never take such a risk again.


I find it hard to believe that your complex passwords got cracked overnight. Were your machines on the internet and do you have huge bandwidth, not behind a firewall? Inside job perhaps with access to /etc/shadow with passwords shared across machines or something? perhaps user accounts got compromised by other users? that's a phenomenal number of login attempts across the internet.


... except the ssh bin is also tracked with OCSP ..


Can’t you disable it though or block the connection?


No developer should be rationally expected to do that. SSH doesn't ship with telemetry or tracking, you shouldn't have to modify the default to make the software behave the way it was intended to; this is runtime-level coercion. You either consider it a feature, or you see it as a bug.

Plus, if you know what OCSP is and care about it, chances are you don't use MacOS anymore. Nobody who conscientiously objects to OCSP tracking should be assenting to the rest of MacOS and it's perverse sense of "security".


But you don't have to on Linux.

Like there's just so much crap you have to do to make these non-free OSes pleasant or private and it's always blowing up. Linux mostly "just works" OOTB.


How does one go about creating a little snitch rule to prevent these connections?


Well, let's not forget the little dance Apple did to make their programs and the system be able to bypass packet filters like Little Snitch...

... and then claim it was necessary for "updates and upgrades". Not sure why TextEdit.app needed a kernel network extension for "updates and upgrades".

They also denied it was possible until it was provably demonstrated that they did.

I like Apple stuff, everything I use is Apple. But too many see them as infallible or go to the whataboutism for missteps like these.


Oddly enough, UTM VMs also bypass Little Snitch, making me wonder if you could always just bypass LS by running a VM in your evil app.


Will it bypass it regardless of the virtualization engine, or only if you're using the Hypervisor framework for macOS VMs?


Depends on the network adapter in use. E.g. a bridged adapter will bypass it, an emulated vlan won't.


Ah that clarifies things, I was confused for a moment.

Well, the goal of a bridged net adapter for VMs is to make the VM as if it were physically plugged in directly into the network, independently of the host, so it makes sense for it not to be affected by a firewall.

> running a VM in your evil app.

IIRC back then, creating a bridged net adapter required a special entitlement (special as in you can't get it without explicitly asking Apple for it and the dev cert for it has additional entries, like for kernel extensions). Dunno if that's still the case.


Paste the following on the All Rules view https://gist.github.com/jiripospisil/e8e0aeef0f83b76c9d8db08...

(Posting a link because HN formatting breaks the format).


That's obsolete. macOS now uses ocsp2.apple.com.

The process is "trustd".


Thanks. Does that depend on macOS version? Better to block both I guess.


I've checked my DNS logs and there hasn't been a single hit against ocsp.apple.com over the last year, but around 20-30 hits for ocsp2.apple.com per day per device. (iphone, macMini, macbook)

Just blocking ocsp2.apple.com is probably fine if you're running anything recent-ish.


Does Little Snitch support regex? Perhaps it should be `ocsp\d*\.apple\.com`


Block ocsp3.apple.com while you're at it.

And ocsp4. And 5. Block them all!


> Does that depend on macOS version?

Yes, though I couldn't say offhand when exactly it changed.


Little Snitch WAS NEVER A GOOD PRODUCT if you're worried about DNS-leaks.

Fact: Little Snitch resolves the IP address (i.e. does domain-to-IP lookup) before the Deny/Allow dialogue ever appears onscreen. Only by pressing "Allow" does Little Snitch then allow/initiate connections to the already-resolved IP ADDRESS.


Then do you have a suggested alternative?


/r/PiHole

I run four, locally [50+ users].

Local DNS Server issues default lookup server @x.x.x.2 (e.g. minimal blocklist PageAd, CloudFare), for maximal client compatibility.

Users can thereafter upgrade levels of blocking by manually increasing DNS IP +1 (e.g. x.x.x.3 for stricter blocking, whereas default x.2 only blocks seven very common trackers), at host-level.

They can also manually enter x.x.x.1 and thereafter have no DNS-restrictions [this is the router itself, which each x.x.x.N itself resolves to, via our ISP's own DNS serverlist].

Our x.x.x.5 is top-notch white-page browsing, but many applications won't work (e.g. banking; although surprisingly Youtube still works although you do have to watch pre-roll ads).

I have one user who there-after has his own local subnet/PiHole, although you can hop-up to PiHoles, as long as the subnet is a child or sibling of that DNS-resolver.



Can you get around that nonsense by turning off wireless radios before launching apps?


Yes you can. If there's no network it will skip the check.

It's not very practical though. Better to block the address with little snitch or a hosts file


HOSTS hasn't worked very well for a long time, the OS often ignores it.


Do you have a source? I regularly use /etc/hosts and never saw any inkling of it being ignored, but do see plenty of cases that confirm it is not.


Safari, when using the "iCloud Private Relay", does ignore the hosts file. Not obvious to everyone, but does kinda make sense.


I just tested this, and it’s even worse than your description. Safari ignores /etc/hosts even when Private Relay is off.


I think it's the "first to connect" logic between IPv4 and IPv6. Saw an article recently that said when the address being connected to has both addresses, hosts doesn't work unless you map both there, otherwise the unmapped address will connect.


That doesn't explain this happening on systems with 2009 Windows 7-era hardware which is only IPv4 capable, like one of my systems - I've noticed HOSTs being ignored for many, many years.


I totally blocked the internet in HOSTs. Blackholed everything to 127.0.0.1

Guess what I'm doing right now? Talking to you on HN.


Oh ok thanks! I didn't know that, I've used it for other stuff but mainly third party software. I've never tried to block this.


I feel the same way about this as I do with the whole NSA clustfuck: If I had access to my own data and could do what I wanted with it, I'd be fine with it.


Ok I understand the technical considerations here. But really what is the risk surface for me here as a dumb end user who uses apps from the store and a few things off homebrew and not a lot else? I mean I've got a large pile of Apple crap sitting here. Is this even remotely worrisome enough to shift it and move to something else? The CSAM thing probably was. This? I don't know.

(I could probably do everything I need to do on Linux - I just don't want to)


tl;dr: Apple can see what apps (or at least what app developers' apps) are installed by a specific IP address through OCSP. It used to be unencrypted and therefore visible to the internet, it is now only visible to Apple. Apple said they don't keep logs, but theoretically they could if they wanted to.


IIRC this request happened every time you open the app, not just at install time. So Apple has a log of (IP, AppID, TimeOpened).


It's really quite unfortunate how much of Apple software is designed around "privacy is when you trust Apple" :/


Exactly. What does privacy even mean when your entire digital existence is owned by and visible to a single entity?


Some of it is self-serving and some is explainable by the deep and pervasive tension between (security / privacy / autonomy), and usability.


I generally do not impute malice to it, but it is in many cases the lazy option out.


Ultimately, whoever controls operating system updates always has control over your privacy. Even if Apple did offer perfect privacy, there's no reason an update couldn't completely change that.

Bluntly: if you don't trust your OS vendor, then you can't use OS updates. There are people in this category but it's a lot of work.

Much easier to trust your OS vendor (at least to this extent).


> Ultimately, whoever controls operating system updates always has control over your privacy

This is not true. FLOSS (and reproducible builds) allows the community to verify the code and significantly (though not fully) decrease the trust to the OS.


Not really they are moving into homomorphic encryption where the entire query and processing is encrypted and Apple has no knowledge of the what you actually requested.


Completely unclear how much they're moving into homomorphic encryption. The only resource I'm available to find about it is an announcement from 30 July saying that they can now do caller ID lookup using homomorphic encryption and they've announced an SDK that developers can use to leverage it. But the announcement is so vague that it's entirely unclear how much this can actually be used for practical workloads. And, the idea that they're going to go all in on homomorphic encryption is speculative based on what Apple has revealed so far.

That's notable, as we're discussing a case where Apple said they would do something, and then not only didn't do it, but went out of their way to pretend that they never said they would.


Also, the homomorphic encryption is a requirement for third-party caller ID providers, not Apple themselves. Apple's first-party "Contact Photos" caller ID feature operates primarily on the "trust Apple" security model AFAIK.


I'm not aware of any other company of Apple's size (or anywhere approaching) that have been as committed to privacy tech. Of course they are not perfect and sometimes get it wrong but they constantly release new technologies that are furthering our privacy. Who else does it better?


It comes down to what you identify as privacy. Apple is commited to not give your data to any other company and keep it protected in their ecosystem. They'll sell access to you for ads, but only exposing your cohort to the advertiser.

From that lens, Google is also commited to never give your personal data (think Gmail content, Maps behaviour, pins etc) to other companies and keep it all in their ecosystem, for themselves only. Your data is their key advantage, the base of the ad empire, and they won't let another company run away with it.

If we call Apple privacy focused, Google also fits the bill, the question just falls down on whether we see Apple or Google as part of our intimate circle, within our private life. I assume you do for Apple but not for Google.


There is no serious person that could think that Google is a privacy focused company. Their entire business is founded on knowing everything about their users. It's an ad company. They need user data to function and they will never release tech that compromises their business. Just look at the direction of ad blocking and chrome to see where they are headed.


The Apple side is the similar: their current entire business is to middle man your relationship with other companies. You buy Apple products, purchase and subscribe to apps and services from the App store, use Apple Cloud, etc.

They need you in their ecosystem, the same way Google needs you in theirs.

And I totally agree with you, I wouldn't't call Google privacy focused, and I don't call Apple privacy focused either, even as they market it harder than anyone else.


Google is a privacy antagonist. Apple is privacy focused because it suits their business. Apple has been privacy focused for years and has built several technologies to prove it. It's not hollow marketing to build privacy software.


I don't define "privacy" as "only a single company has access to all my stuff", so to me Apple's claims are just marketing. I'd buy an argument about good security and some protection against other companies, just not "privacy".


Google is a "privacy antagonist" with an Open Source OS you can build locally and modify to your heart's content? And Apple's been privacy focused, suing security researchers for copyright violation when they try to analyze iOS?

Methinks you're holding a double standard. Compared to Android and Linux, Apple's "promise" is no better than the one Microsoft offers Bitlocker customers.


These companies shouldn't be graded on a curve. Everyone knows Microsoft is crap for privacy. But Apple has their reality distortion field, and it's important to show people that their privacy promises are BS.


Okay but from an evolutionary sense which company should we be supporting. The one company that is somewhat moving towards privacy or the 10 others that don't give a shit. Which one should survive. Would you like to see companies that copy Apple's privacy approach or Facebook's dumb fucks approach.


I was unaware there exists a fully homomorphic encryption scheme that has the right trade offs between security and computational effort to make this economically viable for even moderate to small workloads.

I’ve always thought it was either far too time or far too space intensive to be practical.

Do you have sources on this, either from Apple or academic papers of the scheme they’re planning on using?


They posted about this recently [0][1]. They are using Homomorphic Encryption in iOS 18 for Live Caller ID Lookups.

[0] https://www.swift.org/blog/announcing-swift-homomorphic-encr...

[1] https://news.ycombinator.com/item?id=41111129


I've posted about this above a little after you did. Reading the article, I'm unable to determine whether or not this has any practical utility outside of niche applications or if it has the potential to be broadly useful. Has anyone reviewed the SDK that can render an opinion?


Homomorphic encryption is broadly useful and in fact should be ubiquitous for remote computation that leaks private data (not to comment specifically on Apple's implementation). They did open source it though, which gives you an idea that they want others to follow.


Can you point to other ways this is used or is intended to be used?


It's useful for situations that would otherwise be illegal, so that tradeoffs are less relevant.


Following through on a public privacy promise does not require R&D.


[flagged]


> You have to trust Apple and if you don't then buy another computer.

No, I don't have to trust Apple or anyone absolutely, and I can and do use a Mac, as the least bad option for my purposes, without fully trusting Apple. You're presenting a false dichotomy. Trust is a matter of degree, not all or nothing. In general I trust my mother (except her advice, which I often ignore), but if she insisted that she had to install cameras in my home for my "safety", I would start to have serious doubts about her.

> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

1) They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch and only install when I'm ready.

2) Open source is not a panacea, as demonstrated by the XZ Utils backdoor.

Moreover, closed source does not render users helpless. Closed source software can be reverse engineered; indeed, I've done that myself with macOS many times. The behavior of closed source software can also be observed in various ways with various tools, for example, the aforementioned Little Snitch. And even when macOS bypassed Little Snitch one time a few years ago, that was detected by developers. The whole world is watching Apple closely, so the company can't simply get away with whatever they want undetected.

> The whole point of their privacy stance is to protect you from companies like Google and rogue governments who we already know can't be fully trusted

We already know that Apple can't be fully trusted.

We also know that Apple actually caters to "rogue" governments such as China, by removing apps from the App Store at the Chinese government's request, as well as handing over control of iCloud servers to China, and reportedly Apple source code too, for inspection.


> 1) They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch

They can because they (Apple) run in kernel space and Little Snitch no longer does/can. So LS, god love it, only works because Apple lets it. If Apple wanted to they could push updates to your OS without LS knowing about it.


> They can because they (Apple) run in kernel space and Little Snitch no longer does/can. So LS, god love it, only works because Apple lets it.

I already addressed this later in my comment: "And even when macOS bypassed Little Snitch one time a few years ago, that was detected by developers. The whole world is watching Apple closely, so the company can't simply get away with whatever they want undetected."

> If Apple wanted to they could push updates to your OS without LS knowing about it.

They could release an update that allows later updates to bypass Little Snitch, but they can't magically make that ability retroactive and get it on a Mac that doesn't currently have that update installed.


>They could release an update that allows later updates to bypass Little Snitch

Or the ability could be there already without us knowing.

But fortunately we can still run our own routers/switches and see outgoing traffic. If you configure LS to block everything then you could confirm it with your network gear.


> But fortunately we can still run our own routers/switches and see outgoing traffic.

Which is useless if they simply encrypt the data before sending it over SSL.

Then you will never know what they are sending to their servers.


>Which is useless if they simply encrypt the data before sending it over SSL.

Not entirely useless, you'd still know they were sending something and it would be proof they could bypass Little Snitch.


> Or the ability could be there already without us knowing.

An empirically baseless conspiracy theory.

> But fortunately we can still run our own routers/switches and see outgoing traffic. If you configure LS to block everything then you could confirm it with your network gear.

That was my point. Apple isn't actually capable of completely avoiding detection.

Anyway, just as I don't trust Apple absolutely, I don't distrust Apple absolutely either. They do some user-hostile things, which are richly deserving of criticism, but I prefer to engage with facts and evidence rather than idle speculation and paranoia.


>An empirically baseless conspiracy theory.

You misunderstand I think. I don't think it's likely. It would be easy to catch them. I also don't distrust Apple completely. I was (pedantically) replying to your comment that "They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch". "Can't" is too strong. They could is what I'm saying, as in, they have the ability/possibility to do so. They can't time travel, because that's impossible. But they can establish network connections, because they control the firmware and kernel, which exceeds your control via Little Snitch.


> I was (pedantically) replying to your comment

Please don't be a pedant. It's not appreciated, and it only makes the conversation worse.


If you worry about Apple, then they thought you were a true adversary, they could install some code only for you. You really have no idea.


I really have no idea what you're saying.


and critically, the suspicion is that they handed off the keys for iMessage encryption used inside China. And if they did it for love of money for China, would they really not do it for the right US government inquiry?

Google's better in that they chose not to do business in China the same way. For now.


> by removing apps from the App Store at the Chinese government's request

So they follow the law in the market they’re operating in? Holy shit, how dare they.


I'm not sure why you're replying to me with a sarcastic response when I was merely disputing the notion that "The whole point of their privacy stance is to protect you from companies like Google and rogue governments".

On the other hand, the issue is not as simple as Apple following the law, because Apple chose to lock down iPhone and set themselves up as the sole gatekeeper for app installations. On the Mac, which allows distribution from outside the App Store, it's not possible to completely ban apps in this way.


It depends, they cried much harder for the DMA in the EU and still aren't really fully compliant. In China, they were quite okay to throw citizens under the bus without much complaints though.


Because they assume the EU would be weaker and more malleable to US influence than China and the CCP.

US companies keep trying to impose local US rules, policies and way of thinking of their HQ whenever they operate abroad, especially in Europe, but totally do an 180 when they operate in China.


> Because they assume the EU would be weaker and more malleable to US influence than China and the CCP.

That doesn't really fit, because the course of action would be the same in both cases. If the CCP says they have to ban apps then enable side loading or third party stores so customers in China can install the banned apps from another source but the company can still feign compliance. This is the same thing the EU says they have to do anyway, except that it actually defeats the onerous regulation in China whereas in the EU the regulation is intended to benefit rather than oppress the user and Apple objects to it because they're the one wearing the boot.

The explanation that fits is that they care about their own control (and so fight the EU) but don't care as much about China oppressing their customers (and so bend the knee there).

> US companies keep trying to impose local US rules, policies and way of thinking of their HQ whenever they operate abroad, especially in Europe, but totally do an 180 when they operate in China.

In general company leaders should try to have morals and use their influence to push back against rules from governments trying to harm their people. Obviously publicly-traded companies have a perverse tendency to do the opposite of this once they're controlled by Wall St rather than the original founders.


Following the laws of a market you choose to operate in does not absolve you of your actions - just ask IBM.


> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

We know this isn't true because the contrary to this is how we know this is happening. Even when the OS is closed source, someone can still inspect its network traffic or get it running in a virtual machine and see what it's doing, and then publish the result if they find something untoward.

But once your data has been exfiltrated onto their servers, you no longer have any way of auditing what they do with it after that. So the only way to have any assurance of your own privacy is to call any behavior that exfiltrates your data a violation of trust.

It might be easier to audit open source code than closed, and maybe then we should be demanding open source operating systems, but there is still a large difference between "this is difficult but possible and so someone in the world could do it and let everyone else know" and "there is no mechanism for members of the public to validate their claims at all".


Trusting Apple not to break the computer I paid them $1k+ for is very different from trusting Apple to not hoover up and sell my personal data (or let it get stolen).


> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

I don't believe such things would be undetectable. If that were the case, how would we have discovered the OCSP server?

Another point I don't understand is that, if it were made open-source, wouldn't Apple have the same level of control? After all, they'd still be the ones building the binaries. As another commentor noted, the XZ backdoor proved that even open-source software shouldn't be blindly trusted.


> If that were the case, how would we have discovered the OCSP server

Because Apple is not trying to obfuscate anything.

If they did you would never have discovered it.


I regularly discover things that Apple tries to obfuscate.


Such as?


Unreleased products for example


You don't think anyone would have ever analyzed the network traffic coming out of a Mac Book with Wireshark?


> and rogue governments

As opposed to the orderly and just governments, that are so clearly-defined and well trusted?

Or are you more highlighting the unlawful conduct from businesses like NGO Group that are so shamefully allowed to persist without government scrutiny? It's hard for me to tell, since both of them have hacked iPhones to attain privileged access and will likely do it again. From where I'm standing it looks like their "promise" has more holes in it than swiss cheese.


Shameful behavior by Apple.


What are Apple up to? Very long term?


Selling that data.


>Selling that data.

To whom? And to what aim? They're not exactly short of money.


To the highest bidder apparently.

> They're not exactly short of money.

Shareholders are not satisfied (and never will).


Honestly I can imagine the preference being axed when OSCP is the macOS antivirus and I'm pretty sure I know what the first thing any malicious software is gonna do if it's able to be turned off.

macOS preferences aren't magically locked away from the rest of system, regular users can change their own user preferences, and root can change system preferences. An antivirus has to still work against an attacker who has root. It's why you can't block certain apps/domains from the firewall as well.

You could put the preference in recovery mode along with disabling SIP and I think that would accomplish everyone's goals.


> OSCP is the macOS antivirus

It's not. There are multiple layers of security, including notarization and XProtect.

> I'm pretty sure I know what the first thing any malicious software is gonna do.

What?

> macOS preferences aren't magically locked away from the rest of system, regular users can change their own user preferences, and root can change system preferences. An antivirus has to still work against an attacker who has root.

You sound confused about admin vs. root. Anyway, if you have a local attacker running on your Mac, then it's already too late for OCSP.


Those aren't so much layers as the different parts, and OSCP is how notarization actually provides security, the developer cert is what's being checked to run the app.

This scheme isn't really designed to prevent attacks so to speak, it's to stop the malicious software on everyone's system all at once. You can say theoretically it's too late because the software is already doing what it's doing and you should consider the machine compromised. But the rubber meets the road for regular users who aren't going to wipe their machines in response to malware and so this lets you purge it.

I don't think I'm confused about admin vs root. I'm talking about the System Administrator user, The oops I ran malware with sudo user, uid 0, the user who is only constrained by the kernel via SIP. But yeah, admin is the more common use-case for regular users and I'm not sure the point you're making, malware can get admin and if you put the preference somewhere changeable by admin then welp. You could put the preference in recovery mode same as SIP, I think that would be fine.


> Those aren't so much layers as the different parts

Mkay.

> OSCP is how notarization actually works, that's what's being checked to validate the notarization.

No, you are misinformed. OCSP is checked by the trustd process on ocsp2.apple.com, whereas notarization is checked by the syspolicyd process on api.apple-cloudkit.com.

OCSP is simply checking whether the Developer ID certificate has been revoked. Notarization, on the other hand, requires uploading a build to Apple and receiving a special notarization ticket. The notarization ticket is either "stapled" to the app or downloaded from Apple when the app is first launched.


> Mkay

Well they're not. What would you call it? Windows Defender and Microsoft's code signing requirement aren't super related. You could purge discovered malware with a signature/scan but that's not impossible to get around.

I'm not sure I really grok the difference from a security perspective when the main thing with notarization is ensuring it's signed with your developer cert.

I guess the Venn diagram isn't technically a circle but is it not that the actual security of notarization is provided by OSCP? I suppose I could have phrased that bit better.

Is there a case where a hypothetical notarization process that excludes that bit provides any real security? Because Apple "scanning it for malware" isn't going to be that different from Xprotect.

I'm really not sure what I did to get such an, idk hostile? response.


> is it not that the actual security of notarization is provided by OSCP?

The security of notarization is provided by Apple's signature over the hashes of the executables in the app [0]. The hashes and signature are put into a "ticket". This ticket is stored on Apple's servers, and can also be "stapled" to the app. Gatekeeper (one of the macOS security systems) will prefer to fetch the ticket from Apple if possible, and fall back to the stapled ticket if available. Notarization is meant to guarantee that the code was sent to Apple and checked for malicious code.

OCSP checks that the Apple Developer ID certificate used to sign the app hasn't been revoked.

They are two separate checks done by the Gatekeeper system, which is meant to ensure that only trusted software runs on macOS. I believe it makes sense to call the OCSP check part of the Gatekeeper system, but this may be incorrect.

[0]: https://forums.developer.apple.com/forums/thread/710738


> the main thing with notarization is ensuring it's signed with your developer cert.

It's not the main thing.

> is it not that the actual security of notarization is provided by OSCP?

No.

I tried to explain the difference in my previous reply, but I'm not going to sit here and write an entire essay on the subject (though I could). The information is out there, for example on developer.apple.com. Or even on my own website. Inform yourself, or at least stop spouting falsehoods.


Oh god, don’t send people to developer.apple.com. It’s apple’s worst product.

I’d rather you shill your own blog posts. Even without reading them, I know they are better. ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: