Hacker News new | past | comments | ask | show | jobs | submit login

I feel like the headline is a bit click-baity but I don't want to jump to conclusions.

> Tsakalidis said that in order to make modifications to Electron apps, local access is needed, so remote attacks to modify Electron apps aren't (currently) a threat. But attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified.

So the issue is that Electron app distributions dont include a signed integrity check so there's no way for end-users to detect if they got a modified version. I thought that the MacOS builds did do this, but maybe the ASAR bundles aren't included in the hash, or maybe I'm wrong entirely.

I assume the a solution would store the signing pubkey on initial install and then check updates against that. The only way the signing key could be checked other than trust-on-first-install would be through some kind of registry, which is what I assume the Windows and Mac stores are geared toward. Am I correct on all this?

EDIT: Either way, it seems like the solution is to only use the projects' official distribution channels. Signed integrity checks would be useful but probably not change the situation that dramatically. Is that accurate?




I'm still trying to figure it out too:

> I thought that the MacOS builds did do this, but maybe the ASAR bundles aren't included in the hash?

Yeah, I think that's the problem they're describing. It sounds like the Mac setup will require binaries -- like the Electron runtime itself -- to be codesigned, but if the first thing your codesigned binary does is to read an unprotected JS file off disk and execute it, there's no codesigning benefit.

> Either way, I assume the a solution would store the signing pubkey on initial install and then check updates against that

Not just updates that you initiate yourself, though -- I think the idea is that any other app on the system could backdoor the JS in the ASAR at any time. That's pretty hard to defend against.


Hey Electron maintainer here

> but if the first thing your codesigned binary does is to read an unprotected JS file off disk and execute it, there's no codesigning benefit.

The ASAR files described in this post are part of the signature of the application though. You can't modify that file and then redistribute the app to another machine without gatekeeper getting incredibly angry at you.

E.g. Try modifying the ASAR file, zip the app up, upload to google drive, download again and try run the app. Gatekeeper will boot it into the shadow realm :)


> Not just updates that you initiate yourself, though -- I think the idea is that any other app on the system could backdoor the JS in the ASAR at any time. That's pretty hard to defend against.

Good point, but if the attacker has filesystem access you're already hosed. I suppose there could be some other risk where the ASAR could be modified without full FS access? But I'd want to know what that attack is, if that's the case.


> If the attacker has filesystem access you're already hosed.

I think that's not supposed to be true in modern (e.g. latest macOS) threat models. App Y isn't permitted to just replace App X unannounced, and on both Mac and Win there's a large codesigning infrastructure in place to provide that protection.


Also, sandboxing is designed to prevent unfettered filesystem access on macOS, meaning this isn’t part of the threat model if all apps are sandboxed and packaged.


What makes this unique to electron as opposed to any other application that doesn't run as a completely closed binary (not that binaries can't be backdoored, of course)?


On macOS, if my understanding of the current situation is correct, code signing normally covers all binaries in an application bundle, including binaries in all bundled frameworks. What's different about Electron is that it puts application code, which is not a binary, into the Resources/ directory, which is not signed.

I just tried this out with Slack on macOS, and it did work... almost as advertised. I had to use sudo to change the application files, which means this isn't really much of a novel attack surface, but it did bypass the code signing checks quite handily.

So, is this a "vulnerability"? That may be a stretch, as far as I can see, but putting application code in Resources/ definitely counts as a "smell" in my book.


Hi, Electron maintainer here.

> I just tried this out with Slack on macOS, and it did work

Here's the thing with how gatekeeper works, that application had already passed gatekeeper and will never be _fully_ validated ever again.

If you zipped your modified Slack.app up, uploaded it to google drive, and downloaded it again. Gatekeeper would 100% reject that application, the ASAR file is included as part of the application signature. You can prove this by checking the "CodeResources" file in the apps signature.

You can't re-distribute the app without gatekeeper completely shutting you down.


Thanks for taking the time to reply! Like many here, I've been a critic of Electron, but I think it also does some amazing stuff, and I'm sorry you have to go into PR maintenance mode over such a weaksauce article.

I was coming back to follow up and say that that's exactly what I found -- running `codesign --verify` does show the modification. It makes sense that Gatekeeper wouldn't re-verify a 185 MB bundle on every launch, which makes me wonder if there's something else macOS could be doing at the FS level to see if any files have been modified and trigger a new check.

At any rate, while I don't quite take back what I said about application code in Resources/, I do take back the implication that it had anything to do with this; I suppose there doesn't seem to be anything Electron-specific about TFA, other than that exposing raw JS lowers the bar for who can write the code to inject. (Assuming you can get FS access to inject code in the first place, of course.)


if you have malicious code executing that has the access and opportunity to modify files, a modified electron app is likely just the beginning of your troubles.


Developers on Windows, in this scenario, can generate a catalog of all files in their app and sign that/verify that at runtime [1], negating the need to rely on upstream to incorporate signature support into the asar file spec. There may be workable equivalents on macOS and Linux.

[1] https://docs.microsoft.com/en-us/windows-hardware/drivers/in...

But this will all be in vain if the attacker scenario includes unfettered file-system access. (They can modify the app to not perform these checks, for example.)




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: