Hacker News new | past | comments | ask | show | jobs | submit login

Electron co-maintainer here, so I'm a bit biased.

1) We should absolutely work towards allowing developers to sign their JavaScript.

2) Re-packaging apps and including some menacing component as a threat vector isn't really all that unique. We should ensure that you can sign "the whole" app, but once we've done that, an attacker could still take the whole thing, modify or add code, and repackage. We sadly know that getting Windows SmartScreen and macOS to accept a code signature doesn't necessarily require exposing your identity and I'd _suggest_ that most people don't _actually_ check who've signed their code.

3) If you ship your app as a setup bundle (say, an AppSetup.exe, an App.dmg, or rpm/deb files), you should code-sign the whole thing, which completely sidesteps this issue. The same is true if you use the Mac App Store, Windows Store, or Snapcraft Store.




> 1) We should absolutely work towards allowing developers to sign their JavaScript.

I've already been working on this for my own projects. It might be something that can be generalized for all Electron projects.

https://github.com/soatok/libvalence

https://github.com/soatok/valence-updateserver

https://github.com/soatok/valence-devtools

This uses Ed25519 signatures and an append-only cryptographic ledger to provide secure code delivery. The only piece it's currently missing is reproducible builds.

For greater context: https://defuse.ca/triangle-of-secure-code-delivery.htm


I think you need OS codesigning integration for this threat model. Otherwise whatever special app runtime check code you add just gets removed by the malicious overwrite of your app code.


I'm just doing this for secure updates, so that malware doesn't get delivered through the update mechanism. For precedent, see https://core.trac.wordpress.org/ticket/39309

It isn't meant to mitigate a compromised endpoint.


I don't think it's correct that 3) sidesteps the issue, if I'm understanding it. Electron App is installed via a codesigned setup bundle. Then Malicious App runs on the machine later and overwrites your ASAR. The OS doesn't complain because the ASAR isn't receiving codesigning protection, and Electron App has been backdoored in a way that the system's use of codesigning suggests wouldn't be possible.


If you're already running code on the victim's machine, presumably with sudo rights to change `/Applications`, you've already hit the jackpot. Yes, you can change apps, but if you're the victim, that's _probably_ not the biggest issue. It's the rootkit on your machine.


This (FS write access == game over) is usually true on Linux, but the Mac and Windows codesigning infrastructures exist to offer some protections and user warnings in this case, and they're what's being defeated by this attack.


With FS access you can just strip the signature entirely and it’ll run without any fuss. In this case it’s the machine that’s compromised, not the app.


OSX is getting rid of the ability to run unsigned kernel extensions pretty soon. Compiled off the shelf RATs are usually lit up pretty well by modern AV as can be seen by Virustotal results. And a noisy python/ruby/whatever executable on the marketing persons computer would raise a few eyebrows in some organizations. Slack/Discord on the other hand...


3) is not a valid protection on macOS once the application is copied away from the signed DMG (which is then discarded).

macOS code signing does not extend to Contents/Resources/ which, unfortunately, is where — without exception — every application on my system stores 'electron.asar'.

    /Applications/VMware Fusion.app/Contents/Library/VMware Fusion Applications Menu.app/Contents/Resources/electron.asar
    /Applications/balenaEtcher.app/Contents/Resources/electron.asar
    /Applications/itch.app/Contents/Resources/electron.asar
    /Applications/lghub.app/Contents/Resources/electron.asar
    /Applications/Boxy SVG.app/Contents/Resources/electron.asar
    /Applications/Slack.app/Contents/Resources/electron.asar
    /Applications/Discord.app/Contents/Resources/electron.asar


This response from elsewhere [1] seems relevant:

> Here's the thing with how gatekeeper works, that application had already passed gatekeeper and will never be _fully_ validated ever again.

> If you zipped your modified Slack.app up, uploaded it to google drive, and downloaded it again. Gatekeeper would 100% reject that application, the ASAR file is included as part of the application signature. You can prove this by checking the "CodeResources" file in the apps signature.

> You can't re-distribute the app without gatekeeper completely shutting you down.

[1]: https://news.ycombinator.com/item?id=20637738


Hooray! I am glad to be wrong. For others looking to test this,

    $ codesign -dv /Applications/xyz.app
    ...
    Sealed Resources version=2 rules=13 files=122
    ...
For version=2, all resources are signed.


Electron developer here. I work on this project:

https://getpolarized.io/

We ship binaries for MacOS, Linux and Windows. ALL our binaries are signed. You're INSANE if you don't do it. It's still a MAJOR pain though and wish it was a lot easier.

If ANYTHING what we need to do is make it easier for MacOS and Windows developers to ship code signed binaries.

It took me about 2-3 weeks of time to actually get them shipped. Code signing is an very difficult to setup and while Electron tries to make it easy it's still rather frustrating.

The biggest threat to Electron is the configuration of the app and permissions like disabling web security. If you're making silly decisions you might be able to get Electron to do privilege escalation.


Can confirm. It took the better part of a month to get both windows and mac code signing certificates provisioned for PhotoStructure.

The diligence applied for both platforms at least exceeded pure security theater. They actually did a modicum of effort to ensure I was who I said I was, but it wasn't much. It just took a lot of wall time.


which is really weird. a let's encrypt approach to validate ownership of a domain should be sufficient. if the app is from a domain you trust that should be enough for most apps. bonus checks for high-risk applications (banking/LoB etc)


I don't think it's analogous.

If you need a certificate to prove you own a domain, changing DNS TXT records for that domain, or serving a secret, from that domain, proves you own the domain.

If I need a certificate that proves I am the corporate entity on some signature, say, "PhotoStructure, Inc.", there isn't some magick TXT record I can add that uniquely identifies me as the owner of that business.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: