
Basic Electron Framework Exploitation - adambb
https://www.contextis.com/en/blog/basic-electron-framework-exploitation
======
dvt
This is clickbait nonsense. Unfortunately, because it's so popular to hate on
Electron these days, it's going to get a lot of traction on HN and elsewhere.
The premise of the blog post is:

> It’s important to note that this technique requires access to the machine,
> which could either be a shell or physical access to it

I mean... what? I can literally do code injection on (almost) _any_
application I'm running given that I have shell or physical access to the
machine. It's like the author never heard of Detours[1] or VTable
injection[2]. This is a low-effort clickbaity post that brings nothing to the
table to serious security researchers or even hobbyist hackers.

It's a shame, too, because there are a lot of very interesting techniques out
there for injection and remote execution, but they are OS-dependent and
require a lot of research. Clearly, a more interesting post would have been
too much effort for OP and instead we're going to pile on Electron.

PS: ASAR code-signing is not fool-proof, as we can still do in-memory
patching, etc. Game hackers have been patching (signed) OpenGL and DirectX
drivers for decades. It's a very common technique.

[1] [https://www.microsoft.com/en-
us/research/project/detours/](https://www.microsoft.com/en-
us/research/project/detours/)

[2] [https://defuse.ca/exploiting-cpp-
vtables.htm](https://defuse.ca/exploiting-cpp-vtables.htm)

~~~
pimterry
I think its been exacerbated significantly by the reporting elsewhere:
[https://arstechnica.com/information-
technology/2019/08/skype...](https://arstechnica.com/information-
technology/2019/08/skype-slack-other-electron-based-apps-can-be-easily-
backdoored/)

Notably, according to that Ars Technica coverage:

> attackers could backdoor applications and then redistribute them, and the
> modified applications would be unlikely to trigger warnings—since their
> digital signature is not modified

That isn't in a claim in the original post, and doesn't seem to be true
afaict: every distribution mechanism I can think of signs the entire
distributable, so you really can't just modify the ASAR without breaking the
signature. Windows & macOS both require you to only install from signed
application bundles/installers (or at least they make it very difficult for
you to use unsigned software). On Linux you could get caught out, but only if
you download and install software with no signing/verification whatsoever, and
that's a whole other can of worms.

If that claim were true this would be a bigger concern, but given that it's
not I'm inclined to agree this is basically nonsense.

~~~
FreakLegion
_every distribution mechanism I can think of signs the entire distributable,
so you really can 't just modify the ASAR without breaking the signature.
Windows & macOS both require you to only install from signed application
bundles/installers (or at least they make it very difficult for you to use
unsigned software)_

Only drivers have to be signed on Windows, and even then not all kinds until
Windows 8. Also many apps, including Visual Studio Code, are available in 'run
from USB' form, so there's no installer, just an archive you unpack and run.
Those archives can be modified and redistributed without invalidating any of
the PE signatures within, but since nobody pays attention to these signatures
anyway and Windows doesn't enforce them, yeah, this is typical Black Hat-week
PR nonsense.

~~~
dvt
> Only drivers have to be signed on Windows

This is half-true.

Windows and macOS both make it difficult to install self-signed (or unsigned)
software. For example, I made [http://www.lofi.rocks](http://www.lofi.rocks)
(an open source Electron-based music player) and I'm not going to spend like a
few hundred bucks a year to have a non-self-signed cert. This makes both macOS
and Windows complain when users install the app. More draconian practices
(that "protect users from themselves") will make it even harder for
independent open source devs like me to share cool projects with a wide
audience.

~~~
voltagex_
[https://www.certum.eu/en/cert_offer_en_open_source_cs/](https://www.certum.eu/en/cert_offer_en_open_source_cs/)
free - although you have to submit a worrying amount of personal
identification.

------
felixrieseberg
Electron co-maintainer here, so I'm a bit biased.

1) We should absolutely work towards allowing developers to sign their
JavaScript.

2) Re-packaging apps and including some menacing component as a threat vector
isn't really all that unique. We should ensure that you can sign "the whole"
app, but once we've done that, an attacker could still take the whole thing,
modify or add code, and repackage. We sadly know that getting Windows
SmartScreen and macOS to accept a code signature doesn't necessarily require
exposing your identity and I'd _suggest_ that most people don't _actually_
check who've signed their code.

3) If you ship your app as a setup bundle (say, an AppSetup.exe, an App.dmg,
or rpm/deb files), you should code-sign the whole thing, which completely
sidesteps this issue. The same is true if you use the Mac App Store, Windows
Store, or Snapcraft Store.

~~~
some_furry
> 1) We should absolutely work towards allowing developers to sign their
> JavaScript.

I've already been working on this for my own projects. It might be something
that can be generalized for all Electron projects.

[https://github.com/soatok/libvalence](https://github.com/soatok/libvalence)

[https://github.com/soatok/valence-
updateserver](https://github.com/soatok/valence-updateserver)

[https://github.com/soatok/valence-
devtools](https://github.com/soatok/valence-devtools)

This uses Ed25519 signatures and an append-only cryptographic ledger to
provide secure code delivery. The only piece it's currently missing is
reproducible builds.

For greater context: [https://defuse.ca/triangle-of-secure-code-
delivery.htm](https://defuse.ca/triangle-of-secure-code-delivery.htm)

~~~
cjbprime
I think you need OS codesigning integration for this threat model. Otherwise
whatever special app runtime check code you add just gets removed by the
malicious overwrite of your app code.

~~~
some_furry
I'm just doing this for secure updates, so that malware doesn't get delivered
through the update mechanism. For precedent, see
[https://core.trac.wordpress.org/ticket/39309](https://core.trac.wordpress.org/ticket/39309)

It isn't meant to mitigate a compromised endpoint.

------
pfraze
I feel like the headline is a bit click-baity but I don't want to jump to
conclusions.

> Tsakalidis said that in order to make modifications to Electron apps, local
> access is needed, so remote attacks to modify Electron apps aren't
> (currently) a threat. But attackers could backdoor applications and then
> redistribute them, and the modified applications would be unlikely to
> trigger warnings—since their digital signature is not modified.

So the issue is that Electron app distributions dont include a signed
integrity check so there's no way for end-users to detect if they got a
modified version. I thought that the MacOS builds did do this, but maybe the
ASAR bundles aren't included in the hash, or maybe I'm wrong entirely.

I assume the a solution would store the signing pubkey on initial install and
then check updates against that. The only way the signing key could be checked
other than trust-on-first-install would be through some kind of registry,
which is what I assume the Windows and Mac stores are geared toward. Am I
correct on all this?

EDIT: Either way, it seems like the solution is to only use the projects'
official distribution channels. Signed integrity checks would be useful but
probably not change the situation that dramatically. Is that accurate?

~~~
cjbprime
I'm still trying to figure it out too:

> I thought that the MacOS builds did do this, but maybe the ASAR bundles
> aren't included in the hash?

Yeah, I think that's the problem they're describing. It sounds like the Mac
setup will require binaries -- like the Electron runtime itself -- to be
codesigned, but if the first thing your codesigned binary does is to read an
unprotected JS file off disk and execute it, there's no codesigning benefit.

> Either way, I assume the a solution would store the signing pubkey on
> initial install and then check updates against that

Not just updates that you initiate yourself, though -- I think the idea is
that any other app on the system could backdoor the JS in the ASAR at any
time. That's pretty hard to defend against.

~~~
pfraze
> Not just updates that you initiate yourself, though -- I think the idea is
> that any other app on the system could backdoor the JS in the ASAR at any
> time. That's pretty hard to defend against.

Good point, but if the attacker has filesystem access you're already hosed. I
suppose there could be some other risk where the ASAR could be modified
without full FS access? But I'd want to know what that attack is, if that's
the case.

~~~
cjbprime
> If the attacker has filesystem access you're already hosed.

I think that's not supposed to be true in modern (e.g. latest macOS) threat
models. App Y isn't permitted to just replace App X unannounced, and on both
Mac and Win there's a large codesigning infrastructure in place to provide
that protection.

~~~
applecrazy
Also, sandboxing is designed to prevent unfettered filesystem access on macOS,
meaning this isn’t part of the threat model if all apps are sandboxed and
packaged.

------
Rotten194
I don't see why this is a big deal -- a native app can also be distributed
with malicious patches or dlls, and those are common methods for e.g. game
modding and cracking. If you're worried about the integrity of a program,
check the hash.

------
efficax
If you can write to my binaries, you can do anything you want to me. Boring.

------
saagarjha
> The problem lies in the fact that Electron ASAR files themselves are not
> encrypted or signed

Resources on macOS get signed as part of the application bundle. I wonder why
this isn't possible for Electron apps as well.

~~~
marshallofsound
Hi Electron maintainer here

ASAR files are signed as part of the application bundle. The issue is that
folks don't understand how gatekeeper works so let me try explain it here.

When you download an application from the internet, macOS initially considers
it "quarantined". When a quarantined application is first opened gatekeeper
scans it _completely_ and if it's happy removes the quarantine tag and let's
it launch.

Once that quarantine tag is removed, gatekeeper will _never_ run a complete
check of that application again. Meaning the ASAR files are validated once,
when the application is first launched.

What people are seeing here is they're taking an application that gatekeeper
has already signed off on, modifying it, and then asking why gatekeeper didn't
stop them.

If you took that modified application, zipped it up, uploaded it somewhere,
downloaded it again and tried to run it, it would NOT work. Gatekeeper would
boot that invalid application to the shadow realm.

~~~
ilikehurdles
So this sounds like a non-issue -- or at least not a new or novel one. How did
this get published so far and wide?

~~~
mceachen
How does _any_ nonsense get published far and wide?

People are trying to be helpful, perhaps, by amplifying some concern, while at
the same time not having the expertise necessary to see it as false.

------
f00b4r666
This article seems a bit clickbait-y considering this means that you'd have to
download the application from an untrusted source for this "exploit" to be
taken advantage of. The same could be said for most applications if people
aren't checking that the hashes match.

I feel like this will get a ton of discussion here anyway due to the Electron
hate train.

~~~
blackflame7000
What if you installed it via a trusted source, and then someone swapped the
ASAR files without your knowledge? A flash-drive programmed to operate as a
keyboard could easily swap in a malicious file simply by plugging it into the
victim's computer when they aren't paying attention.

~~~
volkk
can't you do far worse if you're actually plugging in and running code from a
flash drive on someone's computer?

~~~
blackflame7000
Perhaps, depending on what sort of Anti-virus/Monitoring software is
installed. It would definitely leave a bigger trace to install, run, and
persist a malevolent executable than it is to hijack an already trusted one.
Like if you saw a random exe running in task manager you would be much more
paranoid than if you just saw slack.

I guess a better example might be if you have 2 admins on one computer and one
could edit the files in programs directory to spy on the other. This assumes
that only trusted executables are run by the victim (ie word) and you don't
have the ability to modify its source code to make it malicious.

------
hnbroseph
You could say the same about (for example) a Python-based QT app. Or any
scripting-language based application or framework.

It's also true to say something like "Rails can be back-doored by modifying
the code and redistributing it to unsuspecting developers!"

Actually, you could say the same or similar things about many applications,
including binary distributes. With some analysis you can figure out what
conditions a jump instruction is using, and modify it to always jump where you
want. Cheat Engine lets you analyse game memory at runtime and substantially
modify behavior.

------
davej
Here's the corresponding issue on Github:
[https://github.com/electron/asar/issues/123](https://github.com/electron/asar/issues/123)

As you can see from the issue, this exploit has been known for 2 years and
probably longer than that. As I said (November 2018) in the linked issue, I
believe it's only a matter of time before Skype/Slack/VSCode gets packaged up
with malicious code and flies under the radar of SmartScreen and Gatekeeper.
It probably won't be downloaded from the official websites but there are
plenty of other ways of distributing the software. I get the feeling that the
Electron team aren't taking it too seriously. I think this has the potential
for a really dangerous exploit.

My startup (ToDesktop[1]) uses Electron and I've put a huge effort into
securing certificates on HSMs (Hardware Security Modules). But it's mostly a
pointless exercise when a hacker can simply edit the javascript source.

[1] [https://www.todesktop.com/](https://www.todesktop.com/)

------
howlett
I wrote the original post. The main issue I was trying to highlight is that
you can make signed apps run your code from a local perspective. Here's a real
life scenario that happened :

I was doing a security assessment for a client, and after gaining foothold on
the host we needed to establish persistence. As the endpoint protection was
blocking anything non signed, I used slack to inject a powershell payload
that's executed on startup and gains us access back to the internal network.

So the risk is there, but not the individual user but the organisations using
it. I didn't expect this to become a big deal over "redistribution" but I
hoped for the command execution without modifying the binary.

Having said that, this can be solved with a simple integrity check of the asar
files. Sure, the attacker can modify the binary file too, but then it's not
signed anymore.

------
pimterry
I'm unclear about how this attack works. The article says:

> attackers could backdoor applications and then redistribute them

Most distribution mechanisms however ship a single signed bundle, containing &
thereby signing the entire application, including resources like ASARs. Any
that don't sign the application are of course vulnerable to all sorts of
trivial attacks (replace the whole binary with anything you like).

To make this a danger from a distribution POV, it seems you would need the
application to be _partly_ signed; i.e. the executable but not the included
resources. Where does that happen?

For macOS for example, all resources (including ASAR files) are signed, and
macOS makes it intentionally difficult to install anything that isn't signed.

Similarly for Windows you'll see large warnings if you open an unsigned
application; Electron apps are almost always distributed as a single signed
installer exe file, including the ASAR file.

On Linux it depends wildly, but most of the time either the entire package
(e.g. a deb from the official repos) is signed, or nothing is signed and
you're vulnerable regardless.

What am I missing?

(I'm not addressing the risk of altering an already-installed application -
that's a separate attack also mentioned, but requires local access to rewrite
files on the target machine, at which stage there's many other options)

EDIT: URL has now been updated, here I'm discussing points from
[https://arstechnica.com/information-
technology/2019/08/skype...](https://arstechnica.com/information-
technology/2019/08/skype-slack-other-electron-based-apps-can-be-easily-
backdoored/). The post now referenced doesn't mention redistribution, and I
suspect that in fact Ars is wrong, and allowing signed redistribution of
subverted versions isn't a real vulnerability here. I'd love to hear if I'm
wrong though!

~~~
seandougall
> For macOS for example, all resources (including ASAR files) are signed, and
> macOS makes it intentionally difficult to install anything that isn't
> signed.

I just tried this with Slack on macOS, and it launched without a single
complaint about code signing. It would appear that either the ASAR files are
not included in the signature, or the OS doesn't check the entire application
bundle on every launch.

(Edit: That said, I needed sudo to do the mod in the first place, so I'm not
about to start panicking about this as an attack vector.)

(Edit 2: As 'marshallofsound pointed out below and elsewhere, it is the latter
case; the OS doesn't check the entire bundle on every launch. Which makes
sense, and also means TFA is not really about Electron at all.)

~~~
marshallofsound
Hi Electron maintainer here, I explained how gatekeeper and asar validation
plays in with macOS codesigning here -->
[https://news.ycombinator.com/item?id=20637791](https://news.ycombinator.com/item?id=20637791)

------
thrax
This is pure FUD. Literally "hacker with her hands on your keyboard can
compromise your machine."

~~~
seandougall
"... but only if you give her sudo access"

------
barnson
The high-order bit is that if you install your apps to user-writable locations
in the file system, your app is vulnerable to any other app the user runs.
There's no reason Electron apps _can 't_ be installed to protected locations.
VSCode provides a "system installer" that does, for example (on Windows).
However, updates require elevation so to reduce friction, the per-user
installer is the recommended default for VSCode.

------
SamuelAdams
For those that do not read the article:

>Tsakalidis said that in order to make modifications to Electron apps, local
access is needed, so remote attacks to modify Electron apps aren't (currently)
a threat. But attackers could backdoor applications and then redistribute
them, and the modified applications would be unlikely to trigger
warnings—since their digital signature is not modified.

------
unnouinceput
Oh, I have power over my own applications? Thanks captain obvious. Those who
also read The Old New Thing know what I'm talking about

------
c-smile
Ideally HTML/CSS/script application shall be just single monolithic signed
executable.

But that's achievable only with Sciter :)

------
sctb
We've updated the link from [https://arstechnica.com/information-
technology/2019/08/skype...](https://arstechnica.com/information-
technology/2019/08/skype-slack-other-electron-based-apps-can-be-easily-
backdoored/), which points to this.

~~~
adambb
The Contextis site was overloaded at submission time (and is currently as
well)

Here is the Google Cache link
[https://webcache.googleusercontent.com/search?q=cache:xeIOGz...](https://webcache.googleusercontent.com/search?q=cache:xeIOGzxMftkJ:https://www.contextis.com/en/blog/basic-
electron-framework-exploitation+&cd=1&hl=en&ct=clnk&gl=us)

------
mavdi
Electron is the new Flash. Change my mind. ¯\\_(ツ)_/¯

~~~
bsmith0
It makes cross-platform desktop development much easier.

VSCode, Discord, and (the new) Slack are written in Electron, and those
absolutely excel, even at the cost of a bit extra memory usage.

There's a circlejerk of Electron hate, but there's a reason it's so popular,
the ease of development outweighs its weighty memory drawbacks for many
companies and individuals.

Edit: Not to mention that it uses Node.js, HTML, CSS, so moving from web-apps
=> desktop becomes a much simpler endeavor.

~~~
packet_nerd
Slack excels? In my experience it's really slow and clunky and consumes way
more memory than it should.

As the sysadmin at my company I ban all electron apps unless it's clear they
are exceptionally well written and/or there are absolutely no alternatives.
VSCode is really good and one of my few exceptions. I strongly suspect it
would have been even better if they had developed with a more performant
platform, but who knows.

Edit: To expand on my reasoning:

Say an operation on a well built performant application is 5 seconds faster
than the electron (or otherwise bloated) version. Say employees do that
operation on average 20 times a day. Say I have 2500 employees who work 246
days a year and get payed $25/h on average. The slow version will cost the
company $427,083 every year. That's the amount of money I'd be willing to
spend per year for the fast version of this hypothetical application.

A company like Slack has hundreds of thousands of users and the poor
performance must be costing _someone_ millions. It boggles my mind that with
all that money, they still can't find the resources to make a performant
application?

And that's the naive calculation, there's also the administrative aspect of
installing, upgrading, and supporting the application. (The worse the
applications quality, the more time I, who am payed a lot more than $25/h,
spend supporting it.) While there are multiple variables here, a development
team that prioritizes "easy and fast" development doesn't inspire me with
confidence they have also prioritized building a quality product.

~~~
applecrazy
The new Slack app, introduced last month, makes using it feel snappy. It also
uses significantly less RAM.

------
throwaway8879
At this point in time, it's reasonably healthy to assume that everything has
backdoors. The only place where information can be kept safe and hidden is
deep within our minds. Any method used to share said information with another
human being is subject to surveillance and backdoors. Only share what you
don't mind being read by the state and it's friends.

~~~
he0001
For some reason, the key to my mind’s backdoor is beer.

~~~
r3bl
I don't drink much, but a $5 wrench would probably work on me.

~~~
mcbits
The beer and wrench won't work on me, but the $5 might.

------
StreamBright
It is amazing to see how large, fat, over-engineered frameworks are taking
over the internet. Not only it is easy to backdoor but usually they consume an
enormous amount of memory and CPU. Not sure how we ended up here.

~~~
t-writescode
Cross-platform guis are hard or ugly and html+css came to save the day

~~~
metalliqaz
And javascript[1], don't forget that lovely language.

[1]
[https://www.destroyallsoftware.com/talks/wat](https://www.destroyallsoftware.com/talks/wat)

