Hacker News new | past | comments | ask | show | jobs | submit login

This is clickbait nonsense. Unfortunately, because it's so popular to hate on Electron these days, it's going to get a lot of traction on HN and elsewhere. The premise of the blog post is:

> It’s important to note that this technique requires access to the machine, which could either be a shell or physical access to it

I mean... what? I can literally do code injection on (almost) any application I'm running given that I have shell or physical access to the machine. It's like the author never heard of Detours[1] or VTable injection[2]. This is a low-effort clickbaity post that brings nothing to the table to serious security researchers or even hobbyist hackers.

It's a shame, too, because there are a lot of very interesting techniques out there for injection and remote execution, but they are OS-dependent and require a lot of research. Clearly, a more interesting post would have been too much effort for OP and instead we're going to pile on Electron.

PS: ASAR code-signing is not fool-proof, as we can still do in-memory patching, etc. Game hackers have been patching (signed) OpenGL and DirectX drivers for decades. It's a very common technique.

[1] https://www.microsoft.com/en-us/research/project/detours/

[2] https://defuse.ca/exploiting-cpp-vtables.htm




I think its been exacerbated significantly by the reporting elsewhere: https://arstechnica.com/information-technology/2019/08/skype...

Notably, according to that Ars Technica coverage:

> attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified

That isn't in a claim in the original post, and doesn't seem to be true afaict: every distribution mechanism I can think of signs the entire distributable, so you really can't just modify the ASAR without breaking the signature. Windows & macOS both require you to only install from signed application bundles/installers (or at least they make it very difficult for you to use unsigned software). On Linux you could get caught out, but only if you download and install software with no signing/verification whatsoever, and that's a whole other can of worms.

If that claim were true this would be a bigger concern, but given that it's not I'm inclined to agree this is basically nonsense.


every distribution mechanism I can think of signs the entire distributable, so you really can't just modify the ASAR without breaking the signature. Windows & macOS both require you to only install from signed application bundles/installers (or at least they make it very difficult for you to use unsigned software)

Only drivers have to be signed on Windows, and even then not all kinds until Windows 8. Also many apps, including Visual Studio Code, are available in 'run from USB' form, so there's no installer, just an archive you unpack and run. Those archives can be modified and redistributed without invalidating any of the PE signatures within, but since nobody pays attention to these signatures anyway and Windows doesn't enforce them, yeah, this is typical Black Hat-week PR nonsense.


> Only drivers have to be signed on Windows

This is half-true.

Windows and macOS both make it difficult to install self-signed (or unsigned) software. For example, I made http://www.lofi.rocks (an open source Electron-based music player) and I'm not going to spend like a few hundred bucks a year to have a non-self-signed cert. This makes both macOS and Windows complain when users install the app. More draconian practices (that "protect users from themselves") will make it even harder for independent open source devs like me to share cool projects with a wide audience.


https://www.certum.eu/en/cert_offer_en_open_source_cs/ free - although you have to submit a worrying amount of personal identification.


Only drivers have to be signed on Windows This is half-true.

Windows and macOS both make it difficult to install self-signed (or unsigned) software. For example, I made http://www.lofi.rocks (an open source Electron-based music player) and I'm not going to spend like a few hundred bucks a year to have a non-self-signed cert. This makes both macOS and Windows complain when users install the app. More draconian practices (that "protect users from themselves") will make it even harder for independent open source devs like me to share cool projects with a wide audience.


This is like saying a Python or Ruby application could be exploited if someone snuck code into your machine. This is a known scripting language "flaw" that nobody cares deeply about.


It is something that gets attention when you have a defense-in-depth approach. Look at https://www.python.org/dev/peps/pep-0551/ and https://www.python.org/dev/peps/pep-0578/ for Python.


Why just scripting languages? Can't I replace compiled binaries as well?


I’m not sure why you’re so upset by this. Electron is installed on our machines and deserves to be scrutinized.

The author presents the info clearly and even includes videos demonstrating the “technique,” so it doesn’t seem “low effort” and click-baity to me.

I’m not sure I can support your view that this is unworthy of attention or fix because of in-memory patching, etc. If I told my customers Not to worry about my product because there are much scarier ways they can get hacked elsewhere, they would still ask why I didn’t put my best effort into closing a known loop.


It's clickbaity and low-effort because this is no more an "exploit" than running a random .exe is an "exploit." It can be "fixed" by always installing software from trusted vendors and not running random executables you download from IRC. In other words, it doesn't even really qualify as an attack vector. Electron isn't any more vulnerable than any given native app.

Compare that with an actual Chromium RCE vulnerability (a very clever PDF heap corruption exploit): https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1748...


The interesting thing to me is that the techniques you are talking about get lit up like a Christmas tree by some modern endpoint protection products, whereas a backdoored Electron app is squeaky clean on Virustotal...


This is untrue. A "backdoored" native app (i.e. an app where an executable or DLL was modified) would also be squeaky clean.


No, if you embedded malicious code (say a full blown RAT, like this tool gives you) into an exe, modern av models will do static analysis of that code and flag it as potentially malicious. Because none of the JavaScript code in this backdoored electron app is even looked at by any engine(none of the engines on virustotal do analysis of JavaScript) the binary features are indistinguishable from the legitimate version.

Backdoored ccleaner flagged as malicious by multiple ml based products: https://www.virustotal.com/gui/file/6f7840c77f99049d788155c1...

Backdoored xmanager flagged by multiple ml based products: https://www.virustotal.com/gui/file/d484b9b8c44558c18ef6147c...

Countless other examples.


True, but those are backdoored apps whose signatures has been identified and stored in some AV database. The solution is (provably) impossible to generalize with static analysis. Clearly, it's also reactive (people need to report the backdoored application before you know its signature). There are also fairly well-documented ways to get around this signature-approach to AV (polymorphism comes to mind).


No, signatures have nothing to do with this. The ml models embedded in those products (and what is evaluated on Virustotal) can flag legit software that has had a RAT executable inserted in it by modifying the binary. The ml models are trained on thousands of features, and are pretty good at classifying malware. USCYBERCOM has been tweeting APT malware that was not seen by these models or anyone in the public, and yet was still flagged. https://twitter.com/CNMF_VirusAlert . That would be completely impossible if these products were relying on signatures. Regardless, the entire point I was making in my original comment is that this article is far from clickbait nonsense, because you have a chance, significant from what I've seen, of flagging something like the backdoored pieces of software I linked or never before seen malware like in the tweets above because the malware exists as compiled code. JavaScript is currently not evaluated whatsoever by ANY software security product, so the chances of it being flagged and blocked is 0. Signatures and polymorphism are 10 years ago quite frankly. Backdoored Slack exfilling data in steganographic images over https to giphy.com and instagram and twitter and shit is one future realm of malware. Both the binary and the network traffic are completely indistinguishable from legitimate usage.


The claim is it's easier to bypass some app integrity protection mechanisms when the target is an Electron app.


"easier" than what? and is it particularly noteworthy? if malicious code has write access to any given app's constituent files, there's effectively no app that's hard to subvert.


Easier than apps that are better covered by system app integrity protection? I'm not sure what's unclear about this, it's right in the writeup.


If you're talking about installing apps, every installed app needs to be signed (unless you ignore Windows/macOS warnings). If you're talking about injection or modifying program files (be them executables, DLLs, or ASARs) post-install, every app is equally-vulnerable. There is no functional difference between a native app or an Electron app in that regard, so maybe you can clarify what you mean by "system app integrity protection."


so maybe you can clarify

I didn't write this thing, I'm just saying that the claims it makes are not the claims you say it makes. 'Functionally equivalent' is a bit like 'Turing complete' - it makes it easy to say something so true it's not actually interesting.

It's not some major discovery or controversial claim that Electron apps are an even more convenient and easier-to-leverage vector for exploitation than regular old binaries. But writing some blog post about it (they didn't give the vuln a name, they didn't rent it shoes, they aren't buying it a beer) does not warrant the weird invective you're throwing at it.


I wasn't trying to be snippy, I genuinely didn't understand what you meant since the term "system app integrity protection" isn't anywhere in the original blog post. Also, just to clarify, by "functionally equivalent" I meant "exactly the same."




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: