
Critical vulnerability of NPM package macaddress - tekkk
https://nodesecurity.io/advisories/654
======
koolba
Here's the offending line of code[1]:

    
    
         exec("cat /sys/class/net/" + iface + "/address", function (err, out) {
    

That's some serious amateur hour and may even be a new contender for the
Useless Use of Cat (and "exec(...)") Award.

Would be simpler and less error prone to just read the file via
fs.readFile(...) after verifying that the iface parameter does not contain a
directory separator.

[1]: [https://github.com/scravy/node-
macaddress/blob/dd079620d11c9...](https://github.com/scravy/node-
macaddress/blob/dd079620d11c91957ac897e88d539dfceec41d56/lib/linux.js#L4)

~~~
tomxor
Nodejs API docs for exec say:

> Never pass unsanitized user input to this function. Any input containing
> shell metacharacters may be used to trigger arbitrary command execution.

I've always wondered what exactly the best way to do this is... searching
about sanitising user input into shells always results in suggestions for
something quite extreme like stripping non-alphanumerics or something similar.
But much of the time user inputs are just passed as parameters or built into
paths as in this case, so I wondered, is it enough to quote them instead? we
don't need the features of double quotes when building the string in some
other language in which case the rules for single quotes apply:

From GNU Bash 3.1.2.2 Single Quotes:

> Enclosing characters in single quotes (‘'’) preserves the literal value of
> each character within the quotes. A single quote may not occur between
> single quotes, even when preceded by a backslash.

In which case is it possible to get away with just "sanitizing" for single
quote usage and then single quote all paramaters?:

    
    
        exec(`cat '/sys/class/net/${iface.replace(/'/g, '')}/address'`, () => {})
    

Is it possible to break out of that? I think in order to use && or ; to start
a new command you need to end the context of the string, but .replace(/'/g,
'') seems to remove that ability? e.g

    
    
        iface = `'; touch /uhoh;`
    

results in the following string passed to exec

    
    
        cat '/sys/class/net/; touch /uhoh;/address'
    
        => cat: '/sys/class/net/; touch /uhoh/address': No such file or directory
    

Also you can't do further exec inside single quotes with backticks e.g

    
    
        cat '/sys/class/net/` touch /uhoh `/address'
    
        => cat: '/sys/class/net/`touch /uhoh`/address': No such file or directory

~~~
koolba
Escaping and quoting is the wrong way to go. Even if it works you're
technically not totally correct as you can't reference my network interface
named "bad'name"[1].

If you can avoid fork/exec you're always better off. In this case reading the
file directly is fine. I can't think of a valid reason to fork and invoke cat.

If you have a legit reason to fork and run then use execFile(...) as you can
pass the arguments as an actual array. No escaping needed as it gets passed in
total without shell parsing of args. Your example would be something like:

    
    
        execFile('/bin/cat', ['--', `/sys/class/net/${iface}/address`,], (error, stdout, stderr) => { ... })
    

Though that has the issue of being able to pass in ../../../ and read a file
named "address" anywhere on the filesystem. Should also add in a regex
validation that no directory separators are in the text of iface.

[1]: Yes this sounds stupid but it's a valid file name. A more realistic one
may be something like "Alice's Wi-Fi".

[2]:
[https://nodejs.org/api/child_process.html#child_process_chil...](https://nodejs.org/api/child_process.html#child_process_child_process_execfile_file_args_options_callback)

~~~
tomxor
> Escaping and quoting is the wrong way to go. Even if it works you're
> technically not totally correct as you can't reference my network interface
> named "bad'name"[1]. If you can avoid fork/exec you're always better off. In
> this case reading the file directly is fine. I can't think of a valid reason
> to fork and invoke cat.

Yes I get that, I should have clarified: in the general use of exec (i.e a
valid and necessary use).

> If you have a legit reason to fork and run then use execFile(...)

You are absolutely right, execFile is a much better way, I've used it before
and I guess you are right - in _most_ cases this should be the #1 choice.

[edited] - irrelevant details of obscure case where execFile isn't enough.

Anyway, in a case where exec through a shell is necessary for some reason, i'm
wondering if this method of single quoting is actually safe or not, as far as
I can see it is, but i'm not a security expert.

------
georgyo
Many people here are saying that the problem had been in the code base for
years, was super obvious, and that just no one looked at the code.

I think that it is much more relevant that the node community does do many
micro packages, that it makes so much nose to look at the quality of pulled in
packages.

I recently pulled down a web app (Zulip) the packages.json expands to almost
1400 npm packages. How is any human going to look at all that. And apparently
it is "normal" for an npm project to have that many dependencies.

Too me this is madness. It is impossible to know the quality of all those
packages.

~~~
pi-squared
I am not a fan of the whole javascript craze but for comparison on my desktop
machine with I assume average setup of ElementaryOS:

    
    
        $ apt list --installed | wc -l
    
        2394
    

And on my server the number is 502. Couldn't the same argument be applied here
as well? I understand that (probably) users of Ubuntu are more involved than
supporting npm packages (or... are they) but it's still a lot of packages to
eyeball.

~~~
georgyo
Sure, but the major difference is that those packages are for everything on
your system. Dependencies are often added very carefully.

In the node case it is not uncommon for a small app to include 50 small
packages, and then those packages pulled in 30 unique smaller packages each
(on average).

Imagine you did an apt upgrade on your system, or decided to install a new
package and it said it was going to install 1500 new packages. Would you
continue? It would definitely make me pause for a while. But this is the norm
in the node world, and people seem to think it is a good idea.

------
jedisct1
npmjs shows 1,325,674 weekly downloads.

The current version was released 3 years ago.

The code is small and straightforward.

Using exec("cat x " \+ userdata) to read a file in any language should
immediately drive any developer nuts.

Apparently, nobody ever looked at the code. Or maybe some people did, but
never reported it. Pretty sure some companies did, fixed it in their closed
local fork and never notified the author nor contributed their fix back.

This is really scary, and really bad for the trustworthiness of open-source
software.

------
robin_reala
This has been biting a lot of people as the mostly unmaintained cssnano uses
it, which in turn is used by create-react-app. Previously this would have
probably gone unnoticed, but npm audit in npm6 has brought it front and
centre.

~~~
andyana
It's amazing that something like cssnano would have a dependency on getting
MAC addresses. WTF?

~~~
roblabla
I maintain a package that has an indirect dep to macaddress because of UUID
generation. UUID v2 is generated from Mac address + timestamp. My package only
generates uuid v4 (fully random) and even if it was, the uuid dep doesn't
expose the vulnerable argument. As such it isn't vulnerable. But the dep is
still there.

This isn't surprising at all tbh.

~~~
bluetech
A dependency for uuid4 is hardly needed:

    
    
      crypto = require('crypto');
    
      function uuid4() {
          const bytes = crypto.randomBytes(16);
          bytes[6] = (bytes[6] & 0x0f) | 0x40;
          bytes[8] = (bytes[8] & 0x3f) | 0x80;
          return bytes.toString('hex').match(/(.{8})(.{4})(.{4})(.{4})(.{12})/).slice(1).join('-');
      }

~~~
roblabla
I mean sure. But I also need to parse UUID. And do other things with it. Even
if I didn't need to, I'd still likely use the uuid-1345 crate for it.

Here's the thing: If I import this code in my codebase directly instead of
using a library, it suddenly becomes my "responsibility" if it breaks. And
when we're talking about a FOSS side-project, I don't have that kind of time.
Third-party libraries means I get any improvements and bug-fixes for free. And
if it breaks, I get to talk with the original maintainer to figure it out, and
we're suddenly two people, with one hopefully knowledgable enough on the
topic, working on the problem.

Importing this kind of thing in make codebase makes it an ugly unwieldy mess
that _will_ inevitably break, and I suddenly will be alone trying to figure
out what I did wrong. This is what's good about npm and libraries: the
community that's built around it.

~~~
bluetech
Well, surely you put the line somewhere? Would you depend on `leftpad`? How
about `is-odd`?

Your view of dependencies is very idealistic. And if I'd venture to guess -
apologies if I'm wrong - young, at least to programming. I'm saying this
because you don't tend to hear such phrases once reality hits.

Everything you said can be flipped:

> If I import this code in my codebase directly instead of using a library, it
> suddenly becomes my "responsibility" if it breaks.

I have full control & understanding of the code; I can adjust & trim so it
fits as much as possible. If it breaks, I can fix it directly.

> And when we're talking about a FOSS side-project, I don't have that kind of
> time.

Copying/hand rolling some trivial code is often faster than deciding among a
dozen libraries which do the same thing, reading their docs, their issue pages
and open PRs, adding and integrating to my project.

> Third-party libraries means I get any improvements and bug-fixes for free.

I must keep the dependencies updated. I must read changelogs and hope nothing
breaks.

> And if it breaks, I get to talk with the original maintainer to figure it
> out

I must use inefficient communication with a more-often-than-not absent
maintainer in order to resolve my problem instead of fixing it directly.

> I suddenly will be alone trying to figure out what I did wrong

I am able to be self-reliant in fixing the problem, because I wrote the code
and I understand it.

I do not have to trust the maintainer of the package or the maintainers of any
of its transitive dependencies to be competent or non-evil.

\---

For some things this is worth it. IMO for generating 16 random bytes it is not
worth it.

~~~
roblabla
Well, everything I've said here are grounded in personal experience working on
FOSS projects. They're also grounded in trust of the maintainers. In 5 years
of full-time programming as a job and a hobby, I have found that to work to my
advantage. I can understand that this might not be everyone's experience. I
also understand that different projects have different needs - some might
simply be too sensitive to depend on external code if they can help it.

To answer your rhetorical questions: I would certainly depend on leftpad, as
it allows me to free my mind to something more useful. I probably wouldn't
depend about is-odd because I intuitively know how to make it work. That's
pretty personal. The line is basically drawn at "do I need to think to figure
this out." Yes, leftpad is trivial. But this is death by the thousand
papercuts: When I program something, I'm solving a problem. Having to solve a
million other problems at the same time - no matter how trivial - is going to
take me away from what I'm supposed to actually doing. It breaks my flow. I
don't like that.

> Copying/hand rolling some trivial code is often faster than deciding among a
> dozen libraries which do the same thing, reading their docs, their issue
> pages and open PRs, adding and integrating to my project.

I have found this to not be the case. In the JS land, depending on a library
is beyond trivial. Especially for those "simple" libraries that simply export
a function. Yes, I do need to read the docs, but there isn't a million ways to
do leftpadding, nor is there a million ways to do UUID generation. Copy
pasting code might require playing with requires, figuring out transitive
dependencies, and just some manual work that take time away from actually
thinking about the code I'm working on - again, breaking my flow, forcing me
to context switch.

> I must keep the dependencies updated. I must read changelogs and hope
> nothing breaks.

Semver takes care of that for you. And yes, I trust maintainers to follow
semver. Again, that generally works. I only had a single case of semver-
breaking release in my life, and that was in the Rust land - docopt.rs 0.8.2.
[https://github.com/docopt/docopt.rs/issues/235](https://github.com/docopt/docopt.rs/issues/235).
Debugging the issue took me at most an hour, reporting it to upstream took
seconds, and the fixed version was released literally 10 minutes later. In the
meantime, my install still worked due to the lockfile pinning the previous
docopt version. I could easily continue working normally.

Breakage happen. I've found that breakage also happen internally. Sometimes I
change my code and break my own invariants. Refactor and forget to fix an
instance, which breaks the code in subtle ways. Bugs happen. I accept that.

> I have full control & understanding of the code; I can adjust & trim so it
> fits as much as possible. If it breaks, I can fix it directly.

This is where I totally disagree. Depending on a third-party package does not
deprive me of the ability to fix code when it breaks. And in fact: it empowers
me. Again, the major advantage of dependencies, in my view, comes from the
community. And if it needs adjustment that could be beneficial to others, I
can PR it. I can fork, and npm has tools to make it easy to inject a fork into
the dependency tree, replacing all instances with my own. This lets me
experiment with potential fixes with ease, as if it was my own code.

Sure, sometimes, the library might not do exactly what I want it to, exactly
how I want it to. Maybe what I want to do is incompatible with said libraries.
In those cases, rolling my own might make sense.

In the end, it comes down to this: I trust the community, and believe in
taking and giving back. If that's a young point of view, then so be it. Maybe
I'll learn the hard way that reality is not so forgiving, that my trust was
misplaced.

------
speps
The actual report, as that page contains nothing interesting:
[https://hackerone.com/reports/319467](https://hackerone.com/reports/319467)

------
cyphar
According to the HackerOne report[1] there was no response from the maintainer
and the last package update was 3 years ago. And yet the package is downloaded
several _million_ times a month (meaning it's definitely used in something --
I imagine looking at the reverse dependency tree would cause some concern).

This is something that bothers me a lot with most of these home-grown package
ecosystems. In distributions we remove packages from their main repositories
if old maintainers are no longer available and nobody is interested in
becoming the new maintainer (we do this in openSUSE all the time). Does the
Node (and Rust, Python, Ruby, ...) community have this sort of process for
deciding that a project is no longer adequately maintained?

One of the benefits of having macro-packages is that it's always clear whether
a package is being maintained -- if there isn't a release for a long time then
it's probably no longer maintained. But with micro-packages there are
thousands which are so trivial they require no updates (see left-pad,
ansi-{green,blue,red,yellow,cyan,...}, and so on) that a very large number of
projects depend on. Not to mention that distributions don't generally have
projects with thousands of dependencies each -- so removing a dependency and
everything that depends on it will not destroy the entire package ecosystem.

[1]:
[https://hackerone.com/reports/319467](https://hackerone.com/reports/319467)

------
royjacobs
That yet another NPM package proves to be pretty broken seems to be par for
the course, but what especially shocks me is that on Hackerone the people
working on this vulnerability can't reach the original maintainer and then
say: "Doesn't look epic to me, also last publish was 3 years ago."

It doesn't look epic, yet it's a package that has over 1.2 million weekly
downloads.

Also, fun fact, one of the dependent packages is react-input-select...WHY?

~~~
wolfgang42
react-input-select has 640 (!) dependencies, all of which were added by the
author in a commit named 'type fix'[1] which seems to have been ostensibly
intended to fix a typo, and none of which seem to be required to actually use
the library.

The only thing I can imagine is that the author somehow added their entire
node_modules tree as dependencies. Maybe it was an ill-fated attempt at
pinning?

[1]: [https://github.com/rayescmata/react-input-
select/commit/cd7b...](https://github.com/rayescmata/react-input-
select/commit/cd7b82578d70b9562013e00eb1ee7594ef628e9d)

------
taf2
love those little bobby tables

------
cc-d
Another month, another ridiculous issue involving NPM that has no place being
in software with millions of weekly downloads.

It really is never ending amateur hour in the NPM community.

~~~
binarymax
This has nothing to do with NPM. It is a specific package published on NPM.
Also, stereotyping and insulting the NPM community is nonconstructive and out
of place. Vulnerabilities in packages happen all the time in all kinds of
package managers. The lesson here is to not blind-trust any package you come
across that might make your job easier.

------
ashelmire
This one has been out for about a month:

[https://snyk.io/vuln/npm:macaddress:20180511](https://snyk.io/vuln/npm:macaddress:20180511)

Unfortunately, it's introduced into any Rails app that uses webpacker or any
app using node-sass through a long dependency chain.

And once again demonstrates how the JS community is far too dependent on
extremely trivial, unsupported, and unaudited libraries.

