
Malicious remote code execution backdoor discovered bootstrap-sass Ruby gem - qzio
https://snyk.io/blog/malicious-remote-code-execution-backdoor-discovered-in-the-popular-bootstrap-sass-ruby-gem/
======
deanclatworthy
I really don't know what we can do as developers to prevent this. What else
could be done? Is there any package manager system that isn't vulnerable to an
individual being compromised? Having an audit-tool is only useful after you've
been compromised and the vulnerability is discovered. Some kind of
vulnerability scanner in all pulled in code might be one idea, but much like
the cat-mouse battle with AV's, I don't think would be very successful.

Ideas (shoot them down): \- Somehow tie keys to location. \- Require two-step
verification for all commits for packages with over 1000 downloads. (Assuming
their phone isn't compromised if the keys are).

~~~
raesene9
Lots of things could be done, it's whether people are willing to accept the
tradeoffs, which they haven't been (in general) up until now. Unfortunately,
modern software development has heavily embedded the idea of using large
numebers of un-reviewed open source libraries, changing that assumption would
be very difficult.

A non-exhaustive list of things that could be done...

\- Improve repository security (e.g. stronger authentication requirements).
Problem here is most of the repositories are non-profit and this costs money.
Also there's a risk that if you increase friction for developers, they'll go
elsewhere.

\- Require library signing. This has some potential benefits (may even have
stopped this attack) but again increased friction for developers and
management overhead for repositories.

\- Curation of libraries. Actually pay some people to review the code in the
libraries. This doesn't scale easily when you consider the volume that places
like NPM have.

~~~
repolfx
Also (on some runtimes):

\- Sandbox code modules

If a library doesn't need any permissions, and it's not granted any, then
attempts to e.g. access the filesystem or network will throw exceptions. The
JVM can do this, but it's not well documented how to use it.

~~~
nih0
The java security manager is a cool concept but sadly not very ergonomic to
use.

------
intertextuality
It's worse than I suspected.

Rails 6.0, when generating a new app:

\- installs 102 ruby gems (without being asked)

\- installs yarn & 602 yarn dependencies (without being asked)

The modern programming industry is a joke. We call ourselves "engineers"
without any certification or oaths of responsibility, and basically ignore
security beyond user credentials at best.

Someone please tell me `yarn list | wc` showing 2,800 transient packages is
incorrect.

[0]:
[https://gist.github.com/azah/c219844f95243cdfdb3b352ad3dee2f...](https://gist.github.com/azah/c219844f95243cdfdb3b352ad3dee2ff)

~~~
ChrisCinelli
We definitely trust more than we should.

And it is worse that what you just found out. Did you think about all the
software and dependencies that are installed just to get to the point where
you can install Rails 6 (ex: Linux)?

What about your coworkers? Are you sure that you can trust all of them and
nobody will not sneak anything in?

If you are too paranoid on who and what you can trust, you may end up tossing
all your electronic devices and go living in a desert island.

~~~
intertextuality
Those attack vectors you mentioned are already known, and we essentially have
to live with them. When it comes to "reflecting on trusting trust", we already
know that it's basically impossible.

However, this does not mean we should by default be installing heaps of
packages upon generating new packages. This is practically begging for a
security exploit, particularly the JS packages.

[0]:
[https://www.archive.ece.cmu.edu/~ganger/712.fall02/papers/p7...](https://www.archive.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf)

~~~
ChrisCinelli
I am not sure why the npm packages in yarn should be more exploitable than
Linux.

\- Linux is more ubiquitous than Node.js. If somebody exploits Linux, they get
at least one order of magnitude more machines. So there is a higher
motivation.

\- Funny code in C is harder to spot than in Javascript. Furthermore ...

\- A binary is harder to inspect than any module written in Javascript (even
if minified).

\- The code in the Linux default installation has at least 2 orders of
magnitudes of the code that yarn installs.

I was very conservative in my estimations.

Considering these kind of attacks, Linux seems more likely to be already
exploited by a few organizations in multiple ways.

We have been trusting too much. As more code gets written and many more people
come to learn to program, the number of supply chain attacks will increase.
And at the same time, as the security know-how is easier to be accessed on the
Internet, we will get smarter exploiters capable of hiding their wrong doing.

------
thibaut_barrere
Relevant links:

\- [https://github.com/twbs/bootstrap-
sass/issues/1195](https://github.com/twbs/bootstrap-sass/issues/1195)

\-
[https://github.com/rubygems/rubygems.org/issues/1941](https://github.com/rubygems/rubygems.org/issues/1941)

------
h1d
Do people in mission critical systems (like life threatening) write code from
scratch all the time without using any external dependencies? I'm sure they
don't reinvent encryption.

I'd like to know how they audit their system and the threshold of saying it is
adequately secure.

------
fxfan
Rust has an effort going on (by a single unaffiliated person) to have a trust
chain sort of thing per version. Cargo-cred.

I hope others copy this. This is truly scary.

------
jlangenauer
For those who haven't yet come across it, bundler-audit is a very useful tool
to pick up any old and/or insecure gems in your application. It should be part
of your CI pipeline, or at a minimum, run locally every so often.

[https://github.com/rubysec/bundler-audit](https://github.com/rubysec/bundler-
audit)

~~~
thibaut_barrere
Note that it won't yet detect this specific vulnerability:
[https://github.com/rubysec/ruby-advisory-
db/pull/386](https://github.com/rubysec/ruby-advisory-db/pull/386)

------
raesene9
Another one in the line of inevitable supply chain attacks on software
libraries.

This isn't going to go away, it's just a question of if/when the damage caused
will get bad enough that people start to accept some of the tradeoffs that
would be required to improve matters...

~~~
jimmychangas
Can you elaborate on what do you think would be a good solution to this
problem? In my opinion, having big central sources of libraries makes it
easier for security scanning services to operate. Companies hosting libraries
locally (e.g. with Sinopia, Nexus, Artifactory) have to proactively monitor
them, but most commonly will let them frozen at a specific version and miss
all security patches.

~~~
raesene9
Really the only "solution" is to ensure that you have some way to establish
trust in all the software libraries you use.

For commercially provided libraries that could be a contract specifying
security requirements, with some specific measures of how to do that.

For Open source the only real option I can see is to curate your own package
repositories and get a level of review that you're comfortable with.
Definitely fixing versions and insisting on review before upgrade would help.

The problem with automated scanning is that they can't really find backdoors
or just generally insecure code, they can find known vulnerabilities. They
_could_ use static analysis to find insecure code, but my experience of SAST
tooling is that it takes a lot of manual effort to tune, it's not a pure
automated scan option.

That's not to say that automated scanning provides nothing, but that it has
limitations.

~~~
pjc50
> curate your own package repositories [and review them]

This is a non-starter. This is even worse than "just run your own mail
server".

~~~
raesene9
sure but what's the alternative? Unlike mail servers where there's tonnes of
good commercial solutions you can use, I'm not aware of many good options for
curated software library repo's

There's tools you can use to add automated scanning to your repo's and pin
versions of packages, but that's doesn't really feel like curation to me...

companies like Sourceclear seemed to me to be going down this line but that
doesn't seem to be the drive of their offerings any moe.

~~~
zzzcpan
In the end it all comes down to reviewing code you have to trust. It's just
that it's too much for most code, so there has to be something that makes
potentially insecure code explicit and dealt with on use and the rest of the
code usable without having to trust it.

~~~
ChrisCinelli
Reviewing code mitigates a lot the risk but it does not solve everything (ex:
vulnerabilities hidden in plain sight).

It also implies that you are compiling everything from source code or you
trust the distribution channel of the binaries.

And you do it every time there is an update that you want to use. And you have
a system that promptly alert you of vulnerabilities in libraries you use.

Unless you have a handful of modules that are already self contained, it is
not practical for most teams.

------
rajangdavis
This is very timely: I was trying to start up an old Rails app today and when
I ran the bundle install command, I got a warning that there were no sources
for this gem... now I'm glad that this failed to install!

------
Foxboron
I hope this will help rubygems adopting reproducible builds. Currently their
issue with SOURCE_DATE_EPOCH is not yet supported properly.

[https://github.com/rubygems/rubygems/issues/2290](https://github.com/rubygems/rubygems/issues/2290)

[https://reproducible-builds.org/docs/source-date-
epoch/](https://reproducible-builds.org/docs/source-date-epoch/)

------
inglor
Great writeup by Liran, he's also a member of the Node.js security team and I
recommend checking his talks about package ecosystem security.

~~~
thibaut_barrere
Ultimately, I hope we end up getting (in some future) some form of "cross
language" global network of versions & vulnerabilities signaling, with e.g.
alerts when you install something, or at runtime, and push notifications for
all "subscribers" (outside of a paid product).

~~~
tannhaeuser
OWASP [1] maintains npm, maven (and nuget, I believe) plugins for detecting
use of packages with known vulnerabilities, and a vulnerability db, of course.

[1]: [https://www.owasp.org/](https://www.owasp.org/)

------
saurabhnanda
> We assume that the attacker has obtained the credentials to publish the
> malicious RubyGems package from one of the two maintainers, but this has not
> been officially confirmed.

Didn't this _also_ require a commit to the relevant Git repo? Or is it
possible to upload a tarball directly to Rubygems without it being backed by a
git repo?

~~~
joekrill
5th Paragraph in:

> The backdoor was wisely hidden in the 3.2.0.3 version that was only
> published to RubyGems and no source of the malicious version existed on the
> GitHub repository and allowed remote attackers to dynamically execute code
> on servers hosting the vulnerable versions.

------
dpc_pw
Relevant I'm working on a system for auditing open source libraries:
[https://github.com/dpc/crev](https://github.com/dpc/crev) . Currently focused
on Rust/crates.io.

------
RoryH
Anyone maintaining a library with a large audience of consumers should be
mandated to use 2FA/MFA, I can only assume from the information here that they
only had a set of credentials for authing :-(

~~~
ChrisCinelli
Having every package management system requiring 2FA to publish is a great
idea. It may be a problem though when the build happens on a CI system.

------
F117-DK
This is scary. Gems don't have any audits?

~~~
pmontra
Not this kind of audits. Is there any major language for which all third party
modules are audited for security before they are released? I would be
surprised.

