Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How do you security-audit external software using NPM packages?
106 points by BjornW 35 days ago | hide | past | favorite | 82 comments
Hi,

At my current client I've been doing more and more security related tasks such as audits on external software. Currently the type of software I audit are WordPress plugins. I have more than 15yrs of experience with WordPress and in the past I could fairly easily assess a WordPress plugin's potential security impact(s). Nowadays not so much due to the seemingly increased usage of npm packages included with these plugins.

Often these plugins do not include a package.json, package-lock.json nor are the javascript files readable (bundled & minified). This makes using npm audit near impossible. Good for production, less for audits.

Sometimes I can grab development files such as package.json, package-lock.json from a public repo, but in the case of so-called 'premium' plugins a public repo is usually absent.

So my question is: How do you (security) audit external software depending on npm packages?




As much as the JS ecosystem terrifies me, Node isn't really the problem here. Receiving plugins that contain minified blobs of JS is, practically, quite equivalent to receiving plugins that contain binary blobs.

If you accept receiving and using plugins that contain unauditable blobs of software, whether it's minified JS or a binary, a good-quality audit is going to be virtually impossible.

In many other ecosystems this wouldn't be normal. If a Rust crate ships binary blobs with no easy access to source code, I wouldn't ever consider depending on it.

If you can't prevent these blobs from infecting your system, you have to deal with the risk another way – locked-down containers on the server side, strict CSPs on the client side, and monitoring.


One of the big tradeoffs with Wordpress is that it is a user-first, developer-second kind of framework. I say framework because that is what it has become, for better or worse.

Despite that, developers are building serious/complex websites with tons of custom functionality with WP, by extending it via the hooks that the theme and plugin system provide.

One of many problems with the above is that your whole default project structure is optimized for user workflows rather than developer workflows. Dependency management becomes even more complicated and fragile than it already is. So in terms of JS inclusions you get pre-compiled stuff instead of the source, because WP doesn't have a dependency and build system.

The best strategy to solve that is to _avoid_ dependencies, _especially_ plugins-level dependencies, like hell and have only a minimal vetted list of those. Most functionality plugins provide are again, optimized for users who will only ever touch the GUI and are typically completely unnecessary, too complex and are often painful to interact at the code level.

Plus as a kind of important aside: there is another issue with providing plugins with pre-compiled JS and other stuff without the source. I'm pretty sure it violates the GPL license[0] to do so or is at least a grey area. Would be happy to hear of more knowledgeable people about this issue.

TL;DR: When developing with WP, avoid dependencies and especially plugins. There is already enough accidental complexity as is.

[0] https://developer.wordpress.org/themes/getting-started/wordp...


We've been using trivy [1] to audit the container builds we've been producing for a relatively security focussed project. As well as scanning for OS package level vulnerabilities it also scans for reported vulnerabilities in NPM packages. Works well for us.

But the other complementary approach is to lock down other things - so for example, if you're running in a container, make sure that container can only talk to the proxy in front of it. That way, even if there was some kind of malicious code running in one of the modules, there's no way for any data to get in or out (unless it finds a way of injecting into any web input/output, but then you need to be scanning for that too)

[1] https://github.com/aquasecurity/trivy


It is also useful to keep track of entries in a vulnerability database for some of the more "enterprisy" dependencies https://nvd.nist.gov/

Running a pen test against web apps can also be educational and amusing. ZAP is highly customizable, so you can extend it to cover particular areas of concern. https://www.zaproxy.org/getting-started/


> make sure that container can only talk to the proxy in front of it

Is there a tool like trivy that can help with that?


If you're deploying containers in Kubernetes, that's what network policy will do for you :)


network namespaces and any reverse proxy :)


Independent of the language, I only use external code if it is small enough that I can manually review it. Often I refactor it into a single file during this process.

This of course excludes the majority of packages out there. But apart from security, it has another benefit: These dependency very rarely break and need updates. So compared to projects with a more complex stack, projects with a lean stack are easier to maintain.

It would be great if there was a "single small file packages" movement so that more lean open source software will be created.


Does that mean you roll your own database because you can't review the whole codebase for Postgres or whatever? How about your own cryptography suite, and web server, and compiler?

I think it's reasonable to err on the side of rolling your own for simple stuff instead of `npm install is-even` or whatever. But using other people's software is a net positive for both productivity and security for sufficiently complex applications. And the range from "simple" to "complex" is a continuum and it's not trivial to decide where on that continuum to draw the line.


Refactoring into a single file sounds like a bit of a pain, since you have to do it every time the external code gets updated. Also how do you deal with dependencies that come with their own dependencies? Do you avoid them?


This may be slightly tangential but I recently discovered ncc[1] from vercel which can take a single node project and compile it and all dependencies to a single file.

As an added benefit it also collapses all contained dependencies license files into a single licenses.txt file too!

- [1] https://github.com/vercel/ncc


    every time the external code gets updated
I do not keep my fork in sync afterwards.

    dependencies that come with
    their own dependencies
Depends on the dependencies. If you give me an example, I can tell you what I would do.


This is a fantastic trick! By copying the source code (which is legal) but not declaring the dependencies in a package.json or similar, nobody will ever get on your case for CVEs in dependencies, and you can save so much time and churn by not updating them.


not necessarily. Depends why the external code was updated and if you need the new functionality.

I'm assuming it isn't a security flaw, because ideally you would have fixed that already during your refactor.


> It would be great if there was a "single small file packages" movement so that more lean open source software will be created.

There are certainly npm authors doing this already, feross[1] is a good example. That means you get packages like is-buffer[2].

[1]: https://www.npmjs.com/~feross [2]: https://github.com/feross/is-buffer/blob/master/index.js


What I would do if I wanted to use "is-buffer" is I would copy this index.js to a new file called "isBuffer.js" and it would look like this:

    export function isBuffer (obj) {
        return obj != null && obj.constructor != null &&
        typeof obj.constructor.isBuffer === 'function' &&
        obj.constructor.isBuffer(obj)
    }
Imho, there is no need to pull 10 files into my project to use one function.


You would, of course, preserve the copyright and license notices too. Otherwise that would be a violation of the license.


Yes, in this case I would put something like this on top of the file:

    # Fork by TekMol of https://github.com/feross/is-buffer
    # Which is MIT licensed by Feross Aboukhadijeh
I am actually never completely sure how to properly do this. Would the next forker write the following then?

    # Fork by Joe of https://github.com/tekmol/isBuffer
    # Which is a fork by TekMol of https://github.com/feross/is-buffer
    # Which is MIT licensed by Feross Aboukhadijeh


Looking at that function, I'm not convinced that it's copyrightable in the first place. Once you know what it's supposed to do, how else would you express it?

Of course it's only appropriate to credit the author anyway, and one way to do that nicely is by following the license requirements. I just think "violation of the license" is a bit of an unnecessarily strong statement, given the triviality and the likely non-copyrightability of the code.

Your lawyer's opinion may vary.


I think a developer who does that is doing a disservice to the company they are working for. If I start looking at a project and see a bunch of outdated packages that have been copied and pasted there and tweaked who knows how, I quit right away. Not to mention that these packages most likely have various unpatched security issues.

And if the package is not "small enough", I guess it means it needs to be re-implemented from scratch? If a company is happy to pay you for that (and is aware of it), why not, it's fun actually, but not very productive.


>It would be great if there was a "single small file packages" movement so that more lean open source software will be created.

uh, no. what would be even better if TC39 did something beyond window dressing and JS gets a sane standard API so these idiotic requirements for API fill in are no longer required.

These packages are required solely because JS has a crappy API and a vacuum was filled. This increases the surface for supply chain attacks, a la ua-parser-js in Oct.

Other languages have their own issues. But they also have saner stdlibs so the attack vectors are different.


User agent string parsing (already of dubious merit on its own) by no means belongs in the ECMAScript standard. It would fit right in with the work that the Web platform standards bodies are doing, though. If anything, ECMA-262 itself has already gotten too complicated and needs to be pared down to a smaller, look-we-haven't-completely-lost-our-minds profile. Compare ES5 to anything that came after ES2015.

Even ecosystems that do have developer kits with massive API surface area like you want (such as the ecosystem associated with the other TC39 initiative) had the good sense to define collections of common classes separately, speccing out their implementation as being optional. Then again, there's nothing stopping anyone from doing exactly that and just maintaining it outside the scope of the technical committee, a la Boost or Qt in the world of C++. The fact that people try doing this and fail to retain long-term interest from their short attention span colleagues gives you all the evidence you need for why the irreversible step of transmuting that work into a part of JS's core is a bad idea.


> uh, no. what would be even better if TC39 did something beyond window dressing and JS gets a sane standard API so these idiotic requirements for API fill in are no longer required.

That would remove the most used libraries. You could make babel, express, lodash/underscore, moment and stuff like that core, sure. But then you still have people using lots of libraries, especially in the frontend world for components.

> Other languages have their own issues. But they also have saner stdlibs so the attack vectors are different.

There's also more of a culture of just writing things yourself.


While you'll probably get a good overview on the question you asked from the other comments, don't overlook that non-audit measures might give you even better bang for the buck — and that going 80% on a mix of lots of different measures will usually give you better overall effectivity than going 99.8% on just auditing.

For example, putting that external software in a hardened container with network policies, non-root user, capability drops, readonly filesystem (,...) will go a long way towards securing it, even if there's a cleverly disguised backdoor you didn't manage to find.

Dumping your database is only half the fun if the app isn't able to send all the data to Alonistan...


Good advice, but these are NPM packages in WordPress plugins. It'd be unusual to have Node running on a server that's running WordPress, so it's very likely that these NPM packages are actually being delivered to the users of the website as part of the pages they're viewing (clientside components like custom dropdowns, calendars, etc are a typical use case). No amount of server hardening is going to protect the client if that's the case.

The only option is to flag them as either insecure or unchecked.


Yes, npm is typically used to build minified JavaScript code for WordPress plugin distribution. WordPress developers themselves use it for the Gutenberg editor.

I personally dislike this new age and miss the days of hacking with just a text editor and browser, but I understand their benefits.


In practice, I don't think anyone bothers. I asked a Node developer how they ensure none of their 3000+ NPM packages would send our customers' confidential information to Somalia, and he looked at me like I'm from another planet. To me that's reason enough to not touch something like this with a hundred foot pole, and keep well away from the blast radius when this inevitably backfires.


How is this different from any other programming language that has dependencies?


A quantitative difference - in other programming languages you usually use a very limited number of dependencies, and the organization can follow every individual dependency that's used in production, knowing which versions you are using and tracking and monitoring their releases/notifications/changelogs as part of their software inventory. Like, a random quite large project that I recall had 10 dependencies from three vendors, two of those being Apache foundation and Oracle, and you can track and maintain those dependencies in the exact same manner as you track the OS and database system (and their versions/updates/patches) which the product is using.

This becomes difficult for node.js ecosystem simply due to the large quantity of those dependencies.

In many other cases you are only using dependencies that are considered to be verified and monitored/patched by others e.g. those included in LTS release of your OS; and you can make a statement that you will be using only dependencies that are being actively maintained including security fix backporting to the major release which you are using - and you check for that by verifying (and periodically re-verifying) the process and maintainers of each and every third party package you're using. Again, not practical when there's something importing things like left-pad.


Sure but this problem is not mutually exclusive to npm.

Python has pip, Rust has crates, Ruby has gems..

The fact that other languages have fewer dependencies, is (imo) probably because there aren't many dependencies uberhaupt?

My argument by the way, is that many dependencies is a feature, not a bug. And the developer probably looked at you like that because he thinks you have no trust in his ability to pick dependencies..


Folks who publish stuff to PIP had the good sense to not require a hundred different other dependencies for each module they publish there. So it is tractable to just go through your entire transitive closure and ensure you're not including something you don't want.

And also, just because something is "hard" is not an excuse to just ignore it. NPM devs should be more aware of this issue, and make design decisions that improve the situation, such as, for example, not including a dep just to use a single function (the dep, by the way, might pull in 10 other deps, each of which could pull in some more). DRY is not a religious maxim, it's OK to do it to reduce "bushiness" and improve the predictability of the dependency graph.


The primary difference with other languages is not the availability of features - in general, pretty much every major language has almost all what you might need - but the differences in bundling. Many other languages have larger standard libraries, so things that might be "yet another dependency" in npm would be part of the standard distribution in other languages; and there are different approaches to bundling, as the same thing that might be dozens of dependencies in npm would be distributed and maintained as a single large package elsewhere.


Nexus Lifecycle / Nexus Auditor tends to be useful for this - in absence of a package.json it crawls the raw js files and finds their source. It can help figure out things like embedded jqueries etc. That being said, it has the same limitation as other tools - minification and bundling obfuscates origins and makes it harder to assign identity to the source package.

The only way that I can think of getting around this is to have a hard requirement for a source registry - or asking the premium plugin producers to produce a SBOM like cyclonedx or spdx and evaluate that in lieu.


You’re primarily talking about proactive auditing here, but if something does sneak in you’ve got problems. In the best spirit of layered security you should also build up a strong Content Security Policy and include that with your pages to make sure that there’s a whitelist of the servers the page can talk so, and that technologies you know you’re not using are locked down.


I agree. Pro-active audits will only go so far, there is definitely a need for other measures (which are implemented as well). A Content-Security-Policy is as far as I know still really hard to implement well (as in truly protecting assets instead of being a policy tick-off) on WordPress with external plugins and themes. Sadly, a CSP will not protect against attacks running on a post npm install in your development environment, as this is also a risk of using npm packages.


I believe one thing that can help security a little is to use full version numbers in your dependency list. This applies to all package managers.

Because the moment someone hits update on the package manager nothing will get updated and you won't receive potential dangerous updates you did not review first.

Edit: sorry this is not relevant to the question...


You want to have a file specifying your dependencies, and a file specifying your currently locked set - e.g. Cargo.toml and Cargo.lock, or Gemfile and Gemfile.lock, or pyproject.toml and poetry.lock

You'll then want tooling to periodically update your locked dependencies, so that you pick up fixes to security vulnerabilties. That wants to go through your CI.


This is a double edged sword.

Because on one hand, you do pick up hotfix patches, but on the other hand, you are possibly bit more exposed to supply chain attacks.

Any ideas on how to balance that out? Or should we just not consider supply chain attacks to be a real threat?


Mirror the upstream, and as part of mirroring, do an automated security analysis of your dependencies.

Sandbox your dependencies.

Run automated security vulnerability testing on your program, looking for rogue behaviour.

Require code signatures on dependencies.

Identity security critical components and audit them.


Full version numbers, pin your dependencies, commit dependencies to a local repo. Update them only when necessary (security patch, feature you need).


Can you elaborate? Your local repo then re-exports the dependencies to be consumed by your application or how would you do it properly?


> sorry this is not relevant to the question...

It’s a very good tip though. I must start doing this.


If any software pulls in more than a few independent npm packages, I call it a huge risk and sandbox it as if it's a ticking time bomb. After some deliberation I've come to the conclusion this is a reasonable approach with all software. It's for me a nice approach to deny every capability unless it is critical for the functioning (that you want) of the software. If that's "full network access and subprocess spawn capability", then you should probably not be using it anyway.


Yeah. I was thinking about some of this stuff a couple of years ago and came up with a proof of concept of what I called 'package sandboxing'. Basically the idea is that if you use a dependency that is only supposed to left pad strings, why would you let if have access to privileged functions like spawning processes, or using the filesystem or network. So I wrote https://github.com/ashward/byrnesjs which allow-lists privileged functions to only trusted code. If untrusted code is in the stack then those function calls will be blocked. The project is pretty out of date, and really only a proof-of-concept, but if anyone's interested in helping out I'd be happy to bring it back to life!


What's your strategy/stack of sandboxing, and on what os?


Sonatype Lifecycle is designed to analyze a built package and figure out what's inside it, specifically when there aren't manifest files to tell you what's -supposed- to be there. It can obviously do a lot more, but the analysis is designed to solve the exact problem you're describing.

https://blog.sonatype.com/mapping-the-javascript-genome-for-...


Yeah, currently in the process of evaluating Lifecycle, Firewall and Repository. Impression so far is great, and coming from an org where everything is blocked by default having these tools in place is night/day for us...


Will have a look at this. Thanks for sharing.


I am maintainer of couple of OSS projects https://github.com/ossf/scorecard https://github.com/ossf/package-analysis

These projects help with repository best practices and does some level of npm package analysis based on rules.


These projects are under https://github.com/ossf which is part of Linux foundation.


I'm in the process of moving to Deno (away from node+npm) due to its feature set reducing the need for and management of external code--in part by designing away a package manager (dependency 1) and instead using web platform features (ie many other dependencies--before my own project's). The docs include a section specific to external code https://deno.land/manual/linking_to_external_code and the solutions work well overall. Bundling has some bugs and there are transformations and complexities in the mix of rollup+npm that I'm still working through, specifics related to caching, subresource integrity and bundling external code at https://stackoverflow.com/a/69833335/965666 As endymi0n noted a mix of strategies is the best approach.


I've been avoiding the Node.js/NPM ecosystem for this reason among others. Unfortunately more and more packages are including Node for simple scripting functionality.

At this point I can no longer hold my breath and hope that the Node.js trend goes away.


If you don't use node, what do you use? You will have the exact same problems if you want to use a library.


For js libraries, I generally prefer something that offers a prebuilt package. If I need to use NPM just to include a Javascript file on a webpage, I generally look elsewhere. Same with CSS themes.

For command line tools and packages that require Node, again I generally look elsewhere. Dealing with a rest API on a command line shouldn't require so much bloat.

If a tool requires Node as part of a build script, I avoid it like the plague.

In other cases, I use a VM to build the JS package, but I honestly dread this workflow. Whenever a README/INSTALL contains, "Just npm our package..." or "first install node, then npm our helper utilities..." I start looking for alternatives.

Maybe I'm becoming a curmudgeon and maybe my dislike of Node is irrational, but this is how I'm coping.


What would you rather download assuming you don't know if you can trust the author: The source files of a program or a distributed binary?


Is there really anything better than `npm audit`, even with all its faults? Any relatively popular and well tested library will pull in dozens of dependencies.

  $ du -hs node_modules
  
  289M node_modules

Yeah no...


> Yeah no...

That is only a valid stance to take while you're developing most of the "regular" software out there, but once you start dealing with finance data in an industry that's highly concerned with compliance and security you might get more demands forced upon you in regards to what you can or cannot do.

That's not to say that there exists a better auditing mechanism, short of allotting a large amount of your time to building all of your dependencies from source and checking all of their changes manually, which almost no one actually does in practice - it is indeed easier to lie about compliance and just shrug your shoulders when you get caught.

I do not, however, condone any of it. I believe that the industry is largely headed in the wrong direction - there are entire Linux distros that weigh in at less than 289 MB and that not a single codebase out there needs that much code to actually run. I briefly touched upon it in my blog article, "Never update anything" (the title is not to be taken at face value), which explored the nature of updates in our industry: https://blog.kronis.dev/articles/never-update-anything

Of course, sadly that doesn't actually help a person who is stuck with such a codebase and needs a solution now. I don't think there is one. And in regards to starting new projects, large amounts of dependencies can definitely creep up on you! It would be interesting to have a CLI tool that warns you about the amount or size of dependencies in your project, to get something like: "CI build aborted, the project exceeds your configured amount of dependencies: 30 libraries (10 MB total size). Please reconsider what you're importing in your own project. Here's a dependency graph: ..."


> That is only a valid stance to take while you're developing most of the "regular" software out there, but once you start dealing with finance data in an industry that's highly concerned with compliance and security you might get more demands forced upon you in regards to what you can or cannot do.

I'd think financial institutions would avoid such a scenario entirely? The work cant also easily be split up - just because two libraries are correct doesnt mean both of them together are correct.


> I'd think financial institutions would avoid such a scenario entirely?

Then you'd only be relying on either the standard library, or packages that you (the company) have written yourself, which has a completely different set of challenges - everything from it being impossible to find people who know your internal libraries, to documentation and tests becoming a challenge etc.


I think I just read half of your blog. Good stuff in there, keep it up.


npm audit reports known vulnerabilities, but I think it doesn't help against supply chain attacks, or does it?


In terms of preventative measures, harden the underlying infrastructure. For example: prevent outbound connection initiations. If you need it, profile the connections and lock them down.

From a detection standpoint, the free options are NPM audit and GitHub’s Dependabot, which are ok. A commercial option (e.g. Snyk, WhiteSource, BlackDuck) is typically more recommended to manage exceptions and get more accurate results (e.g. is the vulnerable code used by your code).


3rd party npm libs are a massive security breach waiting to happen. The recent trend towards compiled javascript has made the problem even worse, since code can no longer be manually checked for security issues. At the moment, the main advice seems to be just keep your `dependencies` as up to date as possible (doesn't remove vulnerabilities), or lock down your run time environment (not an option for code that is run by others)


Snyk and Dependency Check are two tools I've used when performing security checks.


I second Snyk - using at work at the moment. Seems ok so far.


It's not free, we're using Whitesource which provides alerts against libraries being used in the codebase. It can scan package.json, but it can also scan individual files. It matches the hashes of those files with those from open source projects so it usually able to identify which library that file came from, or at least where it was first seen. That way the package.json isn't always needed.


Does it recognize hashes of proprietary (closed source, minified) files too?


Nope, it does not. If you remove the comment at the beginning of an unminified JS file, it will not recognize it as outdated anymore. You should treat WhiteSource as something that can potentially help to find problems, but it will by no means grant you security on its own. It is an enterprise tool to help people check boxes.


We leverage it mainly to confirm license compliance but the package vuln notifications are nice


I would love to know what you specifically look for in Wordpress plugins, especially on the PHP side. I've written quite a few but would appreciate some tips on security.

If you prefer a private discussion, my gmail username is the same as my HN username. Thank you!


This should be related, but I haven't tried it (my brother builds it, so I am not affiliated) https://requiresecurity.com/


A WordPress plugin may contain hundreds of interdependent npm packages all neatly bundled and minified. Without access to a package.json or package-lock.json it is quite hard to find out which individual packages have been used. Quite often there is also no public repo available of the development files.

To give an example of my process thus far:

Someone in my team wants to see if we can use plugin X. I’m downloading the plugin to have a look at the code. Luckily this plugin has included a non-minified version of the js file. I can derive the use of npm packages from this file. Using Snyk I have a look at the first package mentioned. It’s axios. The included version is vulnerable (high & medium severity) and has been for almost a year (Note: the last version of the plugin is 3 months old and does not exclude this vulnerable version in it’s package.json which I found in a Github repo later on).

Since I have no package.json nor package-lock.json (all I have is the distributed build) I can’t easily update the npm package. I have no clue as to how this package relates to the other packages and how their version might depend on each other. Even if I would update the package, all other users of this plugin are still vulnerable. I contacted the plugin author. He tells me he will update the plugin as soon as possible. The plugin is (as of today) still not updated & has not released a new version. In the meantime there have been two new versions of the axios package released.

Every user of plugin X is still vulnerable to the issues mentioned on Snyk, but is this a real problem in this specific WordPress plugin context? I’m not sure how to interpret the high & medium severity in the context of this plugin. How exploitable are these issues & what is the impact of the exploits in the context of this plugin? Do I need to be a logged in user? Is this something which can be triggered by any visitor? What am I able to do when I can exploit these vulnerabilities? I can only try to find answers to these questions if I’m willing to invest a lot more time into this, which more or less beats the purpose of using a ‘ready-made’ WordPress plugin. And this is just one package of multiple npm packages used in this plugin. Packages which also have their own dependencies as well….

At this moment I’m wondering if any WordPress plugin using npm packages can be trusted at all.

ps: The way the npm ecosystem is structured is, in my view at least, problematic. Often packages are not like libraries as I’d expect, but look more like a function call or method call. I’d prefer to write these short pieces of code myself instead of depending on external code which also includes extra risks. The very rapid release schedules makes it even harder to trust external software (like a WordPress plugin) using npm packages as it seems they cannot keep up with it.

I’m sorry if this seems like a npm rant, but I’m seriously looking for methods on how to deal with these issues so we can use external software (like WordPress plugins) built with npm packages.


My first question here would be: What is the attack vector you are worried about? If your wordpress instance is taken over, what is the problem? That the intruder gains access to data they should not have? Or that they will use your machine in some way that would harm you?


There are multiple attack vectors I can think of, although most can be mitigated using other security measures. I don't want to rely on audits only off course. To give you an example: using the WordPress environment as a stepping stone to gain more access, running client-side software without out permission (stealing data from visitors, our resources e.g. crypto miners), defacement/fake-news, etc.


My reply to this would be that this is very broad.

In my experience, if you really want to make your infrastructure more secure, you need to explicitely define what it is you want to avoid.

Taking your first point: You say "using the WordPress environment as a stepping stone to gain more access". What type of stepping stone would this be? How can malicious JS on the WP instance escalate its privileges?


npm with wordpress usually means front-end code, so one possible issue is attackers sneaking in stuff like credit card number stealing scripts etc. So it is more like protecting end users and less protecting the server/system.


The security concept behind credit cards is insane. Who thinks that a number which you hand over to everyone you buy from is a secret?

Shouldn't this be fixed at the root by handling payments via PayPal or Crypto?


It would have similar security risks if your frontend is compromised, for example, it could make the users pay their cryptocurrency payments to an attacker-controlled address.


Can you reverse lookup the packages using the abstract syntax tree? Most of it's probably from the top 10,000 packages.


I've built https://bundlescanner.com which is similar what you're describing. It has indexed 35,000 of the most popular npm packages. However, it is not accurate enough to reliably identify which specific version of a package is present in a js bundle.

I'd be curious to hear if anyone can think of possible applications of it in security auditing.


I use `npm audit` and (and maven-dependency-check) and I trust that vulnerabilities discovered by others are enough.

I assume that if I were a sensitive institution, I would pay people to inspect those dependencies and discover vulnerabilities.

The medium term would be to publish a bug bounty, so researchers are incentivized to find vulnerabilities.


Opening a bug bounty program where security is immature may cost too much money. If issues won’t be handled in a good enough pace, researchers may stop submitting bounties.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: