The documentation currently just says 'avoid typos'.
siddharthkp: please give me a way to contact you. see my contact info on my profile.
So their security model is basically the same as with C: "do it perfectly the first time."
https://news.ycombinator.com/user?id=siddharthkp doesn't return an email address, and I don't see one on the GitHub profile page either.
With this, you have to be careful the WHOLE time you are typing. You normally don't worry about hitting return in your editor causing RCE.
Opened an issue here: https://github.com/siddharthkp/auto-install/issues/2
(defun install-from-npm-region (start end)
(concat "npm install "
(filter-buffer-substring start end)))
(get-buffer-create "*NPM Output*")))
Also note that that code hasn't been tested. I wouldn't copy this into my .emacs just yet, if I were you. Run it by somebody who actually knows elisp, first.
One other possibility is to have a delay, so that it waits an amount of time before installing, to give a chance to catch and fix the typo.
Yes. You're right. If it scares you, don't use it.
It's always possible to run malicious code by typo, and this is only a little different from installing dependencies from the terminal. Even when you do spell a package's name correctly, you still don't know for sure what you're installing.
The guy just made a cool thing - it seems a little out-of-scope to freak out over security when npm was never really there.
The concerns are fair though, just added a --secure flag which will install popular modules only (>10k downloads last month)
I've successfully used Detective in a couple of my personal projects to find all require statements.
Relevant issue on Detective:
So i wouldn't call regex a 'very naive' choice, since people making it are aware of tradeoffs and pick regex intentionally.
Wether the choice was deliberate or not is debatable, but given the speed of these parsers I'd reason that there isn't any advantage to using regexes.
Reducing the amount of false positives is also one step closer to making this tool somewhat more secure, though certainly doesn't address any of the previous comments in this thread.
Then you can traverse it and modify it like any other AST.
I love that the Node community enjoys innovating for convenience, but ideas like this one are less than half-baked from a security perspective. Just make a few typo'd popular packages, and use npm install scripts  and you have a very easy remote code execution vector on developer workstations.
The bigger problem I see is that npm is a circus. No package signing and a ridiculous debate on why that's been going on for a year and a half . Credentials leaks of popular modules. 
When everything is a module and everyone is supposed to include modules vs. writing their own very simple functionality for things even like isArray polyfills  (24MM downloads a month!), you end up with the same attack surface that gives WordPress such a shitty reputation for security. It's not usually core, it's all the plugins by authors of unknown provenance and skill. WordPress gets pwned because there are a lot of plugins hastily written by new developers and used without audit by mom-and-pop web app shops and/or those that trust the code because they aren't capable of auditing it meaningfully.
When you use an npm dependency, you are taking on all of their dependencies. You are trusting they don't leak creds, that npm has not been compromised, and that the chain underneath has been audited for malicious behavior. In reality this is impractical: go npm install express and see just how deep the dependency chain goes. Things like Snyk are required to just understand what might be vulnerable.
EDIT: Ironically, this module itself is vulnerable to code injection.
This is otherwise known as an active developer community and is a good thing. In any open library ecosystem, it's ultimately up to the developer to carefully choose and vet third-party modules. There isn't any substitute for that.
The alternative is a tightly controlled standard library, but that isn't npm's stated goal. Such a controlled, curated, audited standard library is, however, something that could be built on top of npm, but obviously not vice versa.
So npm being a circus is, in the grander scheme of things, a good thing. Novice programmers will necessarily produce novice code.
edit: if it wasn't clear, I completely agree about the security risks of this project.
What npm says it is doing on paper and in its charter is not necessarily what npm gets used for. At this point in the ecosystem's maturity, npm developers are doing their users a massive disservice and opening them up to a lot of risk. Maintaining this line of a vibrant active community, and "developers should be responsible for their own security", rings hollow.
As I see it, npm appears to be acting like there are a lot of unsolved problems in this realm, and in doing so are endangering a developer community that is absolutely full of amateurs.
Debian solved this problem years ago. Restrict npm defaults to vetted packages, and have people add repositories as need be (e.g. multiverse, Ubuntu PPA, etc.) for packages that aren't audited or by trusted parties.
The Node user experience often ignores most security issues for ease of use; this is OK when you are guarding people against the most likely mistakes. This is a problem of setting insecure defaults and expecting the Internet to play nice. The user experience, the marketing message, and the community at large defends its openness to the death. I'm all for openness, but at some point senior developers should be attempting to shepherd their new developers into making secure decisions, and thinking in a way that is somewhat security minded.
I agree that npm has been a bit slow with a bunch of important features like package signing, sandboxing post-install scripts, etc. but as a counterpoint to the authoritativeness issue, I would argue that vetting and defining "authoritative" packages is a difficult problem. I'm not aware of any open/semi-open package ecosystem that has solved this problem (please do correct me if I'm wrong).
These are solvable problems, but not easy ones to reach consensus on.
The Redhat-like alternative is to have a central entity employ/pay contributors to audit and maintain libraries, but it's debatable whether npm would have grown to its current size with that model.
Ramda is a utility belt that sticks to pure functional practices wherever it can, something the JS community doesn't do, so it shouldn't be the authoritative.
Underscore is dead and was replaced by Lodash.
While it's hard to do the above with all kinds of libraries, there are some where it's easy.
But, "Novice programmers will necessarily produce novice code."
Having had my own variety of experience with CPAN, PEAR, Tcl's package thing, C by way of building my own RPMs and DEBs, and now watching the ongoing trainwreck that is npm, yes. You're right about that.
Installing modules from npm is dangerous enough. Nice for education or playing around, unsuitable for a serious developers' workstation.
Maven knows I've made the mistake of triggering a big download on a bad network.
Opened up an issue https://github.com/siddharthkp/auto-install/issues/1
I don't actually know that there's any problem that `rm node_modules; npm install` won't solve, since that would make things the same as a fresh install.
All of the scenarios I could come up with were about long-running installs, rather than new deploys, so more likely to happen on dev than prod.
ETA: actually wrong, see below.
Even in long-running installs I still don't see a potential issue here. Do you have a small specific example you can think of?
Package B depends on lodash: ^4.14.2
... you know what? I just installed semver to check my beliefs, and it turns out I'm just spreading FUD. Caret doesn't work the way I thought it did.
a) a typo can run unintended code on your box
b) a bad/old/hacked project (correctly typed) can run unintended code on your box
c) both of the above conditions are considerably more severe because they can be triggered with near-zero friction (unlike a gemfile, which are manually pulled-down and code is run from a separate command. Gemfile results are also generally small enough (1-2 pages) that one can visually spot typos if you are monitoring the output
d) complex dependency systems can often turn into dependency hell
e) (IMHO) instantly-available dependencies potentially reduce a given programmer's likelihood to attempt to solve their problems with the language itself (i.e. don't use Cassandra when a Dictionary will do) ... the nodejs community seems especially susceptible to this
That's all I can think of right now...
However, I'm not sure I agree with the statement - you could use this tool and still have the discipline not to pull in random packages.
If there is a friction, the balance is changed a little against pulling
dependencies, at least those most trivial.
Added a --secure flag which will install popular modules only (>10k downloads last month)
Oh wait, npm doesn't even have namespaces...
Remind me again why everyone is using this shitheap?