
“Curl Bash piping” wall of shame - type0
https://gnu.moe/wallofshame.md
======
wang_li
I had this conversation with a person on reddit just the other day. After I
pointed that there is no difference between piping a script directly into bash
and downloading a package and installing it, they told me that they can audit
a script/package they download and that they have personally audited every bit
of code on their linux box. The kernel, gnome and metasploit.

I got to thinking what it would take to do such a task, and mainly what I came
up with was a whole damned lot of time. Assuming a person can read, understand
and remember 5000 lines of code per day (which honestly I think is way more
than is realistic), it'd only take 8.5 years to audit the linux kernel. Add in
all the other stuff and I figure something on the order of two decades. And in
the end you'd be running two decade old software or you'd have to start over.

At the end of the day, security comes from the personal and corporate
economics of reputation, profit, and prison avoidance. Do your best to get
your stuff from people who you judge to be trustworthy and rely on their own
self interest to not be malicious and to do their best to protect their
repositories. And rely on others to be good citizens and report the bad shit
that happens to them so that things can be cleaned up.

Now I'm not saying to throw out security best practices, but people should be
aware that the quality of their systems are built on trust in human nature and
self interest.

~~~
edwintorok
What if the download gets interrupted due to network error and bash runs
something else than what was intended. E.g. the script has:

    
    
      rm -rf /tmp/something
    

But when the pipe gets interrupted it executes:

    
    
      rm -rf /

~~~
plorkyeran
The body of the shell script should be wrapped in a function which is invoked
at the end. If you do not trust them to get this incredibly basic thing right,
how can you possibly trust any of the other code they have written?

~~~
RealityVoid
Soo, maybe a silly question... but are downloads from the internet always
"linear"? Meaning they go uninterrupted from start to end and if it drops it
will stop in the middle of the downloaded file? Doesn't it get "pieces" and
then place them together?

~~~
Ao7bei3s
It depends on your download program, but that's irrelevant because of the
pipe. It has to be linearized before going through the pipe.

------
arcticfox
I'm really not a fan of this author's style. The content seems angry just for
the sake of it. I'm no expert on it, but it's pretty unclear whether the
complaint is even valid, as there are talented, non-"retarded" developers that
disagree:

[https://sandstorm.io/news/2015-09-24-is-curl-bash-
insecure-p...](https://sandstorm.io/news/2015-09-24-is-curl-bash-insecure-pgp-
verified-install)

And for an egregious example of the author's approach of being angry without
validating anything:

[https://gnu.moe/petpeeves.html](https://gnu.moe/petpeeves.html) <\- an angry
post complaining about English mistakes that is itself splattered with basic
grammar errors

~~~
0xmohit
> an angry post complaining about English mistakes that is itself splattered
> with basic grammar errors

The following on the home page [0] is hilarious:

    
    
      I also maked a sitemap for those who want to dounloud
      errything.
    

[0] [https://gnu.moe/](https://gnu.moe/)

~~~
tempodox
OMG, that looks like a parody of a 1990's web site, except that it's not a
parody.

~~~
cheiVia0
How's it not a parody? Seems like it's parodying of a pseudo-Japanese Free
Software enthusiast, likely by a 4channer, that just happens to have some real
content.

Also: <!DOCTYPE QTML>

------
eridius
This page is completely wrong.

`curl | sh` is insecure if you're using an http url, although as has already
been said here, it's not really any more insecure than "download this script
and run it", unless you're expecting all of your users to actually read
through the whole script first. But if you're using an https url then you
should be ok since the page cannot be hijacked or modified en route (unless
the attacker actually has a trusted certificate for the domain, which is an
attack that's way out of scope of this discussion).

The biggest risk with this approach, which the page doesn't even mention, is
the danger of the connection being terminated before the whole script is
downloaded, as the shell will still evaluate what was sent. But this can be
handled in the script by making sure that an early EOF means nothing is
actually run (e.g. you can wrap the whole script in a bash function, and then
the last line executes the function).

So if you're using an https url, and the script is written to be resilient
against early termination, then this is a perfectly reasonable way to install
things.

~~~
btgeekboy
There's a few subtle differences you haven't mentioned:

1) If you attempt to read the script in your browser first, and everything's
great, then go pipe to bash, the server can send alternate content based on
your user agent.

2) You'd have a hard time proving the first point, or reviewing to see if a
script acted poorly, if you didn't have a local copy, which piping to a shell
like this normally prevents you from doing.

~~~
chrismonsanto
> If you attempt to read the script in your browser first, and everything's
> great, then go pipe to bash, the server can send alternate content based on
> your user agent.

If you can't trust the site to give you a safe installer then you can't trust
the rest of the sources it gives you either--you would need to audit the
entire package. Virtually nobody is going to do that. Singling out the
installer as uniquely dangerous is security theater.

~~~
seanp2k2
Windows and OSX also do this and present pointless non-defeatable "omg but
this is from the scary internet, are you SURE you want to run it????"

~~~
thedudemabry
To be fair, those have been very effective in keeping my non-techie family
members from accidentally installing malware because they'll at least check
with me when presented with a scary system dialog. And it hasn't impacted my
own use other than the occasional extra clicks. In a locked-down corporate
setting, I can see how that could be maddening, though.

------
hannob
"curl | bash" isn't actually very different security wise from "download this
piece of software from our webpage and install it". Every time you install
software these days you put a certain amount of trust on the vendor. (This may
change in a theoretical future of reproducible builds and binary transparency
logs.)

I'd argue that "curl [http://.*](http://.*) | sh" is always bad, but so is
every webpage offering software downloads that isn't https by default (and
there are plenty of them).

------
profmonocle
How is this more of a security risk than having the user do wget
[https://example.com/whatever.deb](https://example.com/whatever.deb) and then
dpkg -i whatever.deb? Or adding their apt repo & public keys? Sure, the
project maintainer could include malicious code with a curl bash, but they
could do in either of the ways I mentioned as well.

And maybe it's not relevant, but I find it really off-putting how the author
calls these developers idiots and retards constantly.

~~~
bkanber
It's definitely off-putting. And I'm totally fine with curl | bash. Composer
(PHP package manager) does that and I've used it hundreds of times (production
is containerized, but I do this on my personal machines too).

BUT. There is a difference -- code signing. HTTPS ensures that the data isn't
compromised en route, trust in the vendor is what makes you OK with letting
them run code on your computer, but neither of those things protect against a
compromised payload. ie, if the vendor's server gets hacked and the script
replaced, HTTPS doesn't help, and you get code that the vendor never intended
for you to run. Code signing is what protects against this, cryptographically
ensuring that the code you got is exactly the code the vendor wanted you to
have, and is the last link in the chain that connects your machine to a
trusted vendor.

~~~
eridius
But it also requires you to have some way to actually validate that the
signature is valid. Apple provides this service to registered Apple
developers, so you can code-sign your independently-distributed app or
installer and the certificate you sign with is generated and signed by Apple.
But in nearly all cases of code signing I've seen outside the Apple developer
ecosystem, it's GPG signatures, which relies on you being able to
independently verify that the signing key is valid and belongs to the vendor
and was not compromised. Which is to say, not very many people who download
stuff outside of a package system are actually going to validate that sort of
thing.

------
notacoward
I often see the claim that curl|bash is no worse than what a package manager
does. That is simply untrue, for the following reasons.

(1) HTTP (no S) MITM. At least the lazy devs admit this one.

(2) No key/signature checking _at all_. Sure, some semi-lazy devs will tell
you to add their own repo, and maybe you don't check the key for that repo
yourself, but there are others who do and they'll raise an alarm you might
hear. With curl|bash you don't even get this kind of herd immunity.

(3) No dependency checking.

(4) No adherence to standards. If you've ever tried to get a package included
in e.g. Fedora or Debian, you know that there are people who will go over them
with a fine-tooth comb and will reject them if they do bad things (or do them
in a bad way).

(5) Most install scripts don't handle interrupted downloads well unless the
author has taken special care (thank you for this one eridius). If you're
piping directly into the shell you have no idea whether that's the case, and
if the dev's lazy enough to be doing things this way in the first place the
odds are poor.

Packages and package repos can be deployed and used in many ways. Some ways
provide pretty strong safeguards and guarantees; other ways are weaker.
Curl|bash is weaker than _any_ of them. There's just no excuse.

~~~
seanp2k2
(1) you can easily see and just not use it (2) is incorrect, see
[https://rvm.io](https://rvm.io) \-- it verifies the script (3) you can do
within the script, though you'd have to handle different package managers (4)
is true for packages as well. You can put whatever you'd like in an RPM or deb
including e.g. A post-install hook to sudo rm -rf --no-preserve-root /

Making packages is hard if you want to support many distros. Maybe we should
adopt the Go model and just ship binaries.

~~~
Rapzid
Underneath the curl|bash method, which is what is being discussed I believe,
there is a link to a "more secure installation.":
[https://rvm.io/rvm/security](https://rvm.io/rvm/security) . These
instructions have you download and verify a signed installer.

------
inlined
This is a perfect example of dogma over logic. I'm not sure if the author just
prefers we all use the Mac app store. Three rules for any installation
process:

1\. Make sure you trust the vendor

2\. Make sure you trust the delivery (TLS)

3\. Think twice before you sudo

~~~
Rapzid
In the world of signed payloads the delivery, (2), includes all the
distribution infrastructure. And trusting the vendor, 1, means verifying
signature of the payload with their public keys. The assumption here being the
building and signing infrastructure is more secure than the distribution
infrastructure.

This level of security isn't desirable for everyone in all situations, but
it's not just dogma and it's not "the same thing".

------
GuiA
I don't disagree with the author that it is a huge security risk. But this is
yet another example of a software enthusiast who doesn't get the value of
convenience. In fact, "convenience" is probably the first attribute of great
software.

So again, while I agree with their general sentiment, being "baffled by how
[oh my zsh] became so popular" just because it instructs users to curl pipe
shows that they don't get the core issue at play here.

~~~
paulddraper
If I had to provide some sage piece of wisdom that summarized the learnings of
my life, it would be this:

"Never, ever underestimate the power of convenience."

------
amalcon
Piping curl into a scripting environment is problematic for a lot of reasons.
For starters, it has a worse user experience than "Download and run this
program". I know we love inventing worse user experiences than what's been
done for decades, but it's a trend that needs to reverse.

It's easier to mess up security (the early reset thing). This type of install
script can't be run offline, because it needs to fetch dependencies (so you
can't download it once and run it on multiple systems, and you can never run
it on an airgapped system). It accustoms users to pasting commands into shells
without knowing what they do, which is irresponsible even without JS clipboard
shenanigans.

The author of this page went a lot overboard with the rhetoric, but the simple
truth is that there's no good reason to suggest an install method like this.
Even taking the exact same script, and asking the user to download and run it
is a better plan, because it gives an improved user experience (though still
nowhere near ideal).

And yes, I've seen Sandstorm's defense of this practice. I use and very much
respect that project, but I couldn't disagree more with the choice of
installation method.

~~~
chrismonsanto
> there's no good reason to suggest an install method like this

It is useful for bootstrapping a package manager. Haskell's Stack uses this,
Rust uses this, Nix uses this, etc

~~~
amalcon
Why, though? Why is it better than asking users to download and run the exact
same script?

~~~
chrismonsanto
Because it's easier for their use case to copy and paste (1 step) than do
however many steps you want them to do. If you want to run something offline
or run it on multiple machines there is generally a different method available
for you to do so. That's not the use case curl | sh is trying to solve in this
instance.

~~~
amalcon
Copy and paste is 3 steps:

    
    
      - Select text
      - Start up a shell
      - Paste the text into the shell
    

Download and run is 3 steps for a shell script:

    
    
      - Click download link
      - Start up a shell so you can access stdin/stdout/stderr
      - Run the script in the shell
    

For a GUI program, it's 2 steps:

    
    
      - Click download link
      - Run the file from your download manager once it's downloaded
    

Why is piping the file directly into my shell easier than any of these? It's
considerably worse than the GUI option, and slightly worse than the shell
option because running programs is something I do from shell all the time.

~~~
chrismonsanto
> Download and run

I generally have a shell open, so it is easier for me to copy/paste into that
session than open a file dialog, save a temporary file, (potentially) switch
to a different directory, run the script, and delete it.

> GUI program

Creates a temporary file on my disk, needs a correct file association for .sh
since the download manager won't save it with +x (for my machine it
automatically opens in Emacs, which isn't what I want), I don't even know how
to make Linux open a terminal automatically for .sh. I know on Windows cmd.exe
would close the window immediately after running the script so you can't see
the output. All in all more complicated than just copy and pasting into an
active shell session.

~~~
amalcon
The shell is an invariant for script-based installers (i.e. required no matter
what). My workflow is also apparently somewhat different from yours, because I
tend to open a new shell for each distinct task.

Still, if you're the type of person who generally has a shell open, you can
probably figure out how to pipe something from a URL into an interpreter on
your own. Which is the better thing to teach to users that can't figure that
out?

I haven't manually deleted a temporary file in years, I just have a cron job
that clears out my downloads directory. Though this may be a case where my
weird workflow changes things from typical, from what I've seen most people
just ignore the temporary files.

Most of your objections to the GUI program workflow don't actually apply to
GUI programs. I admit that I forgot about the executable bit. I use a mix of
Linux and Windows, and basically zero Linux packages have GUI installers --
most prefer .deb/.rpm files, which would be ideal. So I haven't actually run
into that before.

It seems like it's only a couple of clicks to fix the permissions in the
Nautilus GUI through Firefox, though. Really I would just run the installer
from a shell anyway, but that's part of the better user experience: I get to
choose that, it's not forced on me.

------
pcwalton
"And most of all, the people that are part of the project are also likely to
be malicious because trying to infect someone is the only valid reason to
recommend this method of installation."

I've seen a lot of random accusations about us in the Rust project, but this
is the first time I've seen anyone accuse us of deliberately trying to spread
malware. I guess I'm learning how it feels to be a politician :)

------
michaelneale
Amusing list. I would like to share it but the unnecessary use of "retard" is
a bit upsetting when there are many other appropriate terms (fools, dangerous
idiots, whatever)

------
seagreen
Railing against `curl [https://foo/](https://foo/) | sh`? I guess security
theater's not just for the TSA.

(Do be aware of pastejacking though, but this is not nearly an important
enough issue for a wall of shame)

EDIT: We can all learn something from the readability of that page though. One
zoom and it's better than 90% of the websites I've seen. Text -- it works.

~~~
lowboy
Monospaced typefaces have never been good for readability.

~~~
tempodox
I beg to differ. There is so much botched typography on the internet and
widespread use of Arial-like fonts that monospaced, while not ideal, is far
from the worst of choices.

------
raz32dust
It boils down to "Is there any type of attack that can compromise `curl | sh`
without compromising a download from the site? I can only think if one - the
download allows for md5 check.

Also the language is too condescending to be taken seriously. Not to say that
the tone has anything to do with the argument. And this probably matters less
in the software world. But if you want yourself to be taken seriously, you
either need to be Linus Torvalds or learn to disagree respectfully.

------
codehusker
Security is not pass/fail. What threats are enabled by this method? Are there
mitigations? Are there alternatives?

It would be a mistake to group all curl pipes. Does it require elevated
privileges? Is it served over TLS? Does it do any signature verification? What
the heck does the script actually do?

Different levels of security are required depending on trust. I trust Debian's
repository, so I feel less need to audit packages. But a random startup
promising ponies? I'd like to at least skim what I can, then throw it in a
jail/vm/container to test.

How is curl piping beneficial compared to grabbing the script, giving it a
quick read, then executing it? Convenience is all I can come up with, and
convenience often seems to be at odds with security.

------
richard_todd
Even though I know it's fine, it still feels weird to me when a site asks me
to pipe their script directly into a shell. Since I always have the option of
breaking it into two steps, though, it doesn't bother me, and I don't think a
"wall of shame" is called for.

I do usually glance over a a script/makefile/whatever before I run it--not so
much to find security issues but to see if there's anything I'd like to tweak
about it. For example, I always install homebrew in a nonstandard location,
and that means changing a couple things about the installer script first.

------
icameron
At least you have an option to download the script and glance at it before
running it if you're worried. Compare that to "Save and Run" an MSI in
Windows, which is putting the same level of trust in the product you are
downloading except its much more opaque to what is actually happening.

RVM and Homebrew are two big projects that also do this method of installing.
They are a breeze to setup. There's something to be said about just getting a
job done and going home for the day.

------
voltagex_
This might be attacking the wrong issue. How about making it easier to create
packages for major distributions? (this includes Windows and macOS)

~~~
nixpulvis
You mean like `brew` for macOS, or the many other options like it?

It's not the wrong issue, because I believe the main goal here is to raise
general awareness that piping to bash with untrusted data is a bad idea and we
as developers should frown on it not promote it.

~~~
voltagex_
Yes, but what alternative are you suggesting for developers who want to
distribute software to multiple platforms?

IT security can't just be the ones that come in and say "you've got issues
here, here and here" without also providing solutions.

~~~
notacoward
Bash scripts aren't magically multi-platform either. If you want something to
work right on multiple platforms, you have to understand those platforms'
proper distribution and packaging methods. Sure it's a pain, but so is writing
the mother of all bash scripts to handle every condition on every platform.
You won't get it right, just like all the people who tried to do the exact
same thing back in the 80s didn't get it right. That approach failed so
consistently and so badly that people created package managers to bring some
sanity to the situation. They're still the best solution available, even if
they're not perfect.

BTW, it's not _just_ about security. It's also about correctness, consistency,
repeatability, reversibility, auditability, etc. As a developer, I still build
and install actual packages on my test systems not because there's a security
issue but so I can be sure that an uninstall/reinstall will work exactly as
they should and not pollute my system with untracked changes. I don't know
what kind of developer wants to risk checking in changes that don't match what
they tested, but not the kind I want to work with.

------
echelon
Aside from the fact that this is a plaintext markdown file, the tone of this
page is an immediate turn off.

------
tyre

      And most of all, the people that are part of the project
      are also likely to be malicious because trying to infect
      someone is the only valid reason to recommend this method of 
      installation.
    

This lacks any empathy.

One valid reason: large projects (e.g. rvm) have proven over time to not be
malicious. It is far easier for a user to copy and paste one line than any
other install method. This lowers the barrier to entry and reduces support
requests for the maintainer.

Having an opinion is fine. Disagreeing is fine. Pretending like your answer is
the only reasonable one: not likely to win over many people.

------
thinkmoore
This is among the many use cases we're hoping to solve with our secure shell
scripting language Shill
([http://shill.seas.harvard.edu/](http://shill.seas.harvard.edu/)).

We're currently working on a commercial version of Shill targeting Linux. If
Shill sounds like a product your company wishes it could find, we'd love to
hear from you. Shoot me an email at sdmoore@fas.harvard.edu.

------
new299
Unless you're installing from a package manager with signed packages
everything is going to suck by comparison.

What exactly is the alternative the author would suggest? Git checkout?
Couldn't you paste-jack that too? If the instructions come from a webpage,
aren't they all basically paste-jackable? What is the specific issue with
using this method?

------
lucb1e
I've seen so many arguments over this, I wrote a short article on it.

In short, I'd prefer apt-get over curl-bash any day of the week, but most
Windows users install loads of software (sometimes signed, never checked)
since their OS offers nothing better, and also on Linux you never hear this
debate when someone offers a package for download. People worry about what you
might be piping to bash because they can see it and notice it could have
easily been anything; a deb package (or equivalent) is much more opaque.

More details: [http://lucb1e.com/!126](http://lucb1e.com/!126)

------
phn
How is this different than say, brew install <whatever>?

I'm trusting someone to serve me a piece of code to run either way. Brew, or
the people that provide the https cert for the given endpoint, right?

~~~
htns
When a third party is involved there is a degree of accountability. A proper
package repository would require everything to be logged and signed.

~~~
phn
Well, there's also accountability if I'm installing, say, docker.

Surely they want to make it easy to install their thing without a hitch, and
that's why they provide [https://get.docker.com](https://get.docker.com) for
me to pipe into bash.

My point is it all boils down in who you trust. If you are downloading
something unknown, sure, it's harder to go wrong with a package manager (if
the package is available), but you're still trusting someone not to attack you
or to leak the private keys.

Crucifying curling into bash has nothing to do with how safe you are. It's
almost like saying "Never run anything you download from the internet, it's
dangerous!"

------
mindslight
Many of the comments here are utterly missing the point. Likely, you've been
spoiled by the cancerous javascript ecosystem and so don't realize what
exactly you're giving up.

1\. An https URL is _not_ secure unless your trust model involves trusting the
server completely. Unless you're running this script in an isolated throwaway
sandbox, this is a terrible idea.

2\. Obviously auditing can't rule out well-hidden maliciousness or clever
bugs, as we're up against the halting problem. But it is quite easy to do a
quick sanity check on a downloaded script to see if it is going to do anything
wacky or boneheaded.

3\. Explicit packages are expected to have identities like versions and
hashes. This allows us to talk about how something has been modified, whether
a specific download instance has been tampered with, etc. A rando script has
these things from the developer's perspective, but not from the users'.

4\. These "easy install" scripts usually want to puke files into random places
on your system. /usr/local, /opt, /home/crap, /who/knows, etc. This is a great
way to create an unmanageable system. Standard practice for software outside
the package manager is to let the user choose where things should be
installed, eg ./configure --prefix=xxx. Lazy people choose /usr/local and can
always blow that part of their system away, the more astute use
/usr/local/pkg-1.0.0 (for stow), and some even have completely arbitrary paths
(I personally use something like /x/local-x64/pkg-1.0.0 which gets synced
across machines with unison).

5\. Such scripts usually continue their reign of brokenness by instituting
some sort of auto-updating. Now the user has little idea what version they
were running (say they want to upgrade to a source install to investigate some
bug), is less able to not change versions at an inconvenient time, and is
further discouraged from bringing the package under their management.

The sheer majority of these installers provide files that would be fine as
plain archives, but the distributors think they're being clever while
forgetting about users' general requirements. I do understand that in this age
of concentrated Melcalfe's law, it helps to appeal to the lazy people who
don't really care if their system becomes an insecure unmaintainable mess. But
really, you owe it to your users to provide a proper downloadable package that
is installed in a manageable way. (And the same goes for proper versioned
source releases, as opposed to telling users to grab a random git checkout).

~~~
jjnoakes
You seem to be arguing install scripts vs package managers, and we are all
discussing something else.

~~~
mindslight
Sure, but the two generally go together.

The last time I saw a downloadable shar-esque installer was quite some time
ago, and I've never seen software which installs a proper distribution's
package using curl | sh.

Plain install scripts are still around, but are generally fixed when a package
grows up. A large problem with curl | sh (especially which runs further curls)
is that it makes a crappy approach masquerade as a polished solution.

------
mioelnir
I do not even dislike curl|sh for security reasons. Package managers go to
great lengths to provide a reproducible runtime within them. When was the last
time you saw 'curl | env -i sh -C' as instruction?

If the script fails halfway? Good look trying to undo whatever it did if you
do not have access to `zfs rollback` or similar.

It is also less-than-fun to go through `zfs diff` and the downloaded script to
make a package out of it that can be distributed and automated.

~~~
__david__
Agreed, I think this is the biggest issue. Also you have to trust some random
installer to put the binary...somewhere? Is it going to overwrite junk in
/usr/local? Does it assume ~/.something is available? Does it require root and
then try to stuff code into /etc? Does it work if the install directory has a
space in it? What if there's a symlink somewhere in the path?

There's a million things that the script can do stupidly, and practically
every single one has at least one assumption that is bad.

One trick I've learned is to edit the script before running it and prefix
anything that looks dangerous with "echo" (because of course none of them ever
support --dry-run). Then I can at least see what they are doing, what they are
downloading, etc.

curl|sh is the bane of my existence. Shame on you if that's your _only_ means
of installing.

~~~
jjnoakes
None of this is an inherent problem with curl and piping however; any
installer you could download and run has the same list of issues, and many of
those aren't even auditable.

You should redirect your anger away from curl and pipe and toward using
install scripts vs package managers in general, because that's where your beef
really is.

~~~
__david__
I don't know about you but I don't end up running binary installers too much.
Certainly not on linux.

Even so, windows style binary installers are at least frameworks designed for
installing stuff (many with years of bug fixes under their belt), while the
curl|sh style installers are just ad-hoc one-offs written in a language that's
known for being pretty hostile to defensive programming.

So yes, any installer _could_ make those errors, but in my experience only
random shell installers seem to do that. Saying they are the same is a false
equivalency in my eyes.

~~~
jjnoakes
Who said anything about only binary installers? I specifically mentioned
install scripts as being different from the subject of the discussion, and you
seem to be conflating the two.

~~~
__david__
You said, "many of those aren't even auditable."

The only installers I can think of that aren't auditable are binary
installers. If you meant something else, I'm not understanding.

------
yeowMeng
I try to read scripts before I download and execute. The only time I thought
WTF was this:

[https://bootstrap.pypa.io/get-pip.py](https://bootstrap.pypa.io/get-pip.py)

Even though the author made it clear something funny was about to happen - I
just could not execute.

When I showed it to the guy next to me (at work) - he said he already
installed it on another box and didn't even look.

~~~
deathanatos
Where is the WTF in that script? Just because the author chose a base-85
encoding over base64? (That's a little weird, as I suspect the savings are so
minor as to not be worth it …)

~~~
yeowMeng
it could be base3, base64, base84.. that does not matter.

The part that I find interesting is how many people who read the script are
going to decode the blob and verify that the blob is non-malicious.

------
gepoch
I like this approach:

[https://github.com/ellotheth/pipethis](https://github.com/ellotheth/pipethis)

Either check a pgp signature if there is one, or skim the script before
executing. Also covers off that pesky dropped connection problem.

curl ¦ sh is too handy to ever die, but it's possible to be smart about it!

Edit: typos

------
dwightgunning
> because they are either run by retards or intelligence agencies.

Now would be a good time for the author to expand their vocabulary.

------
arnarbi
curl|sh isn't the least bit less secure than downloading a software
distribution and just running it.

------
0xmohit
I'd be tempted to say that `curl | bash` is as insecure as

    
    
      ./configure && make && sudo make install
    

The logic in the article effectively implies that one shouldn't be installing
an software without auditing every single line of code.

------
rkeene2
AppFS ( [http://appfs.rkeene.org/](http://appfs.rkeene.org/) ) solves almost
all of the problems with "curl | bash" and indeed package managers in general.
I'm the author. Feedback desired.

------
0942v8653
(Archived version if it's down:
[http://archive.is/Y9z0w](http://archive.is/Y9z0w))

------
ryanlm
Doesn't that popular software that tracks you tell you to install by piping
the output of curl. I think it's call Brew Ware.

------
NuSkooler
"Download this executable"

"Add this repo and key"

...so on.

------
api
This stupid hack exists because Linux is so fragmented it's impossible to
distribute software easily in any other way.

That and most Linux distributions are significantly harder to deal with both
technically and administratively than Apple or Google app stores.

Fix these problems and curl pipe bash will die.

But here we have the usual sort of nonproductive advice you get from security
people: shaming with no thought to underlying causes and God forbid we try to
improve anything.

------
kylek
Also see [https://curlpipesh.tumblr.com/](https://curlpipesh.tumblr.com/)

------
yuhong
[https://weakdh.org/logjam.html](https://weakdh.org/logjam.html) even uses it
as one of the examples. (the site has since been fixed)

