Hacker News new | past | comments | ask | show | jobs | submit login

Someone should push a patch to upstream bash that will make any script longer than 50 lines abort. If you're building something that takes 2500 lines of shell script, you're using the wrong tool for the job. You're trying to make a program, but in a syntax (I was going to say language, but shell scripts aren't a programming language) that is utterly unsuited for anything like it.

A shell script has a very simple function: execute a number of tedious but common commands. That's it. Unfortunately someone decided they wanted control flow, so they wrote the horrifying [ and [[ programs, which I still consider to be a crime against humanity. Enter decades of write-once-read-never scripts that run with abysmal efficiency, kill maintainability, and neuter innovation. And somehow the entire Linux ecosystem depends on it and now every distro has to ship a shell with cruft dating back to 1970, with probably as many hidden vulnerabilities. And now every shell that tries to position itself on the market has to implement the same quirks, bugs, and design flaws in order to even be considered.

Let shells be shells, let shell scripts be dumb as bricks, and please use python or even perl for the love of god if you're going to make a build system.

I see this sentiment a lot, but it's not a useful one, as that ship has already sailed.

Please read this comment: https://www.reddit.com/r/linux/comments/7lsajn/oil_shell_03_...

It's not a small amount of shell, and it's not even old in some cases. People are still writing big pieces of shell code. I point out at the end of this comment that Kubernetes, a brand new cluster manager from Google (2014 or so), has 48,000+ lines of shell in its repo.

Shell is definitely a terrible language for many things, but that's what I'm trying to fix, in a way that doesn't force you to rewrite all your code. Some things will never be rewritten -- it's like saying "Hey why don't you rewrite Wikipedia in Python and get rid of all the PHP?" It will never happen, for fundamental reasons of economics.

Also see this comment for more of the "why":


I plan to write these up on the blog, as mentioned at the end of this post.

> I see this sentiment a lot, but it’s not a useful one...

Consider this response for entertainment purposes only. It’s easy to become insult(ed|ing) or get into a holy war, but I’m really not invested in any of that.

That said: it is a useful sentiment, and a kubernetes script written in sh(1) is not the answer to the question.

That one “can” does not mean one “should”.

The reddit comment cites init scripts (I happen to use RC scripts, and haven’t viewed init scripts in a while) as an example of why shell is so great - everybody uses init!

1) of course everybody does. And it -happens- to be written in shell. So what?

2) this is fine! The scripts I see are 10s of lines long and practically flow top->bottom w maybe one or two conditionals.

This is more akin to a job control language. A perfect application for sh(1).

I just feel it (and you can too): sh(1) (or bash - nobody cares) features feel archaic and brittle. People have written impressive work in brainfuck[0][1], or C that just emits mov[2] instructions; is that a case for brainfuck projects or mov-only assembly? No. It’s just interesting. And kubernetes scaffolding or anything else big that happens (by some miracle) to work are not excuses for huge shell scripts. That’s just tautology - otherwise we’d be talking about the virtue of these scripts. For a boot sequence (rc/init) even if we ignore that the short scripts are actually suitable for the important job, I’d say their other grace is that sh(1) is demanded by POSIX, so you don’t need to wonder if it’s available in this nascent state of booting your OS - you’re guaranteed. The same couldn’t be said for ruby or python.

From the “see this comment” comment:

> Have you ever written a bash completion script yourself? I did a few years ago and I am scared to ever touch it again! I have a non-deterministic bug that I can't figure out, so I just restart my terminal every time completion gets borked. And I know bash very well.

The comment admits what a warzone the space is... Sure, fix it, but we don’t need to promote it.

The sentiment of discouraging huge shell scripts the GP expressed is useful. Resist.

[0] https://en.wikipedia.org/wiki/Brainfuck

[1] https://rosettacode.org/wiki/Category:Brainf***

[2] https://youtu.be/R7EEoWg6Ekk

It's clear from these comments that I need to explain the project more concisely on the blog (which I mentioned in the conclusion), but here is a short response:

(1) In retrospect, I should have responded to the first commenter differently. He was clearly angered by having to debug other people's shell scripts in the past. But that is exactly what I'm trying to fix -- I'm trying to provide an UPGRADE PATH OUT OF BASH. I explicitly state that in this long post, which I think some people have missed.

These posts show one part of the plan: http://www.oilshell.org/blog/tags.html?tag=osh-to-oil#osh-to...

(2) I'll repeat my objection: trying to convince people not to use bash is going to be about as successful as convincing people not to use PHP. Not only is it not a solution to the problem, it also ignores the humongous installed base of PHP code, like Wikipedia, etc.

Even if another line of bash never gets written, you'll still have to deal with it on a regular basis!

I am providing a path out of bash, while other people are just wishing the problem would go away. The world is not how you want it to be, but exhortation on online forums isn't going to change the world. Less charitably, "you're wasting your breath".

I need to write a blog post entitled "Reimplementing Bash is the Only Way to Kill It". This is analogous to how Facebook is "removing" PHP from their codebase by developing the Hack language and VM.

It's also analogous to how Microsoft "killed" Lotus and other competitors by understanding their file formats.

(3) I also don't think you understand why cloud projects like Kubernetes are using shell scripts, probably because you don't work in that domain. Shell really is the best tool for that job.

I think it's presumptuous to state that the people writing that code don't know what they're doing. A lot of the project is in Go, but a lot of it is in shell, and there's probably a reason for that other than ignorance.

To start, I think your project is excellent and I don't mean to take away from your work. It has bore fruit, which is an accomplishment on it's own, to say nothing of the intrinsic value of simply investigating and hacking in whatever domain interests you. I like to think I'm fond of hacking in "unsexy" spaces, and I think hot-rodding a shell is that space.

Re: 3) above -- Curious to hear about the reasons.

Re: "...people writing that code ...other than ignorance" -- I'm not trying to imply people working on this are idiots... though w/ broad-sounding statements like "scripts with more than 20 lines should be looked at suspiciously" I can see how someone might read that. I think there are myriad poor reasons an shell script might be inappropriately used. There's also a case for "if it's not mine, who am I to judge?". I'd be curious to hear points of why shell is especially suited for (e.g.) cloud projects like Kubernetes, though.

Good luck with your work.

For #3, shell is a language for dealing with processes and the file system. And that's exactly what you need to bring up servers / containers / cluster managers like Kubernetes. Also stuff like OpenStack and distributed file systems. Shell is universally used to automate these projects.

I think you need to actually do it to have an appreciation for this. Other people here have made the same points I'm making, but if you don't try it, you won't viscerally understand it. I would just caution against the attitude of assuming that people who use a tool you don't like / don't want to learn (understandably) are using the wrong tool for the job.

As mentioned I should write a blog post called "Python Is Not an Acceptable Shell", in part based on this post, which is very in detail, but I think proves the opposite point:


Of course, bash has significant downsides for this task too. That is why Oil exists!

> sh(1) (or bash - nobody cares) features feel archaic and brittle

Maybe some shell features are not that great, but in general I feel the opposite. Many of the shell features are great and lacking in other, more general-purpose languages - for example, program composition via pipes, stdout/stderr redirection syntax, direct access to the filesystem, interchangeability of a function and a program call, and others.

When somebody decides to write a big system, like Kubernetes, they better be pragmatic rather than idealistic. That means choosing the right tool for the job, even if this tool has some deficiences or is considered quirky by some people. Manipulating VMs,containers on localhost or remote hosts, interacting with the OS and the filesystem, all on a unix-like system, it seems to me like shell is the best tool there is for most of the task.

> composition via pipes

That’s a good, big one in favour of shells.

Re: file system, std err/out, I’m not convinced, but pipes alone are worth a lot. I wasn’t thinking at all of pipes, and you’ve tipped the scales, but is it enough? Do Tcl/Perl/ruby/python/... compete in that space?

FWIW this post is related:

Pipelines Support Vectorized, Point-Free, and Imperative Style


I started writing a series on "what shell can do that other languages can't", but I put it on hold until the shell is in better shape. I hope to resume that series in the next few months.

> Re: file system, std err/out, I’m not convinced, but pipes alone are worth a lot.

How does the existence or lack of a standard data interchange format affect the value of those pipes?

It seems to me that theoretically the effect is brutal. Practically, it’s negligible.

We pipe everyday, all day, and the world carries on, so that’s a pragmatic vote of confidence for pipes.

Are you trying to get me to write a “worse is better” essay that ends with “...and the existence and continued durability of shell scripts is proof enough that their place is not only justified, it’s vital!”, because it won’t work. Pipes are good. Shell scripts should be looked at suspiciously if they’re more than 20 lines long.


Pipes aren't even something shell scripts have a monopoly on. They're just as easy to use in Perl. Perl may not be known for its readability, but it still ends up miles ahead of an equivalent script in bash.

I do wish there was a first class syntax for it in Python (but you can get most of the way there by using generator/coroutine pipelines).

So much arrogance and negativity, so little evidence for the claims you make.

> A shell script has a very simple function: execute a number of tedious but common commands. That's it.

And why is that? What will happen to me if I write more complicated shell scripts? The functionality is there and one can write very expressive and readable programs in shell. I would definitely recommend shell for writing long programs (for example various system administration and web backend tasks which do not require complex libraries). Preferably bash since it is the most common one.

> And somehow the entire Linux ecosystem depends on it and now every distro has to ship a shell with cruft dating back to 1970

The 'somehow' has quite a simple rationale, it isn't something that should outrage you. The family of Linux-based OSes take inspiration and compatibility target in unix systems on purpose. Unix systems are primarily operated via shell.

The current standard of shell may be an old idea from 70's with some 'cruft', but what language does not have cruft? The shell has proven to be usable and worthy of continued use. The compatibility with the unix shell is a big part of the early success of GNU/Linux and majority of users need it.

> And now every shell that tries to position itself on the market has to implement the same quirks, bugs, and design flaws in order to even be considered.

I have no idea what you are talking about.

> please use python or even perl for the love of god if you're going to make a build system

OK, so you think perl and python are better for making a build system. Have you made one?

> bash since it is the most common one.

Im pretty sure you meant to say "POSIX, because its rules are supported by multiple shells, and you can't depend on bash being available or a recent version on some systems".

Actually, I meant bash, that is, if one works solely with Linux systems. Bash has some very useful features and is the default on common Linux systems. Those are not POSIX. It was frozen in time long ago and is very limiting, do not wear that straightjacket if you do not have to.

For commercial unices or bsd's bash could be hard to use and using POSIX shell may make more sense.

From my experience, the extras Bash give you are likely to be bigger hints that it is time to move to a more appropriate language.

I'm definitely pro on shell scripts, but I'm also willing to use a different language when posix SH isn't going to cut it.

Any unix system will have sh(1).

Not all have bash. Not all have python. More probably have perl than python, so if you want something other than sh, then perl should probably be it.

I'm aware of that. When I said POSIX, I was referring to a POSIX-compatible sh implementation.

Just for the error handling and namespacing, python is more suited for big code bases. But there are way, way more reasons.

> What will happen to me if I write more complicated shell scripts?

As part of my job currently, I am going to delete it.

The same goes for poor python code, but I will take bad python over shell scripts any day of the week.

> As part of my job currently, I am going to delete it.

Go away, or I will turn you into a very small shell script.

    find / -name \*.sh -size +10k -type f -print0 | xargs -0 rm

    $ mangle find -delete
                  Delete  files;  true  if  removal  succeeded.   If the removal failed, an error message is issued.  If -delete fails, find's exit status will be nonzero (when it eventually exits).  Use of -delete automatically turns on the
                  -depth option.

                  Warnings: Don't forget that the find command line is evaluated as an expression, so putting -delete first will make find try to delete everything below the starting points you specified.  When testing a  find  command  line
                  that you later intend to use with -delete, you should explicitly specify -depth in order to avoid later surprises.  Because -delete implies -depth, you cannot usefully use -prune and -delete together.

Is mangle an actual program that's publicly available? Searched for it but didn't find anything.

Thank you.

Does this escape spaces in filenames properly?

Find's `-print0' will output results using null separation. Xargs' `-0' argument will treat input as null-separated.

As a sysadmin you can pry the ba(sh) scripts out of my cold dead hands. I'm not a dev, I'm not really a programmer, I'm the one who has to support all the crazy shit you devs are throwing at the wall to see what sticks. Bash is how I get shit done.

I'm just tired of bash getting bashed everytime someone mentions shell scripting, based on what mostly seems to be bandwagon reasoning. Of course if you are writing more than $arbitrarycomplexity drop into perl, etc, but there is so much that can be done quickly, easily, and readably even for a non-programmer like myself that I think there are many benefits to using shell scripts that get far too often ignored.

Also, a perfect example of a great bash script that's over 2k loc, and I've got plenty more were that came from: https://github.com/centminmod/centminmod/blob/master/centmin...

Don't take this bad but when you're not a programmer it's hard to understand how 2k LoC are probably 50 LoC done wrong. It's a self fulfilling prophecy, since programmers spends their day finding abstraction to turn long and windy into concise, but for the rest of the world we're just lunatics.

Now I'm not as pushy as OP, but it's far from baseless bandwagon.

> it's hard to understand how 2k LoC are probably 50 LoC done wrong

I challenge you to reimplement the shell script that grandparent linked in 50 LOC. In fact, shell script has the chance to be a lot more succinct than other languages for a lot of administrative tasks since the invocation of other programs is a first-class citizen. For example, compare this Go snippet:

  cmd := exec.Command("apt", append([]string{"install"}, dependencies...))
  cmd.Stdin = os.Stdin
  cmd.Stdout = os.Stdout
  cmd.Stderr = os.Stderr
  err := cmd.Run()
  if err != nil {
    fmt.Fprintln(os.Stderr, err.Error())
to its equivalent in shell script:

  if ! apt install ${DEPENDENCIES[@]}; then
    echo "exit status: $!" >&2
    exit 1
Or even just "apt install ${DEPENDENCIES[@]}" with "set -e".

Finally, what is it with sysadmins maintaining large scripts while at the same time quipping that they are not programmers? A script is very much a program, and a scripting language is very much a programming language.

> what is it with sysadmins maintaining large scripts while at the same time quipping that they are not programmers?

We've met some programmers, and don't want to be compared to that[1].


Programmers will embark on some massive mission that Apache and dumb CGIs can do[2]

Programmers will justify their slow software by saying it's readable, or by saying that the JIRA ticket didn't specify how fast the solution needed to be.

Programmers will make user interfaces that require clicking multiple times to find some bug, then require the report tediously repeat those clicks, complain that the bug report wasn't accurate (can't find a button that says "OK" when it says "Close", and there's no other buttons).

Programmers will actually shit in the pool, and say it improves readability. Seriously: I have seen programmers delete perfectly working code and replace it with something broken (especially on the edge cases) and then ask the sysadmin team to wake up at 3am to restart their software every night. Code that was working fine for years. Code that wasn't even in their project. Fuck that.

[1]: https://news.ycombinator.com/item?id=16155641

[2]: http://marcio.io/2015/07/handling-1-million-requests-per-min...

What you mean is not "programmer", but "software engineer". Everyone who introduces themselves as a "software engineer" to me starts with -100 points on the scorecard in my mind, because that's exactly the sort of people you're talking about.

I program. I’m not a “programmer”.

I do lots of other things l, like make sure my software works, and make sure the users are successful with it — but I don’t know what that job title is.

“I print money for my employer” seems a little OTT...

Software developer

I also sell, do marketing, technical writing, negotiate with vendors, manage multiple product roadmaps, manage teams of people who program, manage servers and networks, manage sysadmins, cook, and so on. But because I was also AS21863 for a decade, I've always been a sysadmin who programs, and not a "programmer" who runs a network (and does these other things). But this isn't really about my job title.

The point is that I've been programming for over three decades, and over time this skill became just one of many things that I do.

"Programmers", are people where that's all they do, so they look to solve problems with more programming. Wanting to call them "software engineer" or "software developer" or "potato" anything else doesn't matter to me.

this is probably regular, but I'm not praising "programmers" like that, just to be clear.

rewriting for the sake of it, with regression is plain bad

also just in case, I'm not saying sysadmin are lower; but their job is not to learn about programming languages and paradigm, so they don't know what goes on there. Now, that they can make good programs, sure. Having sound logic is not limited to "programmers".

It would be a fun challenge, note that for instance the first part of the script is dependency check, the whole is basically `import a,b,c,d` in meaning.

It's true that calling programs shortens tremendously, but every time you have to glue things together you're back to grep/sed and all the fragile bash idioms.

Of course you could have standalone programs to do that, in the end it would be as having a non bash programming language in disguise

> Finally, what is it with sysadmins maintaining large scripts while at the same time quipping that they are not programmers? A script is very much a program, and a scripting language is very much a programming language.

You are correct, but I mostly use this preface to let people know I don't have the detailed formal training in programming most devs do. So while I can wiggle my way around perl/python/bash/awk/sed and other "systems languages" just fine I am constantly finding out how much I don't know.

I suppose in the early hacker days sense of the word most of us are programmers.

> I am constantly finding out how much I don't know.

You think that once you have a CS degree, it's not like that anymore? If anything, that feeling gets more and more common over the years.

> In fact, shell script has the chance to be a lot more succinct than other languages for a lot of administrative tasks since the invocation of other programs is a first-class citizen.

Sure, so the fair comparison to make is with calling library functions - all those other programs had to be implemented separately too. If I were reimplementing the linked script I'd use something like Puppet where "ensure package installed" is a first-class citizen too.

> shell scripts aren't a programming language

Please explain why. Because simple observation seems to prove the opposite.

That would be pretty hard to sell to the bioinformatics crowd. My Bash scripts are long, but easy to read. I'm just setting some environment variables, running some algorithms (many output to stdout and can be piped directly to other algorithms), moving the results around, write to a log here and there. Easily over 50 lines. Why would you force me into Python or what ever? I'm just gluing different programs together and moving files around.

> And now every shell that tries to position itself on the market has to implement the same quirks, bugs, and design flaws in order to even be considered.

That's patently false: https://fishshell.com

fish is praised a lot, but how often is it used?


Tell that to the Zsh community. Or the community for any embedded scripting language -- Emacs Lisp, Vimscript, etc.

Zsh, especially with 'set -eu', is a perfectly suitable, albeit slow, replacement for Perl, which is in my opinion more distasteful.

Another example: Arch Linux is an entire distribution developed (at least 90%) in shell scripts. For example, the installer [1] is three shell scripts of 300-400 lines each. Packages are built with `makepkg`, which is a 2400-line shell script.

It helps that the Arch developers are good at writing clean shell scripts.

[1] In the package "arch-install-scripts".

Yes, Alpine's APKBUILD seems to be very much modelled after PKGBUILD, and "abuild" sounds analogous to "makepkg". It's also 100% shell, but it uses busybox ash rather than bash.

I prefer pure hand-written shell to an unholy mix of auto-generated shell and Make. The latter is what I remember Debian's build system to be, although admittedly I haven't looked at it in awhile.

Alpine's apk-build was also written in Shell, but rewritten in C: https://github.com/alpinelinux/apk-tools

I wonder what the reason was.

There is "abuild" to build packages, and "apk" for users to install packages. abuild is definitely still in shell (~2600 lines).

I'm not sure if apk used to be in shell, but I wouldn't be surprised if it were. If you have a link to the old version I'm interested.

I think the main reason not to use shell for the user side is the version solver. To install packages, you have to solve dependency constraints, which is actually an NP-complete problem! Those heuristics are best coded in C.

In contrast, the build side doesn't have to do anything like that, or it can just shell out to "apk" if it does.

Usually rewrites to C are for performance and portability. In this case I suspect the former.

Example script:

    trap '(( LINENO < 10+3 )) || { echo "Script executed more than 10 lines of code! Aborting."; exit 88; }' DEBUG
    echo 1
    echo 2
    echo 3
    echo 4
    echo 5
    echo 6
    echo 7
    echo 8
    echo 9
    echo 10
    echo 11

    $ ./test.sh
    Script executed more than 10 lines of code! Aborting.

That would have the nice side effect of killing the systemd/init debate. (Disclaimer: I was foolish enough to write a continuous integration script in bash).

What's wrong with a continuous integration script in bash? A basic CI script just has to check source control periodically, run the build (make or something similar) and notify anyone if there's an error and maybe publish the result somewhere. I've done similar with the far less capable batch files.

Sure it's not as great as a full on CI server, but it doesn't require the same investment either. For my personal projects I have bash scripts to handle basic CI (compiling and testing), given a bit of time I think I could string cron, make, bash and m4 into a CI server that fits 90%+ of my needs.

Oh it was great in the sense that it was very quick to write and it works... but it is not very easy to graft on new features.

FWIW I wrote a post that touched on that debate last year:


(I didn't come down on either side of the debate -- it's more of an exploration.)

Comments: https://news.ycombinator.com/item?id=13477842

I have always been wondering why don't we have shell program that don't do scripts -- no flow control and fancy macros, and separate program that run scripts -- no readline or job control (while pipes and redirects are simply treated as syntax)?

We tried that. It was called the C-shell. Most people would use it on the command line and write scripts in Bourne shell, but some people kept trying to write scripts in C-shell for some incredibly stupid reason, and so it got banned.

Indeed, I have vivid memories of reading the chapter in Unix Power Tools devoted to this very topic: https://docstore.mik.ua/orelly/unix2.1/upt/ch47_01.htm

Good to learn! I always thought C-shell was aimed at scripting part (due to the sound of "C"); I guess I missed the boat.

One could argue that HTML is a programming language. https://www.youtube.com/watch?v=4A2mWqLUpzw

Peter van Roy puts XML and S-expressions in the "descriptive declarative programming" paradigm, with data structures only (no Turing equivalence):


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact