Hacker News new | past | comments | ask | show | jobs | submit login
I got tired of PHP and Perl, so I tried bash (github.com)
312 points by nerdgeneration 6 days ago | hide | past | web | favorite | 177 comments

Shell scripting gets a lot of hate, but after spending time to grok the language (and copious use of shellcheck), I've grown quite fond of it. This also has the nice side effect of emphasizing use of old unix tools.

I think the "big realization" for me was shifting to think in pipelines instead of passing state around in variables. That completely removed the pain of no arrays in POSIX shell for me. Stdin/stdout are your friend.

Certainly, shell scripting isn't the answer for everything, but with shellcheck (and bats!), I feel like it's a really reasonable default for systemy things. Heck, the status bar I use in my window manager is essentially just a 700 line bash script, with asyncronous update and everything. It's even quite readable and orthogonally organized, if I might boast. Have been pleasantly surprised over the years every time I go back and read it.

I think one of the biggest problems is the inconsistent runtime environment. I mean, yes, you can just say fuck POSIX and code for Bash only, but then again, not even Bash is consistent across its versions. Just to give an example: The infamous `set -e` behaves quite differently across different shells


And those are just POSIX compliant shells. I mean this doesn't matter if you just want to write a script for yourself but becomes frustrating when you want to level up you shell skills and write cross-platform scripts ;-)

That said, I have to add that I love shell scripting... I am not exactly sure why, but I guess it is because it lets you build powerful things with very few lines of code.

Yes. I have already been burned by it. Some scripts were developped without bashism in order to be portable on posix shells. Initially, they were used on ksh. Recently, they were tested only on bash. I had to use them on ubuntu (where /bin/sh is a link toward dash). Nothing was working and it was hard to identify the issues. I have changed the first line of all scripts -> problem fixed. Portability of scripts is an illusion: all you can do is test on all the target shells. If you do not have budget for testing on other shell than bash, do not waste your time trying to write portable code.

I agree, and I like having those ‘oh wow, I can do that in bash? That’s dangerous, but awesome’ moments just by researching how to get stuff done. Really people don’t know what they’re getting themselves into when they start down the shell scripting path.

When isn't this true, though? JavaScript behaves differently based on your browser and version. C runs differently based on your compiler version and system libs. This is just the nature of software.

I think the particular problem with shells is that POSIX (the standard) didn't really keep up with the development of new features over time. Instead, the different implementations introduced new features and nowadays there are some features which are quite common but not part of the standard (just from the top of my head: curl/wget; mktemp; bashisms).

With JS that is different. Yes, every browser has a few extra bits here and there, but there is very little that many browsers have that is not part of the standard. With compilers, it is different because the developer can decide which one he wants to use at build-time. After that, his program still might behave differently on different platforms (depending on APIs, etc.) but in general, the language that he uses behaves the same. With shell scripts that is not the case as the behavior depends on the run-time shell and the developer has very little influence on which shell will be used.

So while you are right that every programming language has this problem, some are better at managing it than others and shells are particularly bad at it in my opinion.

You do realize standardization exists to avoid just that?

What's your point? Shell is also standardized by POSIX.

Shells, C, and JavaScript all have standards. That doesn't mitigate versioning and platform differences entirely.

and to make things worse, macOS ships with a 12 years old version of Bash.

Indeed, I have been burned by that... Wrote a script which worked perfectly fine with Linux. Tried it with MacOS...

Oh gosh, it was such a pain work around those old bugs...

Does someone know why Apple doesn't update the bash?

I believe Apple ship the last version of bash released under GPLv2, and do not ship versions under GPLv3.

Apple has a problem with GPL3, not sure what exactly but that’s the underlying issue.

The problem is that "shell scripting" is "bash scripting", and the focus of that particular outgrowth of the original Bourne shell isn't really focused on that. It was mostly about interactive improvements, and while a few of the bits of added functionality help (like not needing to shell out for every bit of string manipulation), it doesn't contribute a lot to writing longer scripts. Zsh just doubled down on the interactive use. And let's never mention csh again.

Now there was one offshoot that tried to get better at longer scripts: The Korn Shell. In it's later edition (ksh93) it actually had hashes and a pretty good way of expanding its functionality. Heck, dtksh would've given Tcl/Tk a run for its money in the 90s if both ksh and Motif (the library dtksh used for its widgets) weren't proprietary at the time.

I worked on a "devops" system in the early '00s that was used to provision all kinds of Unix servers and workstations (including HP-UX, AIX, Solaris, Linux, BSD..), and the major parts of it was written in ksh93. It worked quite well.

I still think that all of this is inferior to both Perl5 and Tcl/Tk, but ksh93 made some decent strides way back when.

I have a special place in my heart for Perl5, but I would never use it for anything DevOps related... Perl’s motto is “there’s always another way to write it” — and that’s the exact opposite of what you want in a DevOps environment :)

Oh yeah, and Motif can die in a fire.

Sorry to say, but I find the criticism in this form to be quite meaningless. What does a motto have to do with any kind of validity for a certain purpose? Python has the "Zen of Python", does this inherently make it a better solution?

"There are many ways to skin a cat" is older than computers, and it applies to basically any programming language. Some languages make it harder to do it in a different manner, some allow it, but don't offer it without including some additional package. C has no hash tables, but it's not that hard to get them, for example.

I'd go so far as to say, that plenty of current programming languages allow more ways to do a thing out of the box than Perl. What mythical language does offer a straight and completely obvious path from problem to solution? Haven't I ascended enough in the halls of programmer-dom to be told of this coding panacea?

And why would this be particularly bad for DevOps? Are DevOps programmers worse and/or more easily confused?

Just recently we had a thread here about different ways to remove duplicates from arrays, and that was done in Golang, devops Holy Grail du jour.

> And why would this be particularly bad for DevOps? Are DevOps programmers worse and/or more easily confused?

DevOps is all about consistency and readability... neither of which are strong points of Perl. Perl was the de-facto cross-platform DevOps language in the 90s (seriously quite a few Mac programs of the day installed Perl as a dependency). There’s a reason the industry moved away from it 15 years ago.

I’ve been building in Perl5 for over 20 years, and it’s really easy to paint yourself into an edge case where things don’t quite work the way you expect them to. Everyone who writes Perl learned it a slightly different way, which makes managing an active codebase of Perl “fun” — it has a tendency to devolve into spaghetti code over time once you get a lot of contributors.

> DevOps is all about consistency and readability As opposed to any other field of programming?

Again, what weird single-approach programming language are we comparing Perl5 with?

Bats, shellcheck and jq has completely transformed the shell scripting experience for me. To the point where I’ll reach for bash much more often than other languages these days, for anything terminal based.

jq also has some really nice quick interactive features. My favorite super simple one is that jq beautifies JSON by very virtue of just piping it into jq with no additional arguments.

Haven't tried BATS before, but it sounds like the missing part of our bash codebase, which I am currently mostly skeptical of because it isn't unit tested.

I'd fear though that having BATS would just increase the tendency to let a bash script grow from 50 lines through to 200, when it should've just been ported to Python at 100.

Actually, my experience with bats is that it forces you to modularize and keep things small. Easier to test things that way. Bats is truly an indispensable tool when doing shell scripting – I even use it to test CLIs built in other languages too!

Pipeline thinking: This is the pipe-and-filter architectural style. Well documented in the industry.

For pipe-and-filter system management use case, I tend to think that Powershell is the peak of the shell evolution currently, but that is probably my limited insight. When I learnt shell scripting, zsh was the best available :) but I preferred the ksh.

> I think the "big realization" for me was shifting to think in pipelines instead of passing state around in variables.

ever since I started dabbling in racket I've grown fond of not dealing with state and look at evreything like a function. (I'm not a functional programmer).

The old UNIX philosophy with pipes is very close to that as in every command in the pipeline is a function in similar way. I'm not saying hacking together a bash script is FP but it's still possible to apply the FP thinking in the UNIX pipeline better than building a monolith that handles the whole pipe. I can pass state around (env vars) but it's an extra effort and it forces me to think whether I can get it done without state first (via optimizing the input output formats).

It is very inconvenient, ineffective and slow. For example lists are slow as hell. And the syntax is kind of workaround - look at '[' executable for example. It definitely has its place in CS history and it made things possible in the past but using it to anything more complex nowadays is a bad, bad, ba idea!

I am glad people prefer Python or Ruby for more complicated stuff and Go, Java or compiled langages like Rust, C/C++ for performance-intensive jobs.

Perl is fast but the syntax is terribly non-intuitive and I would mark it 'do not use for new designs'.

> [...] the "big realization" for me was shifting to think in pipelines instead of passing state around in variables. That completely removed the pain of no arrays [...]

Could you elaborate please? I see that data - especially sequential, array-like data - is conveniently passed via pipes (and e.g. jq https://stedolan.github.io/jq/ uses this concept), but I feel you're saying something deeper or a different kind of use-case... [perhaps because you say "state" not "data")

The sibling comment by @glic3rinu pretty much says it, but I'll elaborate a little bit. Can't make any claims about being anything deep, but in a sense, my shift in perspective came from treating program state as "data in pipes" and doing that as extensively as possible.

A big part of this was figuring out I could the `read` builtin to parse stuff into local variables:

    if read -r foo bar baz _; do
        # do stuff with $foo, $bar, $baz
        # $_ contains anything extra on the line
Also, once everything is on stdin, all the standard unix tools Just Work without having to echo and print everywhere, e.g. something like

    echo $some_local variable | unix_tool
For me, this turned a lot ugly scripts into primarily a collection of functions that compose nicely via pipelines.

Anyway, using `read` works especially nicely with tabular data too. For example, a naive CSV parser might look like

    while IFS=',' read -r field1 field2 field3 _; do
        # do stuff

  while IFS=',' read -r field1 field2 field3 _;   do
    # do stuff
This is the reason we need something better than bash than stick with it until it's been used for 50 years. It's just too cryptic.

Fish is better but still looks legacy.

> Fuck quoting/escaping, it will work 99% of the time.

Encouraging this is why bash scripts scare me.

By "state" I think OP is referring to an style of programming heavily based on global variables (global state). Function calls set global variables that the caller can read after. Functions and imports (sourcing) are all riddle with "side-effects", making the execution of the program very difficult to follow (spaghetti code).

Because "return values" in bash are implemented by echoing (printing to stdout):

  return=$(my_function arg1)
I believe OP has the point that echoing friendly-parsable output is a way of implementing "data-structures" in shells.

  result=$(my_function arg1)
  result_a=$(echo "$result" | grep A)
  result_b=$(echo "$result" | grep B)

Shell (Bash) scripting have a few huge problems, mostly with error handling and number arithmetic. The arithmetic one isn't a large problem for its niche, but the error handling is so large a bother that it's best avoided for anything complex, even if something else (AKA, Python) will lead to a much larger code.

That said, there are niches where bad error handling isn't as much problematic. Ironically, Bash is much more fit for web programming than for system scripting.

Exactly this.

Every time to reach for an array in bash, try a pipe instead.

Every time you desire multidimensional arrays, pipe through filters.

(At least that's what I think you're saying.)

After writing a decent backup shell script in fish, I converted it to bash as IntelliJ didn't have support for fish and also non strict Posix compatibility wasn't too comfortable and then realized it's quite a pain to write long script in bash knowing modern languages are far easier to write with, I may just go back to using perl for shell scripting.

All of the later scripting languages aren't really meant for writing shell script as invoking commands is just tedious instead of just using back ticks in perl.

Talk about going in circles of history.

If that was the big realisation, you may enjoy FP.

So anyway, I've been writing websites for 20 years, and this seemed fun. Some friend suggested I post it here, so I'm doing that just to shut him up.

What will you do once you get tired of bash?

Port it to Javascript, naturally.

bash.js has a nice ring to it

WebAssembly would be pretty cool too. I hear there is now something called WebAssembly "usermode" which can run directly in ring-0.

WebAssembly + Ring-0 = Security /s


LOL. I see what you did there xD

I made some progress with an assembler version but it's just gruelling, hard work.

I knew a guy who was writing Win32 apps in asm in high school. Heard that he became really weird in uni.

Your title made me smirk

Many years ago I learned myself Bash by writing this: https://github.com/louwrentius/PPSS

It's a huge bash script for parallel processing of work. It was great fun and I'm proud of it as I don't see myself as a programmer of any sort.

But gosh since I learned Powershell and Python, I would never ever ever ever ever use Bash for anything substantial.

If you ever feel like you grow 'fond' of Bash or Shell scripting, it's time to seek a doctor. Please learn Python or some equivalent instead.


Edit: reading the code of this web server project:

Let's be absolutely clear: this is a joke project. DO NOT USE THIS FOR # ANYTHING, ESPECIALLY ON A PUBLIC FACING SERVER.

Ive found it harder than expected to simply port over a bash script to python. Mostly because the syntax for 'run command c with arguments x y is much worse than bash. And that is before you want to handle output from the command.

It would be super nice if there was a package offering "native" shell experience in python. Say

  from pyshell import sh, ShellError  
  def func(a,b)  
      output, retcode = sh("df -k | awk '{print $5}'")  
      if retcode == 0:  
          return output  
          raise ShellError(output,retcode)

The sh module is really excellent, and it lets you take the best parts of bash (the external commands, pipes) and ditch the bad parts (string handling, general programming). I think it’s what you’re looking for.


I wrote https://github.com/tarruda/python-ush for my own scripts. It was inspired by "sh", but my version supports native "lazy" pipelines (using python operator overloading) and other nice features not provided by sh. Also it works seamless between Unix/windows

I was also planning to extend it to support python3 asyncio, but never got around it

https://shell.readthedocs.io/en/latest/ looks promising, will try next time before writing "just a small script" in bash (small scripts always grow and you pile on dirt on them when starting in bash).

This looks great, thanks! It looks very easy compared to subprocess.run and friends. I hope it'll help convincing my colleagues to default to python instead of bash (or, heavens forbid, csh)

Check out https://github.com/tarruda/python-ush

It provides an idiomatic shell scripting experience for python.

For example, it uses the "pipe" operator overload to allow complex pipelines to be created (pipes are created using platform native pipes, so there's no performance difference from a shell pipeline). The same operator can also be used for redirections

Isn't this what os.system(...) does?

By the way, why not put the raise inside sh?

EDIT: Ok, the difference is that os.system does not return stdout. But you can use the subprocess package to capture stdout (and stderr as well). If you want pipe syntax, you can invoke e.g. bash -c "...", where the dots are your command. But the problem with this is that it's prone to injection attacks, unless you are very careful. I guess the same warning applies in your original "sh" example.

Fabric'll do that for you (or rather Invoke, I guess I should say as part of the Fabric 2 changes)

    >>> from invoke import run
    >>> date = run("date")
    Sat Apr 13 11:09:18 PDT 2019
    >>> dir(date)
    ['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__nonzero__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', 'command', 'encoding', 'env', 'exited', 'failed', 'hide', 'ok', 'pty', 'return_code', 'shell', 'stderr', 'stdout']
    >>> date.return_code
    >>> date.stdout
    'Sat Apr 13 11:09:18 PDT 2019\n'
    >>> do_they_exist = run("grep banana /etc/passwd")
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/Users/pgraydon/Envs/python3/lib/python3.7/site-packages/invoke/__init__.py", line 46, in run
        return Context().run(command, **kwargs)
      File "/Users/pgraydon/Envs/python3/lib/python3.7/site-packages/invoke/context.py", line 94, in run
        return self._run(runner, command, **kwargs)
      File "/Users/pgraydon/Envs/python3/lib/python3.7/site-packages/invoke/context.py", line 101, in _run
        return runner.run(command, **kwargs)
      File "/Users/pgraydon/Envs/python3/lib/python3.7/site-packages/invoke/runners.py", line 271, in run
        return self._run_body(command, **kwargs)
      File "/Users/pgraydon/Envs/python3/lib/python3.7/site-packages/invoke/runners.py", line 404, in _run_body
        raise UnexpectedExit(result)
    invoke.exceptions.UnexpectedExit: Encountered a bad command exit code!
    Command: 'grep banana /etc/passwd'
    Exit code: 1
    Stdout: already printed
    Stderr: already printed
If it gets a non-zero code back (like grep does when it fails to get a match), it raises an UnexpectedExit exception.

This is a simple example, but usually you have command line arguments, the output can be large, and you might have input to process. As soon as you have that, you arrive at a more complex API, but one where you don't shoot yourself in the foot so easily (e.g. with special characters in file names and friends).

I disagree. If I’m porting a shell script to another language then I would want to use as much native code as possible rather than shelling out. Sure, it takes a little more effort, but the output is more idiomatic code that runs faster (fork() is slow).

Fwiw ruby borrows heavily from perl, and can be coaxed into serving as a driver for other commands. See eg:



Perl/awk like one-liner/filters in ruby:



Of note is that ruby allows "backtick execution":

filename = 'some_file.txt' wordcount = `wc --words #{filename}`

I find that much of the strength of bash/shell comes from the surrounding utilities though. So simply jumping to ruby won't magically let your "shell script" work on both windows and Linux, just because you have ruby installed;there likely not be a similar set of building blocks like "wc" etc. You could of course replicate most via the ruby std lib - and it's arguably easier to write test driven ruby than bash.

Perl is much closer to bash in this regard, mostly use

  qx(some command some parameters some variables)

nice one. this is my favorite way of learning languages. find it really hard going step by step through a book from scratch and my fingers get really itchy and want to skip ahead into solving real problems. my result is usually a lot worse than what you built there.

seems a lot of effort and iterations went into this?

I have seen quite a bit of legacy plumbing like this in production / automation until better (or at least standardized) tooling came along with the agile/devops movement. A lot of this plumbing survived surprisingly long (which is also scary). Often code like this matures over long periods into stable solutions that could even deal with errors well. Makes them hard to replace.

I agree it doesn't belong in production but it looks pretty neat for small internal stuff even in 2019.

It was a lot of work but it was my first real attempt at programming and the topic itself was motivating enough to bring it to completion.

Learning from books doesn't work, you need to have a project with a goal to apply it in.

The horrible honest truth is that the world actually doesn't run on bash, python, C or Java.

It runs on Excel macro's.

This. Some of my Structural engineering professors have invested so much time in Excel they practically have an instance of their mind in it. Really fancy systems too that could do just about anything they needed. Always impressed me.

Your project is a great read!

I hated Perl until I had to do Bash scripting.

I must apologize to Perl. I see now where it was coming from [1], and what it saved us from.

May god have pity on your soul for inflicting us with this.

[1] There are lots of quirks in Perl that I now recognize come from shell scripting. Most obviously, $VAR notation and string quoting.

Really you should be starting with bash, thats how you know why these other languages exist. My rule of thumb used to be if I have more than one "if" in a shell script then its going to python... But now its any ifs or variables

Edit: now that I read the readme I think the title should have been "show hn: I made a web framework in bash" this is pretty neat, I still wouldnt use it, but its cool as hell.

> My rule of thumb used to be if I have more than one "if" in a shell script then its going to python... But now its any ifs or variables

Hmm. I have a slightly different view. If logic is complex, then it should be moved to a different language. But if there are a lot of flat if statements then shell is fine.

Yeah it’s almost inevitable for a bash script to have lots of ifs but they’re almost always simple, checking some variable is empty or something.

Learning perl first stunted my bash knowledge; I'm very, very quick to bail into perl at the first sign of difficulty. :)

Do you consider that a bad thing?

Unless he didn't mean to use the word "stunted", I would say yes.

Note to self: Don't ask rhetorical questions on HN.

Suggestion: You use "set -e". The problem with this is that it's silent about which line the script died at, so if your script doesn't print a lot, you may have to do detective work. This is an automatic way of giving you a bit of a hint:

  $ cat meh.bash
  set -e
  die() {
    echo "Failed at line $1: $2"
  trap 'die $LINENO "$BASH_COMMAND"' ERR
  echo a
  test 1 = 3
  echo b
  $ ./meh.bash
  Failed at line 8: test 1 = 3
Adapted from: https://unix.stackexchange.com/a/462157

If your script dies because of `set -e` the detective work is just the punishment to condition yourself to write more stable code in the future ;-)

By the way, if you are just using Bash (instead of POSIX /bin/sh) you can do even better. Some time ago I found a stacktrace function somewhere:

  set -Eeuo pipefail
  trap stacktrace EXIT
  function stacktrace {
      if [ $? != 0 ]; then
          echo -e "\nThe command '$BASH_COMMAND' triggerd a stacktrace:"
          for i in $(seq 1 $((${#FUNCNAME[@]} - 2))); do j=$(($i+1)); echo -e "\t${BASH_SOURCE[$i]}: ${FUNCNAME[$i]}() called in ${BASH_SOURCE[$j]}:${BASH_LINENO[$i]}"; done

If you also set 'x' then you'll get a log of how it's interpreting each line.

That's useful. Thanks.

Had to register just for this, hilariously inappropriate web servers written in various *sh dialects is my weird pet hobby.

Some additional related projects not yet mentioned:

balls (bash): https://github.com/jneen/balls

czhttpd (zsh; full disclosure, author here): https://github.com/jsks/czhttpd

sh-httpd (ksh): https://www.dartmouth.edu/~rc/classes/ksh/sh-httpd.html

zshttpd (zsh): https://github.com/hkoba/zshttpd

ZWS (zsh): http://www.chodorowski.com/projects/zws/

With zsh you can avoid the dependency of netcat/socat and use the builtin TCP module which is both super cool and terrifying.

On the topic of "why would you write that with that", this is worth a mention: https://github.com/azac/cobol-on-wheelchair

I think "One does not write a web server in Bash" is like "One does not simply walk into Mordor." You're practically daring short people with hairy feet to attempt it.

(Quote: seebs on slashdot)

I wrote a CMS in Standard ML. My only excuse was that I did not know better. The only other language I knew was Prolog. Should have done it in bash instead, would have been easier.

I got tired of crack and meth, so I tried heroin.

Everything being a string does solve potential type issues...

It also creates in-band signalling security hell, even worse than the web already inherently has.

Thanks, I almost killed my laptop spitting coffee on it :)

Has this been run through ShellCheck? Once scripts are run through ShellCheck, I'd almost consider them safe.

No disrespect to ShellCheck, though. It's a fantastic tool.

Shellcheck is indeed great. I needed to make a BASH script to give to my girlfriend that would install on her OS X a snake game I made. Using shellcheck made that a lot easier. For example, I didn't know that the [[ ]] syntax wasn't POSIX compliant. Really great for learning and remembering little things like that for portability.

Nope. TIL about ShellCheck and BATS, and got a Dockerfile feature request. They're all on my to do list.

I can't tell you which one, but I know that a very popular website you will have heard of with a surprisingly high Alexa ranking is almost entirely powered by a bash script via CGI.

I'm constantly surprised it hasn't exploded yet.

This anecdote means nothing without identifying the site.


Perl and MySQL.

Im not the old mod_cgi model when done by a competent programmer for things that should be simple, stacks up surprisingly well against modern complexity because we can "kitchen sink included" frameworks.

> Im not the old

Could you clarify?


Bash is the love of my life. However, it has serious weaknesses. Its array/table concept is flawed. You cannot do tables containing tables. Secondly, out of the box it cannot load or invoke functions in libraries. Fortunately, ctypes.sh solves this problem, and to some extent, indirectly, also the table of tables problem.

Whenever I write websites in bash, I start off fine, but tend to regret it, and end up refactoring into Perl.

Whenever I write anything in bash, I start off fine, but tend to dig my heels in as it gets harder and refuse to switch languages.

I write shell scriots all the time for a myriad of tasks, but will tore my arm off before writing a server in it.

It is very well suited for imperative scripts, but the language breaks down pretty quickly once complexity and the level of abstraction go up; you start relying on obscure syntax, behaviour, and piling up more and more patterns that require full week-long immersion to grasp. Perl is harmless in comparison.

Bash is a misunderstood language. A lot of developers don't take time to learn it enough and discard it because of that. So I started a twitter handle that posts semi-regular tips and tricks about it:


The best thing about projects like this is how well they illustrate the MVP web framework. A lot of people think things like Django or Rails are magical massive beasts, and to some extent they are, but vast portions of it can get boiled down to something like this.

tl;dr if you’re not sure how to roll your own very simple web request/response framework, this is a great intro.

Bocker is similar for better understanding Linux containers.


I wrote something similar, including a CGI webserver written in shell: https://github.com/tlrobinson/martin.sh

If you used `<<-EOT` you could tab-indent the HTML so it reads better. The tabs will be automatically removed and the source code will be much more readable.[1] Of course, it would require tab-indenting all your shell code (but not the HTML itself), which may be a hard pill to swallow if you're a spacer. But the fact is that tab indenting was once upon a time the norm in the land of Unix programming and the tools reflect this.

Windows converts and Python programmers are the ones who turned the tide. I suspect that when most read older source code from the Unix universe they don't realize it's almost entirely tab indented because 90% of the source code will still look clean no matter the tab stop (which is of course the very point of hard tabs =)

[1] I never understood why Perl's heredoc construct didn't also adopt this behavior. I always cringe when people use heredoc (even in shell) without indenting the entire body properly. I mean, that's kinda the point of the heredoc--so the data becomes part of the code.

This is of course just personal preference, and I have to admit that it might be heavily influenced by the shift in responsibilities (i.e. my job involves less and less scripting at the CLI level nowadays) but I was always partial to DCL: https://en.wikipedia.org/wiki/DIGITAL_Command_Language

If I'm doing any scripting, I always use BASH. However, I'm trying to actively push myself to learn another more portable scripting language. I still haven't been able to decide which out of Python, Perl and Ruby that I really want to throw myself into -- although recently I have started to become more and more attracted to Ruby.

If you are thinking about Perl, then take a look at Mojolicious. An amazing framework to work with that covers so many common use cases with async and functional programming style. I find that the code becomes very precise and readable.

Knock bash if you will, but in my experience, it has the best and most helpful community on SO of any language.

Probably because more grown ups know it well.

Ganesh is a fun option as a sinatra-inspired bash web framework for internal services:


also: there is a AWS Lambda Bash runtime.


Does anyone know what the theoretical limitations of trying to use Bash as a Web Server are?

How many requests per second could it serve? What about HTTP/2 support?

It would share all the same limitations as any other forking server, plus extra overhead for any spawned child processes. Basically the same as using CGI scripts, and probably not much worse any Apached hosted PHP based website, tbh. Probably wouldn't be worth implementing HTTP/2 since it would increase the complexity with no real benefit.

Performance is kind of a meaningless question in this case, since the "bash" script makes calls to external binaries. You could write a webserver binary in C, then invoke it with bash and call it a shell script.

Maybe you could ask the people who use/maintain this: https://github.com/avleen/bashttpd

This is a good question. I've always kind of wondered what bash's performance characteristics were compared to other programming languages. Bash certainly feels fast.

It’s a neat experiment, totally impractical, but neat. What this has to do with PHP or Perl I have no idea.

It's just personal. I did Perl in 1997/98 and 2017 to now. I did PHP from 1998 to 2017.

Not the smartest idea putting "www.sh" in the title, which redirects to a Chinese site.

China is .cn, .sh is for a British Overseas Territory and the domain is not registered.

It could be the ISP's nameserver snarfing NXDOMAINs.

Indeed - it is reserved by the registry, and not available to be used. Presumably to avoid abuse

Why not OCaml or Rust?

Maybe when I finish the assembler version...

I got tired of bash. Still waiting for `typed bash`.

Cool Idea, and very easy to migrate between VMs.

When you're tired of PHPing and Perling yourself in the head.

lol living life in reverse :)

You monster

Thank you! ;)

That title reads like "I got tired of punching myself in the face and kidney, so I tried punching myself in the crotch".

I love it.

I’ll leave BoB[1] here as a reference for the OPs comment

[1] https://news.ycombinator.com/item?id=2781019

Came here to say something like that, but my analogy was far more vulgar.

Second commit to the repo:

  Initial commit of a really stupid idea.
  computerx138 committed on Mar 6

That's exactly how I read it. Php and Perl are really from a time gone by. neither have REPL. php doesn't even have to a proper print command, you have to add your own newline. Granted they are both quite fast, but with perls "write only" syntax I would only use either of them for the most basic of scripts.

Except that PHP does have a REPL, and using PHP_EOL is hardly a big deal. PHP induces an undeserved gag-reflex for most Ruby / Python / JS devs because they read some "why php sucks" post on someone's blog ten years ago, but really the language is quite modern now (not to mention faster).

The fundamentals of why PHP is disliked have not changed. It is still covered in annoying problems, though there have been substantial improvements in some areas. It's less painful, but it's still the infamous PHP.

I thought PHP was disliked because its flexibility can lead to horrible code if it's used without discipline, and for a long time its default settings were completely insecure?

I don't recall hearing any serious criticism claiming that it was hard to get things done with (unless you count security).

there are some design decisions, in the core language, that cause pervasive weirdness that’s hard to code around. It’s worse that “too permissive” - it’s like there are bad defaults that you have to constantly fight against

Some examples? I can't think of any.

At least PHP doesn't have the whole Python 2 vs. Python 3 fiasco. And a gazillion different ways to package and install dependencies. Don't get me wrong, I love Python but every language has its warts.

True, PHP never made a clean break with its legacy, unlike Python. That’s not positive.

PHP consistently deprecates old features at a steady, manageable pace while introducing new features and performance enhancements.

> PHP consistently deprecates old features at a steady, manageable pace

Manageable is subjective. There's no evidence that it's "just right" or even "timely". Major API cleanup would be appropriate for php 8 but there's still too many people clinging to the idea that PHP would be damaged because someone can't look up how to solve PHP problems with specific versions (ie are stuck in the late 90s).

There's so much that could be done in one fell swoop, the steering council of neckbeards (PHP neckbeards are as bad as it sounds) are responsible for the stunted development of the language. To suggest they are the progressive face of PHP is, at best, revisionist.

Steady might mean too slow, but there's no 2/3 schism.

There's plenty of features in the docs that say at the top "deprecated in x, removed in y, see also alternative z", so I'm unsure of what problem you're referring to.

What would a major API cleanup actually give except for major breakage?

How about PHP 6? not a fiasco?

They canceled a major break in BC that they weren't sure would have been a good idea and then came out with one of the best releases in PHP history, so kind of the opposite of a fiasco.

I honestly tried to come back to PHP after most of a decade in ruby. I couldn’t find a decent ORM! And after trying to write one, I think I see why.

We ended up rewriting the entire product in Python.

There are tons of ORMs for PHP. Of course, I find ORMs in the end, just get the way of getting the most from a DB and SQL isn't that hard for most things.

I determined there’s no point in an ORM... just write logical code using an SQL wrapper like PDO.

Why not Ruby if you'd just spent a decade with it?

Sometimes preference doesn't match project choice?

I’ve never read a blog post about why PHP sucks. I wrote PHP for many years and it wasn’t that bad but compared to the alternatives I would never go back. Also tormented by all of the Wordpress plugins and template hacks that I got paid to write over the years. Yuck, no thanks I don’t want to do that anymore.

So, you hated the _work you did with PHP_, not really the language.

I never said I hated the language but I do think almost every other alternative language I’ve used since doing full time PHP has been a better experience.

...or they're like me. I spent over a decade making a living on PHP and now it gives me major P(HP)TSD. I'd have to be REALLY desperate to consider a job that involved PHP.

Conjecture: Anyone that spends over ten years making a living writing the same programming language will have major P(LANGUAGE)-TSD. I feel like the deeper concern is more a psychological fear of moving your career in a backwards direction. PHP just happened to be a part of that.

I'm not sure... I've spent six years with C# and I still love the heck out of the language. I keep learning new things about it, too! Most recently, stackalloc and unsafe blocks. I've burned out on other, less comfortable languages though.

I think it's more from being forced to use some framework you didn't like for too long.

I've done PHP for more than 10 years and got no problem using it if I have to.

OTOH, I don't want to touch stuff like rails because I don't like it being in the way but love Ruby.

I'd prefer TypeScript these days though.

It seems you don't know what REPL means. The P is print, and "PHP -a" doesn't do that.

That's true about a few languages actually. Javascript being one, C++ being another.

(Last comment didn't post??) To be fair js gets the same bad rap.

Many users = more vocal minorities and elitist who can't help but bash mass adopted products

that's because it still has global variables.

Others have pointed to PHP REPLs. Perl 5 also has a REPL : https://metacpan.org/pod/Devel::REPL. Perl 6 comes with a REPL built-in. Unlike Python/Ruby, Perl does not have a GIL and with some discipline it can be used in large programs.

Ruby's built-in REPL is irb.

Also, `php-cli` has a very basic REPL which can be accessed using `php -a`.

Remember to terminate every line with a semicolon, and enjoy quirks like true printing boolean(true) and false printing nothing at all.

This position only really holds up if you don't know either language well.

For example, I guess you didn't know there are PHP and Perl REPLs? And robust OO code is not only possible in Perl, but there's tens of thousands of very reusable Perl modules that have stood up over time much better than Python or Node modules have.

If you are writing more than 10 lines of bash, write python instead.

Awful language. Waste of time.

You're using it wrong. The things that bash is good for are not the same things that Python should be used for. For what bash is good for, it is the simplest and fastest language one could use.

What is bash good for?

For a task like taking backups on servers that involves invoking bunch of commands, I'd rather not use a scripting language that asks you to open a process, bind variables on its stdin/stdout and all the boilerplate handling.

How do you do this in a scripting language without complication?

$ ssh remote.server 'mysqldump db | gzip -c' | gunzip -c | mysql db

It's the power of individual commands though I admit bash sucks as a language but it got invented forever ago, so can't blame it. But we need something better than bash. Fish is close but not too good.

Plugging programs together; pipelines. It exposes the UNIX principle. You can use it to augment, to extend, to improve programs written in any language you can imagine, past, present and future. If the program is written in C, C++, Rust you'll get the performance of those programs. What I'm saying you might find very obvious when using it interactively but there have been so many instances where it fulfils my needs. More often than not in certain problem domains it's the first thing I reach for and it usually solves the issue. I'd also like to add that I exclusively write POSIX shell scripts rather than Bash.

Look thru the code though... Thought this would do more than serve a static html file. Not impressed

It doesn't. The route defines the controller. The controller displays the content, ideally using the view function (there are examples with and without a view, but always with a controller). The view function takes an associative array for template replacements.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact