This dogmatic approach means you lose out on ergonomics by using poorly designed tools like bash and perl, so you incur those costs all the time for little potential benefit far away in the future (after all, that effect is just a broad hypothesis)
Very helpfully, python has stuck around for just as long and is almost always a better choice against these two specific tools for anything complicated. It's not perfect, but I'm much more likely to open a random python script I wrote 6 years ago and at least recognize what the basic syntax is supposed to be doing. Bash beyond a certain complexity threshold is... hard to parse.
Python's standard library is just fine for most tasks, I think. It's got loads of battle tested parsers for common formats. I use it for asset conversion pipelines in my game engines, and it has so far remained portable between windows, linux and mac systems with no maintenance on my part. The only unusual crate I depend on is Pillow, which is also decently well maintained.
It becomes significantly less ideal the more pip packages you add to your requirements.txt, but I think that applies to almost anything really. Dependencies suffer their own software rot and thus vastly increase the "attack surface" for this sort of thing.
Python is a very bad example because of the incompatibility between Python 2 and Python 3. All my pre-2012 Python code is now legacy because of this, and since most of it is not worth updating I will only be able to run it as long as there are Python 2 interpreters around.
I like Python as a language, but I would not use it for something that I want to be around 20+ years from now, unless I am ok doing the necessary maintenance work.
There's a script to update from python2 to python3, it's now the most used language in the world, and they learned their lessons about the python2 to python3 migration. A python3 script is literally the most likely candidate to be still working/maintenable by someone else in 20 years.
It never worked for any of my nontrivial python files, even for simple projects it often failed. It was a good start, but it was not a fully automatic magic migration script. Otherwise ecosystem migration wouldn't take ages as it did.
It depends largely on what you're doing with it. True, I would never want to have to talk a customer through setting up and running a python system. I know there are ways to package them (like 37 different ways), but even that is confusing.
However, a decade ago, a coworker and I were tasked with creating some scripts to process data in the background, on a server that customers had access to. We were free to pick any tech we wanted, so long as it added zero attack surface and zero maintenance burden (aside from routine server OS updates). Which meant decidedly not the tech we work with all day every day which needs constant maintenance. We picked python because it was already on the server (even though my coworker hates it).
A decade later and those python scripts (some of which we had all but forgotten about) are still chugging along just fine. Now in a completely different environment, different server on a completely different hosting setup. To my knowledge we had to make one update about 8 years ago to add handling for a new field, and that was that.
Everything else we work with had to be substantially modified just to move to the new hosting. Never mind the routine maintenance every single sprint just to keep all the dependencies and junk up to date and deal with all the security updates. But those python scripts? Still plugging away exactly as they did in 2015. Just doing their job.
It's not just 2 to 3, either. Both 3.12 and 3.13 introduced breaking changes, that's once per year that you at minimum need to audit all your Python code to ensure it doesn't break.
Python even has venv and other tooling for this sort of thing. Though, admittedly I seem to have dodged most of this by not seriously writing lots of python until after Python3 had already happened. With any luck the maintainers of the language have factored that negative backlash into future language plans, but we'll see.
Mostly I recoiled in horror at bash specifically, which in addition to bash version, also ends up invisibly depending on a whole bunch of external environment stuff that is also updating constantly. That's sortof bash's job, so it's still arguably the right tool to write that sort of interface, but it ends up incredibly fragile as a result. Porting a complex bash script to a different distro is a giant pain.
I dislike Python for that reason. I don't love the offside-rule syntax, but compared to how often I have an issue with software written in Python due to some old/deprecated/broken packaging issue...
I've lately been pretty deep into 3d printing, and basically all the software has Python...and breaks quite easily. Whether because of a new version of Pip with some new packaging rule, forced venvs...I really don't like dealing with Python software.
I used virtualenv for the past 15 years and I don't recall it changing significantly. I don't get why people use new fancy tools like pipenv/pyenv/poetry/uv and then complain that there are too many tools to learn. There is nothing wrong with just using virtualenv. It has its warts but it always worked for me and it's stable.
I think if you had chased every single latest hotness then you would have hit lots of breakages, but depending on what you are doing and where you are running (and what dependencies you are using) then I think you could easily have something from 10-15 years ago work today. Part of the trick would have been to aware enough to pick the boring long-term options (but at some level that applies to every language and ecosystem), but the other part is understanding what the tools are actually doing and how they are maintained.
That's why your build pipeline alerts you when tests no longer work, and then you have a release of the previous build still available for download at any time. This is how containers are released!
Sure. It still is burdensome, though. Now there are lots of nightly build from old projects that break at random times and require developer attention.
There's a lot of software that ends up lasting for decades, through multiple OS platform refreshes. Normally there's a small platform/OS team that gets to slog through gardening that mess while everyone else is long gone.
But now I have frozen an old language runtime and a bunch of old libraries into my environment, all of which are not just security hazards but interoperability landmines (a common one being lack of support for a new TLS standard).
I don't see how that solves either problem. If the thing in the container makes a web request out, that code both might become obsolete and offers an attack surface to get back in, and wrapping the outside of it doesn't change anything.
As far as TLS is concerned it does: if you are running a server, run it through a TLS terminating reverse proxy. If you are running a client, run it through a TLS terminating forward proxy. As long as your application logic isn't exposed to a security issue, you're fine.
It has to be weighed against all the time spent learning, evaluating, and struggling with new tools. Personally, I've probably wasted a lot of time learning /new/ that I should have spent learning /well/.
If "This and Lindy Effect" do not "factors a lot", but instead the major factor is you believe perl is better designed, then no, dogmatism of vague future risk is replaced with pragmatism of the immediate usefulness
At the point being discussed, which is not breaking backward compatibility, it indeed is arguably better than more popular tools, and I believe perl has other advantages too.
Its not "far away in the future". Every other IT job right now is supporting, maintaining and fixing legacy software. These are the software choices of the past and you pay them in manpower.
> There isn't much to evaluate, and nothing to compare against
These are exactly the skill issues I meant! Git gud in evaluating and you'll be able to come up with many more sophisticated evaluation criteria than the primitive "it's installed everywhere"
While there are other parameters I would consider like maintainability, ergonomics, mind share, ease of deployment, etc. The ubiquitous availability point triumphs most others though. Installation of new toolchain is usually a hassle when the same task can be done with existing tools. Also when I present it in a company setting installing new software and broadening the security attack surface is the first pushback I get.
Do you advocate the use of Notepad on Windows to edit text because it already exists? What about the increase in the security attack surface from using languages that make it easy to make mistakes in something basic like quoting/escaping? Does it get in the top 10 of pushbacks?
I'd advocate for 'nano' on Linux because it's widely installed and easy for newcomers. A seasoned professional will know they can substitute vim or what have you, I don't need to explain that to them. So yes... If I was trying to explain to a noob how to open a text file on windows and I don't know what they have installed, I'd absolutely tell them to use notepad.
Would I advocate writing my core business software in bash or perl? No, I'd hire and train for what was chosen. For small scripts I might need to share with coworkers? 100%
My comment didn't convey it, but I'm with you on using the right tool for the right job. Just that I always don't have the luxury to do so. And yes like the other comment I'd use it for throw away scripts and glue code, instead of installing a new toolchain. Longevity and importance should warrant doing it. Cheers!