Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why did Python win?
578 points by MatthiasPortzel on Aug 29, 2023 | hide | past | favorite | 828 comments
I started programming in ~2013 in JavaScript. I’ve since learned and tried a handful of languages, including Python, but JavaScript was always my favorite. Just within the last year I learned Ruby, and I was blown away by how fun and easy to use it is. At the present time, I’m starting all my new projects in Ruby.

My impression is that in the ‘00s, Python and Ruby were both relatively new, dynamically typed, “English-like” languages. And for a while these languages had similar popularity.

Now Ruby is still very much alive; there are plenty of Rails jobs available and exciting things happening with Ruby itself. But Python has become a titan in the last ten years. It has continued to grow exponentially and Ruby has not.

I can guess as to why (Python’s math libraries, numpy and pandas make it appealing to academics; Python is simpler and possibly easier to learn; Rails was so popular that it was synonymous with Ruby) but I wasn’t paying attention at that time. So I’m interested in hearing from some of the older programmers about why Ruby has stalled out and Python has become possibly the most popular programming language (when, in my opinion, Ruby is the better language).




Python ended up 'specializing' in data contexts, thanks to Numpy / Pandas, and as a result, ended up becoming the first exposure to programming than anyone doing data stuff had. That was millions of people. In that space, it had no competitors.

Ruby ended up 'specializing' in web dev, because of Rails. But when Node and React came out, Ruby on Rails had to compete with Nodejs + React / MERN as a way of building a web app. Since people first learning programming to build a web app would usually start with javascript anyway (since a lot of very first projects might not even need a backend), it was a lot easier for the Nodejs/React route to become the default path. Whereas if you were a data scientist, you started on python, and as you got better, you basically just kept using python.


I think Python was popular as a general-purpose language first. After all, there was a reason people put so much effort into writing Numpy in the first place.

I think a lot of people were attracted to the language design, as captured in the Zen of Python (https://peps.python.org/pep-0020/), such as:

Explicit is better than implicit.

Readability counts.

Errors should never pass silently (unless explicitly silenced)

There should be one-- and preferably only one --obvious way to do it.

In many cases, Ruby has almost the opposite philosophy. There's nothing wrong with that - but I think a lot of people prefer Python's choices.


> There should be one-- and preferably only one --obvious way to do it.

This is so hilariously wrong in python though


So I imagine you have that perspective because you started less than 20 years ago. In some ways the idea of the Pythonic Way to do things evolved in opposition to Perl's vigorous advocacy of More Than One Way.

Python has been really winning for some time, so it's natural that its ideological discipline has grown ragged. The crop of kids who value options above consistency don't have the scars of the Perl age to inform their prejudices.

But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.


Back when I decided it was time to add a scripting language, Perl and Python seemed like the obvious choices, and in my mind were equally good options. I asked my best friend which I should choose, and he more or less said, "You can't go wrong with either one, but when you ask for help Perl people are assholes and Python people are nice."

I can't confirm his thoughts on Perl and I haven't interacted much with Ruby, but the Python community is definitely welcoming and patient in my experience. I wouldn't be surprised if this was a significant factor in Python's prevalence over Perl, Ruby, or anything else.


yep the Perl community kind of had issues around the turn of the millennium and the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

I don't think there was any toxicity in the Ruby community but it was made up of working programmers where as the big leading voices in the python community was teaching assistants and students so it might have been more tailored to newbies.

I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older so i think the real answer lies in why the educational sector decided that teaching python was easier and significant whitespace plays a huge part here.


> I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older

Yeah that's my recollection too. About 2011ish there weren't a lot of jobs in python yet. Perhaps in SV, but not out in the real world. Several startups were doing it, including Youtube and google at the time.

But in the F500 world, python wasn't used at all. I started using it in 2008/9-ish.


> [...] the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

Not GP, but the Python 2 vs 3 holy wars were also something that kept me from adopting Python as a scripting language a couple of years.


Yeah, python 2->3 transition was painful. But, I would argue that was self inflicted. Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

I mean, I love python, but that sucked!

Yes it would have been more work for the devs, but the amount of work it meant for the users were worse.

In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).


> Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

That is literally what 2.7 was, as well as reimplementing some features in later p3 (up to 3.4).

The core team definitely had the wrong transition model at the start, and it took some time for the community to convince them then decide on which items were the most important, but let’s not act like they did not get it in the end.

> In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).

What?


What would have been a better transition model? Are there any languages with major breaking changes that have done the upgrade smoothly?


> What would have been a better transition model?

Better supporting cross-version transition codebases.

The core team initially saw the transition as “run 2to3, fix what’s left, publish updates, P2 is gone”, but aside from 2to3 being quite limited such transition is quite hard for dependencies, as it means they leave all older dependents behind entirely (dependents which might be the primary user for e.g. company-sponsored projects), or they have to keep two different codebases in sync (which is hard), plus the limitations of pypi in terms of versions segregation.

What ended up happening instead was libraries would update to 2.7 then to cross-version codebases, this way their downstream could migrate at their leisure, a few years down the line people started dropping P2.

> Are there any languages with major breaking changes that have done the upgrade smoothly?

Some but usually statically typed languages e.g. elm’s upgrade tool worked pretty well as long as you didn’t have native modules and all your dependencies had been ported. I think the swift migrator ended up working pretty well after a while (Swift broke compatibility a lot “initially”) though I’ve less experience with that.

An alternative, again for statically typed languages more than dynamically typed ones, is to allow majorly different versions of the language to cohabit e.g. editions in Rust (Rust shares the stdlib between all editions but technically you could version the stdlib too).

Not workable for Python, not just because it doesn’t really have the tooling (it has some with the __future__ imports but nowhere near enough) but also because it changed runtime data model components specifically the entire string data model, which is not a small matter (and was by far the most difficult part of the transition, and why they piled on smaller breakages while at it really).


The only ruby person i've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

Of course I already knew Python, and so did the rest of my team so we had been doing tools in Python (the guy wasn't on my team), but until he pushed ruby into places where python would have been better (import a Python library rather than shell to out to a program) I was willing to accept it was probably fine '


> The only ruby person I've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

I mean, if you read almost any Elixir article that has hit the front page of HN, there are always comments from Pythonistas saying, "Why bother when there's Python?" Similar attitude. Obviously it's not everyone, but it's not everyone in the Ruby community either.


It’s such a bizarre reason. “One person using something rubbed me the wrong way so I decided not to use it”. Did the person extrapolate to an entire community from a sample size of 1?


    The only ruby person i've met was insistent that ruby was the one true way
That sucks. I've been doing Ruby full-time since 2014 at 4 companies and I've never seen that sentiment, even from people who really love it. My experiences have been really positive.


I agree. I’ve found ruby and its developers to be pretty friendly and open to other languages and styles.


My experience with python was simply, people wanting to get shit done. This was circa 2008. They weren't really engaging in language wars, but doing innovative things like extending Java, with Jython.

I was arguing for the F500 company I was working on to explore using Jython to write unit tests for Java code.

Why not have a scripting language to write unit tests for Java code?

I see this with Rust trying to extend python in interesting ways. I don't see this with Java trying to extend C/C++ or Python.


Python and Ruby have some things where the intuitions are exactly inverted from one another. It took me a long time to figure out why Python rubbed me the wrong, and that if I dig up how I used to structure code in Pascal, it’s fine.

Not that I care much these days since I prefer writing in Elixir.


> I haven't interacted much with Ruby

“Matz is nice and so we are nice” https://en.wiktionary.org/wiki/MINASWAN :)

The Rails community is another story, unfortunately.


That’s funny because that’s one of the reasons I tend to point beginners to R instead of Python for data work.


I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.


> I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.

TBH, community had nothing to do with Python's enormous success over its competitors (Perl, and Ruby. Possibly Tcl too.).

Nor did any technical merit, nor ergonomics.

There's one, and only one, reason why Python exploded at the expense of the other competitors: The ease and acceptance of using the language as glue for C functions.

Python's popularity is built on a solid foundation compatibility with C.

If, in the 90s, C++ had taken off enough to displace C, I doubt Python would be as popular as it is. Python owes its ubiquity to C, because if C was not ubiquitous, Python wouldn't be either.

(It's only recently, like the last 10 year or so, that I started seeing popular Python programs which didn't have a dependency on C libraries. And even now, it's still rare to see).


I don't think your analysis is accurate.

My experience of trying to get my own C functions to use in Python to have been nightmarish. Yes, you can do it... if you have exactly the same compiler & version used to produce the python interpreter itself.

C's only usefulness to Python is: it allows optimization of the 80/20 or 90/10 rule, so performance doesn't have to totally suck with Python.

Python 'won' IMHO because it hit a sweet spot -- simple enough for beginners, in fact, beginner-friendly, but due to having a good basic set of datatypes (lists, tuples, sets, plus the usual ints, floats, and complex) -- this allowed complex ideas to be compactly expressed. The ability to switch between functional and imperative styles also helped.

Python is a 'good enough' lisp. MIT switched, and Norvig has said as much.

No, the astonishing thing is that Python survived the 2->3 transition, and came out stronger on the other end. Language cleanups, new 'syntactic sugar' (e.g. @ as the decorator syntax), and what you see is Python is trying to actively steal all the successful programming paradigms under one unified syntax.

Is python perfect? Hardly. But it's beginner-friendly and expert-optimized. AND, unlike C++ (at least for me), you can get ALL of Python into your head at the same time. (Libraries, ok, but true in any language). In this specific sense, it is exactly like C (you can keep it all in your head, even the edge cases).

There are newer languages gunning for a piece of Python's mindshare (Zig, Nim). But because Python is a moving target: getting better and better, the others will need to provide a spectacular use-case advantage --- and I just don't see that happening.


Maz (ruby author) is nice and so we are nice, isn’t so bad. It’s twee sounding but saying you are going to follow the example set by the founder is absolutely fine. Is it any worse than the Python ‘benevolent dictator for life’ example?


I'm going to answer your question directly: No, it's not worse.

My interactions with Guido haven't been awesome. But the people put up with him regardless. The other people in python have been awesome.


I'm genuinely curious: what's "cringe" about MINASWAN?

(I write mostly Python these days, but have been involved in both communities for a long time, and MINASWAN never particularly stood out to me other than as a cute reminder to be nice.)


There is no problem with someone being nice. It's only a problem when they want you to be nice exactly like them.


It was meant tongue in cheek as I was defending Ruby being the most welcoming community. It's just a bit twee like the voice-over on London Underground - "See it, say it, sorted".


Oh I have met Ruby people and it's a big factor in why I never learnt the language.


> But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.

I would revise that to be that the pythonic culture of one acceptable way to to is better matched with a lot of good development practices.

Perl was also very good at finding a Perl way to proceed. It's just that with Perl that often mean a lot of implicitness and "do what I mean", and multiple ways to do it so you could fit it within your preferred style and tailor it to the current project's needs.

That all sounds good until you are confronted with the flip side of the coin, which is that it's harder to understand when looking at something written with those perspectives for the first time or after a long hiatus from the project, which puts a lot of importance on policy and coding standards which saps effort from just getting something done.

I love Perl, and it's still the language I write most often for work, and it's no great mystery why Python became preferred.


But Ruby took all the best bits of Perl so I'm still perplexed as to why Python "won".


Ruby did have a lot going for it about 15 years ago. Many Java/JSP people jumped ship and got on the ruby train. Ruby was a breath of fresh air comparatively.

Python had a great community though, and the Python Software Foundation went out of it's way to make people feel accepted in the community. And frankly programming is often more of a social activity than most people realize -- particularly for new programmers.

So new programmers tended to lean towards python, because the resources to lean on others were there. And people like Raymond Hettinger, Naomi Ceder, Ned Batchelder, and Yarko Tymurciak were approachable even if the BDFL wasn't.


What did the PSF do to make people feel accepted?


Taking a look at the bigger picture, it does indeed seem like the Perl philosophy lost to the Python philosophy overall.

Looking at the hip n cool languages, not just Python for scripting but surely Go and to some extent Rust as well for native stuff (Dart for scripting also but it didn't outright "win"), these are mostly languages that deliberately simplified things. Yes, even Rust - it needs to be compared to C++ and its biggest feature is basically that it doesn't let you do all the things C++ does, within a very similar model.

The only language that I heard is going against these trends (but I'm largely clueless what it's actually like) is Julia which sort of has its own "lay" niche like Perl did back in the day, and is mostly winning based on the premise that it's a performant scientific language.

The industry obsesses over costs of adoption, stability, maintenance; in short: how to get the most out of the least actual new development. It does make quite a lot of sense, to be honest, although sometimes it's really demotivating at an individual level.

And frankly, "learn the language, duh" usually comes up when the language is complex or unintuitive for no practical purpose. Of course there will be people who always complain if they have to learn anything but I don't think they make the majority, or even a significant minority, in the software engineering world. "Learning the language" is only as much of a virtue as the language itself serves its purpose, which includes easy of use and the readability of someone else's code.


Because the best bits of Perl were kinda trash that Python was smart to avoid.


Yeah for sure, the PERL influence is why I dislike Ruby. Makes it really hard to read if you haven't been doing it constantly.


Python's clean, obvious syntax is what drew me to it. They actively decide to not do things because it detracts from the cleanliness of the syntax. Very often, less is more. My biggest fear is Python might be forgetting this, which I see with the sort of things like := operator, etc.


God forbid we should have to familiarise ourselves with a language before using it.


God forbid things be intuitive.


Funny, because 99% of current regex libraries use Perl's regex extensions.


Not just practices, tooling. The whole typing system. Linting to check for mistakes. For the love of black.


Python's static typing still feels very very clunky and bolted on, and tooling around it was rather bad, like mypy. In general I definitely wouldn't say python tooling was good. It's improving rapidly with ruff tho, just as JavaScript when esbuild appeared.


Dumb question: I know the built-in typing (import typing) is limited in some ways, but it works pretty fine to cover most basic needs. So what does mypy add? I'm super interested in adding typing to some code we have but I'm just a bit confused by the choices available.

Would mypy with pydantic be a good combination or do they overlap?


The two are complementary: the built-in `typing` module only provides type annotations, not type checking. Mypy provides the latter, via the type annotations that you (and others) add to Python codebases.

Pydantic overlaps with mypy in the sense that it provides a mypy plugin, but otherwise it's just like any other Python library.


So mypy runs "at run time"? I guess that makes sense, I thought the annotations provided some form of checking too, but now I realize that I should really spend some time to inform myself better :').


Sort of -- mypy is its own standalone program, and you run it to typecheck your program's type annotations. It does some evaluation of the Python program (specifically imports and some very basic conditional evaluation, among other things), but it never really executes the codebase itself.


Mypy runs as a type linter/checker. See https://mypy-lang.org/


No, it's more that pydantic runs "at run time" while mypy not.


Isn't Python's tooling usually considered some of the worst?


By what standard? Yours? Java developers? Rust developers? The fact that Python has typing now, tools to check those, and has plenty of tooling around “make it faster”, I think your world view might be stuck in 2010. Python has gone from a slow obscure scripting language to a powerhouse. Conda, NumPy, Scikit, PyTorch, GPU programming, Games, Analytics, web apps, API’s, I think it’s safe to say this ain’t your grandpa’s Python anymore.


I have seen Python in action since 2010 plenty of times, including the present. It's been a mess every time. My worldview is driven by F# and Elixir, namely the `dotnet` and `mix` tooling, respectively. So no, Python's tooling does not impress me in the slightest. The first thing you need to do is get agreement on all the different checkers, linters, formatters. For Elixir, there's one tool for each task. My worldview feels quite current.

Those other things you mentioned aren't relevant to tooling. They're distributions, libraries, and applications. But note that Conda's existence was basically due to the fact of Python's tooling being poor.


You see tooling in a different view than typical Python users. Python users see things like iPython, notebooks, the ability to quickly do statistics and plot results, as tooling - to do their jobs. Data Science. Machine learning. Things that perplex and confound static-typing OOP purists. So yes, by your world view - Python is a mess. There's no one way to do things, there's no one tool, no one linter, no one formatter. I praise the fact that there isn't. What a boring world. What choice would you have if that tool didn't satisfy your needs? Find another language?

I began in that world view. The C/C++/Java/DotNet everything must have a standard, a fixation on a singular consensus. That's not how things work in the open source world of Python, javascript, rust, etc. Will there be gravitation towards a paradigm? sure, until such time that a new one emerges.

If you looked at Python in the 2.5 days and looked at Python today - You cannot argue the tooling has gotten better.


> You see tooling in a different view than typical Python users.

That doesn't surprise me.

I do what you mentioned with F# in Polyglot Notebooks (which support several languages in the same notebook) and Elixir in Livebook all the time, both of which are superior notebook implementations to the outdated Python Jupyter notebooks.

My worldview is not driven by comparing Python to C#, Java, C++, and other such ilk.


So following your logic, "some of the worst" includes the vast majority of actually used languages and tools. Must be cool to be on the special side. :)


They want to be lisp SOOO BAD! =)


+1 for mentioning that python's original competitor was Perl. This point is forgotten some 20-30 years latter


Not just perl, but C/C++/Java as well. Ruby was a competitor to Java JSP development back in the day. And I remember when a lot of Java people jumped ship to Ruby. I moved from C++ to python over a decade ago and never looked back.

Back then, the python jobs were scarce -- but based upon how I picked up the language, many of the typical C++ issues just disappeared -- and I knew it was going to become popular.

One comp sci professor talking at a PyCon years ago made a point that maybe the best college level introductory course probably should not have been SICP based but Python. His example was that for the first time in 5 years of teaching intro courses he had people coming up to him looking to change majors.


C++ and Python are not competitors. Sure both are Turing complete so you can implement anything in either. However if you should is a different question. Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions. C++ compiles to fast binary code, while Python is typically 60x slower for logic (though often Python calls a fast algorithm implemented in a compiled language for anything significant so this is hard to measure in typical programs). However if your programs are not very large and you don't need every last bit of speed Python is going to be easier to work with. (And frankly if not working with a legacy c++ codebase you should look at rust, ada, go, or something else i'm not aware of, c++ has many ugly warts that you shouldn't need to worry about)


As a person who uses C++, I must say something that also applies, somewhat, to Java.

We have all the cool kids like Kotlin, Rust, etc.

However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Besides that, C++ has improved a lot over time. It still has its warts and knives but you can code very reasonable C++ with a few guidelines and there are also linters, and -Wall -Wextra -Weverything -Werror. That takes you quite far in real life provided you have a reasonable amount of training.

I would choose C++ over Rust any day. The only hope I have for C++ successors are Cppfront and Carbon and they are highly experimental. As successors those two fit the bill.

There is a third one, my favorite. It is this one: https://www.hylo-lang.org/ but I am not sure how compatible it will/would be.


The one thing Rust is getting right that I hope Carbon et. al take from it is using the type system to manage memory.

not having to explicitly remember `free` in safe Rust code is amazing. Knowing that if my types are sound that memory will be managed reasonably is great.

I also think that immutable by default, mutable by explicit declaration is pretty great.

I do think there is a lot of room to add better ergonomics on these ideas however


The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

Besides that, in real life you end up having C in most of your software, so I am not sure how Rust behaves compared to well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

There are many ways to achieve safety, and the most promising, IMHO, is the path that Hylo programming language is taking. It sticks to Generic programming and mutable value semantics.

The language base draws from Swift and it has very strong people behind that know what they are doing. For example David Abrahams and Sean Parent. This is an implementation in a language of many of the ideas from "Better Code" from Sean Parent. It has a very solid foundation.

Besides being a solid foundation for generic programming, value semantics, and concurrency, what I like the most is how much it simplifies the model by not escaping references all around and preventing unnecessary copies. This removes the (IMHO) mess that is to have to manage reference escaping constantly in Rust, reason why many patterns such as Linked lists are not even possible.

And Lists are not just an academic exercise, as I was told sometimes by Rust proponents. Linked structures (Inmutable linked structures actually) are important in scenarios such as TelCo backend where you need replication, fast moving of data and history versioning, rollbacks and so on.


> The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

It's difficult to have this discussion in any sane way when Rust (or C) comes up. I tried Rust, but I have projects to deliver on strict timelines and I have yet to find a client who is prepared to pay me for the (what I found to be) very large onramp time to gain deep Rust expertise.[1]

The argument of "just get gud" whenever you point out the deep learning curve of Rust is pointless; I have noticed that Rust experts only come when your employer is rich enough to pay the team to not deliver while they learn it: Basically only FAANGs and startups flush with VC money.

[1] When I have a small project I reach for C. When I need something bigger for which C is not suitable, I don't reach for C++, or Rust, I rather take the tiny performance hit and move to Go. On extremely large projects, where I work with others, C# and Java seem to hit the sweet spot.[2]

[2] Although, C# and Java are also getting a bit too complicated for my tastes too. Seems to me that every language follows C++ evolution towards complexity, because the people stewarding the language are experts in that language and/or in programming language theory.[3] They are comfortable with the changes they propose because they have no need to upskill (they are already at that skill level).

[3] I propose that a language designed by your average corporate developer who has 15 years of experience but no CS degree will have much higher adoption than languages designed by PL purists.


What makes you choose Go over C#/Java for medium projects? And why not go for the large projects?


> What makes you choose Go over C#/Java for medium projects?

Because I said:

>> C# and Java are also getting a bit too complicated for my tastes too.

I abhor complications.

> And why not go for the large projects?

Because I said:

>> where I work with others, C# and Java seem to hit the sweet spot

Yeah yeah, I know it sounds like I am whining (Maybe I am :-), but at least I am complaining about all of them.

Java and C# do appear to have been battle-tested for very large projects that aren't microservices.

Go? I dunno. I've only ever seen very large projects in Go using microservices. I like its simplicity.

My main complaint is that programming languages have too much minutiae to track that I really shouldn't have to be tracking.

Take, for example, asynchronous builtins:

Why are all the explanations wrapped in jargon that only an expert in the language would grok immediately? Promises? Futures? You gotta explain those, with examples, before you can explain what to do with a value from an async call. Go look at the popular introductions to async (say, on MDN for js, or Microsoft for C#, etc) and count how many times they have to explain something because of their leaky abstraction implementation rather than explaining the concept.

How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

That naturally leads into "So you need to check if it is finished using an identifier to identify which scheduled call you want to check"...

Which itself naturally leads to "The identifier you need was given to you when you scheduled the call"...

Which leads to "Using that identifier from `id = foo();`, you wait for the result like this: `result = id.wait()`".

You can even add "You can return that id, or pass it around so some other code can do `id.wait()`".

Now they don't explain it this way, because their implementation(s) is more of a leaky abstraction exposing irrelevant information about the compiler, than of a straightforward implementation of the concept. They are unable to separate the concept from their implementation.

The common implementation of async is so counterintuitive that they have to explain their particular implementation instead of the concept, because it has almost nothing to do with the concept. If they just explained the concept, programmers would still be confused because the implementation needs things like colored functions just to work poorly.

The concept of scheduled functions (which may return once, or may yield multiple times before returning), which is a simple path to understanding asynchronous calls, is completely divorced from the implementation which will produce errors like "cannot call asynchronous function from a top level"[1] or "cannot call await in a function not declared as async".[2]

So, yeah, I'm kinda upset at how programming has evolved over the years (I wrote my first program in 1986, so had a good seat for the later part of this particular movie), from "simple and straightforward", to "complex for complexities sake".

[1] Why? Because their abstraction is an abstraction of their implementation, and not an abstraction of asynchronous calls.

[2] Se [1] above.


> How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

This is not always true.


All this may be true (I'm not the strongest C++ developer in the world, relatively limited exposure), however the Rust memory management via the type system feels natural once you wrap your head around it. That idea is really good. I always hated dealing with `delete`, `free` and `malloc`.

Being able to offload all that busy work to the type system is just nice. There are definitely ergonomic improvements that could be made around this.

All the rest? I'll leave that to someone else to talk through, as I'm no expert here.


I.write c++ all the time, and I go months between needing new or delete. Unique_ptr is a wonderful thing. Not quite as powerful as rust's borrow checker, but it saves me a lot of thinking.


> however the Rust memory management via the type system feels natural once you wrap your head around it

It disallows many valid patterns. That is why I recommend to take a look at Hylo programming language (before called Val lang) to see what I think it is a very good example of how to make a language safe without making the learning curve so steep and without a need for a GC.


The way Rust does it may disallow valid patterns, but it is not inherent to the idea


> well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety. I mean, memory-unsafety is the number one vulnerability cause in software. C++ has many qualities, but safety certainly isn't one of them.


> Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety.

Is C and assembly the same level of memory safety? Probably yes... but no, it is not in practice.

And C and C++? Yes, in theory, in practice... C++ is safer.

How about Rust? In theory Rust is safer. In practice, you are going to use C libraries here and there, so... in practice not as safe as advertised.

Well-written C++ with -Wall -Werror, -Weverything, -Wextra... that is very safe, including detecting even dangling stuff to some extent (gcc-13). If you stick to `shared_ptr` and `unique_ptr` no matter how much you complain about it: Rust with its C shims and C++ with all linters and a good environment are practically at similar levels of safety.

This is the practical, real thing that happens. I do use C++ for every day use for around 14 years professionally and 20 years in total.

You are all in the terrain of theory, but how much Rust and C++ have you really written?

Of course, the CVEs data about memory safety, well, those are true. And they are a real problem. But with a reasonably good use of C++ those would be much, much, much lower than they have been so far.


> However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Yet, somehow people do this with python, perl, and ruby. Google hires professional python people too.


Not only do this, but do it way more successfully. I'll never get tired of repeating that among top YC startups, Java as a primary language contributes to roughly 1% of value, while Python + Ruby are almost at 70%.

https://charliereese.ca/y-combinator-top-50-software-startup...


If by successfully you mean time to market, for sure you are right.

C++ gives more return when you start to save in infra because you have a more efficient language, if coded properly. Same goes for Go vs Python.

The right tool for the right job. I would use (and will, I am on it) Django for a SaaS that I have for the backend. If things start to work relatively well, then, I will keep identifying, if there are, bottlenecks and migrate with a hybrid architecture parts of the workload to C++.

This allows me to save in infrastructure bills.


Hylo page says it was formerly Val.

IIRC, Val has been mentioned on HN sometimes earlier.


As painful as the Python package system is, C/C++ is so much worse.

I remember trying to compile GTK+ on a solaris system a decade ago, and remembering how terrible it was to even to get it to compile.

You're really deluding yourself if spending your time in compiler dependency hell is so much better than python.


The new CMake package manager may make things easier.


> Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions

I see this argument a lot, but people often forget that Python is very concise (yet readable) compared to other languages.

100k LOC in Python typically contains way more business logic than C++, so it is only natural to be harder to maintain.


Python ismuch less concise at this size. Sure your algorithms are more concise, but you lose all of that and more because it you don't have 100% test coverage you never know when a stupid typo will make your program crash, while with c++ you typically can be fine with more reasonable coverage, say 80% where the last 20% is things hard to test that are unlikely to fail anyway. At that scale Python is slower to build as well because c++ lets your tests depend only one the files that changed and this your code-build-test cycle is faster despite c++ being famously slow to compile.


MIT dropping SICP/Scheme for Python conicided with the general decline of education as an end in itself. Python is the VHS of computer languages. I couldn't believe it when I heard MIT dropped a language renowned for its elegant lambda implemenation in favour of a language in which lambdas were not only frowned upon but literally throttled. I think it says everything that MIT ditched Scheme and SICP at the same time as it would be near impossible to teach SICP with Python.



I would argue that Python is the Betamax of computer languages, and C++ is the true VHS.

Fight me.

But seriously, Python is good. Don't let the perfect be the enemy of good, only computer science people can do this.


> the best college level introductory course probably should not have been SICP based but Python

i saw this linked on here recently: https://wizardforcel.gitbooks.io/sicp-in-python/content/


And were of cause forgetting VisualBasic because who cares about microsoftland those days but back when python/ruby emerged even windows server and IIS was relevant as this was kind of the peak of microsoft's dominance.


Yes, it is pretty weird to think that VB.Net flavor was fairly used in web dev even as late as 2006.


> There should be one-- and preferably only one --obvious way to do it.

I just wish Python applied this approach to package management. It is needlessly complicated.


The obvious part is definitely lacking, but the long and short of it is basically to ignore all the newfangled solutions. They are much more trouble than they are worth. Just use pip with a venv and a requirements.txt (easier) or a pyproject.toml (cleaner).

I really fail to see what the newer tools actually bring to the table. From my experience they solve no actual problems and just introduce more bugs and weird behavior.


I think Python has some of the worst API documentation I’ve ever read.

Even Java puts it to shame and that is sad


Coming to Python from PHP it was interesting to see that in the PHP world I'd get 80% of my knowledge from the PHP documentation and 20% elsewhere. In the Python world it's easily the other way around. It also doesn't help that the PHP documentation is more up to date so I am using the most current info, while for Python their own docs are so bad I rely on other docs but they all vary on what version of Python it depends on and whether it follows current best practice. The difference is night and day and one of the reason I ended up going back to PHP.


Yeah the PHP documentation is extremely pragmatic and seems designed to get you going quickly with useful examples.

The Python documentation seems to be suffering from some sort of weird snobbery. Very wordy as another comment mentioned. Examples are frequently lacking or inadequate. They seem like they're trying to compete with MSDN in "professionalism" although these days even the MSDN examples are better. There is an entire ecosystem of python tutorial websites around the web that would not exist if the documentation was as helpful as that of PHP.


The stdlib documentation definitely has a unique flavour. I would characterise it as "wordy".


I disagree: Python seems to have a zillion build tools and multiple competing type-checkers.


I like Python, but most of the Zen has always been a meme, and not a guideline of design principles of either the language, or software written with it.

Besides the one you mention, I also find the "Explicit is better than implicit" line to be against everything Python stands for. The language is chock full of implicit behavior, and strongly encourages writing implicit code. From the dynamically typed nature of the language itself, to being able to define dunder methods that change the behavior of comparison and arithmetic operators.

I really like Go partly because of this. Not only does the language itself stictly follow the explicit over implicit and TOOWTDI principles, but it encourages or even enforces them on the programmer.


It is absolutely not a meme, it's [PEP 20](https://peps.python.org/pep-0020/). Just because some people don't take it seriously, it's definitely a part of the language's soul.


I think The Zen was really important for Python's success as well. Having your core values right there, well defined and out in front, wasn't something you get with a lot of languages. Any language thats been around for a long time gets sorta.. muddled.


I'm from the outside looking in so don't take me too seriously because I tried Python and bounced off of it; there's very likely elegant parts of the language that I never really internalized because I didn't spend enough time working with it.

But speaking as someone who tried Python because I agree with principles like "have one right way to do things" and "be explicit", my initial impression working with language is that much like real souls, Python's soul is metaphysical, unobservable, and doesn't seem to interact much with the physical world in an observable way ;)

If I had to list some of my main criticisms of Python it would be that the language seems to have way too much implicit behavior and seems to have way too many ways of doing everything. I'm going to say something heretical, but it was weirdly enough early Javascript that I found to be a lot more consistent and explicit[0]. Type casting was a disaster of course, dates and the standard APIs were a complete mess, but beyond that it was rare for me to look at Javascript code and think "I have no idea what the heck that is doing." But it happened all the time in Python, it took me a while to get used to things and I still feel generally less capable in Python than I do in other languages that I've spent less time working with. There's so many little syntactic tricks in the language that are... convenient, but I resent having to memorize all of them.

[0]: Until it started messing around with classes and const and Symbols and crap -- the language is probably much harder to learn now than it used to be in the past, but I don't know because I'm disconnected from new users now. But certainly having 3 ways to declare a variable now probably doesn't help new users.

----

As an example, just this weekend I tried to convert a Python codebase from 2.0 to 3.0 and was immediately hit by needing to resolve implicit casting rules about integers, buffers, and strings that were all different now. Python has this weird thing where sometimes it does implicit casting behind the scenes and sometimes it doesn't? There's probably a rule about it, but it's never been explained to me.

So then I wanted to figure out the best way to handle converting a string of hex values into a manageable format so I searched that up and got advised that I should use `struct.unpack` except to be careful because that would give me a tuple instead of a list and also would require me to pass in the length, so instead I should actually use `map(ord, s)`, which prints out something that certainly seems to look like a list, but is not subscriptable, which is not a thing that I knew that I needed to care about but apparently do because the program broke until I cast it back to a list. And probably what I should have done was list comprehension from the start? But it wasn't clear to me if I could do list comprehension on a string or not since I do know that strings in Python technically aren't lists, they're sequences, and anyway list comprehension was not what people were suggesting.

And I know it's unfair because this is very beginner stuff in the language, but my immediate thought was, "oh right, Python. Of course when my debugger prints out something that looks like an array of values it might be one of 3 or 4 different types behind the scenes, all of which will error for subtly different reasons. It was silly of me not to see this coming."

Again, fully aware that this is basic stuff that would completely go away with familiarity with the language, but like.. oh my goodness my kingdom for having one array type that just works everywhere and one iterable quality that supports the same manipulations everywhere no matter what the underlying type is. I'm trying to do quick scripts, if I cared about these distinctions and if I cared enough about performance to need multiple ways to have a list of values, I'd have written this in Rust or at least C# or some fully typed lower-level language. There doesn't need to be this many ways in a scripting language to say "I have an ordered collection of values."

I'm not saying you're wrong, I suspect you're right. I suspect the underlying language is much more elegant than what I'm seeing. All I'm saying is just that the initial impressions of Python for people like me who are really inexperienced with the language are anything but the PEP 20 list -- the impressions are the opposite, it's exactly why I bounced off of Python so hard. And I don't think that's individuals doing something weird, that seems baked into the language? Individuals didn't give Python 4 different ways to represent a sequence of values. I don't think it's a few coders' fault that I'm constantly seeing syntax in Python where having the code look prettier seems to be the priority over making it understandable or explicit? Again, take it with a grain of salt, just... I don't know, I always laugh when I see the PEP 20 linked because it's so contrary to how I think the language looks to new users. I could compare this to something like Lisp, which I am also extremely inexperienced with and extremely bad at writing, but when people talk about Lisp having simple rules, I think, "yeah, I see that. I see the system that you're talking about and I see the consistency you're talking about." With Python I just don't see it, the initial impression makes it feel like a language written by graphic designers trying to create something that's pretty rather than systemic.


For what it's worth, I have come around to really enjoying the more recent versions of python, and it's what I'm writing most at the moment, but I totally agree with you here. I don't think of python as being explicit and having only one way to do things. I think that's a pretty inevitable result of trying to both add new things and also keep older things for compatibility.


I code both.

> explicit over implicit

Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

Further, go panics in surprising ways sometimes. A library developer might decide to panic and cause your program to crash, when that library was only used in your program for optional behavior.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly.

Maybe I'm reading it wrong, but it reads that you have been accustomed to implicit nil type conversions in other languages and that tripped you up when Go required you to be explicit. It seems Go is the explicit one here.

> A library developer might decide to panic and cause your program to crash

Go is abundantly clear that panics should never cross package boundaries except in exceptional circumstances. And those exceptions should crash your program. An exception means that the programmer made a mistake and the program is now invalid. There is nothing left to do but crash.

It is technically true that a library developer can violate that guidance, but it is strongly encouraged for them to not. The parent did indicate that sometimes the explicitness is only encouraged, not enforced.

Not that Go is some kind of explicitness panacea. It has its own fair share of implicitness. But I'm not sure these two are representative.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

I think it makes sense - a `nonexistent<ThisType>` is different from a `nonexistent<ThatType>`.

In C, Java, C#, C++ and other languages, null/nil is a value. In Go null/nil has a type. Now that I've used Go a little, I actually miss having error messages for certain things, and comparing a `nonexistent<ThatType>` to a `nonexistent<ThisType>` is actually a logic error in my code that I want to be warned about.


I don't think "you can define + on custom data types" to be the same as implicit behavior.

"Explicit is better than implicit" is more around things like Django not just automatically importing every directory it sees (instead requiring you to list the apps used). It's also a decent rationale for changes made from Py2 to Py3 regarding bytes-to-string conversions needing to happen explicitly.

It's also about how usually you end up explicitly listing imports (wildcard imports exist but are pretty sparing), or lacking implicit type conversions between data types.

This stuff is of course dependent on what library you are using, the projects you are working on, etc. And there's a certain level of mastery expected. But I think the general ideas exist despite the language allowing for much more in theory.


This held up much better in the earlier days of Python.

Sooner or later every sufficiently popular programming language is confronted with the dilemma of either breaking backwards compatibility and/or adding "alternative ways to do things"

Pathlib is an interesting one. It even has a correspondence table [1]. The introduction of pathlib made sense because it is just much nicer to use. But you couldn't drop the equivalent functionality in the "os" module for backwards compatibility. It's just far too widely used.

The is no magic bullet for this one. Either you accept introducing duplication in your language or you go through a hard change (f.ex. the Python 2 to 3 change).

The softest way to migrate might be to flag functionality as deprecated and give users a fairly long time (talking years) before dropping it. But no matter how much time you give the users there will be users who won't change until it's too late.

So there really isn't a winning path for this one.

[1]: https://docs.python.org/3/library/pathlib.html#correspondenc...


That was the one mantra from the zen of python that I always laugh at too.

Because my #1 complaint with python is that there are so many ways to do the same thing. It sounds nice in theory if you are a new developer because you can create your own voice in python and character.

For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

By comparison if I gave the same puzzle to some Go engineers, all of the responses would be probably very close to identical.

This sounds fun as a new engineer, like this is an argument for using Python. But as you get older and more experienced you will likely find yourself hating the flexibility of python because as you scan any decent size codebases, you start to be able to identify entire parts of the codebase and who wrote them, without even needing to do a git blame.

I am currently an SRE manager who works in a very polyglot environment (primarily Node/js, bash/sh, python, and Golang with just enough java thrown in to ruin your day). When I read through python codebases, I can identify exactly who wrote it, even past employees. Then when you need to fix it, you often have to adapt to that style as you fix a bug or add an updated feature. This is a little true with Bash because you will see certain engineers always lean towards sed and others towards awk and grep to solve problems depending on their strength, but it is less significant. However, in our Go codebases, everyone writes nearly identical code.

I've been writing Python for over a decade and I still learn new features of the language every week or month. Just this week I had to dive into the `ast` library (abstract syntax trees) which was a native module I haven't ever touched before. By contrast, I haven't had to learn any new core syntax and tools of Go and Bash in a long time. You have a fairly small set of syntax and that's it. The power comes in how you use that small set of tools, not learning a new language module.

Again, the infinite flexibility sounds nice. But in a corporate environment the strictness of other languages is actually a benefit because it keeps everyone following the same path.

I truly believe the mantra from zen of python:

> There should be one-- and preferably only one --obvious way to do it

Sadly, Python lost this tradition far before I ever became acquainted with the language. And the python3.10+ mentality of supporting everything forever is only going to make it worse overtime.


Either you move with the times or you become obsolete. 20 years ago Python codebases were clean and consistent, but language design has moved on, and Python has - barely - kept up with it, so now you have people who know the new ways and people who know the old ways and various stages in between (and it's not like they didn't try ditching backward compatibility, but that didn't work out well either). Go has the luxury of starting 20 years later and being a lot less ambitious, but it'll happen to Go too in time.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

I get what you're saying, but I don't think that's as true as you mean. I think that most experienced Python developers tend to gravitate towards the same style of solutions for things like coding puzzles (including the classic "read in data, map it through a comprehension, write out data" pattern).

There are multiple ways of expressing the same thing, but very often in a specific context one of the ways tends to be overwhelmingly a better fit than the other.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

This is true for any language. Arguably, what's different about Python is that the more senior the engineers you're interviewing, the more likely their solutions to converge.

Just because something is in the Zen of Python, doesn't mean it automatically gets followed by every Python engineer. It's Zen, after all - you don't automatically get enlightened just by reading it.


There's generally one obvious way to do it. There's almost certainly other ways to do it, and those other ways might be better suited for different specific tasks, but the obvious way tends to exist and be widely usable.

Are there some exceptions? Sure.


I'd suggest "pythonic", rather than obvious, for that sentence. It's one of the ways the community reminds itself of the utility of consistent discipline.


It's perhaps worth noting that the next line of the Zen of Python is: "Although that way may not be obvious at first unless you're Dutch." (I.e. unless you're Guido.)

So it was a bit of a tongue in cheek statement from the very beginning. :D


IMO, its not “hilariously wrong” in Python.

Note that as written the priority is that every use case has at least one obvious approach, and that a secondary priority is that there should be a unique obvious approach.

People often seem, however, to misread it as “there should be only one possible way to perform any given task.” Which, yes, is hilariously false for Python, but also not what it says.


Yeah. It is now.

It didn't used to be though. People would wax poetically about how pythonic some codebase was and sing the praises of idiomatic python. There was an ideological war between Python ("there should be one--and preferably only one--obvious way to do it") and Perl. ("there's more than one way to do it" or TIMTOWTDI, pronounced "Tim Toady")

Generators and comprehensions and decorators and `functools` are all relatively new.


I feel like the entire industry has adopted Python's approach here, so probably Python doesn't stand out on this point as much as it did in the early days.

Compare Python to C/C++, where even just getting a working binary from source is fraught with many competing tools. Or boost vs std and whatnot.

Others have already pointed out the contrast with Perl.


There is always one obvious way to do things in Python, but it happens that what's obvious varies from person to person.


Like those various personal subsets of C++ :)


As evidenced by the unified Python library / dependency management system.


I use pip 100% of the time


>> I use pip 100% of the time

What about pipenv, poetry, conda, setuptools, hatch, micropipenv, PDM, pip-tools, ActiveState platform, homebrew, or your Linux / BSD distro's package manager?


As much as I love to rag on things, I would go so far as to say that the big problem with Python packaging is the fact that it tries to manage C/C++ packaging and integration.

If Python is only managing Python code, the issues to be solved are VASTLY simpler. Just about anything works.

Once it has to deal with dynamic libraries, compiled code, plugins, build systems, etc. the combinatorial explosion just makes life suck.


Python is a victim of its own success and versatility.

It has become a better "glue code to solve your problem" than many other solutions and so people want to use it everywhere for everything.

It gets packaged in many different ways because it gets used in many different ways for many different purposes.


Packaging is a difficult problem and all attempts to simplify things fail to understand how complex the problem is and so fail in some way. (Attempts like .deb do okay by only focusing on a subset of the problem)


I use none of those except for homebrew, but I didn't mention it because it's for installing complete programs that happen to be written in Python, not Python dependencies for when I'm working with Python.


You forgot about egg!


pip (with venvs, which is a built-in Python feature) covers 99% percent of all use cases.

Yes, its dependency resolution could have been better, and lock files are nice (which can be emulated using constraints [1]) but I don't understand why people are so busy writing alternatives. I work on pretty sophisticated codebases on a daily basis and haven't used anything but pip.

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files


Good for you but many don’t. Most people I know use Anaconda and install things with Conda


Use Poetry to produce a lockfile (or whatever lockfile-producing tool you like), and then let Nix take the wheel. It solves the dependency problem not just for Python, but for your entire stack. I've been able to hand it off to my coworkers and have everything just work with a simple "nix-shell" command, even with a dependency tree chock full of gnarly ML libs.


Whatever in the NamedTuples do you mean?


There is also TypedDict, Pydantic, Attrs and dataclasses


Yep, but couldn't be bothered typing them all :D


Far faaaar better with respect to this than R though at least.


You're young enough to have never dealt with write-only Perl code ...

Be glad.


> You're young enough to have never dealt with write-only Perl code ...

How young does one have to be for that to be true?

As recently as 2017 I was employed at a place where there was a significant amount of perl code, as part of the build system (ugh) for the C code, and generating html (for some other system).


Back then (in 2004) the most popular programming languages were PHP, Perl, C++ and Java. Java was pretty focused (and even then, there was the distinction between int and Integer, between []int and ArrayList, etc.) but C++ and (especially) Perl were driving people crazy because there were thousands of ways to do the same thing. And nobody had any respect for PHP's design so let's not even talk about that one.


Not really. For the core language this applies. This does not extend to 3rd party libraries, obviously, since anyone is free to reproduce whatever someone else already doing better, if they want.


I like also the idea of explicit>implicit, but then you see Django.

I hate significant whitespace though. I can work with it, I get it, I still don't like it.


That was then. Python does much much less explicitly anymore these days, either.


> I think Python was popular as a general-purpose language first.

That matches my memory as well. I can't find any references, but I seem to remember a quote going around way back circa 2008 that goes something like "Ruby is popular because it's the language used to write Rails; Django is popular because it's written in Python."


I think this is it. Love Ruby as a language. Have grown to hate the way Ruby developers write stuff. Stuff ends up with so much abstraction and indirection. End up yearning for a simpler time when you could write code that just did what you needed it to.


I've seen that in Rails projects too. They're often way more complicated than they need to be.

My rule of thumb is for a medium or large project that one probably shouldn't mess with the dynamic and meta bits of Ruby unless writing a framework and maybe not even then.


100%. You can absolutely still just use off-the-shelf Ruby-the-language to get things done, without involving Rails at all, but at this point I think it would be just as weird as using awk or perl for writing a command-line program.

I love tons of things about Ruby, like first-class/syntactic symbols and especially the pervasive and intuitive (in my opinion) use of blocks in the standard library.


Delving into a gem sometimes makes me lose faith in ever using dependencies again.


There are just too many ways to do things in Ruby. How many forms for an if-else or for loopin can you name in Ruby? Just as the simplest example.

Monkeypatching is also awful for readability.

Explicit imports are way more readable than things appearing into current namespace implicit kind of stuff like it happens with Ruby


> There are just too many ways to do things in Ruby

I think this is not completely true. Ruby is conceptually very simple. It's not C++ or Perl. Everything is an object, and everything is done through method calls. There is also extensive support for functional programming, so iteration is performed via filter/map/reduce and similar high order functions. Besides, the block/proc/lambda design makes it easy to turn APIs into DSLs.

For those reasons, Ruby gives the closest experience to programming with Scheme I have ever experienced outside the Lisp world.


I've been writing Ruby daily for years and never written a `for` loop


Started Ruby in 2007 and have used it professionally essentially nonstop. I have also never written or seen a for loop. I don’t even think I could correctly guess the syntax of it.


Monkeypatching exists in both ruby and Python.


It does but it's generally considered a bad practice in Python (and many other languages) and a feature in Ruby.


Two: `for` and `each`. Most people use `each`.


Python fixtures anyone?


I remember entering in Python because you have slices and other syntax sugar stuff that makenyou think other programming languages were making you life miserable on purpose!

Also, the batteries included in the standard libs was incredible when you needed to use the ACE/TAO monster in C++.

Finally, interfacing with native code via SWIG enables you to quickly use critical optimized code.

Obviously, other programming languages captured Python power since then.


This sounds like an idea without any real data. For people to prefer one they had to try both and I guess especially data science people just took the most common language in their field.

People seem to forget or miss that it took many, many years for Python to become popular outside data science. If it had just been clearly better or easier that transition should have gone much faster.

As someone else mentioned, at one point universities started using Python as an introduction language. I am still sad that they did not choose Ruby or js but here we are.


I don't buy this at all. I think it's path dependent. The language differences aren't big enough to dominate over ecosystems and libraries. Ease of integration with C can make a bigger difference than significant whitespace. Etc.

Personally I think Python is an exceptionally ugly language for one as popular as it is (the magical underscore identifiers really bug me, and I think list comprehensions are deeply inferior to monadic chaining - there's a reason nobody copies them but everyone got LINQ envy). But it's clear from a perusal of code in the areas where Python dominates, data science and machine learning, that aesthetics are very far from people's minds. They'd be using Javascript if it had the libraries available.


>I think Python was popular as a general-purpose language first.

What were their choices though, Perl? It's easy to see why Perl lost out. Other than PHP, I don't really know of any other JIT scripting languages they could have chosen.


I knew about Python in 2006 or so when got into Linux, and at that time (when Python 2 was a thing, iirc it just came out) it was very popular between the FOSS world for doing apps and GUIs (I even toyed a lot with PyGTK), whereas I felt Perl was much more about more "serious" stuff like text processing and kind of a sh language with steroids. I just barely heard about Ruby and wasn't sure what it was its purpose - I just heard about its "gems" but not about Rails. Still as both Python and Perl were FOSS I supposed their niche and user base were going to be around it, as many things FOSS at the time.

At 2008 I started my graphic design studies and I just pretty had to forgot about programming (which in hindsight if I had kept doing it I would have a very strong programming background and maybe my life would be much better now, but it is what it is) - but was very surprised to discover around 2011 or so that it seemed _everyone_ was using Python. Like I just blinked and it took over the world somehow.


The other strong contender at the time was Tcl, especially when combined with the graphical library as tcl/tk. Python implemented tk as well given it's popularity.

Tcl's extensibility led to expect, which was very useful for automating scripting over telnet.

https://en.wikipedia.org/wiki/Expect


Tcl also influenced many Microsoft Powershell aspects.


There were always other choices. Lua is the main one that comes to mind.

The point is that data science use came quite late in the Python world, and the increase in users due to it is incremental. Python was already at 3.x before the ML world adopted it. If the ML world picked another language, Python would still be in the top 5 languages.


You arbitrarily restricted the scope. Java is what I saw.

I was in college 2006 - 2010 in CS, and while all the introductory courses were in Java, by 2008 or so a lot of the other students had switched to Python on their own, for projects where either language would work. Didn't really see anything else, just Java and Python.


No, it wasn't arbitrary. A JIT language is much easier to pick up than something needing a compiler and an executable. The focus on a scripting language was deliberate

Edit: turns out the term I'm looking for wasn't JIT but an interpreted language.


Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It would seem that having a standard is more popular.


Yeah, there's the ongoing Perl joke about writing a script that works today, but not understanding how it works tomorrow. Too much one-liner type stuff that did not allow for maintainability


That’s any code I haven’t looked at for a while, to be honest. I can’t count how many times I’ve looked at code I wrote or bug tickets I fixed and have absolutely no memory of doing it. It’s almost like the act of committing flushes the local storage in my brain.


I run into this problem as well I'll often come across something I wrote a few years ago and struggle to remember why I wrote it that way.

I've learned to add comments to my code - from what I see commenting code is frowned upon by a certain subset of developers but I've taught myself that whenever I am doing something subtle or unintuitive to add a short comment explaining what the code is doing, for every potentially unnecessary comment I've added I've also saved myself time when I've had to come back to something months or years later and been able to refer back to the comment.


> from what I see commenting code is frowned upon by a certain subset of developers

I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a... belief.


The main argument seems to be comments should left with the commit message and not "inline" i.e mixed with the code. Personally I make heavy use of inline style comments.


I've worked in tech my whole life and never heard a dev make these comments are bad claims. I only work in the boring world though. Is this a startup or FAANG or silicon valley thing?


In clean code there is a chapter about eliminating useless comments. I think this has some merits. Think of the following made up example:

sum = sum_two_numbers(number1, number2)

Probably doesn't need a comment.

The other argument is that comments need to be maintained and are subject to decay. E.g. the code they are commenting on changes.

The book goes into other examples but i think the idea isn't to eliminate comments but to be thoughtful and judicious in which you use


Perl is different as people really tried to do this on purpose with maximizing the concept of one liners. Maybe it's a PTSD kind of an effect? "Any" code or good code doesn't try to be developed in a way that is difficult to parse by the next person. Perl developers definitely didn't adhere to the policy of writing code like the person maintaining it is a serial killer that picks people that write unmaintainable code and they know where you live.


> Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It wasn't just "having a standard" that mattered; in Perl it was actively encouraged to find a different way of doing something.

"There is more than one way to do it" was often taken by the Perl programmer as a challenge to find all the other ways, and then use them all in different parts of the same program.


Icon? Smalltalk? Dylan? Scheme? Common Lisp?


Perl's actually excellent at processing unstructured data, and it had a strong foothold in bioinformatics for a time. I don't think the decision was as obvious as it looks.


This is true. Bioinformatics was full of Perl scripts for a variety of (text) analyses. I however remember well that many students began to hate it soon after working with it, as it was very difficult to understand existing code. So, when given a choice, many choose Python as an alternative. And stayed with it.


For most people the thing that is hard to understand about Perl scripts is the regexp code. However, regexps looks more or less the same in any language. But, the thing is, most Perl scripts process things such as log files and similar data. Which makes the scripts highly dependent on regexps, hence hard to read and maintain. The same thing goes for any code that uses a lot of regexps.

Actual Perl code, disregarding regexp, certainly isn't anymore difficult to comprehend than code in most other languages.


Python may be JIT now (is it?), but it certainly wasn’t back then.


JIT may not be the correct thing to call it. At least in my head, any script that doesn't need to be compiled by the dev or the user before running is JIT. It's the scripting vs building difference to me. If that's not correct, then I'd love to be corrected.

Here's my reference:

https://en.wikipedia.org/wiki/Just-in-time_compilation


Interpreter vs. Compiler might be closer to the distinction you are looking for:

https://en.wikipedia.org/wiki/Interpreter_(computing)


Thanks. "My interpreter doesn't care about white spaces, why should I" being something that should have clued me in.


JIT is a compilation model for interpreted languages, which translates bytecode into machine code in runtime to obtain better performance. The opposite of JIT is AOT (Ahead of Time). JIT languages go through two compilation steps: one to bytecode, and another to native code in runtime. Java is one example of a JIT compiled interpreted language, that also does ahead of time compilation to bytecode, while python compiles to bytecode transparently, but does not do any JIT compilation to native, which is part of the reason it's considered a "slow language" (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language).

TLDR:

Java uses AOT bytecode compilation + JIT compilation

Python uses import-time bytecode compilation + slow interpreter


> (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language)

Yes, which is why this is the way CPython works, but PyPy uses JIT and is faster.


> and is faster.

After a very long warmup time, which may make one-shot applications much slower.


> There should be one-- and preferably only one --obvious way to do it.

There’s not even one obvious flavour of Python to use.


It's hard for me to name a problem domain that doesn't have 3 different python packages doing the same thing three different ways, each more clever and less understandable than the last.


for real

If it's not on the stdlib, I'd go with the most popular one, or just DIY

No, it is not "best practice" to use a barely tested module with half the bugs affecting its main functionality just because someone thinks it is

Unfortunately package abandonment is a thing and they don't even bother fixing needed bugs (which, coupled with 'the urge to deprecate' makes things annoying)


Why TF would you name an HTML parser BeautifulSoup?


I Googled it, apparently it comes from this https://en.m.wikipedia.org/wiki/Tag_soup


OK, I get it, but when I'm trying to write automation for a shop that isn't natively Python-savvy (long story short, it made sense when I started a side project which evolved), if I use that library and ever move on, I now have to document in comments or somewhere WTF "BeautifulSoup" means. Because of some rando's inside joke they thought was funny.


This is actually one reason I prefer the Ruby ecosystem — whimsy is still welcome. The world has enough Noun Manager projects.


When I'm trying to figure out how to build a side project for my team and simultaneously fighting company bureaucracy who reflexively stonewalls things on the order of "no one's done that before, we need permission," I don't also want to deal with some immature idiot's stupid namespace. I want a damned Noun Manager for my own sanity's sake.


The name is also a reference to Alice in Wonderland.


Only one of the items in your "such as" list is unique to python's philosophy, and that is "there should be one way to do it". Ruby embraces the many ways to accomplish something, it's true. However, the three others are not unique to python. Ruby especially embraces "readability counts". And I can't think of anything in the ruby language itself that is implicit over explicit. Perhaps you are thinking of rails and comparing that to python.


For me, the Ruby community's comfort with monkey patching was a big turn off. In Python, you can hack around on class and replace its methods, but if you do, it's expected that your coworkers might stick you in a dunk tank and sell tickets. It's just not the thing that's done.


Monkeypatching is maybe more of a Rails thing than Ruby. I think the biggest problems with Ruby around the time Python began to take off were slowness (MRI was a disaster, slow, and full of memory leaks), plus patchy documentation (lots of things were in Japanese).

Still, I preferred and prefer Ruby. Python has fantastic libraries, but it is a mediocre language. Ruby feels like a simpler version of Perl + Smalltalk, and it is a joy to use. Python has intentionally crippled anonymous functions plus syntactic whitespace, which often leads to long and ugly code.

I think it is a shame Guido hated functional programming and he did not embrace an Algol-like syntax with begin/do end blocks. Those two things could have vastly improved Python. Ruby's block, procedure and lambda design is a stroke of genius that yields beautiful code and makes DSLs trivial.


>Ruby's block, procedure and lambda design is a stroke of genius

Hardly. Not only was it nothing great, it had many variations, making it harder to remember.


    For me, the Ruby community's comfort with monkey patching was a big turn off
If it helps, the Ruby community really soured on monkeypatching in general quite a while back. You don't see it much these days and every decent book or guide warns against it.

It was definitely a crazier place 10+ years ago in the early Rails days.


>It was definitely a crazier place 10+ years ago in the early Rails days.

Yeah! It sure was.

I was around, was an early Ruby and Rails adopter, and worked on a few commercial Rails projects at that time.

That's how I know about the monkey patching issues. Faced them in real life.


I certainly wouldn't blame a person for experiencing those crazier days and thinking the community didn't have the greatest technical vision.

I played around with Rails and Ruby when Rails first blew up. But I didn't start doing it fulltime professionally until 2014. By the time it seemed to me that the community was maturing. I think it's in a good place now.


I figure if Django built an entire framework around metaprogramming, I'm going to do it to Django model objects as a form of vengeance.


I want to see the malicious wonders you can create!


An easy way to start taking the fight back to the enemy is by overriding __new__ and tinkering with things before/after Django's metaclass __new__ does its own metafoolery.

Also overriding mro() is a fun way to monkey with the inner workings.


Whereas in Ruby you get to go to a conf for doing that.



> And I can't think of anything in the ruby language itself that is implicit over explicit

This is surprising to me. I love ruby, but it embraces implicitness more than any other language I've used. method_missing, monkey patching internals, and other types of metaprogramming make things very implicit at the interface level


If you want to really experience the true horror of implicit try Scala.


Scala's implicits are a lot less unpleasant because the type system means that you (and, perhaps more importantly, your IDE) know exactly where they're happening.


> Ruby embraces the many ways to accomplish something, it's true.

Far less true than with Perl.

In another hand, many questions I asked about Python came with various way to do the stuff. So I'm not sure about that advertisement.


Ruby has implicit return values for methods, but unless I'm wrong, so does Python.


Python has a default None return, Ruby returns the value of the last expression.

Neither (except maybe the None case in Python) is really implicit, Ruby is just an expression oriented language while Python is statement-oriented.

OTOH, in Ruby the keyword “return” is superfluous except for altering control flow for an early return, while in Python it is the mechanism for supplying a return value.


> Python has a default None return, Ruby returns the value of the last expression.

I would really hate this feature in a language without strong static typing.


Doesn’t seem like that big of a deal. You’re going to have a really hard time getting your first test to pass if you make a mistake.

I don’t mean using testing as a poor man’s type system, I mean even the tests you would write in a statically typed language will quickly uncover any mistakes you could possibly make there.


In statically typed systems, this wouldn't even compile.

I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.


> In statically typed systems, this wouldn't even compile.

Still, it's not like you're going to miss that a function doesn't return something expected. Even in statically typed languages you are going to have tests to validate that much.

> I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.

That's another problem entirely. Although even then it would be pretty hard for an errant type to not blow up your tests. Again, not tests specifically meant to check that your types or sane, just the same tests you would write in a statically typed language.


> I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.

Quite commonly by design, or, “why union (and sum) types are a thing”.

Something returning type “Foo | nil” is very often going to return Foo on one branch and nil on another.


Python's only implicit return value is the default `None`.


Thanks! Adding that to the things I've learned (or unforgotten, I guess is more accurate)


Well, also, if you're going to write something like Numpy, Python is the most hospitable language for it. C extensions really are a superpower of the language. It would not be easy to achieve a similarly effective result while working through a more standard-issue foreign function interface.


> There should be one-- and preferably only one --obvious way to do it.

It looks too late for 3.13[0]

Maybe they can channel the BDFL in the Packaging[1] thread for version Pi.

[0] https://docs.python.org/3.13/whatsnew/3.13.html

[1] https://discuss.python.org/c/packaging/14


> I think Python was popular as a general-purpose language first.

Is that true? My understanding was that it was a scripting language first (still is), but then got taken up by data and science people and various other niches. Then some education books and courses, like the Artificial Intelligence: A Modern Approach and later MIT and other university adopters. And all of that began to snowball where Python was either the only or main language people knew, so they started using it for everything indiscriminately.


> I think Python was popular as a general-purpose language first. After all, there was a reason people put so much effort into writing Numpy in the first place.

what general purpose? that's just a buzz word. especially back then shipping python apps was never a viable option compared to binaries compiled from c/++ or java. there never was such a "general purpose".


I've used both ruby and python extensively, and I don't actually consider them to be any different, really, on any of these points.


> I think Python was popular as a general-purpose language first

What I understand is that Python was popular in the anglosphere as a general purpose language first, Ruby was somewhat popular in Japan earlier as a general purpose language but didn't become popular in the anglosphere until Rails.


I was actually surprised when I found out they made a web framework in "the RPG Maker XP language".


> Readability counts.

In what way is numpy readable?


It's designed to be familiar to people who know Matlab. Matplotlib bears the same burden. Sometimes you have to start with something that's familiar to users and slowly change when you have people on board.


LOL yeah, while the principles listed in the Zen are good advices and practices, unfortunately most of them don't really depend on the language its self but rather on whoever writes the code and their implementation choices. Of those few that depend on the language too bad python violates pretty much all of them. Starting with the very 1st and 2nd line, where significant white spaces (one of the worst idea ever in programming, if you ask me) make block limits implicit in the indentation rather than explicit int he code with clear delimitation marks (brackets, for instance), making in return most of the source code written in it look quite ugly, and as dyslexic I can add not very accessible. Let alone the horrible experience while iterating on a piece of code: 99% of the time you're briefly stopped by and error simply because in the iteration some lines got commented out and now the damn thing throws a fit for the indentation. Not really practical in my experience.

The reason why one language is more used then others at any given times it's way simpler and more bound to humans than the languages them self: - fashion trendes - laziness - sloth

Most of the people out there writing code and "increasing numbers for any given language" have no real idea of why they started with one language rather then some other one, they never really dig deep enough to actually made an informed choice, and most will keep using a single programming language because they "don't feel the need to learn a new one", aka: I'm too lazy to ever go deep enough the only language I know, let alone learning a new one. And it's the market's fault: we spent the last decade or more taunting how many bagilions programmers will be needed, how anyone can get a great life by simply learning a bit how to code, etc. None gave a fuck about quality, the only goal being cheapening and cheapening the Software Developer profession, until neural networks came about and indirectly revealed the truth: we haven't being rising SW developers/engineers/etc, most of them were just Code Typist copying out of stack overflow. If something like copilot or chatGPT can substitute them, it means there wasn't much value there in the 1st place. In 2007, Jeff Atwood made the quote that was popularly referred to as Atwood's Law: “Any application that can be written in JavaScript, will eventually be written in JavaScript.”, and that's NOT a good thing, it's just the epitome of the state of the industry.

In python's case it's luck was google: python (like go, for instance) is a convenient language for system automations, let's say a more sane versions of what perl was mostly used for in the past (if you notice, lots of python Zen's ideas are attempts to fix perl's insanity). Google has lots of system engineering going on, lots of people using (and abusing) python, and a single repo where everything ends up into, and when they started making neural networks with them, python got fashion for making neural networks. Anyone and their dog wanting to try out some kind of machine learning (10+ years ago) would find a tutorial in python, and tensorflow sealed the deal.

Yes, numpy and pandas did have quite a bit of weight into luring the Math Community into using python, but there's nothing inherent in python that makes them possible, they could have being made in any other language. For instance haskel and lisp are way more approachable from a math stand point, they're just not in fashion any more


No, that’s self-important bullshit. These are just evidence that Python really suffers from people in that community not using any other languages.


Numpy is certainly amazing, but there are tons of competitors in the data/scientific space, which pure "A-type" data scientists tend to prefer: R, SSPS, Matlab...

The difference is that Python doesn't entirely suck as a general-purpose language. Sure, you might have better options, but it's still reasonable to write almost anything in Python.

Other scripting languages like Ruby, JS and Lua are probably a little bit better when evaluated strictly on their merits as programming languages, but they lack the ecosystem.

In other words, it might be inelegant, slow, a mess, whatever, but the concept behind it is basically correct.


>> R, SSPS, Matlab...

I'd immediately throw out "competitors" that cost hundreds of dollars to use. https://www.mathworks.com/pricing-licensing.html Academic pricing is $275/yr and normal is almost a thousand a year. Then you pay for modules on top.


Yes, and this is why R has become so dominant in the data and statistics spaces. R is free, and has gained so much momentum that it often has packages that no other language has. Yesterday I found out that there was just no way to easily reproduce mgcv::gam in python. There are loads of similar examples.

As someone who likes both R and Python, and had the misfortune to learn Stata; I am very glad to see R winning over these expensive dinosaurs. Maybe one day it will even displace matlab!


R even has packages to read in NanoSIMS data! [1]

There are perhaps a hundred NanoSIMS machines in the world as they're extremely expensive [2]. Yet we have an R package on CRAN to analyse their data!

[1] https://cran.r-project.org/web/packages/lans2r/vignettes/lan... [2] This 2017 reviews says there are 42 machines: https://www.osti.gov/servlets/purl/1398172


numpy exists (somewhat indirectly) because matlab cost $$$. and now numpy is a far larger use of matlab's concepts than matlab will ever be.

I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language and couldn't be ported to Python. I don't want a "data/scientific" language, or a "web" language, or a "UI" language- I want one language that explicitly supports all the usecases well-enough.


> I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language

Not intrinsic to the language but in terms of the ecosystem, for a purely A-type data science workload R is significantly better. dplyr + ggplot vs. pandas + matplotlib isn't even remotely close.

Now, obviously in the real world nothing is ever purely model work, which is why Python more than makes up for the difference.

> I don't want a "data/scientific" language, or a "web" language, or a "UI" language- I want one language that explicitly supports all the usecases well-enough.

Me neither. *taps sign*

>> The difference is that Python doesn't entirely suck as a general-purpose language.


R is great when your data is already together in one place and homogenous/processed. In that instance the workflow using dplyr/ggplot is better. R the language is hot garbage though, and the libraries available for doing random things that aren't working with certain data structures, doing math or plotting are generally much worse than their python equivalents. That is the reason python dominates data engineering, and why it's pushing R out everywhere but academia.


R will still have a solid place in industrial pharma for a long time (even as python growths, R growth will continue. R is pretty tightly intertwined with the process of processing and submitting data from clinical trials to the FDA, much to SAS's chagrin.

Personally I think we have to accept that R, Python, Java, C++, and Go are going to be with us for the rest of our lives. I would expect PHP, Ruby, and Perl to go away much faster. Rust is still in question.


Are you from the future? R is making progress to replacing SAS in pharma, but has a long ways to go. Here is an article from last year[0] patting themselves on the back for making a R submission to the FDA. There are oodles of existing processes where data has to be formatted just so because that is how SAS does it. Nobody wants to rip up and re-validate that code until they must.

[0] https://www.r-consortium.org/blog/2022/03/16/update-successf...


The trope about SAS being required is pretty old... people trotted that out when I last worked in pharma ~15 years ago. The FDA specifically released an article saying that R is perfectly acceptable for submission https://blog.revolutionanalytics.com/2012/06/fda-r-ok.html

Please don't spread the SAS FUD. I work in Pharma and talk to the statisticians all the time.


I currently work in pharma, and I can point to teams of SAS programmers. There are definitely R efforts, and I have no doubt it is the eventual future, but it is not yet here.


Rails is probably going to keep Ruby relevant for a lot longer than Perl. I might go as far as to predict that Ruby will be the new COBOL or FORTRAN - widely used but neither cheap nor easy to find someone to modify it.


If there is anything in R that is better, it could be ported. It's kind of different from language intrinsics that can't be easily ported. In fact I think there's a great argument for writing a standardized data processing and visual representation layer in C++, and then making it usable from all the languages. It would be nice if this was true for the notebook layer.


It's the domain knowledge that's difficult. Like the example up thread about the lack of generalised additive models. The version in R was written originally by the developers of the method. There's no Python version because no one with sufficient expertise has ported it.

Don't get me wrong, I know mostly write Python because it's a better general purpose language but there are big big tradeoffs.


Matlab has a more interesting dispatch model than python though. (Mathematica has even better dispatch but I haven't used it in almost 20 years at this point) Julia's model is even better, but I just never have time to dig into it deeply enough to get to the zen of Julia. One day when I have time to understand Julia I'd like to, but it's hard to compete with the reliability of python just always working.


What is dispatch in this context? Function call polymorphism?

Aside: I wasn't a big fan of the Mathematica language so I wrote "PyML" a looooong time ago- it turned unevaluated Python expressions into Mathematica, transmitted them via MathLink to be evaluated the Mathematica kernel, and then the results were send back and converted to a Python expression. This was long before sympy. It never went anywhere (I found ways to simply not depend on Mathematica).


Yes, function call polymorphism. Especially combined with Mathematica's piecewise syntax. I didn't really use it for numerics, it was more for Lisp/Prolog type things. I was always impressed at how short and readable the Mathematica code ended up being. I used to use Mathematica quite a bit to generate C++ functions.

But it's a very different use case than Matlab. One of the things I like about python is it isn't too bad for sliding between lispy/functional modes for some things and more lapacky things elsewhere. Matlab has a better syntax for the lapack parts but trying to do anything functional gets really annoying and very, very slow. But still I would say Mathematica has the best pure math syntax for those sorts of things. I don't have any experience with sympy, I should probably check it out.


Mathematica is nothing less than a work of art. I dedicated a fair amount of time to become proficient, but it never solved the use case I had (find derivatives for complicated functions- this was before autodiff was common).

Unfortunately, Wolfram himself has made a number of decisions that greatly limited its applicability and so it will always be niche.


> and couldn't be ported to Python

“Couldn’t be” is not the same as “has been”. If R has a package that Python doesn’t, I’m not going to port it to Python, I’m just going to use R.


>I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language and couldn't be ported to Python.

I don't know what "A-Type" is so I might be misunderstanding you.

For SAS at least in my Org what keeps it entrenched is Enterprise Guide and the associated environment. Having a drag and drop GUI to do the data extraction and manipulation and the analytical tasks makes it very easy for non experts to be incredibly productive. The people that use it here are Engineers (Non-software variety) and Finance accountant types. These people would not be productive writing python code (or R) but still need to use something heavier then excel for data analysis.

Over the years we have chipped away at parts of SAS with things like Power BI and Azure ML Studio but I don't see python playing in the same space.


Well, why didn't the free alternatives to Matlab like Octave or Scilab win out then ?


Same reason people use Excel and not LibreOffice. Until you can promise 100% compatibility, users are going to run into something that breaks vs the original platform. For real business processes, the licensing cost is not worth sweating about the problem for parity.


I think people were not ready to switch from some heavy established software. Also by that time, costless was seen as less qualitative (certainly because organisations are happy with licences and are always looking for support contract ?) Coming later, when the market(?) is ready, made R more successful.

I think people used to compare to established standard and detractors are always there to point out missing stuffs, giving impression that Octave or Scilab are behind. Coming with a radical different language, R was condemned to quickly succeed or perish, and anyway avoid direct and frontal comparisons.


But R is itself a reimplementation of S.


Anyone who used Matlab to process text before 2015 wouldn’t consider it anywhere closer as a competitor to Python

Same goes for R/SASS

Python has no competitor in terms of the whole package, it is a full featured language while being particularly good at data


This is a good point.

I've done a fair bit of R and there is no doubt that as a pure data science tool it's usually quicker to produce something than Python.

But I prefer Python because inevitably any data science project ends up having other "bits" that aren't pure data science, and Python's super string general purpose computing libraries are hard to beat.


Some industries are stuck on matlab (jet propulsion, for example) but in general people from sciences are either Python or R, and SAS/SPSS are only really used in places like health care or pharma where there's a lot of regulation in place, because the software is designed around that regulation, unlike Python and R.


It is also still quite dominant in neuroscientific fields, as there is a lot of software and code there from the past, but it is changing little by little.


>Python doesn't entirely suck as a general-purpose language.

I always get caught off guard by comments like these. In my mind Python doesn't at all suck as a general purpose language. Only real argument is execution speed, but most people aren't actually writing code where Python's "slowness" matters and if you actually need that speed you can write that part in C and still use it from your Python application.


Matlab costs $$. Matlab was ahead in perf for a long time, not sure now. E.g. they used to JIT code, etc.

Comparing python to R, python feels like a much more mature language. R always seemed only useful for small scale projects, e.g. writing some script to process data with < 20 lines of code. Python codebases just scale much better.


It also has the nice feature of being good at multiple things. If you get a model running in Python, and decide you want to be able to run the code via an API, put it behind Flask and ta-da!, now it's a network service.

It's not the best way to write analysis code, and it's not the best way to write a web service, but it's probably the best way to write analysis code that runs on a web service.

I picked web services as a random example, not one I'm particularly passionate about. But as a general trend, if you want to combine 2 unrelated domains, Python probably has you covered better or more easily than just about any other language.


> If you get a model running in Python, and decide you want to be able to run the code via an API, put it behind Flask and ta-da!, now it's a network service.

You could say the same about a model (or any function) running in R. Want to instantly convert it to a web service? Use plumbr or RestRserve packages for that. Want to make a quick web application around it? Use the shiny package for that.

Just because Python is good at X, doesn't mean other languages are necessarily worse at it.


I would just add that for scientific scripting Julia is usually nicer to program in than the combo of numpy and Python. Julia is fast enough on its own that you can do the matrix calculations directly and it supports real mathematic matrix operators. It’s also a great type based design that makes using packages magical.


I think Julia has a hard uphill battle against the "Python is the second best language for anything" effect. Julia looks pretty cool to me, but I already know Python. I'll probably learn more about it in my spare time, but I'm also never going to be able to convince my coworkers to try it out, because they also know Python and that's good enough for them. And thus Python's dominance continues.


I was able to convince my coworkers to stop working on a Python project (a simulator for a scientific instrument) once we all realised that we would have not been able to keep using plain Python because of performance. When the alternatives where (1) to force everybody to learn either C or Fortran, or (2) to rewrite everything in Julia, they asked me to provide examples of both solutions. After I showed them a few different approaches (pybind, cbind, f2py...), there was full consensus for Julia. We moved what we implemented so far (it was not to much, luckily), and so far nobody has regretted the choice.

The problem in using two languages is that you do not have just to learn a new language besides Python. You have also to figure out how to properly interface them. (f2py was considered the easiest solution, but we feared that the 1- vs 0-based array indexing would have likely caused many bugs.)


Did you evaluate Cython? I'm not anti-Julia, but I like that my Cython code is useable out of the box from Python, with no wrapping, and then users can continue to use their Jupiter + Python scripting workflows with performant bespoke modules complemented by the full Python ecosystem.

Someday I'll do a project in Julia. But for some such projects, Rust seems fully guaranteed to be performant while Julia might or might not be, so I might still lean towards Rust (unless one of the high quality packages of Julia removes a lot of development time, which is a decent possibility).


I used it for another project but was not impressed. When I tested it, the documentation was scarce, and the deployment is harder than Julia because it needs a working C compiler alongside the CPython interpreter.


Very true, it isn’t usually the best designed language that wins in these cases. The language just needs to be good enough like CPP and JS were. Python is nicer to write than either and with [mojo](https://www.modular.com/mojo) possibly rising in popularity, Julia will lose some of its benefits. The momentum is behind Python which is the ultimate factor.


> which pure "A-type" data scientists tend to prefer: R, SSPS, Matlab

I very much disagree and I would say it's the opposite. The only competitor for data-scientists is R, especially if you're doing stats-heavy analysis (opposed to an ML-heavy analysis). SSPS is in my experience used mainly by someone without a data-scientist background, e.g. psychology, similarly to matlab, where the only users I know are engineers.

I would bet that the overwhelming amount of data-scientists use python, even if it's just to set up the data-pipeline. It's just the better general programming language and data-scientist are not analysing all the time but have to get the data in shape and available first.


I agree with these as major points. A few other secondary ones

Ruby was primarily maintained in Japanese, so had a barrier to entry for language level issues. It also lacked english-language evangelists and university presence.

When Ruby was new (invented 1995) Python had some older design issues (as it was 6 years older) however it really recovered and implemented a lot of change through Python 2 (2000) and python 3 (2008). Though there were compatability issues in the short term in the long term this change worked out.

Ruby inherited from perl TIMTOWTDI (there is more than one way to do it) philosophy which is a little more at odds with the scientific community


> Ruby inherited from perl TIMTOWTDI (there is more than one way to do it) philosophy which is a little more at odds with the scientific community

It's at odds with humanity, TBH.

I once spent like two days trying to figure out how a small (couple hundred lines) ruby script worked, because it overrided `method_missing`.


Wasn’t python released in 1991? That would make it 4 years old when ruby is released.


To add to your comment, the web space was also already very crowded, with PHP, C#/ASP, whatever Java people were using, Django, etc. Rails has always been fairly niche compared to giants like PHP.


People like to hate on PHP (and to be honest I really never enjoyed it), but there was a time when if you wanted to build powerful, large scale websites (or hosting, or scalable web farm deployments) it was the most reliable and performant thing short of Java.

It also helped that the PHP ecosystem had some pretty solid and battle tested HTTP components and pretty productive frameworks--server-side rendering with Varnish was as fast as a CDN-backed site feels today.


The things PHP had that made it win (for a while) was that it was easy to set up and configure for a hosting environment (Apache plus modphp worked out of the box consistently) and the HTML first design made it easy to add to static web pages, whereas forms and CGI was multiple steps and more confusing for less experienced devs.


I'd qualify that. A lot of the hype around PHP being easy for cheap web hosting misses the point that a lot of that cheap hosting was configured with PHP as a CGI module, not mod_php. In that sense it was on a level playing field with Perl.


Agreed. The one thing PHP demonstrably did better than other langs (other than be a simpler Perl) was be web-first.

Other langs assume you could be doing anything, and here's the extra steps necessary to make it serve pages. PHP assumes the reverse.


> Python ended up 'specializing' in data contexts, thanks to Numpy / Pandas

Yeah it was a success 20 years in the making, had a fair amount of input from science men and that being GvR’s background he responded positively e.g. the extended slices (striding) and ability to index with an unparenthesised tuple (multidimensional indexing) were very much added for the convenience of numpy’s ancestors.

I would not say it had no competition, Perl used to be quite popular in at least some sci-comp fields (I want to say bioinformatics but I’m probably wrong).


Why probably wrong? Bioperl was very big in the bioinformatics world.


> Why probably wrong?

Because I don’t have a great memory and I was never into Perl so I was extremely unsure of the suggestion and didn’t want it to come across as an assertion.


James Tisdall and O'Reilly produced "Beginning Perl for Bioinformatics" which sold like hotcakes at Perl conferences.


This is part of it, but the reasons go back further. As others have mentioned, Python is a pretty old language. It debuted as "Python" in 1991 even before Linux was out, and it existed as ABC even in the late 80s. It had a very good, easy to use C API early on. Combining that with the way it can load shared object files as modules and the ability to create user-defined operators and effectively extend the default syntax allowed it to mimic both MATLAB with SciPy and Numpy and R's data.frame with pandas, which is a huge part of why it attracted all the people from scientific computing and data analytics. The API to the major packages was nearly identical to what they were already familiar with from before. It's not an array language on its own, but the ability redefine operators and the fact that array indexing is itself an operator made it possible to turn it into an array language. The fact that all of these packages can easily just wrap existing BLAS and LAPACK implementations made it very fast for these purposes, too.

That also happened pretty early. Numpy came out in the 90s. But even before that, Python and Perl were about the only two scripting languages other than the shell that was guaranteed to be present on a GNU/Linux system from the start. That made it really popular to create high-level system utilities in Python. A whole lot of the util-linux packages, libvirt, the apt packaging system, all make heavy use of Python. So it's not just the academics already familiar with it, but system administrators and hackers, too.

It also gained widespread popularity as a teaching language alternative to Java. Once MIT started using it in their intro to programming course and put that on the Internet via OCW, it really took off as many people's first exposure to programming.

The batteries included approach to the standard library makes it very usable for one-off automation tasks, too. I don't even like Python that much, but the other day I was doing some math work and just needed to compute a bunch of binomials and enumerate subset combinations. You want to look up how to do that in your language of choice? Python just has math.factorial and itertools.combinations in the standard library. If you're using Linux or Mac, you already have it. It may not be a great choice for application development, but if you need to do some quick, interactive tasks that are never going to get deployed or even necessarily stored on disk as files but just run once from the repl, and it's too much for your calculator app or Excel to handle, Python is perfect.


Not only math/academia, but nearly everywhere. As python started coming pre-installed on MacOS and was a lot easier than fumbling with shell scripts, a lot of scripts were being written to automate boring stuff. Also, the hidden beast is Django and Flask, which are also being commonly used by both serious and non-serious folks, overall increasing the userbase.

Also python is less verbose(compared to Java, C#, Scala, JavaScript pre-ES6 etc.) in terms of syntax, so it is much easier to learn and can do both OOP and Functional, while being easier to adapt(no compiler, easy installation, lots of popular libraries), hence academia picked it up. Also, python has built-in parallel processing features which are less daunting(in terms of concepts and syntax) than other competitors(C++/Java etc).

See the popularity of Go, same thing happened with Python.


I think you're describing a symptom (popularity in data contexts) rather than a cause. If you're looking for causes, https://jeffknupp.com/blog/2017/09/15/python-is-the-fastest-... makes a good case the answer is PEP 3118.


There is a lot more history before the data part. But yes, it is a big "modern" driver for its popularity, and Python owes it to its easy C interop.

I would contest the Ruby as web dev part, though. I used Ruby for Puppet and Vagrant a long time ago and it was great, but all my experiences with Rails (and associated gems) turned into maintenance nightmares.


As a Python programmer in the 2.x days, I think data science happened fortuitously to counteract the exodus over the 2->3 transition. For a long time, it felt to me like python was on the way out because of the changes.


At the same time a lot of undergrad programmers were being introduced to Python 3. I think that’s probably the single biggest factor in the Pythons success.


It’s not because of data science, it’s popular for the same reason Java is popular, it’s what they teach in schools. It’s a virtuous cycle, industry adoption grows school adoption which grows industry adoption…


Being taught in schools is helpful, but not sufficient on its own, or we'd see more professional Forth, Prolog and Scheme programmers


future headlines:

Scratch for Cancer Diagnosis.

NASA adopts no-code GenAI interface for critical navigation and telemetry on first human flight to Mars.


> In that space, it had no competitors.

Well, there's R.

But basically, yes that's the long and short of it.


Or Julia, a language custom designed for data science.

Also, MatLab was the go-to before Python took over the scene. So it definetely has competitors.

Python's strength is that it is the "jack of all trades" language. Its not the best at anything but its pretty good at everything.


Julia is much younger.

Matlab is closed source. I haven’t touched it since I had a uni license.


Today I learnt I'm fucking old. My first web programming exposure was Perl CGI scripts



This fine day, we are all old together.


Me too.. learned it in college.. use CGI;

It was fun.


#metoo, 1993.


That's a fascinating wrong way to look at history. Python has no specializing. It always played strong on many popular fields.

From the early days, it had a strong leaning to science, after all that's where it came from. But at the same time it also had a strong stance in unix-areas and established itself as a good tool for sysadmins, competing with perl and bash. And at the same time it also had strong bindings with other languages, gave it early on the fame of a glue-language. It also dabbled early in Web-Stack and networks and had to many web-stack-option to gain that "one framework"-fame, like ruby on rails.

Python just was the language which allowed you to play on on all the important fields, with simplicity, even with some level of speed. This was something other languages lacked. They either were complicated, or slow, or lacking support for specific areas. Python just somehow acquired them all, and had a big impact on a broad area, which made it so popular, because for the majority of usecases it was a natural option.


Good answer.

Python is, imho, not a very good language. But it has a few great libraries which led to overwhelming momentum and inertia. In a cruel twist of irony those libraries are probably written in C.


I'll argue that python is very alive in the system management/scripting space.

Because of the library support (especially for data) it's also popular as api servers. It's not as popular for entire applications but is still very active.

EDIT: Django is actually about equal or more popular then nodejs based on google trends. Not sure that means anything as idk how many people google "nodejs" when working on it.


I think HTMX has given a new lease of life to old school, server-side frameworks like Django and Flask.


> Python ended up 'specializing' in data contexts

There's a big sub-continent of data stuff in the Python ecosystem, but Python the language hasn't specialized there I'd argue - it has enabled the 3rd party number crunching and data libraries like NumPy and Pandas, but not specialized over other domains where Python is used.


But still slower and more energy consuming than Javascript or Java unless you use it like a wrapper of C/C++ libraries:

https://stratoflow.com/efficient-and-environment-friendly-pr...


It is easy to underestimate how much Python as a language benefited from nympy, scipy, pandas and scikit-learn


Python doesn’t compare to R for analytical work — it doesn’t have the eco-system and R is specialised for it.

Where Python broke in was machine learning which is not analytical work and often involves lots of general purpose programming. sklearn, skimage, opencv, Tensorflow, Torch, JAX, and so on are often used in a general code base. Torch was actually a C++/Lua framework initially before switching to C++/Python.

Python has a dominant general purpose eco-system. It’s also a simple language to learn compared to R/Matlab/etc which are just horrible to use for data structures or paradigms other than vector/matrix based pipelines.


I think that data science & web competition is at least 80% but I'm going to throw out a few possible ideas that I haven't seen mentioned that may have helped. Maybe I'm way off base though.

* Udacity's popularity. It's first few free courses were Python. I believe they advertised as a way to get into Google. Not sure how many people got into Google via Udacity though.

* Leet Code Job Interviews. Maybe I'm way off here as I don't do these but from what I've read people who spend a lot of time on these prefer Python because the syntax & libraries often give a speed & simplicity advantage.

* Lots of devs had to install Python to use some utility or tool someone else created.


Good points.


Maybe it all came down to the decision to fuse python's Numeric and Numarray libraries into Numpy, providing a unified "fast" Matlab-like library, which ended up being a great fit to a lot of datascience use cases.


Might be wrong but I think it started with Jupyter/MatplotLib + Python's permissive license. Heard it started all with a physics professor unable to use Matlab for teaching.


Python also had a near death experience with the Python 2 to Python 3 transition, it could have wound up like Perl did with Perl 6 but I think the data science use case really made it.

Overall I think Python is the best language for the non-professional programmer who wants to script things and put their skills on wheels.


Python specialized in ease of learning and running the code. Everything else came after as a byproduct of the large user base.

One reason why I switched our team from Matlab to Python was the capabilities of the core libraries. Argparse alone sold it for us.


And this was all despite Python 3 which was, imo, the worst software migration in history


The worst one that succeeded.


Bingo! Then you have 2-3 year experience junior devs join your company because they know react and are completely blank stare faced when you ask them how they are serving their data.


It actually does have a competitor, R. R has a superior ecosystem for quite a few data tasks. Mainly in the analysis/statistics space.

However aside from that R is vastly inferior.


That was Matlab. Python was the substitute


Ruby is also kind of horrible.


Numpy and Pandas existed as they did because Python is written in C and cooperates so well with C, so I believe that would be the better answer.


The way I (somewhat hazily) remember it, Python was mainly competing with Perl as a scripting language, not Ruby as an application/web language. I started hearing about Ruby years later when Ruby on Rails drove the Web 2.0 movement, but that feels like a different era to me. Eric S. Raymond wrote an article[1] in 2000 about his experience trying Python after spending a lot of time with Perl. I'll quote some of it here:

> When you're writing working code nearly as fast as you can type and your misstep rate is near zero, it generally means you've achieved mastery of the language. But that didn't make sense, because it was still day one and I was regularly pausing to look up new language and library features!

> This was my first clue that, in Python, I was actually dealing with an exceptionally good design. Most languages have so much friction and awkwardness built into their design that you learn most of their feature set long before your misstep rate drops anywhere near zero. Python was the first general-purpose language I'd ever used that reversed this process.

[1] https://www.linuxjournal.com/article/3882


I think a good part of the issue is that right around then, Perl decided to do Perl 6.

People decided to wait on doing anything big, since Perl 6 wouldn't be quite compatible. Later it turned out to be extremely incompatible. And Perl 5 would be dead long term, so why spend time on it, when you'd have to do a radical rewrite soon?

Meanwhile, Python was there.

And Perl 6/Raku took 15 years to finally arrive in some form, by which point Python had completely eaten its lunch.


Perl 5 to Perl 6 transition took so long that python had it's own internal crisis with Python2 -> Python3 transition inside of that (and handled it far better).


Plus Perl6/Raku is still dog slow compared with Ruby/Python/Perl5 by a factor of 10 when parsing a log file with a regex.


This is an accurate explanation.

I started to teach myself Python just before 2000.

The general consensus before then was that for a first language Perl or Python would be a good choice. Python was typically preferred because it was more approachable than Perl i.e more succinct.

With no prior programming experience (besides manually typing out programs into my C64 as a kid) I managed to build a program that connected to my webpage hosted by my ISP and changed a link in a page to point to a web-server I was running on my PC by just referring to the standard documentation.

That’s one point that doesn’t get mentioned I think. The standard documentation was clear and accessible in a variety of formats. Also Python on Windows was very easy to set up.


> it was more approachable than Perl i.e more succinct.

My dim memory of that time is that Python code was often longer than Perl, but that was OK because its selling-point was that it didn't have so many syntax edge-cases and it was easier to read. More things were actual words-for-humans as opposed to special symbols one had to memorize and apply according to special rules.


You are probably right. "Succinct" implies it was short and clear. Maybe a better word choice would have been "clear" or "readable".

I remember the creator at the time saying that he wanted a program language that was more like a natural language i.e the programmer could express the same idea in a multitude of ways. Maybe this approach resulted in code that was easier to write than to read since understanding natural language relies heavily on context and convention.


This comment was very helpful.

Part of my question is why Python was used as a successor to Perl and not Ruby, but it sounds like Python was established as a competitive scripting language 5 years before people were talking about Ruby.

I can’t find it now, but there was another comment that pointed out that the languages weren’t contemporaries.


That's a nice way to put it, I get the same feeling with Python. I pick it up now and then and I always seem to get it to do exactly what I need real fast. It just works, has a library for what you need, and its usage is clearly explained in English in the docs. Compare that to Go for example, the other language I know, and it's not even close.


> When you're writing working code nearly as fast as you can type and your misstep rate is near zero, it generally means you've achieved mastery of the language. But that didn't make sense, because it was still day one and I was regularly pausing to look up new language and library features!

This really jives with my experience. I learned Ruby first and used it as my scripting language of choice for years, but I constantly found myself having to lookup how to do even the most basic tasks.

A day or two after learning Python I was already as productive as I had been with Ruby, it was wild.

This isn't so much a jab at Ruby, just that Python worked the way I thought a programming language should, which made everything easy.


It goes to show how your upbringing affects these things. Before Ruby I spent a lot of time in Smalltalk which really shaped the way I thought about language and program structure, APIs, etc. I'd also spent a metric ton of time writing Perl. Jumping into Ruby was like putting on a comfortable pair of shoes.

Meanwhile Python always felt like an ill-fitting suit to me. I just personally didn't connect with the language design.

But, to each their own!


I think it's kinda nonsense. No programming language is intuitive. It's arcane by nature. Having slightly more concise syntax saves you maybe a week of learning time. I used to write Perl about as fast as I could think because I had a lot of practice with it. I mostly do Python now, but do a lot less coding in general and I have to google syntax constantly. Python is less noisy, but it's not really more intuitive for me.

Frankly I think language design is secondary to IDE integration. The only time I've ever coded without thinking at all was using Java with IntelliJ or C# with Visual Studio. Type-matching and library introspection all work flawlessly 99% of the time. Not only do I not need to know the details of syntax, but I didn't even need to know APIs because I could autocomplete everything all the time. Python with VS Code still feels stone age by comparison.


Your point about statically typed languages rings so true to me. I learned Object Oriented programming on Java, then did professional C# with ASP.NET.

I changed companies and learned Python because everyone was raving about it on the Web. I was so confused when I started using it, I thought I had a misconfigured environment or that I was missing some libraries when I started writing Python.


> I used to write Perl about as fast as I could think because I had a lot of practice with it. I mostly do Python now, but do a lot less coding in general and I have to google syntax constantly. Python is less noisy, but it's not really more intuitive for me.

Well, it isn't intuitive now. It was a lot better than Perl back in 2002 or so, when it really started taking off.

Now, a lot of serious Python code is simply incomprehensible.


That's the time frame I'm talking about. Everyone got bent out of shape over Perl's sigils and built-in variables, but if you memorized the 20 most useful ones they save an enormous amount of time. If you name your subroutines appropriately and keep them short, code will be readable no matter what. Modern node is far worse than Perl ever was:

const foo = async () => {}


this resonates with me.. I fell in love with python because it made it really easy to write 100 to 1000 like I scripts that I could whip out in no time and make something happen. I still love it for that. But I fell out of love with it working on a large codebase -- mostly someone else's code -- where I felt imprisoned by not knowing the types of most function parameters, what I could do with those types, and what code I could change without breaking other code. The lack of ability to navigate in the way that I can navigate C# or Java killed python for me

I still have to use python though. I make an effort to type-annotate everything I see, and python's type annotation features keep improving, but it still often feels like an awkward uphill battle


Yes, Java has very good IDE integration, but for other languages, being easy to write without an IDE was important back then. A lot of people were using text editors.

The situation has changed now that having at least somewhat reasonable IDE integration is common. AI tools can also give you pretty good autocomplete.


I am endlessly baffled by developer's preference for lightweight text editors because they never bothered to learn how to use a debugger. The last 5+ years of my career have been mostly node and python and the state of their debuggers is abysmal and even stack traces are always useless. Want to impress me with your programming language? Don't show me code, show me a stack trace when it fails.


The quotation is so true. There was a "Python WTFs" posted recently which was fun but the thing is you could work with Python successfully for years without ever running into them. Contrast that with other languages where you need to learn the WTFs before you can stop writing buggy code.


That list of python WTFs was mostly stuff where I was going WTF at the code before even looking what the result was. Stuff like having a try/except with a return in the except block and the finally block.


Good memories. I still have that issue in the attic somewhere.


Ruby was never similar in popularity to Python. It was just a fad in webdev circles. I've rarely seen ruby before RoR and I've seen python all over.

Python was popular as a teaching language, how it gained a foothold in academia.

I remember 20 years ago when I was starting out it was popular for game scripting (Lua was more embeddable but python was also used). plenty of first class interfacing methods to native code (still remember c++ boost shipping python binding generator).

Many tools shipped python as an embedded scripting language.

Python was used for creating build systems.

Not to mention any distro out there ships python, it replaced perl in that regard. Batteries included and being available helped it become the norm for random scripting.

Python was big before numpy/ML.


I'll bite on "just a fad in webdev circles", in case you are serious and not trolling. Railsworld is celebrating the 20th anniversary of Rails this year. As I'm sure you know, Rails is built on Ruby. 20 years does not fit my definition of fad.


I see what you mean, but I counter with Plone. It's been around for 24 years, and the annual Plone Conference is coming up in a couple months. Still, you don't see a whole lot of new Plone sites rolling out these days, and there's no Automattic-scale company with click-here-to-deploy convenience.

Nothing against Plone (although I was very happy to put its foundation layer, Zope, in my rear view mirror). It's a fine program that's very good at what it does. But just because something's been around a while doesn't mean that it's still vibrant and growing.


Plone never had anything even resembling Rails' popularity.


So true. Still, you can't judge that from how long it's been around.


That's why RoR was a fad and Plone isn't.

Check out the charts from Google Trends some time.


Just because there are people who still use it, doesn't make the statement that RoR was a fad less true. RoR was "the thing that everyone was using" at one point. It was that for several years. Anyone who's anyone was building stuff with Rails.

Now it's probably the third option, when building websites... and a fourth option, when building APIs.


fad - an intense and widely shared enthusiasm for something, especially one that is short-lived and without basis in the object's qualities; a craze.

---

I don't think Ruby is a fad. The drop off Ruby had since early 2010s is dramatic, but it stabilized around 5% of all PRs on GH in the last few years:

https://madnight.github.io/githut/#/pull_requests/2023/2

It's still one of the most popular languages for web development.


> It's still one of the most popular languages for web development.

But is it? Stats on major job boards such as Indeed show a steep decline in recent years. There are consistently twice as many Django/Flask roles listed compared with Rails. Node is the most popular but Spring, ASP.Net and PHP occupy the next level below Node.


Thanks for linking that graph. I wonder to what extent Ruby’s decline as a percentage is due to its use declining versus other languages growing.


>RoR was "the thing that everyone was using" at one point. It was that for several years. Anyone who's anyone was building stuff with Rails.

I extremely doubt that. Maybe in your geographical area with small stage startups.

At any given time Spring has probably seen 10x usage of RoR.


My point was more that Ruby exploded because of a period when Rails was growing super fast. I don't know what the growth in RoR space is but I'm certain it's nowhere near what it was in late 2000s/early 2010s. And outside of that there's really not much in Ruby space.

Python was all over the place when RoR came out, meanwhile I only remember hearing about Ruby on random forum posts or by some enthusiast before RoR.


I don't think moonchrome was disparaging Rails or minimizing its popularity, but, rather pointing out that Ruby is only as popular as it is because of webdev, because of Rails (and the Rails-likes that came after)—outside of that specific niche, it's not very commonly used just about anywhere else, especially when compared to Python.


Chef and puppet are fairly big ones though. Other than config management, I think this is true. Rails is king in ruby land.


Homebrew, the macOS dependency manager, and Cocoapods, the Swift and objective C package manager, are both written in Ruby.


Mh. Honestly, I've encountered ruby in about 3 primary ways. RoR and Sinatra, which are kinda the same webdev thing, and Chef as a config management. There are additional things, but more supporting things - Rake and Capistrano as deployment systems, Berkshelf and such as supporting things.

Outside of this, it's been perl and python, with python replacing perl after that whole perl 6 fiasco while the language and distros started understanding packaging python.


10 ish years ago, pretty much all new hip HN startups plus existing success stories all started in Ruby on Rails, or feels like they did anyway.


I think they meant back before either Ruby or Python hit their stride, not afterwards.


You're forgetting the period of Ruby's popularity in the devops world, especially Chef and Puppet.


Puppet was pretty awesome. We used it to maintain and provision nearly every kind of software across a few thousand machines, even though it had a worrying tendency to remove user accounts (we once got locked out of the master...)


Puppet was just built with Ruby (and later re-written, I think?), in all my years interacting with it I never actually wrote any Ruby.

Chef you obviously had to, but my impression is that the DSL is so extensive you might as well be writing something that's not Ruby.

Maybe this says something about the power of Ruby, but I don't feel like these tools contributed to the language's popularity or ecosystem - definitely not the way RoR did.


> my impression is that the DSL is so extensive you might as well be writing something that's not Ruby.

This is just how Ruby is. It's designed around the idea of building DSLs, so writing in Ruby is almost always an exercise in writing in one DSL or another.


I've spent year and a half writing chef as my first "programming" job, like 6 years ago. I learned almost 0 ruby, and that's probably why ruby didn't stick that much.


Yeah, I suspect this might be one reason why Ruby hasn't gained as much traction as others: knowledge doesn't translate well between frameworks, so there's little advantage to re-using the same language for a new project (unless it's a new project in the framework you're already comfortable with).


I think that's just building on RoR hype in that same field. Puppet even moved away from Ruby AFAIK.


I beg to differ. Most sysadmins I knew who used Ruby extensively had never touched Rails. There's no real connection between the two phenomena other than the language.


> Many tools shipped python as an embedded scripting language.

I had dabbled with Python before encountering it (in its embedded form), but once I started using it inside ESRI's ArcGIS software, it became my go to language for many, many tasks.


When I first looked into Ruby, pre-RoR, but only just, the stdlib docs (in English) were a gaping void compared to the Python docs. Would have been around early Python 2 days, 2.3 I think.


I first bumped into python because mailman was written in it. It was a large, slow and a bit funny. I saw it as another odd academic language that had spilled out of a university somewhere.

Then in about 2005 I heard about zope, and how easy and wonderful it was compared to php. But it was slow and memory heavy.

In about 2008 more and more stuff started getting python bindings, specifically in VFX (where I was working). There was a tussle between javascript and python, and python won out, even displacing MEL and other domain specific languages.

Ruby was always more obscure to me, it was for people that likes C and wanted a more perl-y version of it, but without the anachy or the universality of CPAN.

I think the biggest reason was that it had loads of stuff in the standard lib, way more than anything else at the time. Much more than perl, so it meant that things were a bit more "portable". So people came for the stdlib, then started adding things to it because they'd spent all that time learning it.


There's a joke that is roughly "Python is not the best language for any particular task, but is nearly always the second best language for that task" - it's a jack of all trades language.

Because of this, lots of people learned Python, and then applied it in many different areas, and it just became more prevalent and useful.


I think that's part of the answer. Python has multiple independent communities of use: web, data science, ML, devops. Most languages only have one or two ecosystems of use.

I think the other part of Python's success compared to JS is a better story about C integration. Python itself is just a series of macros to make C nicer to use, which meant the existing C/C++ ecosystem was relatively easy to integrate. And then the sci-py world also brought in Fortran. JS is huge on the backend web, but it hasn't spread into other ecosystems nearly as much because the C integration story isn't as good.


> Python is not the best language for any particular task

Python basically solved this problem by wrapping around C code that's far more performant, but still exceptionally simple to leverage due to python's simple syntax. LlamaCpp has C++ in its name, yet its most popular platform is python. So for certain applications, python because the undisputed #1. Because it was C in disguise, with better usability.

With parallelization, Python having 100x slower loops became an old problem. 2023's coding paradigm is "if you're using loops, you're doing it wrong." People love complaining about pandas, but Dask solved all the single-core problems in 2021.

> https://twitter.com/iammemeloper/status/1692901551558320254

Amen


> Python basically solved this problem by wrapping around C code

I would rather call it a work-around. Having to switch to a vastly different language to get halfway decent performance is hardly a good solution.

It's not a rare thing in normal applications that you need to have some routine fast and there's no pre-built C solution for it.


Maybe so, but it's not like there are a lot of languages around that are as expressive, flexible and easy to learn as python, but as fast as C.

JS is probably the closest, but not because of superior language design -- people with money wanted it to be faster, so it is.

Also, the choice is not just python, or writing a module in C -- there's numba and cython, various python compilers, pypy and cffi.


> Maybe so, but it's not like there are a lot of languages around that are as expressive, flexible and easy to learn as python, but as fast as C.

There are several languages on the JVM which are that or better. As fast as C, no, but JVM is considered fast enough for a lot of applications where performance matters.

> JS is probably the closest, but not because of superior language design

There's this myth that Python as a language is superior to JS. In reality both have their long history and warts. JS/V8 is just much faster.

For me personally, JS is a much more ergonomic language than Python.


>There are several languages on the JVM which are that or better. As fast as C, no...

Like what? None of the ones i know of meet the "expressive, flexible and easy to learn as python" test.

> There's this myth that Python as a language is superior to JS. In reality both have their long history and warts. JS/V8 is just much faster.

I never said anything about being better or worse, just that as far as i know there's nothing about JS which makes it much easier to optimise than python.

I'm not arguing that python is the best thing to use for everything, or that there are no faster languages; that clearly isn't the case.

The topic is "why is python popular", you're arguing that it's too slow and it's too much effort to speed up code, while I'm arguing that there are no languages which are faster, and don't give up some of the things that made python popular in the first place.


> Like what? None of the ones i know of meet the "expressive, flexible and easy to learn as python" test.

Kotlin is IMHO the best of the bunch, but Groovy might be the most similar to Python.

> I'm arguing that there are no languages which are faster, and don't give up some of the things that made python popular in the first place.

Python has some advantages for scripting - it's almost ubiquitous on Linux distros. But for application development, I don't think it's better than other platforms.


>> I'm not arguing that python is the best thing to use for everything

> I don't think it's better than other platforms

I think we're done here.


It is a great solution. If you write Python, nobody will mistake it for fast code, it stays where it belongs.

JavaScript for example is much worse, because people are convinced it can be fast for some reason, and keep trying to write nested loops in it.


These mental excuses are pretty funny :-D

One problem with this is that you expect a perfect dichotomy where you can isolate code which can be slow and which can be fast. But for a lot of code it's not easy to say in advance (pre-mature optimization plays a role too) and/or it's somewhere in the middle.

We have a big Java application with hundreds of thousands lines of code. Its performance is so-so. But if you look at the profiler, there isn't a clearly responsible module/method (all obviously slow methods have been already optimized). Now it's a death by a thousand cuts. Making it in Python would make it unacceptably slow with no obvious places which could be converted into C. Writing the whole thing in C would make it a much more complex endeavor.


It is not hard to write C. It is hard to write large safe multi-developer C programs with complex object lifetimes. But if you are just writing an inner loop in a CPython extension it is easy.


> Python is not the best language for any particular task, but is nearly always the second best language for that task

Pointing out any problem where speed is the difference between working and unusable—e.g. ray tracing—is probably too cheap a shot, as it’s not like Ruby or Perl are any better in that regard, and LuaJIT came too late to be relevant to the current question.

One range of problems where I’ve found Python (unextended by a code generator) to be surprisingly awful, though, is everywhere the ML style of algebraic datatypes + pattern matching is natural: compilers, basically, all the way down to a regex engine. There’s just no comparison.

Maybe things have changed now that `match` exists? I’ve not yet had the time to try, even if the double indentation doesn’t make me hopeful.


Definitely a jack of all trades language.

And when you get older, and you care more about solving problems than about trying new things, and you've got more responsibilities in life and lack the time to devote to learning new language / technology du jour, then knowing and using Python becomes so handy.

If it is indeed the 2nd best language for the job, it is still a decent choice if you get to solve the problem.


yeah it's a good enough especially to start poking around

many accounts of people dropping other languages with better perf or semantics to use python as a prototyping++ tool

it was a trend in many dynlangs (php and others) where you'd write the core in it and drop to C for hot loops

python is better than php, it has some metaprogramming (hello CLOS) to mold it syntactically wise, good enough stdlib, low enough semantic warts ..

than and numpy/pandas for non sweng crowds


I have always thought it was a good prototyping language. You can get something working fast then rewrite it in a better language later.


> Python is not the best language for any particular task, but is nearly always the second best language for that task

The "second best language" is false, it's just usable for a wide variety of tasks. Just like many other languages, but specifically not Ruby.


I was there when it happened, I remember it well. Python displaced Perl, when Perl was a dominant scripting language, way back in the 1990s. What motivated people to replace Perl with Python is that we started using scripting languages for serious software with non-trivial complexity. In this context, Python was much more scalable and maintainable language than Perl, it was just an easier scripting language for software engineering purposes. I used to write apps in Perl and moved to Python for the same reason everyone else did.

At the time this was happening, neither Ruby nor JavaScript were credible alternatives, Python won by default. The only other scripting language I can remember being used in similar contexts at the time was Tcl.

The other half of it is that Python had a great C binding story, which made it easy to integrate Python with almost everything since C ABI was the lingua franca of interoperability. You could wrap high-performance code in a thin layer that would allow you to efficiently script it in Python. This is why Python became ubiquitous in HPC/supercomputing decades ago. It allowed Python to have a vast library of functionality with good performance and minimal effort by leaning on the vast number of C libraries that already existed to implement that functionality.

Another underappreciated advantage of Python, a bit like C++, is that it is promiscuous language. It isn’t so in love with its purity that it won’t happily do slightly dodgy things to hook up with things that were never intended to work together. It may not be pretty but you can actually make it work and the language doesn’t try to prevent it.

As time went on, a school of thought emerged that you could efficiently do almost anything you needed to do in software engineering with a combination of Python and C++. There is definitely an element of truth to that, and the ease with which you can combine those languages made them common bedfellows in practice.

Python initially won because it was better for writing complex software than Perl, but its staying power was based on its easy integration with literally everything.


> Python was much more scalable and maintainable language than Perl,

This is a really excellent point. We were a Ruby/Golang shop and out of nowhere one of the sales engineers starts writing scripts in Perl. He wrote some really useful stuff, but the downside was that it was incomprehensible gobbldygook baked into 4 or 5 lines, and completely unmaintainable by anyone other than the original author. After he left we decided to rewrite all his code in Python/Ruby based on the general idea of what we wanted the inputs and outputs to be, rather than assign anyone to maintain the actual Perl code.

Vs the 8-year-old analytics/data science codebase I had to unwind a couple years back, was written entirely in Python and was easily understandable from the first line and had been maintained by no less than six people over that time


Perl being write only isn't a meme, it's actual reality. It is _possible_ to write readable and elegant perl, but it requires huge amounts of skill and self control not to mush everything into essentially a 3 line regex that happens to work =)

And in for Python there is the 13th rule of the Tao of Python: "There should be one-- and preferably only one --obvious way to do it."

The language is built around pushing you to do things a certain way, it doesn't force you, but the correct way is usually the easiest and cleanest way.


There was a brief period of schools teaching Java compiling down to CLI programs in Unix (beans?). Perl was popular on the semiconductor world replacing tcl for gluing together complex manufacturing flows. Ruby was heading to be a replacement for Perl with bidirectional file handles and similar syntax. Shell scrips and Perl are still on use there. Python never made inroads because it brought nothing new and was less suited to file munging than Perl. Perl still has superior text manipulation semantics compared to python.


Text manipulation was Perl’s raison d’être, it was exceptional at it. The real legacy of Perl is how pervasive its text manipulation concepts have become in other software. That was what made it work so well in the early days of the Web: everything Web was text-based.

Perl would probably still have a place at the table if Perl 6 hadn’t turned into a debacle over an astonishing number of years.


What concepts does a language force a new user to be aware of, and how reliable are user's first intuitions about those concepts?

I would argue that Python dominates Ruby in this metric.

New users wonder how to call functions. They form an intuition ("use parenthesis"), but it's unreliable. "Oh, parenthesis are optional--oh, parenthesis are only optional sometimes".

New users wonder what a function is exactly. They form an intuition, but their initial intuition doesn't encompass all 7 different types of callables in Ruby, so many surprises and frustrations await.

Python is much more boring in this respect, users are more likely to form accurate intuitions.

What follows is my subjective experience as I came to like Python but hate Ruby: I learned Python in the mid 2000's while trying to script Asterisk. Asterisk had an existing community around PHP and Ruby, so I looked at Python, PHP, and Ruby with fresh eyes, Python appealed to me most. I remember being very confused about what Ruby "gems" were, I thought they were something like "JavaEE" that I'd heard about in passing at school, some kind of compiler plugins or something complicated. Python had "libraries" though, I knew what they were. I didn't seek out learning materials for the languages, I only saw how people were talking about them in the Asterisk community. I'm sure the right material would have explained what a "gem" was very well, but those obscure mailing lists I was reading did not. I never did give Ruby a fair shake, so don't take this as advice, these were merely my experiences as a new programmer looking at both languages with fresh eyes.


I'd argue that better relative ergonomics had diddly squat effect on Python's current position. It's first mover advantage, plain and simple. It got its hooks in data science like JavaScript got its in the browser.

I can say that personally using Ruby, Python felt like a massive step backward. I can see how others might disagree, but I feel like it's all in what you know first. On a certain level, most of us know JavaScript is just terrible, but there have been millions of new devs who knew nothing else and thought it's fine--better than fine, its great, it's the other stuff that's weird! But then you go on for a while, maybe eventually find lisp and/or functional programming, and you realize how brain-damaged our most commonly used tools really are.


I'd argue that better relative ergonomics had diddly squat effect on Python's current position.

And you'd be wrong.

I can confidently say that since I fully expecteded python to win over ruby and it did. Every time I used to hear hipsters dev being all the rage about ruby, I knew they implicitly discounted the cognitive load that goes with learning ruby.

The path from pseudo code to working python code was ( is? ) straight forward and ruby doesn't bring anything in term of paradigm over python that justifies foregoing that advantage.

So everytime a bright mind wanted to implement a library in her field of expertise, python was her tool of choice. And that's how python conquered field after field.


> I'd argue that better relative ergonomics had diddly squat effect on Python's current position.

I'd agree with that. I've done a lot of work in Python and Ruby. I mostly do Python these days. I think Ruby's ergonomics were better, especially for novices. (E.g., if you want to know what methods you can call on an object, in Ruby you can just look at the object, but in Python there are a whole zoo of global functions you're supposed to know about. [1]) But I think ergonomics just don't matter much when compared with more practical considerations like availability of libraries or number of developers available to hire.

[1] https://docs.python.org/3/library/functions.html


Python was already the #2 scripting language (after perl; not counting Visual Basic) back in ~1994 when I learned it. Tcl was already dying out by then and Ruby hadn't been released. So you basically had Perl or Python. You're right that it was a "first mover" advantage but I don't think data science had a lot to do with it. Perl didn't evolve gracefully over the 1995-2005 period but Python did. I think it is probably as simple as that.

By 2005 or so Python was already the #1 scripting language (other than PHP, I guess).


Was python really ahead of shell for scripting in 1994?


I remember it as at that time Windows and Unix was the common systems. I know we collected statistics for our business software offering that could run on multiple types of Unix and on Windows. Around 1998 I think 70% of all customers choose Windows. It was easier and cheaper.

So, I would say that the most common scripting language in the late 90s was VBScript.

I also can't recall any of the Unix gurus at our company using Python. They used bash, zsh or other shells.


1998 was pretty different than 1994, though?


No :-)


Can you explain more about how (you feel) Python is a step backwards? I'm curious if the "steps backward" are syntax level?

I feel the Ruby community is very syntax oriented. As evidence of this:

I see Ruby developers interested in Elixir and Crystal, languages that are syntactically similar, but technically very different.

I do not see Ruby developers interested in Python, even though, if we ignore syntax, Python and Ruby mine as well be the same language. Technically speaking, and in the grand scope of all languages, Ruby and Python are very very similar.


I really wanted to love Python. As a Ruby programmer I thought Python would be like Ruby, but with a prettier syntax because of the syntactic white space.

Unfortunately this turned out not to be the case. Here's my gripes with Python as a Ruby developer that really wanted to love Python:

There's a bunch of global functions that should have been methods on objects, like `len()`. OO in general seems slapped on later, as evidenced by the weird double underscores, and the explicit this object as first arguments for methods. (OO is core to Ruby and very elegantly implemented imo).

Higher order functional programming is a lot more awkward than you would expect from a scripting language that's generally praised for its data processing qualities. It's nice that function definitions are closures but you can't make them anonymously as function arguments, the reason apparently being that Guido didn't want to complicate the parser. This is a contrast with Ruby's most interesting and unique feature, the do syntax which is a syntactic trick making it really easy to use higher order functions that take a single function as their last argument.

Sort of an aside, but from an interview I read with Guido I understand that he actually regrets making Python have semantic whitespace, he actually doesn't like it. To me the semantic whitespace is one of Python's few redeeming qualities and I just wish Ruby would have adopted it as well. Maybe if it had we'd all be unanimously using Ruby.


> There's a bunch of global functions that should have been methods on objects, like `len()`. OO in general seems slapped on later, as evidenced by the weird double underscores, and the explicit this object as first arguments for methods.

From some angles these are strange, but from where I sit they seem elegant and pragmatic.

The global functions and double underscore methods are part of the same thing: the language defining and enforcing a protocol that many types of objects should support. Other languages do this by having common names (.length() or whatever), but this has problems that the python way avoids:

- it's hard to add new ones because they share the same namespace as user code - they may have different meanings in different contexts (e.g. a line has a length, but isn't iterable) - the language can't enforce the protocol (e.g. len accepts only one argument, checks the result is an integer, and raises the correct exceptions)

For operator overloading, using regular methods with special names is nice because it avoids having special syntax for it, and makes it easier to interactively inspect and experiment with (as you can just call the methods).

The explicit self thing is similar, in that it avoids a load of special syntax: no this keyword, no syntax for static methods, no syntax for superclass calls (until super was added). It even means that there's only really one 'kind' of call.

I can see some confusion about where self actually comes from, but then most languages i've experienced where this is implicit also have corner-cases and awkwardness around it.

IMO these kind of simplifications seem a bit frivolous when looked at in isolation, but end up simplifying your mental model of the language, which helps beginners get more out of it quicker.

> It's nice that function definitions are closures but you can't make them anonymously as function arguments.

I thought this for a while, and it would still be nice in a few situations, but really it's not a burden to name your callbacks, and discouraging overly-clever code isn't a bad thing if you're trying to build an ecosystem that's accessible to as many people as possible. I used to use map/reduce/filter a lot, but most of the time list comprehensions and regular loops end up easier to read.


I agree with the objective observations, but subjectively I prefer Python.

Python's OO does feel slapped on. As a Python user I've come to see object-oriented code as just some syntax to organize data and functions. To me, it's all functions, always has been.

I'm curious to know if you have similar thoughts biased towards objects? Do you view functional code as just a way of creating and organizing object-like concepts? I have heard that Ruby is quite pure in its philosophy and approach to object orientation, like SmallTalk, but I don't know either language well enough to fully appreciate this way of thinking.


I think Ruby has a super interesting middle ground. It’s strictly OOP in that everything is an object or a method (or a block), but because of implicit method calls (no parens), implicit return values, ease of use creating anonymous blocks passed to methods, and meta programming allowing dynamic control flow (i.e. calling a method by name at runtime), it has a lot of the elegance of writing functional code. You end up with small, unit-testable, methods that you can chain together, concise syntax, and meta programming that is second only to Lisp macros. It feels a little bit like if you asked a functional programmer to design an OOP language.


For the case of Javascript, (I think) it followed "Worse is better" principle. It's been trying to be as practical as possible. And it's everywhere even though it was not really great (until new standards and Typescript come). So a new joiner would pick Javascript so he/she can have a wide choice of possibility later.

Same things are happening with Python. And if I have to be honest, if I come from a different profession and want to change my career, I would pick either Python or Javascript for my first step.


>New users wonder how to call functions. They form an intuition ("use parenthesis"), but it's unreliable. "Oh, parenthesis are optional--oh, parenthesis are only optional sometimes".

print f"are you {sure} you need parenthesis to call a function in python"

>Python is much more boring in this respect, users are more likely to form accurate intuitions.

is defining a class the same as defining a function? What about functions implicitly defined when you define a class?

__what__ is __with__ __some_words__?

why do you need an empty file called init.py in the same directory as your actual code?


> print f"are you {sure} you need parenthesis to call a function in python"

This example doesn't call any functions. The print statement was removed in Python 3 and turned into a function, so you do need parentheses to call it and the example above is a syntax error. Python 2 (which had a print statement instead of a print function) didn't support f-strings. And f-strings, unlike JavaScript's template strings, are not function calls.

In other words, yes, you consistently use parentheses to call functions in Python.


> And f-strings, unlike JavaScript's template strings, are not function calls.

IIRC, they're syntactic sugar over "string".format(locals())



With the exception of properties, of course.


Double underscores are ugly, but not surprising or frustrating. I say they are not frustrating because the language doesn't force beginners to be aware of how they work right away. Double-underscore methods are there for people to seek out when they're ready, but the language and the surrounding community doesn't push people into comprehending them. Intuition is good enough for a long time. Ruby has more focus on metaprogramming which is likely to force users into some pretty complicated stuff before they're ready.

print as a statement is inconsistent and surprising, but the explanation is shallow--they made an exception for print, that's all there is to it. It's not beautiful but it is unlikely to cause a 2 hour debugging session.


A lot of Rubyists are turned off by Python because there are just too many ugly things and exceptions in Python.


100% fair. They're probably right; certainly right from their own subjective view, which is what matters.

I'll also note that the OP focused on how "fun" Ruby was. Ruby is beautiful and fun.

Going back to the original question: why did Python grow more than Ruby? My first answer was going to be a rhetorical question: which beautiful and fun language has been the most successful?

C isn't especially beautiful or fun, not in the same way. (It has a nice minimalism and connection to the hardware maybe.) C++ is beautiful--is a thing that nobody has ever said. Java ain't beautiful. JavaScript, Python, etc, none are especially beautiful or fun.

I think another answer to the OPs question is that beautiful and fun languages are, apparently, not what most people are looking for. Evidently we should not expect a language to succeed because it is beautiful and / or fun.


Python had a huge head start. It was already one of the most used scripting languages by the late 90s and Ruby barely even existed outside of Japan.

Ruby’s beginner-friendliness and consistent OO design managed to help it expand in the early aughts and Rails brought the language to prominence in 2005, but by that time Python was already replacing Perl in its niche and had support from giants like Google.

Ruby got more adoption than would have been predicted based on its late entrance and lack of truly differentiating features, and that’s a credit to its ergonomics, community and how easy it is to learn.


I prefer to say that underscores mean that designers failed


Failed to design a beautiful language, yes. But they didn't fail to design a usable workhorse language.


It isnt hard to make stuff usable

Designing it correctly is an art


It is sad to me that you hate Ruby. It is a better language than Python for my taste in all dimensions except number of libraries available.

To begin with I could never understand why there need to be those global functions in Python to do meaningful things with lists.

In Ruby the "everything is an object" really works down to every nut and bolt and is clean and very conceptually pleasing.

Whether you call a package a gem, module, jar or whatever isn't very central.


Was simply for readability, as in similar to English prose. What if the length of my list is greater than 10?

  if len(mylist) > 10:
      pass
Only general things dealing with common needs are global, list-centric methods are on the list where they belong. I learned Java before Python and don't really miss value.toString() or value.length() much.


But it is absolutely unintuitive why it shouldn't be

    if mylist.length > 10


If the length of mylist ... see how length comes first? Only unintuitive if you never spoke English or other european languages.


This construction is something my French friend often does in English instead of using a possessive—the <property> of <thing>.

As a native speaker of English, I’d just say, “if mylist’s length is more than 10”.


With all other object oriented structures you would think that you want to do something with the mylist, thus you would first write down mylist.

In Ruby I thus often write something such as:

    if mylist.empty?.!
Because my line of thinking goes from the object to a function or attribute and then chains further along. Calling the `.!` method on the returned boolean object is very nice, because I don't have to go back to before mylist and insert the !-operator there.


I haven't written any Ruby. Maybe one gets used to this after some time, but this honestly looks like it will enable you to write extremely clever and succinct code that makes perfect sense at the time, but a chore for anyone else to read and grok quickly (including the writer themselves a few weeks later). I could be wrong though.


A really, really bad argument in my opinion. That thinking would lead us to Cobol.


A really, really bad reply imho—see how useful that is?


Yes, it's essentially UI/UX principles as applied to language design, with developers being the users. You want your users to be able to build reliable mental models of your interface [1].

The optional parentheses example is a good one. That mistake was carried over to Crystal, it's such a silly little inconsistency that I'm sure the designers considered "cute" or "clever". It's naught but a constant pain in the butt.

Another good example is YAML vs JSON. I understood the entirety of JSON syntax pretty much immediately upon seeing it, it's trivial. YAML is supposed to be "easier for humans", but by forgoing a uniform and consistent syntax it ended up an order of magnitude harder.

[1]: https://news.ycombinator.com/item?id=36396256


I first encountered Python around 2000. At the time Perl was more popular in the circles I frequented, and TCL probably about as popular as Python (eggdrop[1] notwitstanding), but Python was growing steadily. I encountered Ruby in the late '00s (as others have noted it made its english-language debut almost a decade after Python started) and Lua was another contender during this time.

My memory is that, at least outside of the web/RoR world, Ruby at no point as popular in the US as Python. Lets take, say 2005 as a point that Ruby was really gaining popularity due to Rails: at this point I was already seeing internal tools that previously would have been written in Perl being written in Python.

Python just seems to have maintained a certain momentum, and I'm not sure why, but if I had to guess:

1. The syntax is almost a dead-ringer for pseudo code. The first time I encountered Python, I fixed a bug in an open-source project having never seen Python before. I recall two different CS programs switched to Python from Scheme for exactly this reason.

2. It was designed to glue C together. Other languages were too (TCL and Lua; Perl primarily interfaced with the outside world in the same way shell does, with pipes).

3. A kitchen-sink approach to the standard library helped it a lot for the first 20 years of its existence. CPAN was sui-generis at the time for installing libraries, so the more you could do without relying on 3rd-party libraries, better off you were.

1: https://en.wikipedia.org/wiki/Eggdrop


I think it upset the Perl hegemony in part because of the enterprise infatuation with object orientation. Perl got a lot of flak from the Java/XML crowd (“Perl’s not a real language—it’s not object oriented!”).

You could kind of write Java in Python, and thus a certain kind of enterprise techie couldn’t dismiss Python as “just” a scripting language.


In my experience, Perl got a lot of flak for being "write only", I don't really remember anybody complaining it wasn't "object oriented".

Every Perl code base I've seen, or inherited, has been a mess. My favourite was a bunch of utilities all written by the same person who never did anything the same way twice. Each utility was like an entry in an obfuscated 'C' contest.

I used Perl to build my first web application at LONG time ago. That project is included in the list of Perl code bases I've seen that were a mess :-)


There was definitely a group of people who thought "object oriented" was the One True Way and any languages that don't support it are "toys." I'm not sure I ever heard their ire directed at Perl specifically, but I recall the sentiment existing.


Oh definitely the attitude exists. I just never saw it directed at Perl. Maybe since, in my experience, Perl has mostly be used as a system language and not a application development language.


How the pendulum has swung!

Rust doesn't even sport 'Class'


And of cause the fact that the Perl community relished in inventing new and exiting shorthand tricks that allowed perl script to be essentially write only unless you stuck to an style-guide that had not been codified.

Perl like VisualBasic had a well deserved reputation for facilitating atrocious hard to debug code, and it was this reputation more then the lack of OO/XML support that led both of those to fade back into the realm of forgotten memory and business critical enterprise applications that nobody dares to touch.


The word "enterprise" kind of reminded me of one possible significant reason for python's huge adoption rate after the second part of the 2000s. Google started using it for a lot of things at some point and was advertising a ton of python jobs. That was both a shining endorsement and a possible way into the company.


I actually think Google picked the winner by deciding to use Python, especially for TensorFlow, right at the time when deep learning really started taking off, and also at a time when they were seen as the leader in engineering best practices that other companies wanted to emulate.


Google followed the writing on the wall, nothing else.

Google was known as one of the most academic engineering companies, in the 00-ies.


That would be similar to saying C isn't a real language while ignoring the existence of C++. Perl 4 wasn't OO, but Perl 5 enthusiastically embraced OO. The first release of Perl 5 was in 1994, whereas Java 1.0 was 1996.

It did take a while for the wider Perl community to embrace OO, though. There were a lot of Perl programmers who weren't into lofty paradigms. Plenty of Perl programmers weren't fully sold on the idea of breaking your program into functions.


I (rather to the displeasure of many people) refer to that kind of OOP Python as "Java fanfic"...


Before Python got popular, Perl was the most similar popular language. Once I encountered Python, I felt, this is Perl, but done right.


That was my experience. I like Perl. I'm comfortable with Perl. And after about a day of Python, I never wrote another line of new Perl code.

There were so many "it can't possibly be that easy, but it is!" moments.

Let's write a function to add five to a number:

  def add_five(num):
      return num + 5
OK. So, can I pass that function as an argument to another function?

  def call(func, value):
      return func(value)

  call(add_five, 10)
  => 15
What? That worked?! And there was zero additional syntax, you just... do it?

After a seemingly endless list of happy discoveries like that, I seriously rethought my idea of what programming languages could be, and ought to be.


That sounds trivial? I mean, you can do that in C, what’s surprising about it?

I have never found Python as nice to use as you seem to have, but Ruby always fit like a glove. I’ve written Python since 1999 and I still would rather use almost anything else. It’s so labored to do anything complex. You can build worlds in Ruby in the time it takes you to align indentation properly for a single class in Python.


> That sounds trivial? I mean, you can do that in C, what’s surprising about it?

Here's the C version of it:

  #include <stdio.h>
  
  int addFive(int num) {
      return num + 5;
  }
  
  int call(int(*func)(int), int value) {
      return func(value);
  }
  
  int main() {
      printf("%d\n", call(addFive, 10));
      return 0;
  }
It's still manageable, but not nearly so simple.

Even Perl doesn't let you escape having to consider pointers and references:

  sub addFive {
      return shift() + 5;
  }
  
  sub call {
      my($func, $value) = @_;
      return $func->($value);
  }
  
  print(call(\&addFive, 10));
Why do I have to remember to write `\&addFive` in Perl, when Python doesn't require it? Why do I have to write `$func->($value)` instead of the normal `func($value)` in this case? Why do I have to write `$value` in the first place? That's rhetorical: I know the answers. Still, this was the kind of thing that instantly won me over from Perl to Python. It's not that I could somehow write things in Python that were unwriteable in Perl (or C, or assembler, or ...), but that Python let me concentrate on what I was trying to say instead of how to say it.

In fairness, the equivalent Python code with optional typing is also more verbose than my original, minus the pointer stuff:

  from collections.abc import Callable
  
  def add_five(num: int) -> int:
      return num + 5
  
  def call(func: Callable[[int], int], value: int) -> int:
      return func(value)
  
  print(call(add_five, 10))
The key word there being "optional". Python lets you add that if you want to, but you don't have to. And note that the underlying syntax is identical if you delete the annotations. If you opt in to using them, you don't have to alter the code you're annotating.


  int call(int(*func)(int), int value) {
      return func(value);
  }
At least in gcc, you don't need to think about pointers. Just use the function signature:

  int call(int func(int), int value) {
      return func(value);
  }
I don't know if this is an extension or what, but it's something I "just tried" on a hunch with no guide in 2009 and it worked.


You don't have to shift() or unroll @ any longer in recent Perls.


> you just... do it?

A few years from now, when researchers look at mental workload when writing/reading software, we will have the data to prove that Guido was a Neuroergonomics savant.


Python didn't corner the market there:

  (define (add-five num)
      (+ num 5))
  (define (call func value)
      (func value))
  (call add-five 10)
Admittedly, this was one-of-many examples.

I do wonder how much would be different today if not for the Monty Python gags in the documents.


It did corner the market for "no extra brackets needed just in case people want to treat code as data which most people don't want to".


Sigh. I know. We are truly walking in the darkness.

When I’m appointed Lord Emperor, we’ll all upgrade back to Lisp.


If that blew you away you really should try Ruby:

    def call(fn, val) = fn.(val)
    call(->(n){ n + 5 }, 10)


It is still hard to understand unless you are familiar with the language. Python got popular because it made it simple.

Ruby on the other hand encouraged cleverness like what you wrote.

I think ruby faded for the same reason perl did. It is cool to write code, but once it's time to maintain it, especially after the rock star who wrote it left, it sucks.


> cleverness like what you wrote.

So we should optimise languages for users who are not prepared to familiarise themselves with the language? How is this more complicated than its commonly-used Javascript equivalent?

    function call(fn, val) { return fn(val) }
    call(n => n + 5, 10)
You see when written in a language in which lambdas are first-class citizens it doesn't look clever at all so maybe Python's sad implementation of lambdas and its anti-fp culture could be at fault?


+100

I loved Perl, but it had a culture of doing things in N different ways, too many shortcuts, and you ended up with incredibly tiny scripts which were unreadable -- even to the author -- a month later. This wasnt practical for business use in the face of more consistent code from Python


Excellent point. I moved from Perl 4/5 (writing fairly involved RADIUS code) to Python (doing other stuff, but still intricate) and the fact that I could actually read my code six months later made all the difference.


As someone who began his programming career with Perl I've never understood this idea that Python somehow improved on Perl. For a start Ruby is Perl5's most obvious successor. Perl puts Python's text-handling to shame whilst Python's lambda is nowhere near as expressive as Perl's. Python also goes head-to-head with Perl's fundamental philosophy - TIMTOWTDI. If anything Python is the Anti-Perl.


> The syntax is almost a dead-ringer for pseudo code.

This is the big one. You look at it, and you think you know what it's going to do. The indentation rules also means things look predictable rather than being a mess of brackets.


This was the reason what made me chose python over ruby.

I wanted a language that I would come back to and still understand what I was trying to do. Ruby looked like its spirit animal was perl.


Python got Numpy very early on, and other languages never got an equivalent.

And then lots of things were built on top of Numpy - image processing, reading GIS raster data, scipy, pandas, etc etc - and they're all trivial to combine because it's all just Numpy arrays.

Python also had a very friendly and approachable community from the start, and great docs, while Ruby had its documentation in Japanese only for some time.

Django is also top quality, imo. Similar to Rails. But it always seemed Ruby was only good for Rails, whereas Python combines with everything.

So as usual, it's not about the language, it's about the rest.


Rails is far, far more popular than Django.

However, the significance of both declined with the expansion of high-tech options (Spring, Go, Scala, Node.js) and the expansion of low-tech options (Firebase).

So today the popularity of the backend frameworks of each language are less important.


I would dispute that. I always felt Rails was used by a very communicative subset of web developers while the Django users just contracted, delivered, got paid and moved on to the next project without any hassle.


I can confirm. A lot of Django apps out there. People don't make a big scene about it, and it quietly powers a lot of websites. I think one of the powers it has that no one talks about is that you can do microservices within a monolith.


I would like to hear more about this paradigm if you have a minute.


In your wsgi.py file, you specify which settings module to use when loading your WSGI application with, e.g., gunicorn. In your settings.py file, you can specify what your root url config is, e.g. a urls.py file which recursively defines all routes in you application. But nothing is stopping you from having n different settings files, e.g. settings/service1.py with n different urls.py files and loading them in n different wsgi.py files and serving them with n different gunicorn deployments.

You can also do things like adjust the settings defined in your settings.py file to use a different url conf programatically based on environment variables. It is just python, after all.


> Rails is far, far more popular than Django.

Maybe it used to, but seems a lot more people at least searches for Django pretty much everywhere except US, Canda and Japan.

https://trends.google.com/trends/explore?date=all&q=%2Fm%2F0...


For people who are wandering like me, the big jump in Django searches was the release of Django Unchained by Quentin Tarantino on Dec 25, 2012.


And Django was far from the only Python web framework at the time Ruby on Rails became popular. Django gained popularity largely because it was seen as a Python equivalent to Rails.

Before then Zope (on which the popular Plone CMS was built) was probably the most widely used web framework in Python. But there were many, many others too.


Fair enough, my information is outdated.

StackOverflow agrees. [1]

[1] https://insights.stackoverflow.com/trends?tags=django%2Cruby...


If you properly scope to the Computer & Electronics category to avoid matching "Django Unchained" the graph is quite different: https://trends.google.com/trends/explore?cat=5&date=all&q=%2...


Python grew out of a myriad of applications that built on hard to replicate components, while Ruby relied only on web development which is remarkably subject to fashion.

Python is very simple making it easy to learn. Thus, it became more and more popular for fields where programming is not the main task, like science. It is also very easy to extend with C. So, these fields could bolt on time-tested scientific code into their Python scripts.

Ruby seemed to be entirely tied to Rails.

When the 00s and 2010s came with needs for web development and scientific computing for big applications (e.g. ads on social media), there were multiple language options for web development but really only one that made it easy to onboard academics into the task of building software systems.

From there, it was just a feedback loop, and when deep learning became a major field, the ML community was already knees deep in Python, so it was hard to justify making tools elsewhere.

Meanwhile, it seems to me, that Rail's appeal (and with it, Ruby) was taken by Node's promise of using the same language on both server and client, and more generally diluted by the fashion waves of web dev.

If you want to swap Python for another language in scientific computing, you will soon find annoyances that certain packages are missing, or don't talk to each other well, or aren't optimized. If you want to swap Ruby for another language, you just use the other language web dev library and that's it.

As an aside, Julia has been the promised replacement for Python for over a decade. It tries and does replicate the package ecosystem, the ease of use, extendibility, etc. It also goes out of its way to be able to use Python packages out of the box. But it just seems to be very hard to convince practitioners to go to Julia for just some small performance changes.


> Python is very simple making it easy to learn.

I would argue that there's little difference between them in ease of learning. I personally found Ruby easier because everything being an object with methods and no free floating functions felt more natural and easier to look up. i.e.

string.length

over Python

len(string)


Ruby has implicit imports which pollute namespaces. This is extremely annoying for anyone who wants to learn the language. It uses extensive monkey patching, which is once again a substantial challenge for a beginner that wants to understand what their code is doing. It has a nicer syntax with procs and lambdas but syntactically distinguishes between them with different ways of invoking functions and all the issues associated with bindings. In Python all functions look and behave the same, to the point where methods have an explicit self argument to make it clear that they're just regular functions (the semantics for function binding could be easier but then again such is the case in all object oriented languages).

Python's challenges are superficial syntax; the semantics are extremely uniform everywhere and don't suffer from special cases galore.


I don't think either of those issues, namespaces and monkey patching, are things that make the language harder to learn. You require a file, the file's namespaces are available. How often do libraries modify your runtime with monkey patches in a way that you'd notice? I'm sure it's come up during my career, but not when I was a beginner. There was a period of time when metaprogramming was more popular in the ruby and rails community but that was some time ago. Still, it's a tool that's nice to have available.


I vividly remember how I was told that you could write "2.years" in RoR.

So, if you copy that code snippet and run it as an independent script, it would just not work because it was a RoR monkey patch method.

Worse, the implicit imports mean that unless you have a global view of which files and gems are require-d, you don't know what your code is actually going to do.

That, and coworkers suddenly feeling at liberty to override `method_missing`, abusing `bindings`, and all that stuff, makes it really hard to like the language if only for its tendency to bring out the worst in people.


Requiring a file from a library that's part of a framework's source code can be quite challenging as it requires either the complete docs or perusing the source code to fully grok what's getting included, especially with the monkey patches.

I don't know the current state of the RoR ecosystem but back then when both languages were competing for similar mindshare this was a massive turnoff.


It's easier than Java, though, where you have string.length(), array.length and list.size(). In Python it's always len(whatever). But maybe Ruby wins this one if it's consistent in its method naming.


The global functions throw me off too.

To me in general Ruby is the nicer feeling of the two, just generally more "friendly" and "human". Whether those are desirable traits of a language is up for debate but I think there could stand to be more language with such a flavor.


> Ruby seemed to be entirely tied to Rails.

I’d be interested in going one step further and asking why that was. Rails is great, but so is Ruby, but normally a great library like Rails helps attract developers to the language. Rails is the only example I can think of where a widely popular library (ostensibly) leads to fewer users of the language for other purposes.


I wonder if it's because Rails creates such a specific set of expectations that when you actually try to do something without it you run into problems. "Oh, that's only a rails thing? I thought that was a ruby thing..."

Ruby's monkey patching convention might make this worse.


I actually failed a technical interview relatively recently because of this.

It was a fullstack position with some Ruby scripts here and there mostly centered around scheduling. Had a leetcode style takehome that requested I build out a pretty simple reservation system with time off and quarter hours and stuff like that.

I boot up into the leetcode environment and was surprised at just how many things were Rails-specific, rather than being in the ruby STD lib. Completely threw me off since I suddenly had to figure out how to wrestle the Date STD lib to work for me, and anything date/time related is my personal hell :)


Not exactly the same thing but flutter and dart have a similarly weird relationship too.


I agree. I've seen Python being taught in Physics and hard science university cursus as an introduction to programming 15+ years ago.


When I was at CERN in 2003 - 2004, it was when the first set of Python introductory courses started.

The build infrastructure for ATLAS used the CMT tooling, written in Python.

The Summer School of Computing 2004 had several sessions related to Python in the context of grid computing.


> Ruby relied only on web development

Chef, Puppet, Vagrant, Logstash, etc are not web development.

Ruby dominated (dominates?) DevOps tooling.


Back in 2012 there was no question that ruby was the language of DevOps, then Ansible took over. Puppet was a true pain to work with, and through at least version 4, their language designers were simply wrong about what it meant to be a declarative language (you could only declare each resource once, they pretended/falsely believed that inference and/or unification were incompatible with declarative languages), but wow was it powerful to maintain dependable state on long lived servers


I'm sorry, but for me it seems that Go has is taking the lead in adoption rates for devops tooling.

Ruby isn't the go-to language there anymore... probably because GoLang compiles statically.


The k8s world is Go Central, so that probably has a lot to do with it.


I think their comment still holds. Those are all tools that you use, and mostly don't care what language they're written in. When you've got to write Ruby, it was predominantly in the context of RoR.

I think also that dominance has mostly faded. When I think of languages that dominate devops now, I think of Go, and I say that as someone that dislikes Go.


Ansible is Python, Terraform is Go. Those are 2 giant projects in DevOps.


YMMV, but I've only seen Rails fans use any of those.


There was a time when Puppet and Vagrant were head-and-shoulders the best options, and I say this as a JVM/Python guy who never got on with Rails. (I'd argue they still are, but sadly everyone is on docker and k8s these days whether it makes sense for their use case or not)


> My impression is that in the ‘00s, Python and Ruby were both relatively new

They were, but Python is close to 5 years older than Ruby (February 1991 vs December 1995), so in the ‘00s, Python was significantly older than Ruby.

Ruby also was more or less a Japanese-only thing until around 2000. https://en.wikipedia.org/wiki/Ruby_(programming_language)#Ea...:

“In 1999, the first English language mailing list ruby-talk began”

“In September 2000, the first English language book Programming Ruby was printed, which was later freely released to the public“

In contrast, Python was posted on alt.sources from the get-go (https://en.wikipedia.org/wiki/History_of_Python#Early_histor...).

So, effectively, Python had about 9 years head start on Ruby in the English speaking world.

> when, in my opinion, Ruby is the better language

IMO, its flexibility makes it deceptively simple. Certainly, early users used its flexibility a bit too much to implement useful functionality that felt like spooky actions at a distance for newcomers.


> Ruby also was more or less a Japanese-only thing until around 2000

And after that it still took quite a few years until "English support" really became first-class and comparable to Python. I looked at Ruby around 2002-2003 or so (I was looking at many different languages back then, pretty much anything I could get my hands on), and lots of Ruby stuff was in Japanese or poor English.

That's all fine, but I specifically remember that was the major reason I ended up not using Ruby at the time.

(I did end up programming a lot of Ruby many years later, and for what it's worth I do prefer Ruby for most things now)


Ruby’s English documentation remains terrible, unfortunately.


I can't say I've had much trouble with it; I have generally found it more than adequate.


Early on, at least English speaking ruby had an extremely poor culture around documentation. Code was barely commented and "the ruby way" was to use single character variable names


And the culture of metaprogramming and monkey patching didn’t help.

If the documentation is bad you can read the source, but if you need to grok 3 layers of metaprogramming living in unpredictable locations before you can do that there’s no way to advance (practically).


Original asker here, thank you for this comment. Lots of people are arguing about the pure merits of the languages, but the gap in usage IMO is way too big to be explained by “Python has only one way of doing something” or “Python is multi-paradigm.”

Understanding that Python was established years before Ruby was being discussed answers the question of why Python is preferred for many use cases.


Python's old as dirt; it even predates Linux: https://en.wikipedia.org/wiki/History_of_Python

Sometimes the key to success is to just be adequate for a really long time.



> just be adequate for a really long time.

Seems to be working for Pepsi, and their killer model of "they don't have Coke, is Pepsi okay?"


What, no Dr. Pepper? I'll just drink water, then.


I'll have the Khao Khalash, thanks.


I'll take a crab juice!


Well, Delphi is still adequate, but others got more popular.


> Sometimes the key to success is to just be adequate for a really long time.

If that was true I'd be successful by now


both ruby and python had 1.0 releases in the mid 90s.


I come from an analytic philosophy background. I studied mathematical logic and set theory in grad school as part of my logic requirements. I never learned to program for a long time, and it was always a source of embarrassment. A friend finally suggested that I learn Python because it was "English-like" as you say, so I bought Learn Python the Hard Way.

When I opened the book, unlike java or js, I could just read the code. Like... I could just read it and already understood what was happening. I already knew way, way more logic than would ever need. I think it took me about a day to figure out object-orientation, but it was trivial. I went from not knowing how to program, to being able to run fizz buzz in like a day, and do it with objects in like three. Other languages are so caught up in so much bullshit technical stuff I still can't stand them.

Python is intuitive. It's approachable. I focuses on getting you from zero-to-code over a bunch of much more technical applications. Python is the party that welcomes you in, instead of checking your ticket at the door.

As a non-CS person who has fallen in love with programming, I honestly never want to learn another language if I don't have to.


Same here, engineer with no formal CS training. A long time ago I would write macros with Visual Basic. When it came time to jump to a more robust language that is also cross-platform, Python was the obvious choice. It's readable, easy to learn, has great support and the PyPI library is huge.


Can I quote zach's 15 year old comment on Python vs Ruby? Maybe it will have some relevant insights:

> Ruby has clever syntax. Python has pure syntax.

> Ruby has method aliases. Python does not allow a string to capitalize itself.

> Ruby uses Ruby methods within Ruby classes to extend Ruby. Python has decorators so you can write functions that return functions that return functions to create a new function.

> Ruby has strict object-oriented encapsulation. Python is laid-back about objects, because you probably know what's going on inside them anyway.

> Ruby lets you leave off parentheses so you don't miss objects having attributes too much. Python will let you mix tabs and spaces for indentation, but passive-aggressively mess up your scoping as punishment.

> Ruby has seven kinds of closures. Python has one, in the unlikely case a list comprehension won't do.

> Ruby's C implementation is a mess of support for language-level flexibility. Python's C implementation is so clean you get the unsettling thought that you could probably write Python using C macros.

> Ruby supports metaprogramming for cases when programmers find it more descriptive. Python supports metaprogramming for cases when programmers find it necessary.

> Ruby is expressive. Python is direct.

> Ruby is English. Python is Esperanto.

> Ruby is verse. Python is prose.

> Ruby is beautiful. Python is useful.

> I like Python, but coming to it after using Ruby for seven years, well, I think it's like dog people and cat people. You can like having one species around, but you're always thinking -- why they can't be more like the other?

-- This is one of my favorite HN comments of all time: https://news.ycombinator.com/item?id=682364


I wish I could favorite comments. Thanks for sharing this one.


Click the time on a comment ("1 hour ago", "60 days ago", etc), that will focus you on the single comment and give you the option to mark it a favorite. You can access your favorites through your HN profile.


> My impression is that in the ‘00s, Python and Ruby were both relatively new, dynamically typed,

Your impression is wrong. Python goes back to 1988. It was known in FOSS circles in the 90's: a minority within a minority.

The rise of the web exposed more programmers to FOSS, due to pretty much everyone in web eventually having to work with GNU/Linux servers. More developers than ever were suddenly exposed to cruft that only a few GNU/Linux and Unix geeks were previously privy to.

Eric S. Raymond (ESR) was once influential, and wrote articles on Python; he helped draw attention to the language.

For instance, this article in the Linux Journal

https://www.linuxjournal.com/article/3882

For me, that article stands out in my memory as when I became more consciously aware of Python; that there may be something there.

I suspect that the article had a big impact; I'm not the only one.

People read rags like Linux Journal then. Sysadmins, developers, students, hobbyists.


Python, like Perl before it, also had some very well written books that were very accessible if you were interested in the language. Learning Python and Programming Python first came out in the 90s and then Dive Into Python was released in 2000. I think these really helped with Python's momentum when FOSS platforms were also building momentum and being taken seriously.


Wikipedia suggests that Python debuted in 1991, with Ruby following in 1995.


I haven't checked, but I seem to recall that Python started out as a successor to the ABC language in around 1988 or 1989 or so. The initial release may have been in 1991?


It has very little to do with Python itself. Python won because GvR ran an extremely tight ship. And there were many extremely talented core developers that helped make Python a polished product. The development team made all the right decisions, except for rejecting PEP 355 which was a big mistake. The batteries included approach helped grow the ecosystem. Developers wrote third party libraries in the hopes of them eventually, someday be incorporated into Python's standard library. Python also came in an msi installer and made the Windows experience pleasant, unlike Ruby and Perl which iirc were painful to get up and running.


> except for rejecting PEP 355 which was a big mistake

Looks like you are the author of PEP 355. Well, we got pathlib in the end. It's not that bad.


> Python also came in an msi installer and made the Windows experience pleasant

For a long time, the official Windows Python distribution was overshadowed by ActiveState's, called ActivePython. That included more Windows-specific packages such as win32com, and the very capable PythonWin IDE from Mark Hammond. That was my go-to Python around 2000-2005.

ActiveState was also notable in the early 2000s for hosting the Python Cookbook, the main repository of useful Python snippets and scripts. This was long before PyPI, StackOverflow and Github.


That and Enthought Canopy, which bundled Spyder, SciPy, NumPy and came a bit later but before Anaconda


> PEP 355

Didn't this eventually become what is now pathlib (which I am a big fan of) ?


pathlib is truly excellent! The overlap with the os library is unfortunate, but I can’t complain.


Reason is simple - Ruby is not good enough and lags in many, many, areas.

Ruby was an unknown till the RoR was introduced. And the reason why RoR was so easy and popular, was Ruby's biggest shortfall - excessive metaprogramming. The automated creation of all of the CRUD operations was definitely a kick in the... of Java and many other languages that lacked such ability. But that comes with all of the readability downsides, when "programming by convention" lacks a commonly understood convention. (Scala also suffers from it, just that Spark works so well for majority of it's use cases)

The worst part for Ruby, was that none of the features that RoR brought were hard to implement in other languages. Others caught up and Ruby didn't get enough windfall to keep sailing.

Then there was the community, that RoR attracted in the early days... the best way to describe it was - toxic.

> how fun and easy to use it is

First impressions are deceptive. When RoR was getting popular, many people who said the same typically showed me an example of some functionality implemented in just one line. Which is not exactly a great case for a programming language.

As for Python - it just grew steadily. Building the fundamentals and spreading. Having strong opinions from a single person made Python very easy to read and understand. It's by far the easiest popular language to use to demonstrate concepts to anyone - there's a very gentle learning curve. The community, tooling and documentation - are considerably better than Ruby's.

Remember that there are really beautiful expressive languages that make people fall in love with... but syntax is just one small part of it.


What many others here are saying is correct. Python was fairly popular, and considered a great language that good developers wanted to develop in, way before Ruby became popular.

Just look at this article by Paul Graham, written in 2004, called "The Python Paradox" http://www.paulgraham.com/pypar.html:

> I didn't mean by this that Java programmers are dumb. I meant that Python programmers are smart. It's a lot of work to learn a new programming language. And people don't learn Python because it will get them a job; they learn it because they genuinely like to program and aren't satisfied with the languages they already know.

> Which makes them exactly the kind of programmers companies should want to hire.

This was written more or less at the same time that Ruby on Rails was first released. Clearly by then, people had known about Python for a while.

That head-start led it to be more popular early on, except in webdev circles specifically. But in the other circles, and especially in Academia, Python was gaining popularity, which helped it both get lots of great scientific libraries (making it today's default "data" language), and also made it eventually become a language taught in Universities.


Python was developed in the late 80's and first released in 1991 - it's been around awhile. One of the things that always made it useful before it became so popular is its great interop with C/C++.

Ultimately that interop led to NumPy and SciPy, which allowed you to use a rather simple language and yet get high performance when doing calculations. That, as they say, was that.

It doesn't hurt that Python is a multi paradigm language - unlike many of the other popular languages today demanding a particular mindset for programming. You want to do structural programming? Fine. Object-oriented? That's fine too. A little bit of functional (lite)? There's some limited capabilities for that too.

Add in 30 years of Java JIT compilation optimizations that's been applied to Python interpreted code, C/C++ interop - Python is a simple, yet powerful language. That's why it's become so popular.


i don’t know enough about Ruby, but if it’s true that Python had better C interop early on then roll the clock forward and that should explain an underappreciated amount of the difference.

why are so many Gtk apps written in Python? because `from gi.repository import <my_c_library>` gets you bindings to any gobject C library without extra work for the library authors or the library users. if that sort of thing’s existed for a long time, then yeah: Python’s gonna end up more integrated with the C-heavy computing environment of the era than any language that doesn’t bind so easily.


>Java JIT compilation optimizations that's been applied to Python interpreted code

I thought CPython doesn't JIT at all? What optimizations are you referring to


PyPy and IronPython


If you think about it from the perspective of people learning programming, python is dead dead simple. In Unix like systems it comes preinstalled most of the time. For so many students in many parts of the world who use an HP/Lenovo laptop for university work the OS is windows and python installation is again, dead dead simple.

Ruby starts off by saying you should have a version manager. And windows installation is not straightforward (from a student perspective it’s important to note that something that looks like it isn’t officially supported makes it feels like the entire language is going to be a compatibility slog). This is not dead dead simple. It’s not hard. But it requires more than one concept to be learned to start.

Then the docs. Ruby has so many ways of getting started. Python puts up one main way. Same for the api reference. Python has a single link and they even say “keep this under your pillow”. As a beginner I know what I should do. Ruby? Many links that say arcane stuff (beginner perspective) like rdoc and what not. Python is super clear in their breakdown of reference too. I cannot stress enough how simple this page (https://docs.python.org/3/library/index.html) makes it to find what you want.

Overall Ruby and it’s documentation feels like it’s made for people who know Ruby or something more than the bare minimum. Python is made for people who’ll be trying out programming itself for the first time.

To me, that’s what made Python win. They just seemed like they thought more about folks starting out for the first time.


Another beginner-friendly point for Python: it has IDLE, a barebones IDE, included as part of the standard library. It's unfortunate Ruby doesn't provide such niceties to the beginners out of the box, given it inherited a lot of the Smalltalks, the language family with the best programming environments ever.


Definitely. Forgot to mention this one. I remember running scripts through the IDLE till I learnt how to run stuff in the command prompt. So darn friendly.


Python and Ruby both have an optparse library in their standard library. Compare the documentation for yourself:

https://docs.python.org/3/library/optparse.html

https://ruby-doc.org/stdlib-2.7.1/libdoc/optparse/rdoc/index...

I learned Python 2.3 almost 20 year ago, and the standard library documentation in Python has always been just as good. As a beginner I found Python's documentation very helpful, I could read the documentation like a tutorial, like a book, a good teacher. To this day I still look at the Python regular expression documentation whenever I'm using a regular expression in any language. I learned regular expressions from the Python documentation, there was literally no blog or tutorial I found more approachable than the Python documentation when I was very first learning regular expressions.

Ruby had, and still has, a list of methods for documentation.


A lot of people who picked up Python, especially in the 1990s, picked it up as a side language. They were doing "serious" programming in C or C++, and they needed another powerful, potentially cryptic language like they needed a hole in the head.

I believe that Python gained popularity among scientists for the same reason. They needed a simple programming language, not because they lacked brainpower, but because their job was to focus their intelligence on the scientific problems they were solving, not on programming. Writing code was necessary but not meaningful in itself, like washing beakers or feeding lab rats. The less effort it took, the better.

For people like that, Python won because they were able to become productive with it while learning very little about it. Also, it was rare to accidentally write Python code that was hard to read. This was especially important for scientists, who didn't have a software engineer's appreciation of the difficulty of reading code and the need to practice restraint when writing it.


Readability. The most important thing for code is to be comprehensible by developers. Python has several factors contributing to readability. White space as syntax is the most obvious thing. More than that, unlike Perl which has more than one way to do things Python tends to have one or maybe two canonical ways of accomplishing common tasks. This makes it easier to navigate code without having much experience. Python was also good at interfacing with existing libraries from the start so a lot of work got scripted quickly in Python using high powered C libraries. Arguably being able to work with popular C interfaces was yet another factor contributing to generally high readability of code.


I agree the language is very readable overall but I find the whitespace based syntax less readable and practical compared to just using curly braces. Editors can color matching pairs, let me jump between start and end of a scope, select all lines between two braces etc and all that just doesn't work in Python. Also I find it more difficult to see where a scope ends. It has upsides too and is not a huge deal but I often wonder why people prefer this so much.

Imho the reason for Python's success is mainly how easy and quick it is to learn, and how little time it takes to get something running. That alone removes so much frustration and perceived effort.


Try turning on indentation guides in your editor, they help to visualize scope.

If you are moving around large blocks often, well maybe they are too big? There are linters that can help with that kind of thing.


All imo but in the beginning Python took Perl marketshare for scripting because it was simply less "esoteric". 10 Python programmers wrote similar programs to solve the same task whereas 10xPerl programmers would be more creative. All while coming with "batteries included" which helped to get stuff done quickly. And Python has always played well with C so you could optimize if need be.

At the same time, Python was a pretty early adopter for the server backend (and stuff like Zope existed) and also had great interopt with databases but PHP was a stong contender/#1. I'd say Rails timing was off by a bit or Ruby would have gotten more momentum (at that time there were already many things in Python land to compete like Django etc.). Meanwhile JS was still pretty cludgy to use and required a lot of hoops just to set up an environemnt and things like node came a bit late as well or JS would have taken more "web backend marketshare" from Python immediately. There was a window when the Python 2->3 transition slowed things down (imo) but overall it went slow but smooth for such a big transition.

Python was also an early player in other niches like security (both scripting and tooling).

And when ML happened to take off (grossly simplified) once again Python was there as an early player with good tooling and support for most things and with excellent ties to scientific computing.

It also got good support early from some key players, most noteably Google.

So basically, Python was always there when it mattered and "good enough", came with "batteries included" and the Python community has always been great + willing to quickly get working tooling and libraries in all relevant areas. Importantly Python has always been beginner friendly enough to attract a decent chunk of non programmer specialists from other fields that used it to dive into programming (scientific computing in general but biology comes to mind as a typical use case).


I've been writing code since the mid-90's, probably in a dozen languages or so over that timespan.

In my experience, Python is "easy" -- as in, you can quickly achieve what you want. It doesn't get in your way with opinions about types or data structures. You can rapidly create horrific kludges that produce the results you want.

Not sure I'd say it's a programmer's language. It's like javascript that way, but without the absurdity and the footguns. It's overall intuitive, and has all kinds of handy shortcuts.

It's interpreted, which means you can futz around and prototype and get things working, hacking at it until it clicks. There's no hang-yourself-in-frustration build chains, nor even the need for reliable and responsible tooling - no overhead at all.

That's the language itself - but it's embedded in an ecosystem of libraries and tools favoured by the data processing / data science community. Given the mad gold rush in the recent years in these domains, it's no puzzle why python is one of the top choices these days.


I believe what you describe is basically a preference for a language you already know.

I'm quite proficient in JavaScript and could repeat everything you say about ... JavaScript. OTOH I had to do things in Python a bunch of times, and it was always very frustrating.


The JavaScript ecosystem is objectively smaller. JavaScript doesn’t have a foothold in data science and ETL, nor deep learning, nor DevOps. There are no decent ORMs in JS, and Python has several. JS is poor for numerical and scientific programming.


Python has smaller footprint in webdev backend and practically non-existent on the frontend :-D

But I think this thread is about being comfortable with the language itself rather than an ecosystem.


Prisma is better than any Python ORM in my opinion.


Judging by the very contentious opinions on it, I'd rather stick with ORMs that have been chugging along for over a decade without fail.


According to David Beazley, it was lucky to already being around by the time scientific computing folks were in need for dynamic languages as wrappers to help with increasingly complex hardware and projects written in lower level languages, and the only other viable alternatives back then were Perl and tcl. Watch this talk: https://youtu.be/4RSht_aV7AU?si=d3rJvYJPFGcRy6BK

Then it just snowballed since that time.


Coming from Perl, I love Ruby (for non-Rails stuff), but end up using Python because:

- The documentation in the Python ecosystem is generally better.

- There's more Python libraries, and they're generally better maintained and better documented.

- Python has a better interface to C.

- Python is more popular and generally more supported by third parties.

- Django/Flask

- There is comparatively little Ruby activity outside of Rails and a few cybersecurity projects.

I really do not like Python. I think it's really inelegant. But it's a better tool for getting my tasks done, due to the above reasons.


I also had a hard time choosing between Python and Ruby for my goto scripting language. But given Python community is more active outside of webdev, I also chose to stick to be more professional on Python.


When I first encountered Python, it was as an alternative to Perl.

Given the choice between Perl and Python, I think it's clear why the latter won out.


> When I first encountered Python, it was as an alternative to Perl.

Me too

I am amazed that Python won out against Perl

Perl is much easier to use. Does not have the incredibly irritating treatment of whitespace

Perl was a disrupter, whereas Python was "computer sciencey". A false dichotomy but it really mattered to people


Perl being easier to use is a joke... Ruby suffers the same issue as Perl does - you can do the same thing a million different ways, that's too many.


a million different ways, but all with liberal use of human-unfriendly symbols like dollar signs, at signs, pound signs, etc.


Some people (me included) LOVED the whitespace-as-a-feature. I had swam through too many scripts where everything (regardless of nesting level) was all on the first character of a line or otherwise spaced randomly where my first step was to properly indent just to understand what it was doing. A language that enforced that was a Godsend.


Nowadays in languages like C/C++/Go/etc we typically enforce the use of an autoformatter to convert source code to a canonical style. It's also great for removing all discussion of such formatting from code reviews.


Goes to show there is no pleasing everybody


“masterstoke” was the term I used when I discovered it.


I loved Perl. I _loved_ it. It was so much fun. It felt like vim in how you could guess weird corners of the language.

But every reason that made it fun is also an actual reason why it's a nightmare unless you are very strict in how its used. Strict on things like whitespace.


I was about to say, as much as I enjoy python.

Perl is more effective than Python. You can do some pretty amazing things in Perl in a few lines. You can also do amazingly readable scripts in Perl.

But... if you aren't careful how you write perl, it's a very bad pit to fall down to.

Python just forces you to write code that's better for the team.


Whitespace strictness is the biggest downside of python.


The use of whitespace in Python was controversial 20+ years ago. These days it seems like no big deal.

The only annoyance is when copy-and-pasting code from one place to another, the indentation can get screwed up, which is a small annoyance.

But other than that, this seems like an argument that was settled 20 years ago.


In my experience, Python has way bigger warts than something that's just automatically handled by my IDE.


It’s the biggest upside.


A bizarre opinion to be sure. Python is vastly more readable then Perl and is far easier to write as well, hence so many educators in schools reaching for it.


I can understand a love of Perl by folks who love a good puzzle, or who are interested in "code golf" (trying to get something done in the fewest number of characters).

But those are not great traits for a general-purpose programming language. I agree with the other response: "Python just forces you to write code that's better for the team."


Perl is a write-only language, whereas Python almost reads like pseudocode. They're on absolutely opposite ends of the readability spectrum.


The real reason python won in my opinion is it has been a default install on most linux distros for decades. A ton of the groundbreaking ML research (especially in the 2000s and 2010s) was done in government labs and DoD settings where it is extremely difficult to get external software approved/installed, yet there was python sitting there already installed on every single DoD Red Hat machine. So a lot of research code ended up getting written in python as a prototype and then improved with C++ in the speed-sensitive bits, and before you know it, Python = ML.

Being a default install also has a lot of barrier-to-entry advantages in other settings, particularly in academia where there are similar (but usually not nearly as strict) bureaucratic restrictions on installing things, etc.

source: was a DoD computer science researcher focusing on ML for several years and witnessed people reach for python because they were still waiting for months, sometimes years for whatever software package or language to be installed / approved by security/IT and had no other dev tools


I always thought of Ruby as Perl++ more than anything. It's relatively easy to write, not always so easy to look at and refactor later. As to why Python has won, I think you nailed it, the library integrations and academic space has made it popular and useful there.

Ruby's popularity for what it was, was heavily tied to Rails. As that space progressed, a lot of people came to dislike it as much as anything, and maybe more. There are a lot of ways it didn't work so great and those warts made more and more avoid it in future projects.

That's my $.02. I started with JS as well, but in '96 and through classic ASP (JScript and VBScript) through VB6, .Net (and C#) then into/through Node.js etc. I also dabbled in PHP, Ruby and a handful of other tech along the way. Python feels like a massive gap in my knowledge base at times, I have a few books to read, but not much in terms of opportunity/need. And the time I've spent on Python, I feel like I'd rather spend that time leveling up in Rust.


Python had significant traction back in the late '90s (~1.4/1.5). It was perl vs python vs TCL (possibly vs Guile) back then.

Cross platform and the ease of integrating to other codebases (both extension and embedding) are two parts. Perl's transition to Perl 6 seemed to hurt/fragment that community more so than python 2000 did to python. TCL seemed to slowly die except for a couple niche applications - possibly expedited by Sun's slow demise. Guile never really ran well on non-*nix systems.

Fundamentally python had more C/pascal flavor than Ruby's smalltalk's flavor. That made it easy to teach/adopt. A couple big name universities moving from scheme to python for intro CS helped too.


Because I loved C, liked C++, but hated writing complex programs in either language (circa 1995) and somebody said "try python, it's object oriented and easy to use" and I still use it today as my preferred language for a wide range of tasks. it just feels like a macro language on top of C/C++ in some ways. Numeric Python, then numarray, and numpy transformed my way of working with multidimensional data.

Perl: could not stand the syntax or the language philosophy

C++: too many footguns

TCL: a bit primitive for what I wanted to do

Java: ugh. just ugh.

Go: not targeted at my use cases


Python beat Ruby because it's boring technology, whereas Ruby prides itself on being as clever as possible. Take a large application written with Django vs the equivalent Rails app. Even if the Django app was written by mediocre engineers and their Rubyist counterparts were very smart, it's probably much easier to spot the footguns in the Django app. The cognitive load of the average Rails monolith is almost unbearable to non Rails experts.


Ruby was never that great of a language. Rails was much better than any alternative, though, and it brought Ruby along with it. Basically every company using Ruby chose it because they wanted to use Rails.

The main thing that has led to the decline of Rails is the rise of no-HTML-just-an-API backends. Your web frontend development stack is more likely nowadays to be, first you pick between React or a similar framework, and then the backend just sends API data and doesn't mess with HTML. This makes for better websites, and it makes joint web and mobile development easier too, because you're going to have to build that API anyway for your mobile app.


I started programming python in 98. It won because before that, Perl was king. Have you seen perl?

Python made a great decision, it was the anti perl. One way to do things, keep the syntax easy and simple. Force whitespace indents. Before python got really "pythonic" It read like pseudo code. As someone that explored Ruby for a bit, Ruby has too much magic. Magic is cool when you're the one casting the spell, but for your audience, it's annoying if their job is to figure it out.

It offered OOP, but was like C++. It didn't force it on you. If you came from a C background, you could write C like code. If you came from an Objects/structs background you can do OOP.

It then offered a great standard library. It had a REPL. DO NOT UNDERESTIMATE THE REPL. Not just a basic REPL, but an interactive one. dir()? __docs__? The power of man pages built into the REPL, allowed one to explore an unknown library.

Say what you want, but the FFI was well thought out and embraced. This made it possible for lots of C library to be made available in Python. PyQT, wxPython, allegro, OpenGL, all these paved the way for NumPy,TensorFlow, PyTorch. Your favorite graphics, sound library and many other libraries where wrapped in Python. You didn't have to write C anymore for most people with computers being fast enough.

Python won because of many little correct decisions along the way. The python2 to python3 drama was not a nightmare like perl5 -> perl6 or VB to VB.NET. Many good little decisions with a big of luck helps.


I have come around to dislike forced whitespace. I don't think non-visible characters should be syntactically important, and spaces are thus overloaded by separating both within lines and defining functions.

With linters how they are, a traditional semicolon makes things a lot more clean IMO.


Semicolons, braces, and indentation are redundant. That can be useful, but it’s the opposite of clean.


The REPL is huge and makes it so much easier to learn to program by trying out little snippets of code to see what they do. You can trial and error your way to a working program pretty quickly. Only Lisp has a better REPL.


Serverside Javascript split the vote for fancy web development. Ruby never really diversified meaningfully out of webdev, but Python has always had other major constituencies. There are powerful arguments to do server work in Javascript (single language), and not so much in Ruby, so in the late 2010's Python became the most pragmatic option for non-Javascript serverside web dev.

Another way to say the same thing: Ruby's fate is tied to Rails, and Rails put down a lot of bets that haven't been clear wins, most especially about the relationship between frontend and backend webdev.

I don't think Ruby's going anywhere, though I'd probably default to Flask for a new project. There is a lot of Rails out there.


> Python’s math libraries, numpy and pandas make it appealing to academics; Python is simpler and possibly easier to learn; Rails was so popular that it was synonymous with Ruby

Yeah, that’s basically why.

Back in its heyday, Ruby had two basic niches. The bigger one was Rails. Rails is still alive today, but after the move towards richer frontends a lot of the hype, and consequently the developers, followed that trend into the JS world. It’s obviously possible to have rich React frontends and Rails backends if you want to, but that isn’t quite what you got with “omakase” Rails.

The other niche was in devops tooling, e.g. Chef and Puppet. But that space moved towards containerization and all of that tooling, more or less, got written in Go.


The recent DevOps tooling is also not conducive to using code at all - it's YAML all the way down. Nobody is writing Go for everyday DevOps now, whereas for a bit they were writing Ruby and Python.


Python kind of had three things that really mattered, an culture of “one correct way”, an invisible but conservative type system, and an style guide build into the language an dictated from the language designers in the form of significant whitespace and PEP-8.

Ruby could have been a contender but still had some of the “there is more than one way to do things” left over from perl(ruby was kind of an pragmatic alternative to the “real soon newer” of perl6) and it’s a bit more dynamic than python in how it handles variable internally.

And the field of academic data crunching was hurting badly from the fact that the two only viable options for most entry level researches was Excel/VisualBasic or PERL so everyone wanted some level of structure which the python community led by Guido provided, once people begun to settle on bumpy and pandas.

And pythons popularity definitely grows from the fact that it’s what non programmers learned as an alternative to Excel/SPSS/SAS/Matlab(all of whom are expensive proprietary tools) in classes focused on analyzing data.


This maybe chicken-and-egg, but Google standardised very early on on Python (along with Java and C++) for their internal work, and sponsored a lot of the open source infrastructure. So even though Ruby attracted a lot of attention for a big chunk of time, there was no diminishing in the baseline popularity and functioning of the Python ecosystem.


Here's my take.

Ruby essentially stole the thunder from Perl and PHP as the language for back-end development with Rails. But with Django, Python pretty quickly had a more or less equivalent web framework.

Here's my recounting of history:

- For about a decade, Rails remained one step ahead on the cutting edge of web development. Thought leaders like 37 Signals / Basecamp constantly rolled out innovations through Rails, appealing to the most forward-thinking web devs. But Python was never far behind.

- I would also argue that Python is an easier to understand language. Ruby encouraged the development of DSLs with its second-class functions / block syntax, but Python tried hard to keep its core language simple and emphasized explicitness.

- Meanwhile, the simplicity of Python's syntax made it very appealing for developing data tooling, competing with a bunch of expensive proprietary tools, like MATLAB. This exploded Python's userbase.

During this period, I would say Ruby was the hotter language for startups, but Python had a bigger userbase, as the more middle-brow option.

Starting around 2015, both languages started to see other languages encroach on their web dev moat. Node was better suited for simple microservice APIs. People got into typed programming for its perceived advantages. Compiled languages took on a lot of the high-performance systems.


I started programming around 2007. I actually wanted to get into Common Lisp, because I was heavily influenced by Paul Graham, HN and early Reddit. I did try a couple of web frameworks with Common Lisp, but didn't get very far.

Ruby on Rails was very big at the time, but I ended up choosing Python for a couple reasons. Python was a bit older, it already had a reputation as a scripting language for system administration. It had more libraries like Numpy and seemed like you could do more things with Python. You could script, write desktop apps, use it for data science, while Ruby was pretty much just for web dev.

Even back then there was a huge ecosystem of Python package and it seemed like anything you wanted to do with Python, there was already a decent library available for it. So it seemed like you had more options learning Python. Django was also just coming out at the time, so you could do web dev with Python too. Then Reddit rewrote their site and switched from Common Lisp to Python, so that seemed like a good endorsement too.

Python became as big as it has because Machine Learning / AI exploded and Python was there to benefit from that. Numpy / Pandas / Jupyter Notebooks etc. gave everyone who wanted to learn ML/AI FREE tools that were as good or better than Matlab / Mathematica / SPSS or other software that cost thousands of dollars. Grad students are poor. Maybe their school has a Matlab license they can use, but they probably can't install it on their personal laptop, so they used Python instead, wrote all of the influential papers about ML using Python, then anyone else who wanted to learn from that installed Python and so it goes.


To me Ruby syntax looks too magic. Python is much more simple to understand, a person that doens't know python but another language such as C or Java can write programs in like 10 minutes. The only "strange" thing is the use of indentation for dividing blocks, but it's something you easily learn in 2 minutes.

Another thing that helped python is the fact that you can do fairly low-level operations in a very high level language. Python gives you os-level API/system calls that otherwise you would need to use a language like C to have access to. This makes python great to do system programming: indeed a lot of system tools in Linux where performance is not critical are written in python. For this reason you can assume any Linux/UNIX system has a python interpreter, and thus a lot of scripts and tools started to be written in python.

It's also quite easy to call C code from python code, and vice versa (for this reason python is used as a scripts/plugin engine in programs written in other languages). This is the reason a lot of scientific calculation tools are written in python, since they have the algorithms implemented in C for maximum performance with a python API to be able to be used easily.


I always used the word "magic" to describe Ruby syntax. It was my biggest complaint with it too. RoR took that even further by adding even more magic through a framework, where simple functions did HUGE amounts of hidden work.

I will say, my favorite feature of Ruby was the question mark in methods convention.

For those that don't know, a common convention in Ruby was to put a question mark at the end of any method that returns a boolean.

So for example you could write:

if product.active? ... DO THIS THING

It was fun and I enjoyed it. Behind the scenes it was just a method, but the question mark at the end was always fun to me from a readability standpoint.

However, its really not all that different than the pythonic way:

if product.is_active(): ... DO THIS THING

Something about the question mark felt fun and magical.


It's the "just works" part of magic that's the gotcha. I lead a team with a multi EiB Hadoop cluster and magically list status and rename dir can only be done with more and more caveats as you scale.


Rails was really good at building a very specific type of website and that kept Ruby popular for a while, but SPAs resulted in back end web apps that mostly just shuffled json around, something you don't need Rails for (though rails does a fine job of it) with it's page-oriented world; that responsibility got delegated to React.

Meanwhile, python ended up being the default programming language taught to domain experts in a ton of different domains. Having your engineers write the same programming language as your domain experts turns out to be really useful for business. Using NodeJS as your backend if you consider web your domain is also an excellent choice and also eats into Rails' market share.

I have seen some shops with modern rails deployments do truly awesome stuff, but it's hard to beat being able to take a script an analyst has written, hit it with a linter, and throw it into your production application.

Also, for what it's worth, the python 2/3 debacle was a much bigger deal back in the height of Rails' popularity. I was genuinely worried that the schism would continue to haunt the community; that someone would fork 2.x and we'd have two competing pythons. That didn't happen thankfully.


Some notable reasons:

  - Ruby was 4 years younger than Python on the day it was introduced and cites Python as one of it's influences.  
  - It's a fine language but besides rails it wasn't adopted for other uses like Python was.   
  - In the UK at least Django is way more prevalent than rails for web-dev. And if Django doesn't float your boat, there is Flask, FastAPI and others.  
  - Python's 3rd party libs are incredibly wide and many.  
  - It was boosted by Google early on which gave it a stamp of authority.  
  - Schools in the UK teach it to secondary students (high school).  
  - Devices like Raspberry Pi came out with full micro Python support.  
  - There is micro Python.   
  - There is Python in Python (pypy).  
  - There is full windows support  
  - The AI/ML/Data Crowd's darling  
  - It's fast enough for most things and getting faster still.
As someone else mentioned - "win" is not the right term here. Win what? A TShirt? a toaster oven? It's just a matter of adoption in specific geographical areas

(Edited formatting)


To offer a slightly different perspective than just "ecosystem", I learned Ruby a few years after Python, and found it kind of annoying, in that many Ruby programmers tend to invent their own DSLs and little hacks because the language is so flexible. I know many programmers love Ruby for class_eval, but it really makes codebases hard to read, and code is read more than it is written.

Python has honestly had some opportunities to die out (2->3 transition, type hinting missteps, incredibly slow, etc.) but survived because writing extensions was always easy and it was taught a lot in universities.


Many comments have commented on the ecosystem, but I've thought a big reason behind python's success is that it's also a well designed language in terms of readability and lack of surprise. My background was in perl and shell scripting before finding python and the absolute beautiful form of the language (that whitespace is significant in more ways than one!) and its utter insistence that There's One And Only One Obvious Way To Do It (more or less) made programming much more ergonomic compared to the other similar scripting languages at the time.


> There's One And Only One Obvious Way To Do It

From the external POV that's one of my pet peeves with the Python which also soaks into other aspects of the community. Python community seems almost religious and sort of self-righteous about how Python was designed and how things should be done. Other language communities are more open to (self-)criticism.


having used both ruby and python extensively, my guess is that what gave python an edge is better support for namespacing and stricter adherence to local scoping. it is far easier to read a largish python project and easily see where every symbol the code uses is defined; ruby makes it a lot harder because by default everything is tossed into one big top-level namespace and imported implicitly.

concrete example:

python:

    foo.py:
      def f():
        ...

    bar.py:
      import foo
      x = foo.f()  # very clear where f comes from, just look up foo.py
ruby:

    foo.rb
      def f
        ...

    bar.rb
      require 'foo'
      x = f  # grep through the code if you want to know what f is


Google used Python; so, a LOT of folks figured that decision was the best one because "Google used it." Google also chose to employ Python's BDFL Guido van Rossum at a key time (2005); which, pushed at least perception of the language forward. Google's support is likely a key factor in the use in data-science, but I don't know the dates/timeline for that as well.

Additionally, during that time the Ruby-core team seemed less focused on performance and more focused on ergonomics. I think the height of this was refinements; which, to my naive understanding were (are?) a real PITA to support/implement for the non-MRI implementations of Ruby. The complexity of the language probably turned off folks who wanted a dynamic language but wanted it to be fast (which is I imagine how all the different implementations of python came about).


We are in a verrrry similar boat! Always been into fiddling with computers but got my first job in tech around 2013. JavaScript and PHP. Only the past couple years have I messed around with Ruby + RoR and I love it.

Every time I have to touch Python to mess around with anything going on in the ML space, it just feels so clunky, in ways that Ruby feels like it wouldn’t have trouble with.

I think r_thanbapillai’s top level comment is spot on. Ruby became a “web” language because of the singular focus on Rails (at least in the west) which hampered its growth in other areas. It’s not so much the language that makes a community but the libraries people are inspired to build with it. Python specialized in the “right” thing for its moment in the sun.

I do think we're slowly seeing Julia or other more purpose built tools slowly eating python’s numpy lunch.


Adding my 2 cents:

Python was not relatively new in the early 00's, but its popularity really started taking off around that time. It is older than Java and Javascript.

Ruby became big primarily because of Ruby on Rails. Python was already big before Django came around. In fact, one of the reasons Django exists is "Python is cool, and we need a good web framework for it".

Why was it already popular? My guesses:

Very easy for new programmers, but enabled you to build fairly powerful stuff with it (unlike, say, BASIC). Eric Raymond, around 2000, wrote an article on why he loved Python, and the main takeaway was that within a day or two of encountering it, he was already building useful scripts with it.

It is fairly well designed (well, at this point it may be overdesigned). A common remark by people learning Python was "I didn't know how to do X, so I guessed and it worked!"

Has some functional programming aspects that appealed to people (e.g. lambdas, map, reduce, filter, etc). Has laziness if needed.

But the real reason: It was not Perl, and was a viable alternative to it. It explicitly was against Perl's philosophy of "There's more than one way to do it."

Batteries included: Every standard Python distribution came with what was then a really good set of libraries. Sure, Perl had CPAN which was more extensive, but consider that not everyone had high speed Internet, and you still have to do the labor of identifying which CPAN package you wanted. For many Python developers, the standard library was all they needed.

Then with NumPy, it became a good alternative to MATLAB. Although started to be heavily used in scientific computation 2006 onwards. With pandas people started abandoning R for it, even though R is, and continues to be, a better statistics language. With Numpy and Pandas available, machine learning easily followed in the early 2010s.

The benefit of Python over stuff like R and MATLAB (and possibly JS) is that you have access to all the other libraries. In those days, if I wanted to write some code in MATLAB, but have it also crawl some web pages, and do some scripting, it was a pain. You'd write the MATLAB code to deal with the computation, and separate scripts for the rest. Now I can do it all in one language/program.

And yes, I'll say it: Whitespace. The lack of braces and semicolons, and the semi-rigidity of whitespace just made it a lot easier to read.


I'll just add that python provides a pretty good transition for matlab people doing scientific programming, it keeps all the simplicity for scripting, had a good range of packages, etc, another person brought this up too.

For the generation who used matlab in grad school and then abandoned engineering and science to become software engineers, it provided a good soft landing to transfer skills.


1) Because it was a better Perl.

2) The single CPU speed was doubling every 18 months almost. Wait a few years, stick the program a new Pentium III or whatever and you got a nice speed boost.

3) Batteries included. It was huge quality of life improvement. Want to traverse directories, parse ini files, send emails, socket programming, even edit audio files —- it was all there.

4) Repl. Huge improvement over other existing popular languages at the time. Especially when ipython came about.

5) No curly braces. People were sick of Java and C++ verbose syntax. They were seen as old and crusty by that time so they wanted something new. “Why do I need 30 lines of code to open and read a file?”.


Around 2010, I recall that a lot of computer science departments in the US made a switch from using Java to using Python as the first language to teach programming concepts. So a large number of students were exposed that way.

Also, a number of top-tier websites (YouTube, Instagram, Dropbox, Netflix) made Python a central part of their stacks.


I see the flip happening when MIT switched from Lisp to Python for its introductory courses. That happened in the 00s. They had to choose between Java and some other langues and then choose Python. Gradually everyone else followed. New grads came out learning Python. They started developing with it and took over other langues.


I have never encountered a language as easy to pick up and do cool stuff with as Ruby is. Even after years of working with it I still find Python to be a bit of a mess. I think there was an article years ago about how some people liking a product means it is going to die? I'm pretty sure that is me for technology.


I've heard so many people say that Ruby is really easy to pick up and work with and for some reason I've had the exact opposite experience. I have tried to learn Ruby at least three or four times and I bounce off it every time. There are a bunch of other languages that I've learned and worked with effectively, but for some reason my brain just refuses to grok Ruby.


That's word for word my experience with Go. I wonder if people are different enough that programming languages cannot be one sized fits all?


Experience between people differs. Problems they want to focus on differ. The environment changes.

Maybe in hundred years or so software development will somewhat settle, but just look at something trivial like a hammer ... there are so many different kinds and some have their favorite brand.


That's funny, I tried Go for Advent of Code last year and got through the first 10 days or so without any real difficulty, having never written a line of it before.

Just as there are different spoken/written languages and people find different ones easier or harder to learn based on a huge number of variables, I believe the same is true of programming languages. Sometimes it's the syntax that makes a language difficult, sometimes it's a different paradigm, sometimes it's an unfamiliar memory model or type system -- there are lots of things that can make your brain throw up a block when trying to dive into a new language.



> If products you like keep getting discontinued, get used to it.

Well that's me. I used Windows Phone 8, my favorite phone of all time was the Blackberry Passport and my first iPhone is likely the last iPhone Mini ever.

At least the Steam Deck seems to go against that trend...I hope.


It is off topic to the python/ruby discussion, but I wonder if there is a way to discover if I am one of these people.


It seems simple enough:

a) a tendency to buy oddball products

b) track record of finding that products you use are discontinued

That is all there is to it. There is no psychometric test or theory.


I started using reddit in 2006, so I remember a lot of the Ruby vs. Python battle from around that time. Here are a few points that were often raised back then (and which I haven't seen raised here):

- The common wisdom back then was that Python was attracting mostly C programmers who were looking for a faster way to write C and to interface with it, while Ruby was attracting mostly Java programmers who were tired of writing FactoryFactoryFactory classes. As a result, the Python community had a lot more C programmers who were able to work on the lower-level parts of Python (such as the VM and native libraries, like Numpy).

- The original Ruby interpreter (MRI) was very, very slow. It was replaced by YARV when Ruby 1.9 was released. YARV was a bytecode interpreter, while MRI, if I recall correctly, didn't even use bytecode. It just did the parsing while it was executing code. Before the switch to YARV, I remember regularly seeing Ruby at the bottom of the programming language benchmarks page, and it took something like 100x-150x longer than C for a task. Python was about 5x as fast as Ruby back then, IIRC.

- Aside from speed, the Ruby VM had a lot of other problems. The Python VM was rock-solid in comparison. See the links below for more info.

- The Ruby community back then had a reputation for being... unruly, let's say. Some characters there. It also suffered from that sort of chronic beginner syndrome that appears in isolated programming communities. I remember one guy in particular who was fond of justifying Ruby's weaknesses in odd ways. (For example, he said that Ruby doesn't need a debugger because it makes coding so easy that you'll never need one! In reality, Ruby didn't have a debugger because no one felt like making one.)

Here is a blog post talking about issues with the Ruby VM: https://web.archive.org/web/20100606205042/http://cbcg.net/2... (I never forgot that title, haha)

and the discussion on reddit: https://old.reddit.com/r/programming/comments/1kg8e/python_u...


Funny story:

When I took an "intro to programming for scientists" type course in early 2010s, it was taught in Ruby. This was at the height of ROR's popularity, and the instructor must have been curious about Ruby.

So Ruby became my first language, and for a year after taking the course I did all my scripting in Ruby. There was nobody around me (a lab full of C and Fortran coders) to tell me otherwise. Eventually, I switched to... MATLAB, and a couple years later to Python.

And in 2023, I don't remember a lick of Ruby. But I remember it was fun to code in it.


Reading the comments makes me wonder why where perception of Ruby that it is complex or hard to learn is coming from. Let's talk about just language without external libraries, package management or context of specific software.

I'm using both Ruby and Python for tooling (so there is neither Web nor ML context. Neither meta-programming). When it comes to Ruby I learn one thing and safely extrapolate, i.e. you don't need to learn all of the language to be able to use it. With Python I've had to come to the docs many times to understand how a particular thing works and I keep finding ways to shoot in my foot. This includes very basic stuff like variable scoping (global, class, instance or local) or type non-obvious type conversion (in Ruby only nil and false are treated as false in conditional expressions, while in Python this includes empty string, empty list, empty dictionary and so on).

When it comes to readability in Ruby it is natural to do data processing by "pipelining", e.g. you don't need to do f1(f2(f3(f4(data)