I have recently been learning python after primarily coding in Clojure and Ocaml the last 36 months.
It has been a pretty frustrating experience, lots of the tools from functional languages are there, however Python as a language is extremely inconsistent. All these little quirks take up a significant chunk of cognitive overhead. Python also liberally add different syntax in places where it doesn’t seem to add much value (eg lambdas cannot destructure tuples, and don’t need to use return). Python has really showed me the value of keeping a simple syntax, and having value semantics everywhere). Mutation just isn’t a great way to write code outside of leetcode.
I fully agree with you and have expressed this sentiment many times. I use Racket and F# in my spare time, basically sister languages to what you use (!), and I have the same opinions. Knowing these other more structured, consistent, and principled languages makes Python a frustrating and difficult experience. Whenever I program in Python, never by choice, it's an exercise in unlearning sanity.
Knowing these other languages and with others like them out there (F#, Racket, Clojure, OCaml, Elixir), I have no practical or intellectual need for Python.
My favorite language has been Clojure for a long time, but given Python's popularity and widespread use I often wonder whether there is just some big underlying principle that would cause Python's appeal and superiority become clear, if only I could see it. Comments like yours and GP's help reassure me that there's not just some fundamental thing about Python that I'm missing. (And don't even get me started on Python's concurrency situation!)
The appeal is the libraries, like numpy, scipy,matplotlib tensorflow, qiskit the quality and the ease to use them. Basically python became what TCL dreamt of becoming (ok, don't kill me).
Oooo, neat! Thanks for pointing this out, I'll have to look into it. Many of my struggles have also revolved around the ecosystem (pulling in and/or keeping up to date with modules that other teams have written, etc.) so I'm especially interested to see how this deals with that.
There's also Hy (https://github.com/hylang/hy) if you just want Python with Clojure-ish syntax. At the very least it frees you from indentation and single-expression lambda restrictions.
Python is easy to get started with and has syntax that's nice to look at, easy/nice almost always wins over better. So now the issue is many have to keep using Python because of the ecosystem not because they find the language good.
> The top 3 programming languages - JS, Python & Java - are all pretty uninspiring.
And Clojure beautifully solves many pain points of Java and Javascript, and even C#. I haven't tried clj-python mentioned, but I hope someday soon, running Clojure code interoperable with Python in production becomes a reality.
The hosted nature of the language is a brilliant idea. Clojure eliminates many small annoyances I hate in other languages - syntactical, semantical, and operational.
I wish more programmers have given a heartfelt try to Clojure instead of whining that "Lisp is unreadable" and "the parentheses are awful".
The "ecosystems" aren't so tied to languages anymore. There are lots of nice languages targeting JS, JVM and .NET that let you use libraries and framework made for those platforms.
I feel the same way. If I need performance, I'm going to use something like C, C++ or Rust. If I don't care about performance, I'm going to use something as high level as possible, like Haskell (and probably still end up getting performance better than Python, and probably only one order of magnitude away from C).
Python seems to be in this weird middle area where it's not performant, but it's still more low level[0] than languages like Haskell and OCaml. If you don't need performance, you should go as high level as practically possible!
[0] - By low level, I mean things like having to manually write loops, deal with state, etc. More succinctly: "A programming language is low level when its programs require attention to the irrelevant."
Functions (and I assume lambdas) could destructure tuples in Python 2.x. I started my first serious Python project in 2.something before discovering it was nearly eol, and learning that all of my tuple management was broken under version 3 was quite demoralizing.
I find the mutability of values and object orientation of 3rd party libraries to be among the most alienating aspects of Python, but overall it’s too useful for me to ignore, so I live with it.
I just don’t find layering on functional capabilities to an OO language to be nearly as useful as a true FP solution. As you said, too much cognitive overhead.
I'm not sure why they removed the ability to destructure sequences in args. It bugs me sometimes as well. There's probably a long discussion about the rationale in some PEP and an even longer email thread.
That's a decent argument. It's more typing (in both senses!) to use a specific type as the parameter instead of unpacking in the signature, but it's just as or more readable that way.
Yeah, I was so bummed when they did that. I had a whole library of functions for a stack-based interpreter that destructured the stack in the function sig, and it was sooooo coooool...
I learned it in the 00's in highschool and it was a very nice small scripting language that you could just write dumb code which worked and you wrote C for where you needed performance.
For some reason people have pushed it everywhere and lost the spark it used to have.
LISP and functional programming through Lisp style program (I prefer Racket) with multi-threading could be the way of the future, but instead we keep getting stuck in general programming languages that are great second best languages at best.
Using LISP to create DSP Languages is so powerful but under used.
That is incredibly true. For some reason we all have an obsession to use one language for everything (cough Python cough), rather than different, better suited ones for each task. I really hope Lisp makes a comeback.
> The two main drawbacks of Python from my point of view are (1) there is very little compile-time error analysis and type declaration, even less than Lisp, and (2) execution time is much slower than Lisp, often by a factor of 10 (sometimes by 100 and sometimes by 1).
Both have been improved on - in particular (1), with `typing` module, type annotations, mypy and so on. With (2), Python's speed has increased somewhat since, but then again, you won't be doing large-scale math without numpy, and for non-vectorizeable numeric computations you'd probably use something like jit-compiled numba or the like.
I'm not fond of the habit of glossing over the slowness of the Python interpreter by pointing out that you can use it as an FFI to fast code.
That's great if you're using Python for numerics, and color within the lines. But: a general purpose programming language is used for general purposes. Even for numerics, composability suffers if the language itself is slow, since crossing between FFI packages must be done very carefully to prevent paying the cost of the interpreter.
The fact is that Common Lisp, Julia, Swift, and LuaJIT (just to name a few), are substantially faster than Python, and it's not hard to come up with situations where this will matter. Worse, you might stumble into such a situation after committing to Python.
Python has much to recommend it. But its slowness has been mitigated, not addressed, in the years since Norvig wrote this article.
You ommitted Go. I think Go has the best chance of replacing Python in the long run as it is very similar to Python and is often used as a replacement. In time I think we might see Google implementing most of the Python numeric and scientific libraries in Go.
The big thing that has allowed Python to flourish is the fact that it has a good foreign function interface and was easy to use to the point that both academics and web developers would learn it. It really got a second wind as ML got more and more popular. Not many languages can be used to create an ML application end to end (but building on the shoulders of Google and or other large big tech / ad tech companies).
Python had their smearing campaign claiming TIOOWTDI against Perl's TIMTOWTDI, and dubious claim's of sigils (`@$`) hurt readability. But once Python had the ecosystem going, it brought back `@`, `{}`, numerous `_`, fanciful `:=`. And, of course, there always is more than one way to do it in Python.
"Python is easy to use" is a relative statement about its priorities. Compare to other languages that have different priorities. Python is easier and slower than C++, for example.
There are many design errors in Python, of course. But most "wtfpython" things don't actually affect people using the language -- they're engineered to identify funny compiler optimizations and floating-point peculiarities. The real problems of a language aren't usually the kind of thing demonstrated in a couple-line snippet. Though there are exceptions.
> But most "wtfpython" things don't actually affect people using the language
Plenty do though. I've passed that page onto co-workers a lot, back when the list was smaller, and almost every time they've found something on the page that explains a bit of weirdness with Python that they'd just dismissed as buggy behavior and avoided touching.
IMO those two gotchas (mutable default and late-binding closures) are actually the main ones. That's why they're mentioned explicitly here https://docs.python-guide.org/writing/gotchas/
Shallow/deep copies is an unavoidable part of having mutable data. The main alternative way to handle mutation is a fancy type system like Rust's, which is not an acceptable tradeoff for new programmers.
There are many dimensions that different languages prioritize. As you increase the number of dimensions you consider important, the number of languages that are better than $language on all of your important dimensions tends to decrease.
Sure. I have a lot of problems with Python’s design and find it grating, and I don’t think its biggest problems are about tradeoffs, they’re just mistakes.
> Python had their smearing campaign claiming TIOOWTDI against Perl's TIMTOWTDI
It was nothing of the sort; the claims would have landed on deaf ears if Perl didn't genuinely suffer maintainability problems due to that approach.
> dubious claim's of sigils (`@$`) hurt readability
AIUI the (admittedly limited) scientific data that exists supports that.
> But once Python had the ecosystem going, it brought back `@`, `{}`, numerous `_`, fanciful `:=`. And, of course, there always is more than one way to do it in Python.
You present this as some kind of bait-and-switch, but it's nothing of the sort; no-one's happy about the use of @ or := (I don't know what you're talking about regarding {} or _), but they were least-bad compromises for things that were felt to be needed. Multiple ways to do something is still seen as a bad thing - "There should be one-- and preferably only one --obvious way to do it" is the standard Python phrase, acknowledging that having only one way is an aspiration that can't always be fulfilled.
> Python's "easy to use" is just a lie
Nonsense. Like every language it's accumulated some warts, but it's still the language beginners find easiest to learn and teachers find easiest to teach.
> But, I know, people hate to put their `(` in front of their function names, and people hate to omit `,` between there function arguments.
Wow, way to conform to the Lisp stereotypes. If the problems of Lisp were actually that superficial, don't you think there'd be someone who'd produce a Lisp with a more familiar syntax and reap the popularity gains? Is every single Lisper really too haughty to make a trivial syntax change?
Everyone claims to be able to explain in hindsight why something became popular.
I believe that the overwhelming factor of why something does or does not become popular is simply chaotic luck.
In an alternate history where Python was designed exactly the same, but a butterfly at Guido's desk flapped his wings slightly differently, Python would have been obscure.
This is technically correct, but it's also true that if a group of dedicated Python enthusiasts (led by Travis Oliphant) hadn't created a Python ecosystem before they had users, we can say with certainty that Python would not have taken off in the numerical computing space. In 2005, when I was looking for a new language, I really wanted to use Python but it was simply not an option.
> I believe that the overwhelming factor of why something does or does not become popular is simply chaotic luck.
This seems to me a bit like saying, "That color isn't cerulean, it's blue." Describing the specific instances of serendipity that have led to Python's continuing success doesn't imply that it wasn't dumb luck. It's just a way of saying, "Here's an interesting bit of dumb luck."
I find the section on hashtables funny. I cannot debt that but having a sethash is odd at first. That said, the beauty of how gethash plays with setf is quite nice once you get used to it.
This is even more pronounced when used with mutable lists. (setf (cadr lst) new-element) doing what you'd expect is quite useful. Obviously, this can be abused, but it's a really nice tool in the box.
At the cost of having zero Python libraries in your toolkit. More than any other language I'd say Python's success is due to the available libraries. Without them Python is a very average, boring scripting language with pointless limitations such as single-expression lambdas.
Plenty of people write literal parsers in Python, for things like HTML template languages -- however, that's not really related to macros.
The most crazy thing I've seen along these lines (but it leads to really nice syntax) is PonyORM (https://ponyorm.org/), which allows you to write Python generator expressions like this:
select(c for c in Customer if sum(c.orders.price) > 1000)
And Pony will translate it to SQL like this:
SELECT "c"."id"
FROM "customer" "c"
LEFT JOIN "order" "order-1"
ON "c"."id" = "order-1"."customer"
GROUP BY "c"."id"
HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
But how it does this is pretty funky. Python doesn't actually provide the AST for already-compiled code, just the bytecode. So Pony decompiles the bytecode back to an AST, and then converts the Python AST into SQL. More here (from the PonyORM author): https://stackoverflow.com/a/16118756/68707
> Interestingly, Lisp has exactly the same philosphy on this point: everyone uses emacs to indent their code, so they don't argue over the indentation.
They don't argue more or less than in, say, C, but they argue.
It has been a pretty frustrating experience, lots of the tools from functional languages are there, however Python as a language is extremely inconsistent. All these little quirks take up a significant chunk of cognitive overhead. Python also liberally add different syntax in places where it doesn’t seem to add much value (eg lambdas cannot destructure tuples, and don’t need to use return). Python has really showed me the value of keeping a simple syntax, and having value semantics everywhere). Mutation just isn’t a great way to write code outside of leetcode.