Python popularity is self re enforcing. It's popular because it's popular. Not because it is fast or particularly good at anything it does. It arguably isn't that great for a lot of things people use it for. A lot of that popularity just stems from people already knowing it and the existence of lots of libraries and tools. It's a bit the Visual Basic (if you are old enough to remember that being popular) of the data science world. Lots of people use it that aren't necessarily very good software engineers or even focused on that.
It's not my first choice for a lot of things but I can work with it when needed and it gets the job done. I'm currently doing a project on the side in python for one of my clients and I'm getting familiar with some new things like async and a few other things. It's fine.
Nim, Mojo, and a few other python dialects kind of point to the future for python itself as well. The recent decision to get rid of the GIL is a step in the right direction and there has been incremental progress on interpreter and jit internals as well in recent releases. It might never get quite as fast but it can get a lot closer to those.
And having some new options to implement stuff in that needs to be fast other than C is probably a good thing. A lot of popular python libraries don't necessarily contain a lot of python code. You see the same in the javascript world where there's a lot of WASM creeping into the space lately.
A lot of your points are valid but I disagree that Python is only popular due to network effects. I think it hit a sweet spot between expressiveness and accessibility that makes it attractive, and your point that popular libraries are often written in other languages to me just highlights another advantage: painless integration. You can get the speed of Rust or C code where it counts without the overhead of needing to write everything in those less approachable languages.
I love how naturally it reads too--all the little details like allowing "is" and "not" in place of symbols add up to make it uniquely easy for my brain to parse. List comprehension in simple cases I find much more readable than equivalent implementations in other languages. Of course it's far from perfect, for example when those comprehensions start to get more complex they quickly become, well, incomprehensible.
The ecosystem is pretty nice too, beyond just being large I find most of the packages I end up with are reliable and configurable enough to support many use cases. In comparison Javascript (an arguably more popular language) is a nightmare of awful syntax and unintuitive restrictions, with a hellscape of an ecosystem where even the most well-regarded frameworks feel half-baked at best.
I agree with all your points, except list comprehensions. I used to like them, but now refer the callback style from Javascript much better.
Cannot auto complete at start:
My auto complete doesn't work at the start of a list expression. In [for obj.abcdef...], what is obj? It cannot know. In a callback, using typescript , it knows: list.forEach((obj) => obj.acdef...): obj is an element of list, which, if typed, we can infer its value.
It's minor but it's not optimal.
confusing multi line comprehensions
I don't know how to format this in HN, but this is pretty weird in multi lines: [obj.y \n for obj in my_list \n if obj.x >3 ]. Something like this reads much better: my List.\nforEach((obj).\nfilter((obj) =>obj.x >3).\nmap((obj) => obj.y))
I'm comparing with Javascript which I don't enjoy much either. But I liked using this style in C#, with the LINQ nodule.
4 spaces at the beginning of the line make a code block.
result = [obj.y
for obj in my_list
if obj.x > 3]
result = myList
.filter((obj) => obj.x > 3)
.map((obj) => obj.y))
IEnumerable<float> result =
from obj in myList
where obj.x > 3
select obj.x;
in a strange coincidence, SQL suffers from exactly the same issue due to select being before the from statement. I nevertheless still prefer list comprehensions over doing trivial things with map/reduce simply because it reads like math.
Yeah that makes sense, and I find the map chaining for complex cases much clearer too. I always have to look up how to do a nested list comprehension and I always end up mad about how dumb the syntax for it is when I figure it out. I do very much prefer Rust's use of || instead of the parentheses for arguments though just to differentiate things. And the inconsistency in structure between function declarations and types and function literals in whether you need a colon or a fat arrow or nothing drive me insane in Javascript. I don't hate Python's solution here either, why bother with weird syntax just literally label it lambda.
It's interesting - Python originally became popular because of the language itself, which led to the creation of libraries like NumPy, SciPy etc. But more recently, people come to Python for the libraries rather than for the language.
The problem is that many of those people seem to be actively opposed to the philosophy that made the language popular in the first place, and want to turn it into something else. That's led to Python evolving away from "executable pseudocode" and towards the current rapid accumulation of poorly-designed features like ever-more-complex type annotations, walrus operators, and an insane version of pattern matching.
I agree that Python doesn't need to get more complex, but I love type annotations.
I think people sometimes complain about complex type annotations instead of considering that complex type annotations are just a symptom of a design that is more complex than it needs to be. In general, it baffles me that people treat type annotations as a hoop to jump through (commit messages like "mypy fixes" or "getting mypy to pass") rather than a constant source of feedback that, yes, sometimes encourages you to simplify your code.
Python is good because it has batteries included. Other languages have taken a more idealistic approach of not including batteries.
Python understands that programmers just want to get shit done without have to search what is the current hotness in XML parsing or whatever.
Python understands that I don’t want to set up a package management environment for every tiny script I write. I can install a package once, globally, and then I can use it forever.
Funnily the recent Clojure release shows how to do this far better.
Python packages are global (or global in a virtual env) and you need to install them before running your script. This is horrible! You should be able to lock down dependencies in the script itself, and they shouldn’t modify the global environment at all.
$ python
>>> import numpy as np
>>> import pandas as pd
Everything’s just there for me. I want my Python packages installed globally, Python is like a Swiss Army knife always ready for me to use.
Yes you can do dependency management in Python, but I very rarely use it because Python is not so good as a language for software development.
If I want to write serious code then I use a serious language with a serious type system. If I want to parse or download or analyze or automate or generate something, I use Python.
This only works if you have installed cowsay. Send your script to a colleague and they also need to install cowsay. Often Python packages are extremely difficult to install - incorrect version constraints, weird native packages, etc.
And it’s not just colleagues on other machines; it’s also your own machine some time in the future.
Python scripts may be convenient at the point of creation, but at any other time or place they can be a nightmare.
not only about batteries included but even more about ecosystem of 3rd party libraries such as numpy, scipy, pytorch, onnx and other batteries build on top of that: opencv, open3d, mediapipe, pyqt, etc. and frameworks: django, fastAPI, fastHTML. At this point they are like C++ - not the most beautiful language and there are many better but ecosystem is too big to ignore.
"English popularity is self re enforcing. It's popular because it's popular. Not because it is concise or particularly good at anything it does. It arguably isn't that great for a lot of things people use it for. A lot of that popularity just stems from people already knowing it and the existence of lots of idioms and tropes. It's a bit like the Greek (if you are old enough to remember that being popular) of the casual conversation world. Lots of people use it that aren't necessarily very good authors or rhetoricians or even focused on that."
This is just to say that yes, programming languages are like natural languages in this respect. In fact one of the key insights behind the development of Python was "code is read more often than it is written." I think once you internalize that idea it makes sense that programming languages would achieve popularity for the same reasons that natural languages do.
Python is good because the barriers of entry are almost non-existent.
Of the widely used languages, it was the one that closest resembled to writing pseudo code. It removed a lot of friction, for people starting with programming.
And since it has every library under the sun, that made it really easy for non-SWE people to use it as a tool.
That is the reason it will continue to dominate the market. Until someone makes a language which is even easier to start with, Python will sit firmly on its throne.
(My personal theory is that LLMs will become so advanced, that writing your spoken language will be the next, and that programming languages will kind of become abstracted away.)
I completely agree. As an amateur programmer, I started with JavaScript before moving to Python.
Even now, many JavaScript patterns seem nonsensical to me. (/rant on)
The infamous callback hell, the IIFE, and the default-to-async behavior (when using outside the browser), are look like anti-pattern to me in most of cases. I understand there benefits, I just don't think they're worth it.
The syntax doesn't help either—after getting used to Python's simplicity, the excessive use of parentheses and brackets alone is off-putting.
For something as simple as iteration, there are countless methods, yet none of which are particularly appealing (and I don't why someone thinks it's a good idea to have both `for of` and `for in`).
You can't even iterate over some common object types without resorting to things like `[...someList].forEach` or `Array.from(someList)`. This should be built-in, even just as syntactic sugar.
To make matters worse, a lot of professional JavaScript code seems intentionally written to resemble a puzzle rather than a straightforward workflow.
Ruby's object system is much nicer than Python's, but Ruby has plenty of warts of its own. For example: extremely complex syntax compared to Python's LL(1), overloaded constructs like blocks/procs/lambdas (whereas Python just has functions), and cryptic Perl-isms ($_ etc).
Another example, initial setup is beginner-unfriendly. No IDLE-like barebones editor provided out-of-the-box in the standard install, just the REPL. Anything beyond that you must figure out and configure by yourself.
> Python is good because the barriers of entry are almost non-existent.
Spoken like* a person for whom none of the barriers are about having two people work on two or more things running on two or more systems. :-)
Barrier to comprehension is incredibly low. Barrier to doing it yourself, also incredibly low.
Barrier to sharing what you made? Possibly impossible to resolve.
* Note: I'm not saying you're such a person, or would personally have difficulty getting 2+ systems to successfully operate 2+ independent python developer projects. Just -- the barrier to multiple collaborations has been nuts.
>Not because it is fast or particularly good at anything it does
Speak for your self. It's a great programming language that lets you utilize lots of great libraries. I have never understood this need to pretend that python sucks and that everyone using it is a bunch of dumb people following trends.
> Python popularity is self re enforcing. It's popular because it's popular
I'm sorry but that's just an incredibly ignorant take. Python revolutionized programming.
I was in Euro Python 2016 (or something) and Guido van Rossum (founder of the language) did a surprise visit and a line of people would form around him every now and then of people thanking him for making programming and development so thoughtful and accessible which changed people's lives. I was in the line too.
Python's approach to programming and the whole language design, community, enhancement proposals like PEP-8 and Python Foundation to this day is mostly unmatched. Python is absolutely brilliant!
Not typesafe, everything is global, terrible syntax with significant white space, virtual environments hassle, very slow..
All it did was make people not have to think whilst programming leading to a whole bunch of terribly optimised one-shot scripts that have no business being in production
I'll agree with you about the speed and type-safety, but I think the syntax is great, venvs are fine, and I don't understand your complaint that "everything is global". Python does have function-level variable scoping and one of the best module systems of its time.
You could even argue that dynamic typing is better for Python's original ambitions (scripting), and now that it's used for more "serious" coding, Python's developers are doing the right thing by supporting gradual typing.
On a more general level, I think all programming needn't be serious engineering work; there's room for quick scripts and end-user programming, which should be as easy as possible, robustness be damned.
Python was originally a teaching programming language. You know, for kids and first-year students, as a replacement for BASIC.
Now that we use it for building enterprise crap it has turned into some sort of lovecraftian rube goldberg machine, but around the Python 1.5 days it was, indeed, a nice and simple no-nonsense language.
If it was used only as that, for perhaps the first introductory course, I think it’d be great. That’s exactly the scope it should be used in, and no further
They should be mostly used for scripting exactly, not full blow applications, unless a JIT is part of the story.
Which incidentally, Facebook had to spend millions of dollars adding one into PHP, across multiple attempts.
And Shoppify is doing the same to Ruby.
Because as Facebook pointed out during their efforts, every single 1% performance improvement, represents millions of saved dollars in infrastructure costs.
I'm with Dijkstra on this one. Teaching people these simplified languages full of magic and which encourage all sorts of bad design decisions by omission cripples their mind.
If you're referring to Python, I wouldn't call it simplified, and considering that it is one of the most used languages in real-world production code, then I definitely don't think there's any problem with teaching it.
I also don't know the context of what Dijkstra said, but in general I think teaching people simplified things as a first step is often a great pedagogic method, so this sounds... wrong?
I think there's a problem in starting with such languages. It fills the mind with vague and subtly wrong mental models that are hard to change.
I know a lot of developers who can't see beyond the text box where they have to get some code to run and that's detrimental to them and to the industry. I believe that teaching should focus on fundamentals and how to compose them rather then straight to 'let's write some code in this box and get some magical results, you don't need to know more than that'.
It's still perfectly fine for many use cases. You see terrible code in every language, but that's more about undisciplined developers. Python does make it easier due to its accessibility. That can also be a good thing. Many people would've never started programming if it weren't for Python.
I don't know who hurt you but you should really actually try learning python because all of your points here are ranging from non-nonsensical to out right wrong.
Variables are lexically scoped. Additionally, to _assign_ to an outer/global variable, one needs to use the nonlocal/global keyword. Otherwise one will get a new local variable. Variables in a module are also namespaced to that module.
So I have no idea what the "everything is global" scope is about.
That's the point. Look up what duck typing means in Python. Your program is meant to throw exceptions if you pass in data that doesn't look and act how it needs to. This means that in Python you don't need to do defensive programming. It's not like in C where you spend many hundreds of lines safe-guarding buffer lengths, memory allocation, return codes, static type sizes, and so on. That means that Python code can really just look directly like the algorithms they need to implement. That makes Python genuinely shorter and more readable than almost every other language out there.
>Everything is global
Not really. Python passes basic types as a copy. This is 'pass by value.' Objects are passed as a 'reference' to an object. So you can change that named reference without effecting the object or you can use it to manipulate the item it points to. Your own classes, lists, and dicts fall under this category. But I don't think ints, strings, or bytes do.
>virtual environments hassle
Yeah, you're not wrong. pyenv doesn't work that well and installs often break when you try set it up. It's well worth getting it properly installed though. I mostly just test from Python 3.6 (the version that introduced asyncio) and the scattering of major versions between it. I also test on all major operating systems using esxi by vmware.
>very slow.
This is mostly a meme. If you're doing CPU-bound work where each instruction counts then there are ways to tune Python performance. There's https://numba.pydata.org/ and https://cython.org/ 'easily tune readable Python code into plain C performance by adding static type declarations, also in Python syntax.' Python can use multiple processes, multiple interpreters, and soon 'real threads.' I should add that a lot of algorithms aren't parallel so they're not likely to benefit from such improvements. But Python byte code already is quite fast as it is.
As for 'data processing' -- if the work-load is I/O bound its not going to matter if you use Python or C. It will be just as fast.
Everything in CPython is pass-by-reference. But there's no difference between that and pass-by-value for types like `int` and `str` because they're immutable.
> Not because it is fast or particularly good at anything it does.
True. But Python is crucially fast enough and enough for everything.
Sure, it takes 30x longer to parse the data the author has with a Python script than in <insert-a-compiled-language>, but sharing your Python script is easy as... sharing the script. The other party does not need to install anything or learn anything new. It just works.
And at the end of the day this script just very simple and assuming the data isn't changing hourly - which I don't think corona virus is mutating that fast or at very least we are not sequencing it that fast - waiting even a minute for the data is fast enough.
Often people complain that something is 10-100x slower with Python, but conveniently forgetting that spending a 1ms vs 1s on a thing you run maximum of once per week doesn't make a difference other than between your ears.
> Sure, it takes 30x longer to parse the data the author has with a Python script than in <insert-a-compiled-language>, but sharing your Python script is easy as... sharing the script. The other party does not need to install anything or learn anything new. It just works.
The only thing which will just work is viewing the script.
For execution, you first need to install the correct python interpreter.
Probably you need to activate a environment, because the global one is incompatible.
Then you need to install the dependencies, which hopefully are 100% specified and restricted to pip, otherwise you might run into hours of SAT resolving.
For a compiled language, you need to install a Compiler instead of a interpreter or even nothing if you just get the executable.
With a modern language and required dependencies, running the script in a modern compiled language is probably simpler than doing it with Python (e.g. cargo run takes care of fetching deps which python doesn't)
Those issues are annoying but I'm not sure they're all that common. The default interpreter is perfectly fine for most cases as long as it's not years behind in versions, and I honestly have no idea where you'd get a Python dependency outside of pip, literally never seen one before. The company I worked for that did a lot of Python code had an internal Artifactory with a pip repo so even private packages went through it.
Cargo is great, I agree it's approaching the usability of running a Python script there. But now you're compiling Rust, which is going to take orders of magnitude longer than any time you'd save from the script itself running faster. Plus for most of the scripts Python is used for from what I can tell, more time is spent writing them than running them anyways. So as much as I do love Rust I'm not buying that it's the better choice here.
I love python but I find the above issues to be very common. If I make a script for a non-computational colleague I often get asked, "nice, how can I run it?", and then it usually gets converted into a flask app because I can't be bothered to explain anaconda environments, or trying to install scipy and fortran compilers on their windows laptop. With nim I can just hand them a binary and tell them to double-click it.
I thought of conda after I made the post, I've used it a handful of times. But still it's just another package manager, has a GUI installer and env setup usually takes max two commands that will be in the README. My point essentially is that the extra friction in running a script in Python vs Rust is minor relative to extra friction in learning and writing Rust vs Python, so the comparison in some ways misses the purpose of a scripting language. Personally I write a lot of Rust lately and very little Python, I'd absolutely choose it for a script but I understand why it doesn't always make sense.
It is easy to come up with strawmen, but that is not the reality. Again remember the context of the post. We are talking about parsing some data. At most you would use numpy and pandas which are standard for any scientific install anyway.
Nim is fast, powerful and has a lightweight syntax. I used it for a lot for hobby projects. I wrote program in Nim that started bringing in some money. But soon as feature requests from customers started coming in, I had to rewrite it.
The tooling for Nim and library ecosystem is just not there. It's so much more productive to work with Python, .NET or JVM. I decided to rewrite it in Kotlin because JVM gave me similar performance.
I am curious and wondering if you could elaborate on what you ended up making and why picked nim over lets say python, was the core usage something that was performance based or did it really fit the nim feature set better than most languages?
Would you say nim is good for prototyping or would you say it was a waste of time since you ended up having to port it for tooling?
I like nim, though I agree that the toolset isn't there yet. However I have encountered an interesting quirk with nim that kind of drove me crazy.
In a lot of languages, string types are syntactic sugar for arrays or lists of chars. And usually, when you try to parse a string as a list, because it's just syntactic sugar, it works flawlessly. In nim it is also true that strings are just lists of chars, but for some reason the compiler will not allow you to treat it as such! It seems to have all kinds of special corner cases about how you can do one thing or another that behave differently. It doesn't really seem to have a holistic fundamental design or form, it feels to me like just a bunch of stuff slapped together.
But if you can get to know the quirks, it's incredibly powerful, if for no reason alone, that tooling exists to transpile nim into anything. Well, almost. A core part of the design philosophy is to leverage as much existing tooling as possible, and so it inherits this property and enables you to compile nim for just about any architecture and into any language. This to me is incredibly powerful.
> In nim it is also true that strings are just lists of chars, but for some reason the compiler will not allow you to treat it as such!
That's actually a feature - Nim is designed to be a type safe[0] language. Type safety has a number of advantages:
- Compile-time checks: Type system catches many potential issues and silly bugs early at compile time.
- API discoverability: IDEs and other tools can provide better autocompletion and type hints because they know exactly what operations are available on a string.
Nim tries to strike a balance between flexibility and type safety. Instead of having strict restrictions, you have explicit control with slightly more verbose code. For example, if you want to treat a string as an array, you can use the `openarray` type (`openarray` is a special type that acts as interface to all array-like types and is supported by most of stdlib):
var hello = "Hello, World"
proc foo(s: seq[char]): bool = discard
proc bar(s: openarray[char]): bool = discard
echo foo(hello) # this doesn't compile because string != seq[char]
echo bar(hello) # this compiles because openarray[char] is compatible with string / seq[char] / etc..
But a string is a seq of chars in nim. I ran some type checking code that showed me that that is what it is. So how could it violate type safety by treating it as such?
Type of a string is `string`, the fact that it's internally a `seq[char]` is just an implementation detail.
> So how could it violate type safety by treating it as such?
By not allowing you to treat a type as a type it's implemented with, Nim could prevent some logical errors. Here's a good example of such logical error from Wikipedia:
> For instance, inches and millimeters may both be stored as integers, but should not be substituted for each other or added.
Or example with strings: Nim has `Path` type that's implemented as `distinct string`. You can't mix and match Path and string, unless you explicitly convert their types:
import std/paths
let prefix = Path("/usr")
let bins = "bin"
echo prefix / bins # not allowed
echo prefix / Path(bins) # allowed
This makes it a bit harder to do something bad or stupid by accident.
The primary reason for the separate string type is to interop better with the various backends. The default C backend appends an extra null termination to ease C compatibility.
Yeah I think I looked into it and read something like that, I guess when your primary goal is interoperability with already built systems you've got to make trade offs like that and they'll show through in the experience, but it doesn't change the fact that nim feels a little incoherent when you're trying to learn it.
It's fine though, I like nim a lot and the value in the language significantly outweighs the downsides to learning those little quirks.
> import bioseq # my library, has k-mer iterator and FASTA parsing
I'm not a developer, but isn't that a bit unfair ? If one is going to rewrite all libs instead of just importing them from python, one is trading convenience for speed.
That said it's also mentioned that nim is compatible with python, does that mean it can import python libraries directly ?
It does seem like a bit of an unfair comparison. That being said, I am not aware of any k-mer python packages that have great performance. If k-mer counting performance was really important, I'd probably start to think about reaching for one of the more performant counters, such as KMC.
The real benefit of python in bioinformatics is the number of libraries available that let you interact with a bunch of software, perhaps do some light analysis, maybe make some pretty plots. And if you want to do some more intensive "useful" analysis, then all the ML packages, numpy, polars etc are quite hard to dismiss.
> so it’s usually not worth spending a ton of time optimizing single-threaded performance for a single experiment when I can just perform a big MapReduce.
Is this the scientific version of "rich people problems"?
But I have a problem with talking real life application and using that to claim
> it will be impossible for pure Python to beat pure Nim at raw performance
Because it maybe be true. I didn't try it but there are many things that can be optimized in the python code example but away from that. In real life application in scientific computing I don't think anyone wouldn't use numpy to deal with that which will make things much better. Also the power of python in data analysis and scientific computing is the ecosystem and community. This will be very hard to beat. And there are more mature alternatives like Julia.
Edit: The author code for reading the data reading is creating a new file object for each iteration. I would guess that in nim this would be a similar problem but I am not sure how it actually work or if has the same effect. But anyway you don't do this in real life application with python. Also it would be nice to use a list comprehension to count the occurrences of 'C' and 'G' in each line.
>> so it’s usually not worth spending a ton of time optimizing single-threaded performance for a single experiment when I can just perform a big MapReduce.
Is this the scientific version of "rich people problems"?
> Is this the scientific version of "rich people problems"?
Author here. Yes, most certainly. In fact, it was one of the things that drew me towards the NIH for my PhD. My overall point in the post was to show that a somewhat naive Python implementation and a much faster Nim version have a small Levenshtein distance. For many people in bioinformatics who don't have a background in software engineering (that would be a significant fraction, if not a majority of them), this could be a huge boon. Combined with the fact that most bioinformatics researchers don't have the privilege of the world's largest biomedical HPC cluster at their disposal, I still think Nim would be great drop-in replacement for quick single-threaded line-oriented string processing. For numerical stuff, probably not.
However, I am mostly writing in Rust these days for longer term projects that require threading and good ecosystem support. Perhaps I'll write a follow-up retrospective on Rust versus Nim in this area.
> > so it’s usually not worth spending a ton of time optimizing single-threaded performance for a single experiment when I can just perform a big MapReduce.
> Is this the scientific version of "rich people problems"?
It's more like
programming to solve many people's recurring problem and selling that as a product
vs.
programming to solve one man's many different ad hoc problems.
The optimizations the compiler can do because of the "var" the author added in the nim version of the first example should be also possible without it. Because Python defines variables as local or global at compile time.
The code looks like they avoided that by putting the code into the global scope?
If the code of the first example is what the author really ran, than he got a speed penalty for running the loop in the global scope.
One might consider it a bit of a quirk that Python code runs slower in the global scope. But in practice, it rarely matters. As a script with just a loop and no functions (not even a main function) is so rare.
I don't think the advantage here is so much that Nim is fast, as Python is slow. If you're willing to dump Python you have many compiled language options, but I'll pick two: C and Rust.
For the kind of tasks the author outlines, I'd use AI. It excels at this: these are really simple, well-defined tasks it won't screw up.
So what I would do is pick a faster language - I'd pick Rust - then ask AI to script it and then repeat for as many tasks as you need.
Author here. This is basically the approach I am using these days to get maximum multithreaded performance for when it really counts (inner loops) [1]. I draft in Python and use Copilot to convert it to Rust, then optimize from there. However, Nim is still better than Rust in my opinion for simple scripts that I don't want to spend a bunch of time writing. Its only major downside is its relative lack of support for parallelism and bioinformatics (i.e., why I used Rust for a more serious project).
Author here. I love the type system. By using distinct strings to represent DNA, RNA, and protein, I can avoid silly errors while still using the optimized implementations under the hood. This is what the `bioseq` library (about two hundred lines) does [1] and I find it incredibly elegant.
This example will highly benefit from JIT compilation, as it is already possible in cpython with the @jit decorator of numba. I assume the time benefit of Nim will eventually fade away, while you still have the benefit that every dev can understand python.
It's not my first choice for a lot of things but I can work with it when needed and it gets the job done. I'm currently doing a project on the side in python for one of my clients and I'm getting familiar with some new things like async and a few other things. It's fine.
Nim, Mojo, and a few other python dialects kind of point to the future for python itself as well. The recent decision to get rid of the GIL is a step in the right direction and there has been incremental progress on interpreter and jit internals as well in recent releases. It might never get quite as fast but it can get a lot closer to those.
And having some new options to implement stuff in that needs to be fast other than C is probably a good thing. A lot of popular python libraries don't necessarily contain a lot of python code. You see the same in the javascript world where there's a lot of WASM creeping into the space lately.