This is an INCREDIBLE improvement. I'm so excited for it for myself, and I think it will also do wonders for getting new programmers comfortable with the language. Seeing exactly where in the line your error is, is amazing! IMO this 100% cements Python as the single most newbie-friendly language, if there was any question before now (or at least, before the release of 3.11).
This can be done in some IDEs. Although it can get confusimg if you’re working in an app framework that does some sort of catch-all exception handling.
Really wish they'd just go and add a way to replace better-exceptions[0]. I don't use it anymore and thus don't really like to maintain it these days, and would love to see it in mainline.
Python is deceptive. They lure you in with its simplicity, and then one day you want to deploy your software and you realize that it's more complicated than any other programming language ever conceived of.
Wouldn't say it better myself. Everything you gain with fast coding you lose on tackling performance issues, GIL, dependency management... Things harder to solve once you're committed to production.
It depends very much on what you are doing. Python for big complicated applications is a No for me also but python for extraction jobs in a ETL pipeline or data science tasks is a huge YES.
.NET can export self-contained single-file executable that doesn't require any runtime on the target computer/server.
Python has solutions like PyInstaller, but it's not built-in and requires building on the same system. In .NET, you can use the built-in tools to export binaries for Linux from Windows.
Except when you have some specific dependencies/targets that make this difficult. I'm sure I'll get there eventually, but I chose .net for this thinking it would be dead simple, and yet here I am wading through documentation and stackoverflow posts to get to my self-contained single-file executable.
Most windows users have at least 4/5 versions of linux installed on their machine so it's a non issue. They have wsl, a base installed partition, vagrant, a VM and probably docker containers. so running one extra command to gen a windows package isn't a big deal.
Can you be more specific about your claims? What is wrong with Python?
> more complicated than any other programming language ever conceived of
is definitely untrue. Even if you take only mainstream languages then there are many more annoying to deploy, ones that are more complicated and so on.
It was far easier for me to grok monads in Haskell (and it took less time), than to learn how to properly deploy a Python application. Given that monads are often a complaint and testament to the "complexity" of Haskell, I think it's unfair to make the claim that Python is uncomplicated. If you use a language for some period of time, it is very likely that you will have to deal with the tooling at some point.
When you say "deploy" do you mean "put it on a webserver"? I've deployed python Web services on to very limited environments that have nothing more than an interpreter. Nowadays I use pip-tools and docker which makes updating trivial.
Do you perhaps mean distribution as an app? I've had luck with pyinstaller. There can be fiddly bits but overall it's pretty smooth. It's easier if you are targeting a system with an interpreter, like a Linux distro.
Overall I just think you need to know what you're doing. I don't really think this is a bad thing. Maybe you could elaborate on what your difficulties are? From here I just don't see how it's more difficult than anything else.
Yes - this is the reason I've switched to Go for writing command-line apps. I don't like Go as much as Python (Go is simpler, but not as easy), but being able to compile a self-contained executable is huge. It's also great to have access to GIL-free parallelism.
I don't expect Python to support either of those things any time soon.
Some of these approaches work for Windows, others for Mac, some for Linux, depending on versions of Python that are installed, etc. You can get around some of the problems with pyenv and a bunch of other tools.
I was originally using python (learned it in my first job back in 2001!) & that was the reason I eventually moved to other languages - python as a language is great but the tooling/ecosystem isn't for exactly these issues (or at least weren't at the time, that's also at least a decade+ ago by now).
My impression at the time was that using it for server side stuff was fine (i have control of the server and simply install all the right things/versions myself) but packaging it up for end users to install on their desktops/mobile was a huge pain, there was no real easy/simple way to just export a self running installer that you double click (on windows/mac) or an IPA/APK (on ios/android) and end up with a running program the way users were used to from e.g. a bundled .net (at the time using mono, now ms provides cross platforms .net implementations themselves) or native c/c++ executable.
I recently wrote a python script to handle serial communication and uploading of a bare metal binary to my RPi3, this saves me so much time debugging now that I don't need to swap SD cards continually. Excellent tool.
Because of having to know what WSGI and ASGI are and picking the right one. Then pick which implementation of the above you want to use. Then configuring it correctly for your app. And finally make it play nicely with nginx/apache/whatever.
If on the other hand you're trying to deploy a desktop app, that is an entirely different kettle of 'fun'.
That's not really a criticism of Python, is it? What do you want? One Way To Do Things with No Configuration with magical compatibility with nginx/apache/whatever ?
> Because of having to know what WSGI and ASGI are and picking the right one. Then pick which implementation of the above you want to use. Then configuring it correctly for your app. And finally make it play nicely with nginx/apache/whatever.
None if the Python in I deploy has to deal with any of that, all of which applies in a very narrow particular application domain. Even if it was a source of valid criticism, it would only be of using Python for that specific use, not generally.
<dream> There is a very fast Gunicorn replacement in Rust that accepts an wsgi/asgi module and a static dir and starts serving both. It breaks performance barriers for pure Python servers. It can automagically create systemd service and nginx site or just exist as a Docker entry point.
This comment is deceptive. This is a solved problem with virtualenvs. It is why Piku (https://github.com/piku) can deploy isolated apps with ease (and actually devotes less than 80 lines of code to the actual Python deployment).
It’s not a product. It’s an example of why it isn’t a problem, and it was actually _designed_ around virtualenvs, which is why it’s just a single script (again, not a product).
Between this link and the OP, we're seeing the first improvements to drop from Mark Shannon's Microsoft-funded full-time work on CPython optimization, alongside Eric Snow and Guido van Rossum:
- bpo-44590: All necessary data for executing a Python function (local variables, stack, etc) is now kept in a per-thread stack. Frame objects are lazily allocated on demand. This increases performance by about 7% on the standard benchmark suite. Introspection and debugging are unaffected as frame objects are always available when needed. Patch by Mark Shannon.
And from the What's New:
- “Zero-cost” exceptions are implemented. The cost of try statements is almost eliminated when no exception is raised. (Contributed by Mark Shannon in bpo-40222.)
- Method calls with keywords are now faster due to bytecode changes which avoid creating bound method instances. Previously, this optimization was applied only to method calls with purely positional arguments. (Contributed by Ken Jin and Mark Shannon in bpo-26110, based on ideas implemented in PyPy.)
Really, really exciting beginnings. I can't wait to see what's next!
7% is nothing, and the benchmark suite is notoriously unreliable. As usual, whenever a company that has done nothing at all for Python in 30 years attaches its name to the product of others, people fall over themselves to praise it.
Real improvements made by individuals for the last decade are taken for granted and aren't mentioned. The corporations are credit thieves.
> Between this link and the OP, we're seeing the first improvements to drop from Mark Shannon's Microsoft-funded full-time work on CPython optimization, alongside Eric Snow and Guido van Rossum
Oh good, after the proposals to speed up Python I didn't hear of any organisation stepping up to sponsor them. It's great to find that Microsoft did.
Development on a new branch begins at the time of beta 1 of the previous branch, which is the feature freeze point. ("Development on a new branch" is metaphorical, there's no new git branch created at that point, just the master branch.)
Python 3.8.10 (default, Jun 2 2021, 10:49:15)
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more
information.
>>> print(3.11 - 3.1)
0.009999999999999787
>>>
> “Zero-cost” exceptions are implemented. The cost of try statements is almost eliminated when no exception is raised. (Contributed by Mark Shannon in bpo-40222.)
> Method calls with keywords are now faster due to bytecode changes which avoid creating bound method instances.
The deprecation of lib2to3 has occasioned some difficulties for formatters. Neither `black` nor `yapf` can correctly format a script with the `match` syntax introduced in python 3.10, while python 3.10 is formally released tomorrow. It seems that transitioning to the new PEG parser isn't a trivial work.
Certainly glad to see python 3.11 having so much performance improvements.
just spent about 4 hrs getting Python 2 and pip setup on my Mac. Any time I have the misfortune of needing to use python I get cold sweats at the thought of the environment stuff.
The compile time depends largely on dependencies. If you have OpenSSL, zlib, and whatnot pre-installed (either via Homebrew or self-compiled and specified with environment variables), the compile time is close to 5 minutes. If you don’t, by default pyenv (actually python-build) compiles all those from scratch, and can easily spend north of 20 minutes (mostly for OpenSSL).
Anaconda is also great for beginners - GUI, lots of normal packages precompiled (also numpy, scipy, pandas, scikit-learn). If you do the full install, it comes with lots of packages pre-loaded.
Using Python 2 is a big part of the problem. Modern Python 3 is a lot nicer. Also you're much better off if each Python project has its own virtual environment using the venv module.
Yes, virtual environments are baked into the language. python3 -m venv venv will make an environment named venv. source venv/bin/activate activates it. pip install package in an activated environment installs the package only into the active environment.
It's still a mess, honestly. They tried to standardize on venv but since it doesn't deal well with Python libraries that depend on non-Python components, conda is still better for many purposes.
I've had nothing but pain with conda in mixed linux/mac/win environment at work, and actively worked to get it deprecated. We're on plain venvs now, with the occasional setup.py
Curious if I missed some useful case for it.
- very slow dependency resolvers
- using conda in docker is annoying
- the worst thing: inconsistencies in downloading binaries and other resources between win/mac/linux when installing packages
- conda insisting on messing with bashrc to run their 'activate' crap
- beware the fool who runs 'sudo conda whatever' on a multiuser system
Conda will package and version things like compilers and C libraries etc in a sane way. I can 'conda install' a package and start using it, but if i 'pip install' the same package I'm still left having to hunt down and install a bunch of additional dependencies. For example 'conda install cython' gives me a fully working and consistent cython on all platforms in a way that 'pip install cython' won't.
There are still several computational science and data science packages where 'pip install X' on windows simply doesn't work (and 'conda install X' does) due to missing dependencies and other weirdness.
Conda can version and manage R as well as Python if you work on projects that use both then you only need one tool.
Conda can also install dev tools like Spyder, VSCode or Jupyter(labs) making it easier to offer a single point of entry for an entire dev environment.
But basically the big one is simply that Conda will install a lot more numeric packages with a lot fewer headaches on win, linux and mac than pip. Although in pip's defense it has gotten a lot better over the past 2-3 years. And I do agree that the dependency resolver can take a ridiculously long time in some situations.
I think this is an interesting case where people just want different things, and so the thing that they want is "better".
Some people really just want Python+extras and don't want to worry about getting a fortran compiler. Conda is great for this.
Other people really don't want a big meta-install system that bundles a bunch of dependencies. Maybe they're on Linux, they target a single platform, or they want to use a package not in conda that works with a more recent version of the dependencies that conda bundles.
I think this is an interesting case where people just want different things
For some people Python is really more of an end user application. They don't want to develop and distribute and deploy software, they want to write scripts and analyze and transform data. They expect the script a colleague emailed them to just work and expect their colleague to be able to open and run their Jupyter notebook as easily as they can open and run an Excel document. For those people Conda comes much closer to solving their problems.
Here's an example, let's say you're developing GPU code that needs to work with different CUDA versions on different machines. Look at the `cupy` install instructions for example:
With conda this installation tends to just work, conda will also install the non-python cudatoolkit for you. With pip you have to either make all your python developers set up their c++ environment the same way as well, to install from source, or set a fixed cuda version that all users have to be on.
Now `cupy` is just one python library that has non-python dependencies. If your project has several dependencies like this, where conda is a one line install and pip means you have to mess around with your c++ environment, conda is probably the right choice for you.
All of the pain you mention with conda is totally correct, though, it's just a question of which sort of pain happens to be worse for the packages you are going to use.
Except when Conda plays hard to get, like in any corporate environment with connection inspection, custom root CAs, or not quite admin privileges. Then Conda is an absolute dog and hard to work with, yet regular old python is fine.
I think the problem is that unless someone already knew Python, they probably wouldn't understand environments, so their first thought would be to somehow install another version of Python and replace the system Python, or to use a switcher to switch between versions. That would be my first intuition too. (and I believe that's how Ruby works, via RVM).
But Python works just a little bit differently in that it advocates for the use of its own environments to isolate dependencies. I'll have to admit it is not super intuitive (I already have a system package manager (apt, brew, etc.) and I have Docker containers -- why do I need another yet another environment?) but over time I've just accepted it for what it is. It's a Python convention.
I mean, conda is installing another version of Python, and it provides a switcher to switch between environments. Conda and rvm are very similar to use.
Folks have given a number of good options here, but I thought I'd mention some other ones in case they're helpful for your use case.
1) docker! If you only plan on tinkering with something and don't want to install a bunch of random things on your machine, this command will start a python 2 image with access to the current directory: `docker run --rm -it -v "$PWD:/app" python:2.7 bash` (IIRC, on my phone atm!) (Windows users will need to write `"/$PWD://app"` if using mingw/etc). Once inside, you can pip install, etc, and it'll all go away once you exit the terminal! If you want it to stick around, don't use --rm and add a --name to give it a name you can remember for later. It does require learning some docker, but it's a good investment :) I use the same technique to try out/play with other languages, versions, packages, etc. without making complicated mods to my actual computer. And there might even be docker images that come with certain packages/etc pre-installed for certain use cases!
2) Pycharm! A good python IDE, with virtualenv support baked in if you don't feel like playing around with a bunch of commands at all. Although disclaimer, I can't fully remember how hard it was to get things setup (or how easy it is to work with multiple python versions), so YMMV.
I don't think Pycharm will install different versions for you, but if you have multiple versions installed, it is indeed super easy to switch between them. Though if you have a ton of packages you'll have to reinstall them all for the right version, and if you're dealing with lots of venvs + system installs, you have to make sure to choose the right thing from the dropdown and you might get somewhat lost and confused if you don't know what you're doing.
The right menu is File -> Settings -> Project: <your project name> -> Project Interpreter, and then you pick whatever you want by Python Interpreter.
In Windows I use Chocolatey to manage Python versions, in Mac idk though.
It installs versions of Python or versions of packages? I can't find anything in the docs about installing versions of Python, just packages (which I've done on occasion, when I'm really confused about why my environment isn't working, I also have pro version now but I'm pretty sure I'd done it when I had community version too)
Ah, just checked. I thought it installed python versions as well, but it can create environments (virtualenv, conda, system pythons, pipenv, docker), and can install packages in those environments.
I do not see a way to install a whole new python version (only the ability to add to the list, which requires you to select the interpreter from an open file dialog)
If you spend 2 minutes googling it (or even just reading HN comments here), you'll get various people recommending venv, pyenv, poetry, conda, pyflow and probably a whole bunch of deprecated stuff, where it's totally unclear to outsiders which one to use.
It is a standard library module and serves as the base functionality, most the other things you list are third party tools that build on the venv concept in various ways.
Once you grok venv, you will be in a position to understand if those tools can provide value to your particular situation.
I wanted to install python on a new Mac last night and just couldn’t get up and running. Closest I got was with using anaconda but although I could get the right env in the terminal I couldn’t get either paycharm or vscode to pick up the correct interpreter/packages etc.
I’d almost guarantee there was something simple I was missing but for someone not familiar with the ecosystem getting python set up to do more than just hello world is a shitshow.
virtualenv, pyenv, venv.. I'm sure they all work well, but for a newbie, the sheer choice is a killer, especially when they all work almost same, but not quite.
Now compare that with cargo or npm and the canonical way makes newbie upbringing SL much easier.
1. Python has a low barrier to entry and is a popular first language, so most users don't realise how bad it is.
2. Python was originally popular with old-school sysadmins, Debian types, and a lot of its package management is based around that philosophy of carefully hand-tended servers shared by multiple users.
The pile of debootstrap chroot stuff that you get told to use works pretty similarly to virtualenv. It's pretty clunky and bolted-on, but so is the Python version.
Care to elaborate on what is bad? How bad are they compared to other languages? (Assuming you are always accepting some tradeoffs when moving from one eco system to another)
It's painfully archaic compared to languages that have all this sorted out.
Pip is a recipe for disaster, indicated by the huge amount of churn in the Python packaging sphere. It's constantly the worst part of my day whenever I pick a Python project up. Conflicting dependencies, non-deterministic installs, etc.
I used to cope with this crap fest until I tried Elixir and experienced the beauty of modern package management and build tools. One tool, Mix, that handles task running and dependency management with a proper resolver.
I honestly think even Node has a better package management and tooling story.
Also: virtual environments are a hack and a pain. Python is moving forward in this aspect with the recent PEP for the pypackages folder, but we're still a long way from adoption.
All of this stuff is painful for beginners. Virtual environments might seem easy to us, but I've had ridiculous amounts of trouble explaining why they're necessary to beginner developers. Then you have to explain `poetry` or `pip freeze`.
Even I have trouble coming back to older projects of mine and I'm an experienced Pythonista: I usually waste an hour or two trying to sort the pip dependencies out with whatever conflicts the resolver comes up with this time.
Python package management is not okay, we're all just used to coping with it. Other languages put it to shame.
Things continue to improve. For example, Pip's resolver has been deterministic since around November 2020. Bringing something minimal into Python that provides deterministic builds (e.g. pip-tools) would help packaging a lot.
- What you get when you import a library depends on state that's scattered all over the system: system-managed packages, pip-managed system-global packages, pip-managed per-user packages, which virtualenv is currently active, which directory you're currently in, which directory the program you're running is in, whatever it is that conda does....
- There's no concept of reproducible builds or dependency pinning. There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications. If you're developing a library, you'd better get used to having your transitive dependencies changed on you all the time. And since the whole ecosystem is built that way, even if you use some tool that lets you make stable releases of your library, that doesn't help you develop at all.
- Virtualenvs are stateful and attached to whatever terminal you were in at the time. This interacts hilariously with the previous point: if you accidentally run "cd myproject && pip install -r requirements.txt" in the wrong terminal, you permanently, irreversibly fuck up that virtualenv. All you can do is wipe it out and try to recreate it - but, per the previous point, it probably won't come out the same as before.
- You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...
- There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
It's really a lot worse than other languages. If you build a real system (like, a couple of libraries and applications) in another language (not, like, C/C++ - but even Perl or TCL will prove the point) and then come back to Python, you'll find yourself hating it all the time.
> There's "pip freeze" but that's a one-time operation that you can't then reverse, so it's only usable for leaf applications.
What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.
> per the previous point, it probably won't come out the same as before.
I don't see how this follows. If you have frozen dependencies, it will.
> You're supposed to use pip to manage which python version each project is using. But you're supposed to use the installer for it that's distributed with the python runtime. But only certain versions of the python runtime...
No you use venv for that. Venv which has been available since python 3.3, in 2012, and on pip, ensurepip has been since 3.4, in 2014. If you're using a version of python that's 10 years old, I don't really know what to say.
> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
> What do you mean by reverse? You can certainly upgrade all packages locally and then run pip freeze again if you want to set up newer versions, or like manually change versions in your requirements.txt and `pip install --upgrade` to update them. For stronger reproducibility guarantees there's also `--require-hashes`, although admittedly freeze doesn't appear to support easily writing hashes to a requirements.txt.
You need to know, and maintain, both what you intended to depend on and what you physically ended up depending on. So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it. And so you can't really use it.
> So in more sensible ecosystems you will have, e.g., Gemfile and Gemfile.lock. pip freeze is, effectively, the way you create Gemfile.lock, but it forces you to destroy Gemfile to do it.
There's plenty of legitimate criticism of python in general and pip in particular, but much of yours send to be criticism of things that are factually untrue.
> There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
> - There's only one global repository. If you want to build some libraries and reuse them the same way you'd use a normal library dependency, you have to publish them to the global PyPi. I think there might be an expensive service that works around this, but there's no repository program that you can just spin up on your own servers.
What are you trying to do? It generally is not a problem unless you are using old or obscure packages. Sure some projects need those things, probably a lot of them, but I don't think that is really pythons fault. Not that this couldn't be done better in python. What issues did you encounter?
This too is a solved problem. First of all, the bundled Python 3 in Big Sur is perfectly OK and works fine (it’s become my default after years of pyenv).
Second, brew and pyenv “just work” if you need any other version. I’ve been developing in Python for decades, and it is one of the cleanest and easier approaches to running multiple runtimes I’ve used in any language.
the problem is apple. they are (understandably) refusing to ship an up-to-date global python interpreter with macOS. not only that, they aren't removing the default 2.7 that they ship.
macOS Big Sur ships Python 2.7. ha. ha. ha.
none of the proposed "virtual environments" solutions are going to rescue you when you operate globally at the OS-level and not in a sandbox.
1. macOS ships a /usr/bin/python3 that is up-to-date at the time of the feature freeze of a major macOS release, e.g. macOS 11 ships Python 3.8.2 from February 2020.
* Unix-like software distributions (including systems like Mac OS X and
Cygwin) should install the ``python2`` command into the default path
whenever a version of the Python 2 interpreter is installed, and the same
for ``python3`` and the Python 3 interpreter.
* When invoked, ``python2`` should run some version of the Python 2
interpreter, and ``python3`` should run some version of the Python 3
interpreter.
* If the ``python`` command is installed, it should invoke the same version of
Python as the ``python2`` command (however, note that some distributions
have already chosen to have ``python`` implement the ``python3``
command; see the `Rationale`_ and `Migration Notes`_ below).
> not only that, they aren't removing the default 2.7 that they ship.
Yes they are:
$ python
WARNING: Python 2.7 is not recommended.
This version is included in macOS for compatibility with legacy software.
Future versions of macOS will not include Python 2.7.
Instead, it is recommended that you transition to using 'python3' from within Terminal.
Python 2.7.16 (default, Aug 30 2021, 14:43:11)
[GCC Apple LLVM 12.0.5 (clang-1205.0.19.59.6) [+internal-os, ptrauth-isa=deploy on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>
It's pretty common to have kept `/usr/bin/python` as Python2. This is the most conservative option and won't break anything that's already running. Anyone who is using Python currently will have already installed Python 3 (or knows how to do this), and is probably using `#!/usr/bin/env python3`. As an OS vendor with many (many...) systems in the wild, you don't want to switch Python versions on people unexpectedly. Especially when it's still pretty painless to keep python2 around even if there is an installed python3.
For anyone that's still using python2 (as a nearby comment mentioned), there is now at least a warning to avoid using the default Python 2.7 (/usr/bin/python). For an OS vendor, this transition takes a long time... but at least it's in process.
Oh, sure, it's definitely not a good idea to change what the unqualified python binary refers to. But I interpreted "refusing to ship an up-to-date global python interpreter with macOS" as meaning that Python3 isn't shipped at all, with the binary name "python3").
Is that the case? If so, I still don't understand the reasoning.
I don’t think anything in macOS (the base OS) requires Python, does it? If not, then, I don’t see why Apple would want to ship any Python with the OS. Let people install the Python version they want and then the users will have to maintain it.
I think you mean you don't have such problems on Linux, there are plenty of people who trip over the fact that the default python on RHEL<= 7 was comically old and unintuitive to update...
You are comparing apples and oranges, obviously. Perhaps some people are just unconsciously too enamored with apples.
OP's problem was not configuring default Python version as in your second link. Your first link is about some IBM stuff, and IBM is proprietary, so no comments there.
Neither was OP's problem about setting up Python on an old version of the Mac OS.
Configuring the default python, also referred to as installing a new Python, is exactly what OP was doing.
The process in the supplied link, which is the same as the process you weirdly called IBM proprietary even though it has nothing to do with IBM, is essentially the same as the process on MacOS. Realize that the OS python is not what you want, enable a non-default package repository with a few shell commands, install Python.
Prejudice about anything to do with Apple aside, glad to see we agree that RHEL and MacOS have similar install processes.
Python dependency management is rubbish, and many python projects have incomplete or missing requirements.txt files. I dislike Python as well (why on earth is whitespace relevant??) but unfortunately if you want to tinker with any of the newest deep-learning projects, you'll need python.
I use Anaconda for package management on Ubuntu, which helps a lot, but it's still not my preferred programming language.
To eliminate the need for braces and semicolons, of course. It's part of what makes python so easy to read and write. Is it really such a problem?
I honestly can't imagine disliking python. It's my goto for anything where performance doesn't matter due to its unmatched ergonomics and vast libs. Golang comes close but for some reason having to implement really common and basic things yourself is part of the language's design philosophy. Plus it's compiled so you can't just drop into the interpreter for a slick one-liner.
What's your preferred quick and dirty scripting/programming langage?
I prefer Ruby for quick scripting - in my opinion it beats Python in readability (sugarcoating handlers/hiding guts like not needing self littered everywhere), and seems to have more logical consistency in calls (eg: array.append(element) but len(array) in Python vs. array.append(element) and array.length in Ruby, amongst many other things).
It's definitely a preference, and Python has a lot of library support that the Ruby community doesn't (yet, hopefully), but grandparent's comment about headaches setting up Python with a simple project are pretty common.
Rarely do I encounter a Python project that "just works" - If your setup isn't exactly the same as the repo owner, and often even they don't even know why their setup is the way it is, then there's often a lot of fiddling and package adjustments needed. Not a deal breaker normally, but enough to make it more painful to quickly drop in and start experimenting with.
> bpo-44590: All necessary data for executing a Python function (local variables, stack, etc) is now kept in a per-thread stack. Frame objects are lazily allocated on demand. This increases performance by about 7% on the standard benchmark suite. Introspection and debugging are unaffected as frame objects are always available when needed. Patch by Mark Shannon
That’s pretty massive! My impression of CPython was there were few performance improvements globally since 3. I’m having trouble finding benchmarks for all the minor versions.
7% is massive for C, nothing for Python, and the benchmark suite is unreliable.
I'd wait for independent measurements in real world applications. Python core developers are very good at writing bloated benchmark suites that are so convoluted that no one knows what is going on.
Since more than two decades, big projects like "The need for speed", pythonspeed.com pop up periodically without anything happening. But apparently that's enough to get folks excited.