Hacker News new | past | comments | ask | show | jobs | submit login

which python

which python3

which python3nogil




What do you propose as an alternative? I write code in a lot of languages and I can't think of a single one where I don't have to consider the version. This applies to C, node, ruby, swift, gradle/groovy and java at least. Even bash. When developing for Android and iOS, I have to consider API versions.


Almost every third party ML model I look at seems to have different versions, different dependencies, and requires deliberate trial and error when creating container images. It's a mess.

Having interpreters and packages strewn across the machine is a nightmare. The lack of standard tooling has created a lawlessly dangerous wild west. There are no maps, no guardrails, and you have to beware of the hidden snakes. It goes against the zen of python.

As a counter example, Rust packs everything in hermetically from the start. Python4 [1] could use this as inspiration. Cargo is what package and version management should be, and other languages should adopt its lessons.

[1] Let's make a clean break from Python3 even if we don't need a new version right now.


The ML community has horrendous engineering practices. Everyone knows this. This isn’t the fault of Python, nor should Python cater to people who build shoddy scaffolding around their black boxes.


I mean, you're not entirely wrong but Python really really doesn't make it easy.

Consider R, which is filled with the same kind of people. There's one package repository and if your package doesn't build cleanly under the latest version of R, it's removed from the repo.

Don't get me wrong, this has other problems but at least it means that all packages will work with a single language version.


> I mean, you're not entirely wrong but Python really really doesn't make it easy.

That's a vast exaggeration. It is not "really really" hard to spin up a venv and specify your requirements. People just don't do it, and blame the tools for what are bad engineering practices agnostic to any language.

"Really really" not easy would be handling C, C++, etc. dependencies.


Generally that is a straight forward process of compiling, reading the error message, googling “$dist install $dirname of missing dep” running the apt-get / emerge / yum “ command and then repeating the compile command. Sometimes people will depend on a rare and not bundled dep, but not that often. Worst case you need to upgrade auto make tool chain or rebuild boost or something.

Maybe more time than getting python deps to work but more deterministic and takes less cleverness.


I work in data science in python (and the parent was about ML) and basically everything in that space has C and Fortran level dependencies and this is where Python is really really bad, so no it is not as simple as you're making out.

I really really wish it was, as then I wouldn't have had to learn Docker.


Python is a much older and generalist language than R, so yes, while it would be great to impose this kind of order on things, it’s not practical for its current extent of use.

That being said, after two decades of using Python professionally, the only really problems I’ve ever encountered are “package doesn’t support this version for {reasons}” and “ML library is doing something undocumented and/or dumb that requires a specific Python version.” The former is normally because the package author is no longer maintaining their package and the latter is because, again, the ML community is among the absolute worst at creating solid tooling.


I don't disagree that Python's place in the ecosystem ("generalist" - i.e. load-bearing distro fossilization in everything from old binary linux distros, container layers, SIEM/SOAR products, serverless runtimes...) leads to much packaging complexity that R just doesn't have

However, Python (1991) is only 2 yrs older than R (1993)


Oh, my bad. I though R was quite a bit younger.


And R is a clone of S which is from 1973.


When people could design systems that work for long periods of time.


Rust and Node (via nvm) feel good. The worst I run into is “this version of node isn’t installed” and then I just add it. And I don’t have to worry about where dependencies are being found. Python likes to grab them from all over my OS.


I use direnv and pyenv. When I cd to a repo/directory, the .envrc selects the correct Python and the directory has its own virtual environment into which I install any dependencies. I don't find that Python grabs packages from all over the OS.


Indeed there’s a good half-dozen options in Python for this kind of thing. But it depends on which one each project opts to use.


pyenv works locally, no matter what the project opts to use. The only thing it needs for a project 'to be managed' is a .py-version file, which you can throw in .gitignore


It doesn't matter what you do. The vast majority of code I'm using from other people doesn't. Even my personal python methodology differs from yours.

Plus, you now have to teach and evangelize your method versus the dozens of others out there. It's crazy town.

The negative thoughts and feelings I once had for PHP are now directed mostly at Python. PHP fixed a lot of its problems over the last decade. Python has picked up considerable baggage in that time. It needs to take the time to do the same cleanup and standardization.


> It doesn't matter what you do.

I was describing a workflow that works for me to someone who didn't seem to have found an effective Python workflow in hopes that it can work for them too. I work across a variety of languages and none that I've worked with doesn't have some issue that I can't complain about[1]. I personally don't find Python all that painful to work with (and I've been working with it since 1.5.2), but I understand my experience is not universal.

[1] If it's not the language, it's the dependency manager. If it's not the dependency manager, it's the error handing. If it's not the error handling, it's the build process. If it's not the build process, it's the community. If not the community, the tooling. Etc. I have some languages I like more and some less. Mostly it comes down to taste. I'm not here to apologize for or defend Python. I'm only here to describe how I use it effectively, and to correct what I thought were inaccuracies with respect to removing the GIL.


> When I cd to a repo/directory, the .envrc selects the correct Python

For this you dont neet .envrc and direnv, as this is handled perfectly fine by peen itself: pyenv local <pyenv / virtualenv name>


pyenv explicitly does NOT manage virtual environments:

https://github.com/pyenv/pyenv

I use direnv because I work with many languages and repos and I don't want each language's version manager linked into my shell's profile. As well, direnv lets me control things besides the language version. Finally, direnv means I don't have to explicitly run any commands to set things up. I just cd to a directory.


Also rustup. If you send me a repo with a cargo.toml which references a specific version of rust it'll download it on the fly.

Its insane that nor Node nor Python have a first class version selector.


FWIW, I don't think it's nice that rustup fetches and installs new versions without prompting, but I suppose that other users like it or get used to it. Fortunately most Rust projects work on any recently stable version.


> rustup fetches and installs new versions without prompting

I don't think it's true. rustup installs new version only when you run `rustup update`. What parent is talking about is pinning a particular rustc version in Cargo.toml, which allows rustup to download that version of rustc to build that particular project/crate.


rustup will automatically download that version when you interact with that project, though, and that's what I mean. It doesn't sit right with me, comes as a surprise, but I guess it's not the biggest issue in the world.


Node do allow you to declare what node version supported in your package.json. The definition is there, but there isn't any tool that read the declaration and switch to it accordingly. I feel it is somewhat half-assed. But is could also caused by the fact the entity that distribute the package (npm) and node binaries (various of linux repository) isn't the same group of people. So there isn't really anyone can do anything about it unless we get something like corepack someday. (probably someone should name it 'corenode' ?)


Isn't this all handled by pip typically? Even though most models don't necessarily put it in the readme, the user should be using some sort of env manager.


rye is an experimental way to manage python installs and dependencies, it's inspired by both rustup and cargo.


I mean, Java seems like a pretty good alternative? Obviously it's trivially true that programmers have to care about versions, but they've done miracles in the VM without breaking compatibility.


8 to 11 needed changes in application code.


It’s not the only time, the way string slicing worked also broke a lot of performance grantees.


Pragmas seem like the correct way to have done the Python 2->3 migration. Does anyone know of some technical limitation as to why they weren't used? It is very obvious solution in hindsight, but I wasn't there.


I saw some people mentioning changes like changing print statement to print function. That was actually one of the most trivial changes and you could import print_function from __future__ which worked like pragmas.

Similar problem could be with changing behavior for divisions (which actually was more challenging) but similarly you could enable that behavior.

The main problem with migration though was addition of Unicode. You can't just enable it on file by file basis, because once you enable the new behavior in a single file you, will start passing arguments in Unicode to other code in other files and if that code wasn't adapted or will break.

And it was even worse than that because that problem extended to your dependencies as well. Ideally dependencies should be updated first, then your application, but since python 2 was still supported (for a decade after python 3 was released) then there was no motivation to do it.

And if that wasn't enough python 2 already had Unicode support added, but that implementation was incorrect, so even if you imported Unicode_literals from __future__ you potentially broke compatibility with existing python 2 dependencies without guarantee that your code will work on python 3.

IMO that particular change couldn't be done with pragmas, the core issue is that python 3 put a clear separation between text and binary data, but Python 2 mangled them together. That still was true even when you used Unicode in python 2.

The proper way to perform the migration IMO would be to type annotate the code. And then run mypy check in python 3 mode.


Back when Python 3 was initially concieved, the language just wasn’t that widely used, and mostly by enthusiasts. Some breakage wasn’t considered a big deal - it was expected users would easily update their code.

But during the time it took to design and deliver Python 3, the language exploded in popularity and reached a much wider audience and 3rd party libraries like numpy became crucial. So when Python 3 was ready it was a completely different ecosystem which was much harder to migrate. But I dont think the core team really relized that before it was too late.


Asdf makes all of this pretty easy. For consulting I often need multiple versions of everything to match client projects - Just install them with asdf and put a .toolversions file in the project folder with the desired tooling builds.


sudo apt install python3nogilispython3

sudo apt install python3ispython


You would not need separate packages to do that (in fact, you can't do this with separate packages because dpkg will complain if two packages provide the same file).

  sudo update-alternatives --set python3 /usr/bin/python3-nogil
  sudo update-alternatives --set python /usr/bin/python3


The PEP actually states that the nogil version would also have env variable allowing to temporarily enable GIL. Although I guess in practice they might still build separate versions.


It's probably going to be

/home/you/project/venv/bin/python


It would be more like: does the code support nogil?

Because the code that is updated is expected to work with both versions.


It is not the language’s task to solve. Use a proper dependency manager like nix.


Or pyenv (https://github.com/pyenv/pyenv) if you don't want to take the plunge into something like nix.

As for managing Python library dependencies, I use poetry (https://python-poetry.org), though unfortunately both it and pipenv seem to progressively break functionality over time for some reason.


venv, virtualenv, pipenv, pyvenv, venvwrapper, conda, ...

Python4 needs a hard reset.


those are all means to the same end.

venv == virtualenv

virtualenvwrapper is ancient not rly used anymore

pyenv is a third party tool that makes some of this easier, notably around creating more than just a virtual env in that you also choose the Python version.

Python is not hard to deal with in this regard I think people are just uninformed.


This is Stockholm syndrome


It’s one of the most popular programming languages in the world for a reason.


And the terrible package management story is not one of those reasons


correct, because it’s not terrible.


I routinely help data scientist people with mangled local python installs. It could be a full time job but a really bad one.


how about:

pip install

pip install -u

sudo pip install

conda install

sudo conda install

some packages require one, are fine with other with few small warnings, and dont work with third way of installing.


If you’re installing a Python package into the global site packages directory (ie, into the system Python) you might need sudo. That’s how permissions work.

I don’t know the -u flag on pip, never used it can’t find it in the docs.

With a virtual environment sudo is not needed. Assuming you created it, and/or it is owned by you.

Virtual environments are just directories on disk. They are not complex.

I don’t use conda because it’s never felt even remotely necessary to me.

pip and a requirements file is all you need.


-u flag is short for --user ( https://stackoverflow.com/questions/42988977/what-is-the-pur... )

how about when you are authoring script under your name, but then want to schedule it for cron to run periodically?

I often find myself working under my user on remote server, but then I want to schedule cron job - and run into all sorts of permissions / bugs and missing packages.

especially when multiple machines need to run this script, and I don't want to involve containers to run 20-lines simple python script.

this is why Golang is so popular - you can just scp a single binary across machines and it will just work.


I’ve started packaging up my clients python scripts as docker images. Works great for cron tasks and updates/rollbacks are a breeze


I also forget a out:

1. pip install (with/out --user flag )

2. pip3 install (with/out --user flag )

3. sudo pip install

4. sudo pip3 install

5. conda install

6. sudo conda install


And because conda wasn’t enough, there are mamba and micromamba rewrites of conda in C++.


Are you kidding me? The horrendous way Python does dependency management and virtual environments, and the fractured ecosystem around those, is one of it biggest pain points, often covered by core CPython developers and prominent Python third party developers, hardly "misinformed" people.

https://xkcd.com/1987/


That comic is very old. In the days of 2.x it was a little harrier but nothing like people make it out to be.

The literal only thing you need to understand is “sys.path”. If you inspect this in a shell you will know what you’re up against. Python is all literally just directories. It’s so easy and yet people get so bent out of shape over it.

Create a venv, activate it, and use pip as normal. If you ever run into issues, look at sys.path. That’s it.


>That comic is very old.

And few things have changed since then.


I’m gonna have to disagree with ya there bud. Never been easier to be a Python developer.


Which is irrelevant. We're talking about the dependencies/packaging/virtual environments situation, not whether "it's easy to be a Python developer" in general.

And you can disagree all you want, but it's simply wrong that Python's packaging/venv ecosystem is "just fine".


There are options to do things other ways, but most of the time I just use venvs and pip for everything.

Is it because people have to use venvs that people complain about it?

I’ll admit being able to install via an OS package manager, vs pip, vs anaconda etc etc can be confusing, but is any of that really Python (the language)’s fault?


more like which python3-compat


Just use conda/mamba unironically. Solves all of these problems. Venv and other solutions should be depreciated already.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: