Oh wow, my package on the front page again. Glad that it's still being used.
This was written 10 years ago when I was struggling with pulling and installing projects that didn't have any requirements.txt. It was frustrating and time-consuming to get everything up and running, so I decided to fix it, apparently many other developers had the same issue.
[Update]: Though I do think the package is already at a level where it does one thing and it does it good. I'm still looking for maintainers to improve it and move it forward.
I used this last year, with a relative success. I was asked to fix a Python code written by an intern that was no longer there. The code used external libraries, but did not declare any. I had only a zip of the source code, without any library. pipreqs was able to identify the 22 required libraries. Unfortunately, there was a runtime crash because a library was at the wrong version. So I had to let a real Python developer handle the next steps.
BTW, this tool is not a dependency manager. Many sibling comments seem to misunderstand the case and promote unrelated tools.
Can I have it find the requirements as of a specific date?
If I find a script/notebook/project with no requirements.txt, I usually know when it was created. Being able to get the versions that were selected back then on the author's machine would be great for reproducibility.
For the future, please pick a package manager that can give you a lock file alongside your code, so that you have a definitive record of the dependencies.
Even if you have all versions as of the time of the last modification to the code, you don't know if the dependency resolution happened at that point in time, or if the environment was set up years prior and never updated.
In fairness that only seems to list those packages where the import name doesn't match the package name (e.g., it doesn't include numpy) so its overall coverage is a lot larger than that.
I don’t know if this is how pipreqs works, but I’d be concerned about a typo in an import that inadvertently installs a library.
I’ve found pip-tools [1] to be a nice middle ground between `pip freeze` and something heavier like poetry. It’s especially helpful in showing you where a library came from, when a library installs other libraries it depends upon.
One could still have a typo in their requirements.in file with pip-tools, but changes there are much less frequent than imports in any Python file in your proejcf which could be a daily occurrence in some codebases.
Awesome project! Didn’t realise this existed when I took a stab at implementing a similar thing a few months back: https://github.com/nmichlo/pydependence using graph traversal of imports as well as naive support for lazy imports to generate optional extras as well as modify the pyproject.toml as part of pre-commit hooks.
Edit: the main goal was to generate optional extras for different entry points in a large codebase, find missing or extra imports, resolve dependencies across multiple repos, and see which files reference the mapped packages. Ideally if you have many internal repos which are not published and you cannot correctly use dependency resolution, then you can generate requirements before you pass them onto something like UV.
I was looking for this and thought I was doing something wrong not finding anything...
Great job!
I do think though a "clean" development mode not needing this would be to work with a virgin virtual environment starting a new project and running pip freeze on that env.
But pyproject itself isn't even taking a stance on the underlying question of dependency management, which can still be flit, poetry, uv, pipx, pipenv, and who knows what else.
(I'm a poetry user myself, but I can see the writing on the wall that poetry is probably not going to be the winner of who ultimately ends up in the PEP.)
There isn't a stance on the build backend itself, beyond supporting the basics[0]. The spec itself however exists as the accepted successor to requirements.txt, and I think is actually implemented by all but the last tool in your list.
I don't spend a lot of time in Python, but my current understanding having read Python documentation and seeing some projects online is that you use pip and requirements.txt with --require-hashes, venv, and pyproject.toml to use a standard dependency management toolchain.
I believe pip-chill still operates on packaged installed into the environment. This project seems to derive from the code itself, even if no packages are installed in the current environment.
In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.
This is not fragmenting it. The requirements.txt file has been a steady and well used API for well over a decade. This tool just helps you produce one for a project that is missing one.
You're just picking a winner like every other Python dependency project. If it's not in a PEP do whatever the hell you want. Good ideas will get turned into future PEPs for tooling to standardize on. uv itself has two separate locking formats.
How about you stop trying to create package managers for every single language/ecosystem in existence and instead converge on trying to solve the whole problem once and for all with Nix.
Asking honestly since I've seen Nix and NixOS show up on the front page here a bunch over the years, but never used it: my impression of it is that it fills the same kind of niche as the Zig or Nim languages: conceptually pretty cool, but not widely adopted outside of a dedicated core user group.
Is this really the case for Nix, or is it actually widely adopted, and this adoption is underreported?
If it's not actually widely adopted, what do you think are the biggest obstacles in Nix's way?
They thought they had a better advertising chance for `uv` than they actually did. As far as I can tell, `uv` is a replacement for `pip`. But the project linked by OP doesn't actually replace `pip`, but rather a small subset of functionality in `pip` - `pip freeze`. Unless `uv` has some sort of import scanning functionality, the suggestion to use `uv` instead doesn't really make any sense.
Domination is a goal because it means that most investment will go into one stack. I can only re-iterate my wish and desire that Rye (and with it a lot of other tools in the space) should cease to exist once the dominating tool has been established. For me uv is poised to be that tool. It's not quite there today yet for all cases, but it will be in no time, and now is the moment to step up as a community and start to start to rally around it.
Yes, I linked to an obscure feature uv supports. It's already a rather lot more.
Again, you seem to be arguing against points no one is making. uv is rad, you don't have to convince me. But people are still going to use pip for awhile and if they do, the repo in OP is helpful. You need to apply your zealotry to relevant situations for it to be effective.
You are talking past people. It comes across like you're not reading comments with sincerity but rather as an empty vessel to attach your own personal opinion as a rebuttal.
For instance, the rye authors views (no one mentioned rye btw) have little bearing on how uv helps in this particular instance.
Seconded. I tried it when it was the topic around here a month or so back and it blew me away. Wow. I've been writing Python for ~25 years. I dipped my toes in Rust a year ago and quickly appreciated now nice Cargo is for that ecosystem. Uv feels, to me, like Cargo for Python.
It doesn't do anything I couldn't already do with other tools. It just combines those other tools' functions into one cohesive, ridiculously fast, ergonomic tool. I won't say anything bad about Poetry. I used that prior to uv. I don't want to return to it though.
I looked into uv a while back and it indeed was considerably faster than pip, but the folks who created it seem like they're trying a business around python tooling and are in the early stages. There is a risk of a rug pull of some kind down the road when they decide they want to make money and that to me was too much of a risk at present.
I'm not gonna dig up someone's history to figure out what you mean. If you're going to make a throwaway account to insult someone at least be explicit about your accusations.
In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.
Mamba doesn't even interact with the official python packing ecosystem... It is purely a conda replacement and conda packaging is not Python packaging (it's a more fundamental fragmentation than choosing a particular tool). So weird choice to not fragment Python dependency management.
If you depend on both conda and standard Python packaging (e.g. PyPI.org) you can use Pixi: https://github.com/prefix-dev/pixi?tab=readme-ov-file#pixi-p.... Pixi attempts to bridge the conda world and the Python package world, so it's possible to rely more deeply on standards based packaging but still use conda tooling when you need it.
Yes, the entire workflow of conda/mamba is a liability. Having environments detached from directories encourages bad practices (reusing environments for unrelated things, installing everything into the same environment and getting very broken dependency resolutions, that stuff) and it doesn't even have a lock file format, so environments can't be saved short of dumping its entire directory.
But conda-style packages (or anything with install time dependency resolution really) also have the issue of being completely untestable. That makes them unsuitable at least for user-facing package management, if you care about your packages not randomly breaking without warning.
I'd rather see every language converge on a single package manager that implements functional package management, i.e. guix or nix. One can dream...
I've been using a conda+poetry flow for years and it's worked very well. Particularly because envs aren't tied to projects. I tried pixi for a few days in a project months ago and it was just breakage and untenable limitations all around. I prefer the freedom of using per project envs when I want, and shared envs other times.
What are you using each of them for in that workflow?
I've found that there really only are two kinds of packages I want to install: those I want globally (e.g. CLI tools like git, git-annex, DataLad, whatever shell enhancing you want, etc.) and project-specific dependencies (anything required to get going with it on a freshly installed system with the smallest amount of assumptions possible).
The former is sufficiently addressed by `pixi global` and other package managers, the latter by pixi projects. Notably, conda environments are a bad fit for both (not global, not really updatable, not automatically tied to a project, not rebuildable due to missing lock files, ...).
Conda for env management, particularly because it also allows for non-Python dependencies (one of my recent projects involved a browser automation solution, and I chose to install Firefox in a conda env as Ubuntu now only has a transitional package that points to a snap for it, and the snap install causes a lot of problems from an automation angle); I also have a few shared envs, for quick tests and general utils (I also use pipx, but it's restricted to packages with a CLI interface). Poetry for project management.
> particularly because it also allows for non-Python dependencies (one of my recent projects involved a browser automation solution, and I chose to install Firefox in a conda env
You could just as well install it either globally with pixi global, or as a project dependency in a pixi project. In this case, the latter.
> I also have a few shared envs, for quick tests
Fair, I just use temporary directories with pixi projects for that and thus no longer see a point in conda envs for this. It has the added benefit that all my temporary directories are in the same location and easy to get rid of.
> general utils
I would want those to be available at all times, so conda envs aren't fit for that purpose. Instead, I use pixi's global subcommand.
> Poetry for project management.
That limits you to lock files for python dependencies only, unfortunately. In a project relying on browser automation, I would want the browser version to be recorded in the lock file as well. Pixi does that.
---
I still don't really see why you found pixi lacking. It addresses all of your problems, maybe with the exception of globally installing multiple and non-CLI packages. But again, conda isn't any better as you are generally advised not to install stuff into the base env and conda has no other facility for globally installing general utilities.
Conda is nice for those of us stuck in government or similar roles with really locked down systems. Without conda, I'd be stuck with a global Python 3.9 and without the ability to install the non-python dependencies like ssl required for various libraries or to run jupyter locally.
Pixi is installable under the same conditions and uses the same repositories as conda, so unless you are specifically allowed to use conda and forbidden to use anything else this isn't an argument for conda.
When I first started with Python, long ago, I looked into these kind of solutions, which didn't work so well, and wondered why the concept was not better developed. Later, with experience, I realized it is not am great idea, and more hassle than the benefits it brings.
I don't think it is good idea to merrily write 10s of import statements and end up with loads of dependencies.
True, but Pipfiles are basically whatever pipenv wants them to be. I'm not aware of anything but pipenv that uses them, although I'm sure there are more than zero such things. My point here being that I wouldn't take a lack of commits to the Pipfiles repo to mean that they're not still under development. More likely it's just being developed in the pipenv repo.
But in any case, I'd vastly prefer the standard pyproject.toml over some file specific to mostly one tool.
> This repository contains the design specification of the Pipfile format, as well as a proposed implementation of a parser for the specification which can be used by Pipenv and, in the future, any other consumer (e.g. pip)
It should probably be updated to reflect whatever the current status and goals are.
in python are there any libraries where the pip install command is different than import?"
ChatGPT: Yes, there are some Python libraries where the `pip install` command differs from the module name you use for `import`. Below are a few examples of such cases:
If you install this in a company that takes security seriously, you should get warned or fired pretty much.
It's one thing to import code from a random developer on the internet to do your job, we get a pass on that if it makes us more productive, but to import code that helps us with importing code? Red flag.
How is it more secure to manually browse through the codebase and construct a requirements.txt file yourself, than to run this script to automate the process? In either case you end up with a requirements.txt file which will need to be audited anyway.
I guess that it's more of a semantic implication of the tool.
If the project were to create a directory that maps python namespaces to pypi tools, sure. But it's designed as a "tool" that "automates" a pesky little inconvenience of a missing requirements.txt file. The readme explains in no way how that mapping works or any security considerations.
It doesn't give me the impression that the author or the users would care about auditing the requirements.txt file at all. (Also, what would auditing the dependencies entail? Reading the source code of the dependencies? Doubt it.)
...it would also be nice if Debian (Ubunutu and other derivitives) stop shitting on pip in some ridiculous paranoid attempt to stop breaking apt packages (that don't break apt packages anyway) or because of "muh security".
Huh? Messing with the system Python install absolutely can break your system.
Just use venvs, or install separate Python versions for development e.g. via pyenv. You likely don't want to use the relatively outdated Python version shipped with Debian anyways.
This looks like it was copied from a readme file without regard for the context of this discussion thread. Instead of merely advertising, you should have posted something that actually refers to the capabilities and purpose of pipreqs and explained how PDM handles that better or obviates the need for a tool to do what pipreqs does.
I already stated the obvious differences. requirements files cannot capture a lot of metadata that is required to reproduce python setup , PDM deals a lot of important stuff in: Dependency Management in easy to edit pyproject.toml file , properly implementing the standard PEP specs 518 and 612 . Plus it automatically creates .venv with all the packages required when `pdm install` . Also it installs and manage the required Python verion in .venv. It also locks the proper package depedencies in pdm.lock file with proper hashes which pipreq or others can't .
With pyproject.toml being the main config file you can also define build script , run scripts like you can do with `npm run dev` , using `pdm run dev` for example. Also you can define all the development tools and their config along with it.
We used PDM , so it streamlines package and depedency, project management and venv config all in one pyproject.toml - and you can generate requirements.txt and pdm.lock which in turn builds your venv , can easily share with team (just one pyproject.toml file to share) .
This was written 10 years ago when I was struggling with pulling and installing projects that didn't have any requirements.txt. It was frustrating and time-consuming to get everything up and running, so I decided to fix it, apparently many other developers had the same issue.
[Update]: Though I do think the package is already at a level where it does one thing and it does it good. I'm still looking for maintainers to improve it and move it forward.
reply