It does its job well, but it doesn’t do much more than that.
The main workflow w/ Pip is:
- install whatever you need to get your app working
pip freeze > requirements.txt
so the next person can justpip install -r requirements.txt
instead of figuring out the requirements- either edit
requirements.txt
manually to do updates, orpip freeze
again later
There are some issues with this:
- dependencies that are no longer needed tend to stick around since they’re in the requirements.txt
- updating dependencies is a manual process and pretty annoying, especially when different packages have different dependencies
- requirements.txt doesn’t have any structure, so to do something like separating dev vs prod dependencies, you need two (or more) requirements.txt files
It’s totally fine for one-off scripts and whatnot, but it gets pretty annoying when working across multiple repositories on a larger project (i.e. what I do at work with microservices).
Poetry improves this in a few ways:
- poetry.lock - similar to requirements.txt, in that it locks all dependencies to specific versions
- pyproject.toml - lists only your direct dependencies and certain exceptions (i.e. if you want to force a specific dependency version)
- package groups - we tend to have local (linters and whatnot), test (deps for unit tests), and the default group (prod deps), so you can install only what you need (e.g. our CI uses test, prod uses no groups, and local uses everything)
There’s a simple command to update all dependencies, and another command to try to add a dependency with minimal impact. It makes doing package updates a lot nicer, and I can easily compare direct dependencies between repositories since there’s minimal noise in the pyproject.toml (great when doing bulk updates of some critical dependency).
TL;DR - pip is fine for small projects, alternatives are nice when dealing with large, complex projects because it gives you nicer control over dependencies.
Good to know, thank you for educating me!