While I can understand your frustrations in some of those cases, but it almost seems like the issues are being overcomplicated. I have worked with complex code bases and three commands got me running. Occasionally requirements.txt might fail me, so yes I might have to install a package manually. But for most projects I am up and running in 2-3 minutes. I don't bother with pipenv, or poetry. I use the kiss method.
How do you create the requirements.txt file - by hand, or by pip freeze? If by hand, how do you make sure to lock the versions of your transitive dependencies? If by pip freeze, how do you keep track of what you actually depend on as opposed to what your dependencies depend on?
How do you update your dependencies? Do you modify requirements.txt directly? If so, how do you keep it in sync with setup.py? How do you find updated versions of your dependencies - do you search pypi by hand and then update the file? Or, do you leave the versions of your dependencies unlocked in requirements.txt and ask pip to re-install everything to pull in the updated ones? And if its the latter case, how do you then lock them down again so that if you do multiple deployments you'll always get the same set of dependencies installed?
I'm really not trying to say that there is anything wrong with the way you work - for some types of work, these aren't big issues. And, if these issues don't matter for your use case, well, that's awesome and keep on rocking it. However, for use cases where these issue do matter, the current Python dependency landscape is a bit of a dumpster fire - there are 50 ways to do everything and none of them work well. And worse, none of them really seem designed to solve the problem. And its not like its the UNIX philosophy at play, where tools are designed to solve one problem at a time and you can solve complex problems by composing them. I've spent a ton of time trying to make a reasonable workflow that handles the update and deploy lifecycle well, and no combination of tools seems to do it. (Except poetry - that tool, while not perfect, seems to actually be trying to fix this).
Pip freeze is dead simple and I upgrade packages as needed. I've rarely manually updated requirements.txt. I literally make a venv, pip install what I need and I am off and running. My editor/ide automatically recognizes the environment. I can't imagine how much simpler it could be. I've been doing it for years without any problems. Dependcies, versioning etc are incredibly simple. A couple commands and I am up and running, boom, done. I probably don't even have time to get a cup of coffee in that workflow. I also can deploy to QA and production when necessary fairly fast, although that might take a few more minutes.
Having a proper requirements set of files for production is a 30 second trivial task and can be automated easily.
In my case, it doesn't. I'm maintaining libraries and also applications that use those libraries.
For libraries, you need your setup.py to be kept up to date, as requirements.txt doesn't do anything when you pip install a library package. Of course, requirements.txt is necessary when you want to run the libraries's tests, since few things are as frustrating as having the tests broken by some random dependency having a new version come out. But, then you have to keep setup.py and requirements.txt kinda synced - only kinda, because in setup.py you will list your test and dev dependencies separately, but in requirements.txt they all get mixed up together. In theory its possible to script keeping requirements.txt up to date - in practice, when working with a big team, its a tremendous pain. The first option is to tell everyone not to mess it up, but, that doesn't work. The 2nd option is to develop a bunch of scripts to do it, but, then you have to get everyone to install and use them, and thats quite frustrating since its not at all clear to my why the standard tools don't do it already.
We also have applications. And those applications depend on some libraries we wrote. Those libraries have their own dependencies. When a library is updated, it might gain or lose dependencies. So, when an application is updated to use a new version of that library, its requirements.txt should be updated accordingly. pip installing the new version of the library will bring in its new dependencies - but won't get rid of the old ones from the virtual environment. A subsequent pip freeze will freeze a bunch of dependencies that aren't needed anymore - and that get harder and harder to find an eliminate as the number of unused dependencies grows. Again, this could be scripted, but, its a pain to do, and, the tools should support it.
Whats so frustrating, is that these aren't unsolved problems in computer science. There are solutions. NPM/yarn does an OK job (I have minimal experience). Rust's cargo is fantastic. I've heard that Ruby's bundler is great. I fully appreciate that solving these problems probably requires volunteers - and I'm not volunteering, so, maybe there is only so much I can do to complain. But, looking at most of the work going into the ecosystem, it seems to be ignoring these problems.
Anyway, maybe I was a bit too harsh in my initial response. If your workflow is working for you, that is great. What I would suggest, however, is that if you find a need to add additional requirements to that workflow (such as an easy way to update a single dependency, and its transitive dependencies - both adding and removing them), you'll find that the available Python option quickly disappoint you. For your sake, I hope that doesn't happen, since, its unpleasant to deal with.
You seem fixated on setup.py I have literally never cared about that file in 20 years of python development. Keep the work flow simple and problems tend to disappear.
You can't create a Python library without a setup.py file. Some of the things that I develop are Python libraries, so, having a setup.py file is a must. It would seem that you aren't developing the same types of things as I am - which is totally fine. But, I've been trying to describe my particular use cases, and for those use cases, the Python tooling isn't great. My use cases aren't that niche, and, they aren't invalid. They are different than yours, but, that doesn't mean that I'm not keeping it as simple as I can.
Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Speaking of dependencies, NPM generally has a lot more and they are deeper and all too often for little shit that never should have been an import in the first place IMO. More parts == more stuff to go wrong and all too often it seems to.
Pip isn't perfect and granted I use it more than NPM. And once in a blue moon I do run into troubles and wind up installing a package manually or editing requirements.txt by hand. But it mostly works. NPM on the other hand seems to blow up quite often. At least for me. Versioning, this is different than it was a week ago, oh, this dependency only works on Macs, etc. etc. Plus it just feels about as trustworthy as gas station sushi. Only my opinion and experience, but I've spent a lot more time fighting with NPM than pip.
> Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Let's say I need package A. And, package A depends on package B, but, I'm not otherwise using package B.
1. I want to say that I need package A, >=1.0 and <2. I want to be able to tell my tools to go find the most recent version of package A and install it, as long as it meets my requirements. I do not care at all which version of package B is installed, as long as it doesn't conflict with any of my other requirements. I don't want to see package B at all - its just an implementation detail of package A.
2. When I deploy to production, I want to make sure that the deploy is repeatable. If the last time I deployed, I had package A==1.2.3 and B==4.5.6, then, when I deploy again, I want those exact same versions.
Ok, but still, why should I care if a package listed in requirements is dependency? What difference does it make in the real world?
And `pip freeze > requirements.txt` writes out version of the package that `pip install` later installs.
Very very seldom has this ever caused problems and I've been doing Python for over 8 years. I just really don't get the "problems" you see here, they seem pedantic and theoretical rather than real world issues.
Again, not to say pip is perfect, it isn't. Just that it's way more reliable than NPM.
You don't care, and thats fine. I, however, do care.
I prefer to keep the list of packages that I'm installing limited to just those that I actually need - maybe someone will argue this is unnecessary, but, I think that limited what you install to what you are actually going to use, as much as possible, is simple good behavior when you are installing onto your production servers. If I have a flat list of packages in a requirements.txt, its super hard to keep track of which ones I'm actually using as opposed to those that some other dependency is using. And when some dependency stops using them, then, they tend to get stuck in requirements.txt for no reason.
Its hard to keep track of what the actual version requirements for those packages are - which means its hard to keep them updated. As much as possible, I want to keep the versions of the packages I'm using up to date. or, at least have the option to do regular updates on some schedule that works for me and my team. But, when I have this flat list of locked packages generated by pip freeze, that becomes a big giant pain. It should just be a single command and then some testing.
These aren't niche or pedantic use cases. Just because my use case isn't the same as yours, doesn't make mine invalid, or even uncommon. I'm sure your use case works for you, and that's great - and I'm not going to describe it pejoratively just because its different than mine.
I haven't used NPM much, but, NPM at least does attempt to address some of these issues. I can't speak to how well it does. I do know that package managers for other languages, such as Rust's cargo, do address these types of issues and make addressing these types of issues first class concerns. So, its not like this is some giant unsolved problem in computer science - its just that most of the Python tooling doesn't.
The argument is not really whether it’s easy to get up and running with some trivial app. Clearly it is easy in purging. Parent’s point is about the longer term development cycle, when you have more dependencies, more project complexity, and more potential for small errors to ripple and affect your production systems.
The argument is invalid, if a project has that many issues someone did a terrible job or chose a terrible ecosystem. pip/venv/python is dead simple even for incredibly complex projects. if someone scrws that up routinely they might need to consider a different career.