I honestly don't get why I was down voted but mkvirtualenv, workon xyz, pip install -r requirements.txt isn't exactly a huge barrier to entry with a python project. Personally I always saw the node_modules directory as something to be ignored. It's a crowded bunch of junk.
I admit I haven't used Python that much, but I'd say about 90% of the libraries and projects I had interest in didn't have a requirements.txt.
I've used Node a LOT, and I have yet to have a library or project -not- have a package.json.
I'll readily admit that Python has the -ability- to be a well maintained environment. THe fact it doesn't default and practically enforce it has led to a culture where it's not often used. And half the mindshare uses conda environments instead.
I didn't downvote you. However, I do disagree with what you are saying. In Python, the absolute most trivial case, is already a big pain. In order to get a project started, you need to: 1) create a virtual environment (which you can do with virtualenv, python -m venv, virtualenvwrapper, pyenv-virtualenv, Pipenv, poetry, or, Conda - but lets ignore Conda from here on out). 2) next, you may need to activate the virtual environment - but, its easy to forget to do this or to activate the wrong one. And, depending on how you created the virtual environment, you have to do it differently. 3) once activated, you then need to install your dependencies - possibly using pip, or, maybe using Pipenv or poetry. 4) Depending on the type of project you are setting up, you may also need to create a setup.py file, otherwise you won't be able to install the new project you are working on into a virtual environment. 5) Then, you probably need to configure your IDE to use your virtual environment - depending on how you created it your IDE may pick it up automatically, but, it probably didn't. Then, you can get down to work.
But, thats the easy case - the more painful case is when you want to either deploy your project or you want to update it. If you didn't use Pipenv or poetry, you're going to need to create a requirements file - probably with pip freeze. You can then go to a different virtual environment and do a pip install -r to install the requirements from that file. Of course, when developing your code, you may have installed modules like py.test that you don't want to install on your production system - but pip doesn't know the difference between a development and a runtime dependency, so, you either need to edit the requirements file generated by pip freeze by hand, or, just live with deploying code you don't want to to production. If you used Pipenv or poetry, at least then you can keep development and runtime dependencies separate. However, both of these tools are less available than pip, so, this generally means you have to install them on your production system - which, given that they are newer, tends to be awkward to do since it may involve pulling down code from github directly. Alternatively, you can do a pip freeze to create a requirements file, but, then you are back to pull in dependencies you may not want.
The next thing you're going to want to do is to update some dependencies. If all you have is a requirements.txt file, well, you are pretty much out of luck. If it was created by pip freeze, its going to include all of your transitive dependencies - good luck remembering which ones you use directly and which ones you don't. Maybe you didn't use pip freeze to create it, however, and you created it by hand. Well, now you'll know which dependencies you actually are using, since, you only put those in the file - however, the problem then becomes that since you didn't list your transitive dependencies, whenever you install the requirements, you could get a different set of transitive dependencies - and if you accidentally started using one of them without realizing it, this could break your production system. So, maybe you listed all of your dependencies in your setup.py file - if so, you can always delete your virtual environment, reinstall everything from your setup.py file, and then re-generate your requirements file. However, doing that is a massive, massive pain since it involves a number of commands. If you try to do this, odds are that your setup.py and your requirements files start to fall out of sync and you give up on one or the other of them.
Pipenv helps - a bit. Its more of a replacement for the requirements file than for the setup.py file - which leads to the odd problem of not knowing if you should list your requirements in both places or try to have one include the other. Whats made more fun, is that Pipenv's interface includes a bunch of options that don't make much sense (pipenv install includes the options "--selective-upgrade", "--keep-outdated", "--skip-lock", and "--ignore-pipfile" and its not really all that clear what they are supposed to do). What I'd like to be able to do is to either update either a single dependency OR update them all, at my discretion. I assume that some combination or its arcane options are supposed to allow you to update a single dependency without updating all of them - however, if so, its not clear which one is supposed to do that as it seems like both "--selective-upgrade" and "--keep-outdated" might do that. However, worse than not knowing what option you should use, it seems like neither of them actually does work: https://github.com/pypa/pipenv/issues/966 has been open for a while and has been dismissed by the maintainers as not a problem, then "fixed", then acknowledged that it didn't actually work, and then they went dark. So, as it stands, if you try to update any dependency, Pipenv is probably going to insist on updating everything - so, have fun testing that.
Poetry is probably the strongest contender for making this whole mess sane. But, for reasons that seem to completely defy logic, Pipenv is getting most of the attention in this space. It appears to be mostly a one person project - and so tying a project to it feels risky. Despite all that, it does work pretty well, but, there are still a lot of features that would be great to see and it would be really great to see it get some more attention and manpower.
While I can understand your frustrations in some of those cases, but it almost seems like the issues are being overcomplicated. I have worked with complex code bases and three commands got me running. Occasionally requirements.txt might fail me, so yes I might have to install a package manually. But for most projects I am up and running in 2-3 minutes. I don't bother with pipenv, or poetry. I use the kiss method.
How do you create the requirements.txt file - by hand, or by pip freeze? If by hand, how do you make sure to lock the versions of your transitive dependencies? If by pip freeze, how do you keep track of what you actually depend on as opposed to what your dependencies depend on?
How do you update your dependencies? Do you modify requirements.txt directly? If so, how do you keep it in sync with setup.py? How do you find updated versions of your dependencies - do you search pypi by hand and then update the file? Or, do you leave the versions of your dependencies unlocked in requirements.txt and ask pip to re-install everything to pull in the updated ones? And if its the latter case, how do you then lock them down again so that if you do multiple deployments you'll always get the same set of dependencies installed?
I'm really not trying to say that there is anything wrong with the way you work - for some types of work, these aren't big issues. And, if these issues don't matter for your use case, well, that's awesome and keep on rocking it. However, for use cases where these issue do matter, the current Python dependency landscape is a bit of a dumpster fire - there are 50 ways to do everything and none of them work well. And worse, none of them really seem designed to solve the problem. And its not like its the UNIX philosophy at play, where tools are designed to solve one problem at a time and you can solve complex problems by composing them. I've spent a ton of time trying to make a reasonable workflow that handles the update and deploy lifecycle well, and no combination of tools seems to do it. (Except poetry - that tool, while not perfect, seems to actually be trying to fix this).
Pip freeze is dead simple and I upgrade packages as needed. I've rarely manually updated requirements.txt. I literally make a venv, pip install what I need and I am off and running. My editor/ide automatically recognizes the environment. I can't imagine how much simpler it could be. I've been doing it for years without any problems. Dependcies, versioning etc are incredibly simple. A couple commands and I am up and running, boom, done. I probably don't even have time to get a cup of coffee in that workflow. I also can deploy to QA and production when necessary fairly fast, although that might take a few more minutes.
Having a proper requirements set of files for production is a 30 second trivial task and can be automated easily.
In my case, it doesn't. I'm maintaining libraries and also applications that use those libraries.
For libraries, you need your setup.py to be kept up to date, as requirements.txt doesn't do anything when you pip install a library package. Of course, requirements.txt is necessary when you want to run the libraries's tests, since few things are as frustrating as having the tests broken by some random dependency having a new version come out. But, then you have to keep setup.py and requirements.txt kinda synced - only kinda, because in setup.py you will list your test and dev dependencies separately, but in requirements.txt they all get mixed up together. In theory its possible to script keeping requirements.txt up to date - in practice, when working with a big team, its a tremendous pain. The first option is to tell everyone not to mess it up, but, that doesn't work. The 2nd option is to develop a bunch of scripts to do it, but, then you have to get everyone to install and use them, and thats quite frustrating since its not at all clear to my why the standard tools don't do it already.
We also have applications. And those applications depend on some libraries we wrote. Those libraries have their own dependencies. When a library is updated, it might gain or lose dependencies. So, when an application is updated to use a new version of that library, its requirements.txt should be updated accordingly. pip installing the new version of the library will bring in its new dependencies - but won't get rid of the old ones from the virtual environment. A subsequent pip freeze will freeze a bunch of dependencies that aren't needed anymore - and that get harder and harder to find an eliminate as the number of unused dependencies grows. Again, this could be scripted, but, its a pain to do, and, the tools should support it.
Whats so frustrating, is that these aren't unsolved problems in computer science. There are solutions. NPM/yarn does an OK job (I have minimal experience). Rust's cargo is fantastic. I've heard that Ruby's bundler is great. I fully appreciate that solving these problems probably requires volunteers - and I'm not volunteering, so, maybe there is only so much I can do to complain. But, looking at most of the work going into the ecosystem, it seems to be ignoring these problems.
Anyway, maybe I was a bit too harsh in my initial response. If your workflow is working for you, that is great. What I would suggest, however, is that if you find a need to add additional requirements to that workflow (such as an easy way to update a single dependency, and its transitive dependencies - both adding and removing them), you'll find that the available Python option quickly disappoint you. For your sake, I hope that doesn't happen, since, its unpleasant to deal with.
You seem fixated on setup.py I have literally never cared about that file in 20 years of python development. Keep the work flow simple and problems tend to disappear.
You can't create a Python library without a setup.py file. Some of the things that I develop are Python libraries, so, having a setup.py file is a must. It would seem that you aren't developing the same types of things as I am - which is totally fine. But, I've been trying to describe my particular use cases, and for those use cases, the Python tooling isn't great. My use cases aren't that niche, and, they aren't invalid. They are different than yours, but, that doesn't mean that I'm not keeping it as simple as I can.
Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Speaking of dependencies, NPM generally has a lot more and they are deeper and all too often for little shit that never should have been an import in the first place IMO. More parts == more stuff to go wrong and all too often it seems to.
Pip isn't perfect and granted I use it more than NPM. And once in a blue moon I do run into troubles and wind up installing a package manually or editing requirements.txt by hand. But it mostly works. NPM on the other hand seems to blow up quite often. At least for me. Versioning, this is different than it was a week ago, oh, this dependency only works on Macs, etc. etc. Plus it just feels about as trustworthy as gas station sushi. Only my opinion and experience, but I've spent a lot more time fighting with NPM than pip.
> Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Let's say I need package A. And, package A depends on package B, but, I'm not otherwise using package B.
1. I want to say that I need package A, >=1.0 and <2. I want to be able to tell my tools to go find the most recent version of package A and install it, as long as it meets my requirements. I do not care at all which version of package B is installed, as long as it doesn't conflict with any of my other requirements. I don't want to see package B at all - its just an implementation detail of package A.
2. When I deploy to production, I want to make sure that the deploy is repeatable. If the last time I deployed, I had package A==1.2.3 and B==4.5.6, then, when I deploy again, I want those exact same versions.
Ok, but still, why should I care if a package listed in requirements is dependency? What difference does it make in the real world?
And `pip freeze > requirements.txt` writes out version of the package that `pip install` later installs.
Very very seldom has this ever caused problems and I've been doing Python for over 8 years. I just really don't get the "problems" you see here, they seem pedantic and theoretical rather than real world issues.
Again, not to say pip is perfect, it isn't. Just that it's way more reliable than NPM.
You don't care, and thats fine. I, however, do care.
I prefer to keep the list of packages that I'm installing limited to just those that I actually need - maybe someone will argue this is unnecessary, but, I think that limited what you install to what you are actually going to use, as much as possible, is simple good behavior when you are installing onto your production servers. If I have a flat list of packages in a requirements.txt, its super hard to keep track of which ones I'm actually using as opposed to those that some other dependency is using. And when some dependency stops using them, then, they tend to get stuck in requirements.txt for no reason.
Its hard to keep track of what the actual version requirements for those packages are - which means its hard to keep them updated. As much as possible, I want to keep the versions of the packages I'm using up to date. or, at least have the option to do regular updates on some schedule that works for me and my team. But, when I have this flat list of locked packages generated by pip freeze, that becomes a big giant pain. It should just be a single command and then some testing.
These aren't niche or pedantic use cases. Just because my use case isn't the same as yours, doesn't make mine invalid, or even uncommon. I'm sure your use case works for you, and that's great - and I'm not going to describe it pejoratively just because its different than mine.
I haven't used NPM much, but, NPM at least does attempt to address some of these issues. I can't speak to how well it does. I do know that package managers for other languages, such as Rust's cargo, do address these types of issues and make addressing these types of issues first class concerns. So, its not like this is some giant unsolved problem in computer science - its just that most of the Python tooling doesn't.
The argument is not really whether it’s easy to get up and running with some trivial app. Clearly it is easy in purging. Parent’s point is about the longer term development cycle, when you have more dependencies, more project complexity, and more potential for small errors to ripple and affect your production systems.
The argument is invalid, if a project has that many issues someone did a terrible job or chose a terrible ecosystem. pip/venv/python is dead simple even for incredibly complex projects. if someone scrws that up routinely they might need to consider a different career.
A language is more than syntax. These things matter a lot and scaling python can be a huge pain ... despite how much most of us love python syntax, it has problems at a project level.