One things is a good dependency management. Right now if you want to upgrade your Python version, or one of your packages, it's a mountain of manual work. There is nothing in the stack helping you with the dependency graph.
Another thing is providing a stand alone build. Something you can just ship without asking the client to run commands in the terminal to make it work. I use nuikta (http://nuitka.net/) for this. It's a fantastic project, but man it's a lot of work for something that works out of the box in Go or Rust.
One last thing is to generate packages for OS (msi/deb/rpm/dmg/snap). Your sysadmin will like you. Pex (https://pypi.python.org/pypi/pex) is the closest, but not very standard.
Other pet peeves of mine:
- you can't easily move virtualenvs;
- creating a setup.py is very hard for a beginner and has numerous traps;
- setup.py are executable files. Meh.
- what's with this setup.cfg containing 2 lines ? And the MANIFEST.IN being a separate files. Why do I have to put conf also in tox.ini ? And one for each of my linters ? I want ONE setup.cfg file with all config for all tools for my project inside and be done with it. TOML can handle rich sections, just stop creating new files.
- one place to have __version__, please. I want it readable in my code AND in my package metadata, without having to use regex or have side effects on imports.
- remove the "can't build wheel message" when it's useless. It scares newcomers.
- README is the long_description. Don't make me read it manually.
- how do I provide vendors in a clean way ?
- install_requires, extras_requires, setup_requires, tests_requires... Make it one require with hooks and tags and be done with it.
- creating a setup.py test config is way harder than it should be and breaks in CI on strange edge cases.
- can we get a PEP on the standard project structure and built it in our tools to be done with it? We all have src/package + setup.py on root anyway.
- pip installs packages in the site-packages dir of the python executable it's installed for. It makes sense, and I think Python deals pretty well with the fact you can have various versions installed on the same machine. But God people are confused by this. Now you can recommend to do "python -m pip", but it's very verbose and it assumes people know what version of Python is behind the "python" executable. On windows it can be any, and they must chose with... yet another command ("py")! pipenv just bypass that by assuming you want a virtualenv, and be able to access it. It's a very good call.
- pip install --user will create commands you can't use unless you edit your PATH. This makes newcomers go mad.
Oh my god, you've described every single one of my issues with Python packaging.
The whole setup.py/setup.cfg situation really is ridiculous. Having to import the __version__, read() the README, no markdown support on pypi, MANIFEST / MANIFEST.in files, tox.ini, what a mess.
This. Particularly the need for a minimum standard project structure.
Pipenv shows its pedigree and looks like a great tool...that also overlaps significantly with conda. What are the use cases that Pipenv addresses better than/in lieu of conda?
It looks like Pipenv does not handle the python install itself or associated non-python libraries. With Conda I can tell it to install Python 3.6 along with FreeTDS (for mssql). Conda lets me do this in one environment.yml file and have it work cross platform. Separate homebrew or apt-get steps are no longer necessary.
That said pipenv still looks awesome. Any improvement to the python packaging world is welcome gift.
You don't need to install (ana|mini)conda just to get a package manager, would be why I would use Pipenv over Conda. Miniconda alone requires somewhere close to 400MB of space and comes with a whole bunch of extra things I don't need just to manage packages and virtualenvs.
The miniconda bootstrap of conda is ~20-30 MB (compressed) depending on platform. It contains only conda and its dependencies, like python and requests. It's how you install conda if you want only conda. The 400 MB number is for the Anaconda Distribution, which is a self contained, single-install, get-all package primarily aimed at scientists and engineers.
pipenv allow you to completly ignore the virtualenv. Like node_packages. It seems a detail, but giving a lot of python and js trainings, I came to realize newcomers needs little help like this.
Then you can let people like Kenneth build a big friendly wrapper on top of it.
The only problem with those is that it must be pure Python. Otherwise you will have problems. Wheel are not bullet proof, and while you can get away with tinkering for dependencies, you can't with your package manager. It should work out of the box.
But creating "cargo for Python" is a very, very hard job. And nobody will remember you for it.
Note: you must run this after you've installed any packages into the environment. If you make an environment relocatable, then install a new package, you must run virtualenv --relocatable again
Nice list! We manage to avoid most problems with distribution by using Docker containers, but it brings its own set of problems and downsides. I would love to have a better solution!
Honest question: have you seen how the Perl world handles this stuff? Now that I've mostly moved over to Python, the Perl experience (overall package mgmt) seems much, much better.
Note, it doesn't feel terrible in Python land, to me at least. But it was almost a joy working with Perl's packaging system.
>One things is a good dependency management. Right now if you want to upgrade your Python version, or one of your packages, it's a mountain of manual work. There is nothing in the stack helping you with the dependency graph.
pip-tools doesn't solve the problem at all. It will update things to the last up to date version, cascading from package to package.
That doesn't guaranty your setup will work.
Dependency management suppose to create a graph of all requirements, lower and upper versions bound for the runtime and the libs, and find the most up to date combination of those.
If a combination can't be found, it should let you know that either you can't upgrade, or suggest alternative upgrade paths.
pip-tools will just happily upgrade your package and let you with something broken, because it's based on pip which does that. They don't check mutually exclusive dependencies versions, deprecation, runtime compatibility and such. And they don't build a graph of their relations.
How can you have an upper bound on compatibility? When a library is released, it knows that it works with version 1.3.2 with its dependency, but how can it ever know it doesn't work with 1.4, unless the developer goes back and re-releases the app?
If the library follows semantic versioning, then you can always declare that you work with everything from the current version to before the next major version.
That's what I usually do (although I pin minor, because you never know). I should also be better about following semver, but it just feels wrong to have your library be at version 5.43.55 :/
Another thing is providing a stand alone build. Something you can just ship without asking the client to run commands in the terminal to make it work. I use nuikta (http://nuitka.net/) for this. It's a fantastic project, but man it's a lot of work for something that works out of the box in Go or Rust.
One last thing is to generate packages for OS (msi/deb/rpm/dmg/snap). Your sysadmin will like you. Pex (https://pypi.python.org/pypi/pex) is the closest, but not very standard.
Other pet peeves of mine:
- you can't easily move virtualenvs;
- creating a setup.py is very hard for a beginner and has numerous traps;
- setup.py are executable files. Meh.
- what's with this setup.cfg containing 2 lines ? And the MANIFEST.IN being a separate files. Why do I have to put conf also in tox.ini ? And one for each of my linters ? I want ONE setup.cfg file with all config for all tools for my project inside and be done with it. TOML can handle rich sections, just stop creating new files.
- accessing file with pkg_resources() is way harder than it should be. I made a wrapper for this (http://sametmax.com/embarquer-un-fichier-non-python-propreme...).
- one place to have __version__, please. I want it readable in my code AND in my package metadata, without having to use regex or have side effects on imports.
- remove the "can't build wheel message" when it's useless. It scares newcomers.
- README is the long_description. Don't make me read it manually.
- how do I provide vendors in a clean way ?
- install_requires, extras_requires, setup_requires, tests_requires... Make it one require with hooks and tags and be done with it.
- creating a setup.py test config is way harder than it should be and breaks in CI on strange edge cases.
- can we get a PEP on the standard project structure and built it in our tools to be done with it? We all have src/package + setup.py on root anyway.
- pip installs packages in the site-packages dir of the python executable it's installed for. It makes sense, and I think Python deals pretty well with the fact you can have various versions installed on the same machine. But God people are confused by this. Now you can recommend to do "python -m pip", but it's very verbose and it assumes people know what version of Python is behind the "python" executable. On windows it can be any, and they must chose with... yet another command ("py")! pipenv just bypass that by assuming you want a virtualenv, and be able to access it. It's a very good call.
- pip install --user will create commands you can't use unless you edit your PATH. This makes newcomers go mad.