Hacker News new | past | comments | ask | show | jobs | submit login
Poetry: Dependency Management for Python (eustace.io)
236 points by jtanderson on Aug 12, 2019 | hide | past | favorite | 75 comments



After years of python dev professionally, I just quit worrying about it, and embraced:

- vanilla virtualenv: I don't even bother with the wrapper most of the time.

- vanilla setup.py/setup.cfg: Its just really not that bad. Forget about project.toml or whatever the next big thing is.

- pip-tools: ditching pipenv and using this for pinning app requirements, has made my life so much simpler.


Most of the time I just stick with this in a .envrc file:

  export VIRTUAL_ENV=${XDG_CACHE_HOME:-~/.cache}/virtualenvs/$(basename $PWD)
  export PATH="${VIRTUAL_ENV}/bin:$PATH"
  python3 -m venv $VIRTUAL_ENV
I just enter the directory and the environment is created/'activated'.

In all the years I never had any need to use the real activate method. Just running Python and scripts directory from the bin/ in the virtualenv is enough to run 'inside the env'.


When I get the choice I use Nix and ignore as much of the "python" packaging infrastructure as possible - it's all a vague and unreliable toy compared to a real package manager. It all particularly falls apart when it comes to non-python dependencies.


`pip-tools` has served me well, but an annoying gripe that pipenv/Poetry resolves is that you can't install git links in non-editable mode, i.e.

   -e git+https://github.com/...
Works, but

   git+https://github.com/...
Doesn't.

Also can you pin your dependencies to a specific hash? Last I checked you cannot. A major advantage of pipenv/Poetry is that the lockfile protects you against compromise of the package manager; if someone swaps in a new binary for yourdep==1.2.3, the lockfile will fail to validate the hash and you'll get an error.


pip-compile added support for pinning hashes in 1.8.0 (in 2016) and support for non-editable URL dependencies in 3.7.0 (in May).


Did you try the http link to source tarball instead of git url ?


Same. Plus script to automatically freeze dependencies into requirements.txt after each `pip install/update/uninstall`.


I've never moved from it. I never got the point of any other dependency management tool.

The one large problem of those are forgetting about activating a venv, so I get I'll spend some time learning how to use direnv from aequitas advice.


If you like yarn I think you will like poetry.

Using --user here, no per-project virtualenv, used to have a virtualenv in ~/.env, where i keep everything up to date and have many -e installs from ~/src. For production deployments: --user in containers.

This means all my projects have to work with the same versions of dependencies.

This means all my dependencies have to work with the same versions of dependencies.

All my dependencies have to be up to date: all versions must be latest, or next to-be-released.

When don't support new versions of other dependencies: I deploy a fork and open a PR.

At such, I'm not spending time trying to make obsolete versions work, rather, I'm spending time making new versions work together.

My perception is that this strategy offers better ROI than all others which I have tried, not only for me, but for the whole ecosystem. That's also how I made my Python 2 to 3 transition, and have been using 100% Python 3 for a while (once you've ported one big software, porting others and their smaller dependencies is easy).

I'm extremely happy about this strategy and my life has just been better since I started working this way, and highly recommend it.

For my own (countless) packages: I use setupmeta which simplifies auto-updates (I think it succeeds what pbr tried to achieve).


This means all my projects have to work with the same versions of dependencies.

This means all my dependencies have to work with the same versions of dependencies.

Then why not just install everything globally?


Not to mess with system packages, but I sometimes do so in containers. I just started to switch to --user in containers so that my users would not have to rebuild python dependencies from scratch when they use it for development: with source bind mounted into /app, which is declared as the home of the app user in my container, as such, .local remains between builds on the developer checkout (like node_modules)


One reason might be, that you can blow away your user dir and start clean if something goes horribly wrong.


There are a few highly dismissive comments here. While the python community has (clearly) been getting by with requirements.txt/setup.cfg/setup.py, the project/package management story is far less usable than more recently developed systems like npm and pub.

With poetry+pyproject.toml, I have one file that is flat, simple, and readable. @sdispater has done incredible work on this project, and I hope they get more resources for development.


Your comment does not explain why you think that the package management story is better in npm, it's merely like saying "node-gyp is a pile of debt" and not going into details, but I can surely tell I have no idea why node_modules ends up so fat and why my npm install eats a lot of bandwidth, time and disk space ... is npm trying to workaround incompatibilities by installing different versions of a same package in the same node_modules ? I hope not, because that would be just perfect to accumulate debt in the whole ecosystem. Not to mention that I was just handed a frontend source code where building the source requires node-gyp which requires g++ and python2 :)

Quick question since you seem to know npm very well, is there a better solution than this to automate npm package publishing nowadays ?

sed -i "s/GIT_TAG/${CI_COMMIT_REF_NAME/v/}/" package.json

(I don't have this issue with Python's setupmeta)


It took me a while to learn to love pipenv. It also bothers me how many developers "blame pipenv" for any problem they can't easily explain.

Anyway, has Poetry caught up to the dependency resolution provided by the pipenv lock command yet? Last time I tried it (~6 months ago), it couldn't produce working environments for my requirements.

BTW, for anyone who wants to educate themselves, read the issue description here and follow the links: https://github.com/pypa/pip/issues/988


That's funny, Poetry's README [1] claims exactly the opposite (that it resolves dependencies better than pipenv). And looking at the git history, that section of the README is 2 years old, so that claim was made when you experimented.

Was the problem with dependency resolution you had similar to what Poetry describes in it README? Or are there other problems the README fails to mention?

The description they make got me interested, but up to now all alternatives I tried to build moderately complex python projects failed to deliver on their promises, so definitely interested in what you have to say.

[1] https://github.com/sdispater/poetry/blob/master/README.md#de...


I do really like the workflow pipenv provides. But last time I checked, the mayor downside of pipenv is it doesn't and won't support multiple Python versions [0]. For libraries, this can be a dealbreaker. For applications, especially ones you're deploying to a controlled environment, this isn't an issue.

Been meaning to look into poetry. Not a huge fan of TOML, but hopefully all the tooling supports pyproject.toml now (I know black does, but not sure about flake8, isort, pylint, pytest, coverage). I know there's still some hacks required for tox + poetry though.

[0] https://github.com/pypa/pipenv/issues/1050


It seems like pipenv and poetry fill different niches: pipenv is intended for application deployment, so you're likely to only have one version of python, while poetry is intended for building libraries.

I found the tox support a bit hacky, but not bad, and got 3.5 to 3.8 testing just fine.

Poetry does make the publishing process pretty seamless for the simple case. If you're starting a new library, poetry is great.

I think poetry needs a "fuzz option" to test random valid solutions to the version constraints, to check if the older versions are really valid.


I use pyenv to switch between python versions (or just the `py` launcher when on windows), and pipenv will follow suit. It does require regeneration of the lockfile for each python version/environment in the scenario you describe, but you can also keep them organized in a folder and swap them out for each environment. You're right that it doesn't come included in the tool though.


What don't you like about TOML? It seems to be the best language for simple configuration files. I think YAML is often overkill for basic configuration, and JSON isn't ideal. pipenv's Pipfile is also in TOML.

What config language would you prefer to use instead?


To be honest, I don't know. All I can say is YAML parsers exist and it works well enough for many applications, although I do understand the issues with it. Sure, it isn't perfect, but no need to re-invent the wheel for marginal improvements.

I roughly agree with the arguments laid out in PEP 518, even though the array of tables thing is not at all "obvious" to me. And I actually do think an "official"/semi-official YAML subset would be great for loads of use-cases - yaml.safe_load already is that in practice.


Pipenv's dependency resolution doesn't handle backtracking[1].

I've tried similar configurations with poetry and it doesn't seem to have a problem with them.

[1]: https://github.com/pypa/pipenv/issues/2596


Not even with the `pipenv lock` command? It seems to work out solutions in multiple rounds, including backtracking, but I might be wrong on that observation...


It does do multiple rounds, but once it has decided on a constraint, it can't revisit that constraint.


My experience was the opposite. pipenv always told me something couldn't be installed because of version mismatches and poetry has never given me any trouble.


When you use `pipenv lock --verbose` you trigger holistic dependency resolution with reporting on the process and I have yet to encounter _any_ issues with that.

It is important to apply pipenv as intended; I only use `pipenv lock` and `pipenv sync`, and edit Pipfile manually.

The problem with other commands such as `pipenv install` is that they don't apply the holistic dependency resolution as `pipenv lock` does.


How are you supposed to know you're not meant to use `pipenv install <package>` and should instead manually edit Pipfile? The documentation starts at `pipenv install <package>` and never gets to "but that's not applying pipenv as intended so don't do it".


I don't know. I'm just happy to have stumbled upon an effortless workflow (for my applications) and thought I would share the nugget of knowledge.

Perhaps "as intended" is not a correct assumption on my part.


Does pipenv create cross-platform lock files? That has been my primary requirement when looking at solutions, more than the usability issue of Python using a bunch of tools and files for the whole development process without a coherent story.

iirc Poetry does have cross-platform lock files and a coherent story. I'm just blocked on some PRs due to the original author getting busy and not yet having delegated to lieutenants.


Mostly. The lock file format is certainly consistent between platforms, and I've been able to install applications on windows, mac and linux.

But it doesn't handle platform-specific dependencies correctly.

For instance, `doit` has a watch feature, and it will depend on pyinotify on linux and macfsevents on Mac.

If I run pipenv lock on the Mac, it correctly generates this:

        "macfsevents": {
            "hashes": [
                "sha256:hexhexhex..."
            ],
            "markers": "sys_platform == 'darwin'",
            "version": "==0.8.1"
        },
But it won't lock the linux version. Similarly, if you lock on Linux, it throws out macfsevents. And `doit` is not doing anything strange[1] to report these extra dependencies.

[1]: https://github.com/pydoit/doit/blob/660d3a2e75dc9d861e5a8916...


In the past Pipenv has failed to resolve dependencies for my project before, where Poetry (run last week) succeeded. Not a fair comparison as I've not tried Pipenv recently, but that at least lets you know that Poetry is now better than where pipenv was a few months ago.

   $ pip freeze |wc -l                    
        204


The big win seems to be the lock file - like Cargo in Rust, or yarn in the JS world. It's really, really hard to lock down dependencies reliably in Python, especially when you are talking dependencies of the primary packages you are installing (and their dependencies, ...).

One solution at the moment is to run 'pip freeze' and put that in as your requirements file, but that very much feels like an 'and now I have a different problem' solution.


For dependency management needs, 99% of what you need can be done easily with vanilla virtualenv, vanilla pip, vanilla setup.py, and a Makefile. A small shell wrapper that sources bin/activate and runs your Python app is all you need to allow anyone to run your code without additional steps.

The biggest problem with Python packages isn't a dependency manager, it's that Python developers don't reuse and extend existing projects. They just keep churning out new similar projects with different features. All of the most commonly used Python packages are basically one-offs that use a bunch of other one-offs. This creates a cycle of reinventing the wheel and friction in ramping up on existing technology, due to all the extra integration work and split development, and later lift-and-shift into completely different dependencies that implement the same thing. PyPI is awash with duplicate abandoned one-offs to the point that just trying to just find a useful module to do what you want can take a long time.


> done easily with vanilla virtualenv, vanilla pip, vanilla setup.py, and a Makefile

I wouldn't call that "easy", but OK.


Would you consider this easy?

  # Makefile
  venv:
   python3 -m virtualenv --version 1>/dev/null 2>/dev/null || \
    ( echo "Please install virtualenv (python3 -m pip install virtualenv wheel setuptools)" && false )
   [ -d venv.d ] || python3 -m virtualenv -p python3 venv.d
   ./venv.d/bin/pip install -r requirements.txt

  $ make venv
  
  # Running your project without setup.py
  $ ./venv.d/bin/python3 my-program.py
  
  # Running your project with setup.py
  $ ./venv.d/bin/pip install .
  $ ./venv.d/bin/my-program.py
The setup.py you can copy from another project and customize to your liking.


I can't find a good comparison to Conda - is the main distinction simply that Poetry uses the official repos?

For people working outside of scientific Python: conda is a package and env manager maintained by a private company that's become the go-to because it's really good at handling binary dependencies.


I use conda as a python manager and poetry as an environment manager. For every project I just create a new conda environment with the bare minimum python 3.x version I need. Then I use poetry to handle the dependencies and manage the environment for me. For stuff that are strictly scientific or numerical in nature and I need things from conda (ie MKL accelerated numpy) I skip the part with poetry and just use conda to install my packages to avoid any potential issues.


Can you really build packages using solely Conda?

As far as my understanding goes, conda focuses on providing virtual environments for interactive use, but to build and distribute a package on conda forge, you still need to rely on setuptools/distutils.

My impression here is that Poetry aims more at _replacing_ the pip + setuptools toolchain. Users of your package could still install it using conda, if relevant. It seems a bit limited in what it can do at the build step, unfortunately, so it is not a replacement yet.

Coming from the Java world, I pass my days dreaming of a "maven for python", and this project definitely goes into that direction. I will definitely keep it on my radar.


Well conda packages don't need to be python based, so technically yes you can build and package up whatever your heart desires.

conda build recipes are essentially just shell scripts that conda runs in a sandbox, taking care of library paths and so on. You could, if you really wanted to, develop a poetry based Python package and then have your conda build script use/run poetry to build and package the python package, and then add whatever lines are necessary to the build recipe to make that the package that gets installed by conda.


That basically confirms the feeling I was trying to express, in that conda and poetry seem to solve different problems: conda focuses on easing the distribution to the end user, whereas poetry tries to simplify managing the build on the developer side.


Worked a little with and it looks really nice and promising, but my biggest concern is number of issues on github and that vast majority of commits is done by one person.


From what I've gathered, the project grew faster than the original dev had time to build up other people to also monitor PRs and has since gotten very busy, slowing down the review process. I know I have several straightforward outstanding PRs. I hope this is able to be resolved so we can have a sustainable community.


That is also my worry. I've opened issues that were show-stoppers for many people and we couldn't even get the author to give us an update on when the fix would be released.

I wish he'd run the project a bit better. Other than that, it's stellar software.


I love poetry, and am using it in all my python projects, but despite that I would hesitate to recommend it for this reason.

That said, "ejecting" a plain requirements.txt is possible on the alpha branch via `poetry export`


I like how transparent poetry is about what's happening when you run it, and how well presented that information is. I've come to loathe pipenv's progress bar. Running it in verbose mode isn't much better.

I can't be too mad at pipenv, but all in all poetry is a better experience.


Can I use this to build a package that can be uploaded to a pypi repository for other people to depend on?


Yep, it works really well for that. https://poetry.eustace.io/docs/cli/#publish


Python is the worst thing to happen to SWE (specifically SWE, not ML, Education, etc).

Python makes huge sacrifices in readability and efficiency for a hypothetical win in writeability. It's also fundamentally at odds with the multicore world we're living in an will continue to live in, an issue that many people have tried to fix and failed at.

I can't count the number of times I've had to spend my entire night reading through an older Python service without type annotations, desperately trying to understand the types of inputs and outputs. It's just a mess, and the bias of programmers to believe that they are writing readable code (when in reality it's only readable to them) exacerbates the use of names that provide little context to solve the problem.

Python is awful, and it's an uphill battle to fix it. Asyncio can solve some performance issues, but negates the benefit of Python's simplicity by destroying the sequential mental model that Programmers can easily grok. It also requires a complete rewrite of libraries and a split between non-asyncio land and asyncio land.

Type annotations can make Python more readable and thus more maintainable, but gradual typing still leaves lots of unanswered questions when you're interfacing with untyped code.

Python is really not good unless you are using it to prototype or build something on your own. It's led to a world of slow, buggy software that is impossible to rewrite. It's downsides are easy to measure, but its benefits are difficult to quantify.


Http://trio.rtfd.org replaces asyncio with a model that feels much closer to sequential. And it's very good.


How do you start a new Python project in 2019? I mean, there's so much stuff. Poetry, pip, virtualenv... oh, my. With PHP, I just composer init and that's it.


Never used poetry before. How does it compare against pipenv?


Much better


Love poetry, I hope it gets to feature parity with pew regarding project naming and jumping.


Neat, definitely something I would want to give a try.

I see though that it only supports pure python for building packages, does that mean that it doesn't build if you are dependent on compiled libraries?

Is there also a plan to add some of the functionality of bundling tools such as web-pack into this build phase? like automated css optimization, image compression... Could be handy for some django/flask projects.


Poetry is very good. I think projects should use it.

I hope the rest of the ecosystem can catch up quickly.

Tox and pip and and pex need full support for PEP 517/518.


They can't support reading the poetry data because it's not generic. Nor are most practical implementations of tools using pyproject.toml because they all use custom tool-namespaced fields in pyproject.toml: it's far from a standardized format yet, despite the push for using it from the people that made it.

I still wish poetry would support setup.cfg in parallel. It has been working for 2 years, is compatible with pip, setuptools and the whole legacy stack, and most fields are standards and documented.


I should be more specific about what I meant.

I would like to see tox support `install_commands` (https://github.com/tox-dev/tox/issues/715)

I would like to see pex support the `build-system` feature. (https://github.com/pantsbuild/pex/issues/660)

For pip, I'm not sure what's involved in getting everything to work generically. This is probably not simple. It has the consequence that when we don't have control over the install command, we end up needing hacks like using `extras` for ReadTheDocs (e.g. https://github.com/readthedocs/readthedocs.org/issues/4912#i...)


Looks great! So, dependency management in Python is now a solved problem?


Seems to be solved only five or six times; not nearly enough.


I was excited by this project, and pipenv, but unfortunately I haven't had consistent results cross platform with either. I have ended up sticking with venv because of this.


Could an effort like this ever be implemented as an (backwards compatible) extension to or modification of something existing with traction?


This looks awesome, well done! Is there a TLDR about how it improves on pip or other python dependency management?


The GitHub readme explains how it improves on Pipenv.

I like Poetry a lot; I hope platforms like Heroku will start supporting it soon.


> The GitHub readme explains how it improves on Pipenv.

That's a shame it is not in the project webpage, though. This is the first information I searched, without success.


I raised an official support request with Heroku for this. “It’s coming” along with other improvements to python build packs.


How is the execution time when resolving a non-trivial project? Like your typical Django app, for example.


Adding dependencies takes a (little) bit of time, because of the lockfile. But deployment (ie installing dependencies) is as fast as any dependency manager (pip, pipenv, ...).


Congrats on hitting version 1.0! I’ve been putting off completely switching off of pipenv until this milestone.


I'm not being snarky, it's probably just too early in the morning, but where did you see the project hit v1.0?


I am Le idiot. I saw a version 1 email update but it’s for the prerelease >_<


Well, it's 1.0.0b1 which is still good news considering how long it was in alphas.


Does anyone know if there's a name for the "terminal" theme used on the poetry site?


what poetry can do that venv can not do? considering venv is the default tool and is solid and fairly easy to use already?


> what poetry can do that venv can not do?

Venv doesn't do dependency management, it just provides an isolated python environment. By default, poetry uses venv for environment isolation, but also does dependency management.


what about pip freeze to requirements.txt for manually but solid dependency management under venv? pip takes care of dependencies be it venv or poetry, requirements.txt will record what's need so you make your project reproducible and portable. what else does poetry bring to the table? If there is a strong selling point I would like to try poetry again.

yarn came out great but these days I switched back to npm for the sake of simplicity of my workflow, especially npm absorbed yarn's good features.

same thing happens to parcel/webpack, the neat new tool beats the old players, but then the old dominant one catches up by taking in good stuff from its smaller competitors, quickly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: