Hacker News new | past | comments | ask | show | jobs | submit login
Rye: A Hassle-Free Python Experience (astral.sh)
527 points by jcbhmr 3 months ago | hide | past | favorite | 317 comments



A lot of our core packaging development is now happening in uv [1]. Rye uses uv under the hood, so as we improve uv, Rye gets better too.

E.g., we recently added support for "universal" resolution in uv, so you can generate a locked requirements.txt file with a single resolution that works on all platforms and operating systems (as opposed to _just_ the system you're running on). And Rye supports it too in the latest release.

[1] https://github.com/astral-sh/uv

---

I work on Rye and uv, if you have any questions :)


Does Rye/UV with universal resolution work properly with PyTorch, or does it suffer from the same issue as Poetry?

https://github.com/python-poetry/poetry/issues/6409

https://news.ycombinator.com/item?id=39257501


It works properly with PyTorch. For what it's worth at $DAYJOB we switched from Poetry to Rye starting from version 0.15 (even before it supported uv) for that reason initially.


That is encouraging to hear. I am curious specifically about the universal resolution feature that has been newly added. Poetry's issue is that it cannot properly resolve PyTorch's more complicated situation. I imagine Rye before this change would have used pip under the hood, so it would have worked as well as pip. I hope this still holds with the new under-the-hood changes.


Any ongoing work/plans to follow for cross-platform lock files?

This is one concern that would prevent the team I'm on from moving. We do ML work, so I'll use a pytorch-based project for my example. The desired state is that all dependencies are in pyproject.toml, e.g., from which you can generate a set of lockfiles from an AArch64 Mac or an AMD64 Windows workstation for the following platform configurations:

1. Mac with default (MPS) pytorch

2. Windows with CUDA pytorch

3. AArch64 Linux with CPU pytorch

4. AMD64 Linux with CPU pytorch

5. AMD64 Linux with CUDA pytorch

P.S. Big thanks for making Ruff, we're huge fans of linting and formatting both.


I have almost the same problem. With Poetry, I managed to work around it using this technique, involving a combination of groups and extras: https://github.com/lucaspar/poetry-torch

It's a hacky workaround, but it seems to work so far. It would be much nicer to see this solved in a better way, though!


Poetry does cross platform lockfiles, but the absence of cross platform lockfiles is one of the reasons why uv benchmarks significantly faster (on top of using Rust for the resolve)


Unfortunately, installing pytorch via Poetry was not for the faint of heart, last I tried. [1]

For example, it used to download wheels for all matching distributions (say pytorch for a few platforms) to get metadata. That would be like a 10 GB download for 4 wheels of pytorch, to generate the lockfile.

Another thing it used to do is ignore local cache, so for some locking operations would download the same wheel multiple times.

In general, it was not entertaining, from an ML practitioner's perspective, to infer an attitude from the Poetry maintainers that the mountain shall come to Muhammad, so to say. Their abridged position was that pytorch is not compliant with standards, when Poetry itself, e.g., is not PEP 440-compliant, in a manner pertinent to the issue at hand. [2]

I realise that my grumbling about pytorch has nothing to do with your perfectly adequate comment, that said. Please interpret it as nothing more than an expanded rationale fore being more interested in learning about Rye/uv than in revisiting Poetry, at least for the projects I've worked on.

[1] https://github.com/python-poetry/poetry/issues/6409

[2] https://github.com/python-poetry/poetry/issues/7748


My experience is that both PyTorch and Tensorflow are incredibly fickle with the interplay between cuda versioning and python package versioning.

You need to go straight into their websites and manually figure out what goes with what.

Quite honestly, this needs to be done in Docker with fancy conditional logic, and not via the python package manager.


Speaking about strictly PyTorch, it's actually elementary on the distribution side. I know that there are packages which do some weird setup magic at runtime, unfortunately, but pytorch is not one of them.

Installing it via the PyPI gets you the CPU version on Windows/macOS, and CUDA 12.1 on Linux. You can specify CUDA version explicitly, which you need to get CUDA on Windows, or CUDA 11.8 on Linux, by referencing the corresponding PyTorch package index, e.g., https://download.pytorch.org/whl/cu118 - cu118 means CUDA 11.8. Doing that explicitly is the normal practice, since in ML land you'll know your hardware well - no one is asking for tooling that automatically resolves CUDA/cuDNN/etcetra compatibility.

Hence, it hopefully follows that installing this with pip is trivial, you just need to maintain multiple requirements.txt files so that you specify the correct index URL where it's necessary. What I specifically want, then, is for someone to give me a UX where I don't have to maintain these requirements.txt files. I should be able to just specify arbitrary build environments* in pyproject.toml, and override the global default package index for specific packages in [platform-]specific build environments.

*Really, we're probably talking about combining blocks of optional dependencies into an ersatz build environment, but that's what they ultimately are for.

If I can then "compile" these build environments into properly reproducible lock files, and then use the dependency manager to export these lock files into conventional requirements.txt files with exact pins, then we've arrived at what I would call a good UX for specifying dependencies in Python.

To the best of my knowledge, for 99.9% of packages this doesn't require Docker. Different parts of this process work well across the ecosystem, just not as a matter of full package. To name a few

- Poetry has many building blocks, but just has had a questionable implementation as per parent you're replying to, and also random other problems like not working at all with private indexes back when I tried, and so on;

- pip-tools does the entire pipeline well, but it quite unfortunately doesn't support cross-platform targeting;

So, quite honestly, until someone can do this correctly and in one place - and, ideally, with endorsements from PSF and PyPA - I'll just consider Python dependency management tooling to be inadequate for my use cases.


I guess I'll add that this is mostly possible using a Docker-based workflow, but the hope is that one day we could get something more lightweight.


Requiring Docker implies that a packaging tool is too simplistic to do its job.


Well, that's my question - I'm not familiar with Rye/uv, and I'm curious to know if it can substantively simplify the workflows that I'm aware of. I simply clarified to pre-empt a potential comment about Docker.


Great work. I've switched to using Rye now as I used to have to occasionally setup a new computer to work on a project and it was always a complete pain (pyenv+venv+pip).

Now its:

* Install Rye,

* Pull from Github,

* Type rye sync


Given that you guys are in charge of both uv and rye, why keep two alive at the same time? Why not just kill rye now to avoid fragmentation, and confusion that comes with the burden of having to choose between the two?


UV doesn’t do all these things yet. The end goal is that UV can fully replace Rye and a neat migration will be provided so that one can move over.

Since Rye already uses uv behind the scenes there won’t be a lot of incompatibilities to deal with so the migration should be trivial.


I love Rye, especially with UV, and we are rolling it out where I work.

Question: are there any plans to add sandboxing support to Rye or UV? I realize it is a big ask and that it might not be possible on all platforms.

My dream scenario is that I can define a section in pyproject.toml that goes like

  [sandbox]  
  allowed_urls = [...]  
  allowed_dirs = [".",...]
and then "rye run" wraps the executed code in a sandbox which forbids the code to access anything outside those two whitelists. This would make me sleep so much better at night, as it solves 99% of my supply chain attack fears. And it's lightweight enough from the user side that it will actually be used.


No plans. That said, independently I have been thinking about this quite a bit since Deno has demonstrated that there is interest in stuff like that. Unfortunately to make that work in Python would require significant changes to Python itself.


What I'm thinking about would technically be language independent, just a wrapper that is interposed between Rye and the Python interpreter to apply something like a seccomp filter to the Python interpreter (and all code that it calls). The wrapper could equally well be used on a piece of C code or whatever. But I'm sure you have a far better understanding than me of whether something like that is feasible :)


That sounds like a job for "firejail", at least in linux. Implementing that as part of python would be a major undertaking, and I wouldn't necessarily trust it, I'd prefer an external tool implement the sandbox.


Is there a technology in existence that would facilitate this from userspace and wouldn't require elevated kernel privileges to set caps?


From my cursory googling, I believe seccomp on Linux is one way to achieve this. See e.g. Cloudflare's "sandbox" wrapper:

https://github.com/cloudflare/sandbox

FWICT, it's not possible to achieve something like this reliably on Windows natively, but maybe in WSL it could work. On Mac, there is the similar libsecinit aka App Sandbox which also can be spawned from a userspace process, I think?

Of course in every case the program spawning the sandbox has to be outside the sandbox. But having to trust Rye or UV is much better than having to trust thousands of "RandomDevsNichePythonPackage".


IIUC, uv is a replacement for pip (?), and rye is for pyenv+poetry.


In the end, uv will be the pyenv + poetry replacement, and supersede rye.


So which should I use for a new project, as of July 2024?


Rye is perfectly fine to use as of now. It uses uv under the hood anyway, but has some niceties of its own.


For example?


I would use Uv - if you can handle that it's a moving target right now. (uv python, uv lock, uv sync and so on, and in pyproject tool.uv.sources). All preview features that replace similar Rye features with a more robust implementation.


Poetry.


I would.


You have a lot of firms that care about predictable and performant building that are shifting to you, and one of the things that comes up is SBOM generation for ingestion into tools like guac.

https://guac.sh/

https://pypi.org/project/sbom4python/ https://github.com/CycloneDX/cyclonedx-python

Your recently added ability to unpin dependencies so devs are more encouraged to stay compatible as they dev, then generate a correct explicit requirements.txt for reproducibility, makes both vuln management and the SBOM step a far easier thing than poetry etc.

Thank you!

For similar reasons, we use https://hatch.pypa.io/latest/why/ and appreciate that it plays nicely with `uv`.


> I work on Rye and uv, if you have any questions :)

How do you pronounce "uv"?


Rhymes with bruv. London pronunciation


No idea how a londonian prononces bruv. How about IPA?


/bɹʌv/


How do you notate that the v is aspirated? It's practically an f


Like the American pronunciation of "of".


We say "you - vee" :)


Like ultraviolet? What's the origin of the name?


I assume this "Universal" resolution is enabled by adding the "--universal" cli argument?

It doesn't look like this is currently documented but I found some hints in the release notes.


How does it compare to Pipenv and Poetry? I had some problems every time I used Poetry. I wanted to like it but it hanged or took forever and similar things often.


So dep resolution is fast with uv.

Setting up a new project is: rye init && rye sync Adding a dep is: rye add flask && rye sync You can pin your python version in the pyproject.toml

Migrating from an established project is a little hard than it should be. Importing the requirements.txt into the pyproject.toml is not a good idea as uv gets itself in a twist with all the low-level dependencies that exist in the requirements.txt. I've never tried it with a poetry made pyproject.toml, report back if you try it.

On the whole its a good experience, fast and straight forward.


> Importing the requirements.txt into the pyproject.toml is not a good idea as uv gets itself in a twist with all the low-level dependencies that exist in the requirements.txt.

Can you explain? I wonder if you mean "requirements.txt generated via pip freeze" rather than "a human curated requirements.txt"


Yeah you're right.

If you just keep the requirements.txt "high level" then you should be okay. Just my experience with uv hanging for ages with a pip freeze made requirements file.


This is awesome. I’ve really struggled with cross-platform resolutions in my bazel setup, as our services are built for containers that run in k8s, but we also want to be able to build and run locally for scripts, tests, etc. I have a branch that uses PDM, which works much better, but there are still a ton of hacks in it. Rye looks like it could help quite a bit.


How can you generate a universal lock file when some packages will query the running system in arbitrary ways during resolution?


Most packages don't do that. You can get really far by assuming that all of a package's wheels have the same set of dependencies (maybe using environment markers), and that its sdist always returns the same list of dependencies (also maybe using environment markers). No, it's not perfect, but it's also what Poetry and PDM do as far as I know.


Yeah that's right -- we make the assumption that all distributions for a given package will yield the same dependencies, similar to Poetry, PDM, and other tools. This is not strictly required by the standards, but it's very rare for it to be violated.


It's rare, but in my experience it can still be very impactful and hindering adoption.

E.g. that practice seems to be quite prevalent in a few high-profile pytorch packages, which caused our team a lot of pain when trying to enable reproducible cross-platform builds 1-2 years ago


Indeed. I gave up on this and just build in Docker.


I like how you ignore the "best practices" for packaging created by PyPA (i.e project.toml and friends) and just do requirements.txt cowboy stuff.

Don't get me wrong, both are hilariously bad, but I like to see more chaos in Python infrastructure. You pushing against PyPA (rightfully) delegitimizes them.


What's the relationship between uv and uvloop?


None. "UV" was chosen because it is short and available.


How is it available if libuv?


They probably mean that it didn't conflict with other common executables. I guess it could be a problem if libuv ever includes an executable called uv, though I can't imagine how it would make sense for libuv to have an executable component like that.


I can do this with pip-tools. Exactly how does one enable this in rye?


The thing that put me off of Rye is that it strongly encourages you to use non-standard Python builds.

From their philosophy page: https://rye.astral.sh/philosophy/

> No Python Binary Distributions: CPython builds from python.org are completely inadequate. On some platforms you only get an .msi installer, on some you literally only get tarballs. The various Python distributions that became popular over the years are diverging greatly and cause all kinds of nonsense downstream. This is why this Project uses the indygreg standalone builds. I hope that with time someone will start distributing well maintained and reliable Python builds to replace the mess we are dealing with today.

And here is info about those particular indygreg builds.

https://gregoryszorc.com/docs/python-build-standalone/main/

It is, however, possible to choose a different Python.

https://rye.astral.sh/guide/toolchains/

I've never really experienced the problem they are describing. Any official Python build I've gotten from python.org has worked. Every normal old Python I've gotten from my distribution's package manager has worked. Every Python included with an official Docker container image has worked.

I'm sure their special builds will also work just fine, but their non-standardness gives me pause. There's even a list of behavior quirks. Why are we adding quirks to something that has no problems? And the fact that the rye philosophy seems so intent on using them, turns me off from it compared to the alternatives that sensibly default to using the Python you already have available on your OS.


I'm just guessing, but I imagine the scenario goes like this:

1. Work at a company that runs some ancient version of CentOS or something.

2. The system package manager doesn't have anything newer than Python 3.6.

3. Official binaries from python.org print an inscrutable error about the glibc version when you try to run them.

4. Building Python from source requires some new system dependencies, takes forever, and then produces an even less scrutable error about a missing header file.

5. Googling that error leads to a ./configure flag that works around the issue, which upstream GCC fixed in 2017.

6. Success?

If you haven't seen that error #3 before, or dealt with "manylinux" build environments, you've dodged an entire world of pain. The same goes for scripting installs on Windows, and for the part of that page that's talking about "limiting the CPU instructions that can be used" :')


The scenario might also go something like this:

1. Try to install pytorch


Which is kinda, to some degree, on the pytorch people as well.

I would think that if you add a new way of packaging projects and managing dependecies, there will always be legacy projects that require you to manually put some files to an undisclosed location with a wheelbarrow.

I don't think any new build system is to blame if an existing projects installation process is user-hostile. It would certainly be better to make the pytorch installation more straightforward instead of carving out special niches in all build tools for it.


Or spend days (literally) trying to install OpenCV...


This is one place containers can save a lot of pain


> If you haven't seen that error #3 before

I have been in the #3 hell, almost exactly how you described but it was always about SSL and its missing headers. On my desktop wiki, the most important section about Python is the one that contains the incantations required to compile SSL, setting myriad of variables and using pyenv to build a newer (3.10/3.11/3.12) Python.


I've run into #3 quite often in embedded Linux projects, especially when dealing with the Jetson ecosystem where upgrading to a modern Python is a nightmare due to all the specialized hardware. Glad to see I'm not the only one who runs into this.


Python building, packaging and deployment has two extreme states: the king's highway and the hall of a thousand knives. If the portable Python suggestions do not make sense to you, then consider yourself lucky, because you have managed to stick to the highway.


Best of the thousand knives I think is `shiv`. Produces one big artifact which self-installs without colliding with other runtimes on the system.


I regularly download the python source code, compile it with standard prod optimizations, then install to /use/local/python${version}. This works extremely consistently since python 3.7 (released in 2018). In my experience, these commands are so stable and consistent they could be automated away. What might the author's issue or underlying protest be?


I've also compiled python from source a good amount, and it usually works... until some thing where I realize some standard lib wasn't compiled because I was missing an optional dependency. But some lib assumes that it was always included cuz the standard distro is.

I think it's easy to compile Python, but it's easy to end up just having to go re-compile it for some random component that was not compiled in the "standard" fashion.

If you have a good test suite this stuff shows up quite loudly though. At one point the core issue is more collaborators wanting to not have to compile things themselves.

(And to "automating away" as a comment... indygreg's releases _are_ this! Someone has done the work for us all)


>And to "automating away" as a comment... indygreg's releases _are_ this!

They most definitely are not. There's a world of difference between downloading a portable Python build and building one on your own machine, and C extensions can give you a world of trouble when they start referencing paths that are not on your machine, but rather were on the CI machine that built your static build. The FAQ even has a big section that boils down to "There's a bunch of paths included in these builds that will not make sense to consumers and we don't have a way to fix it."


lol readline I'm looking at you.


Same. We build our own Python and have been running it for years without a single hiccup. Not sure what the big fuss is. Pyenv does the same thing.

The concern could be absolved by simply improving the docs with the most recommended compile flags. I think they are actually noted there. Also of note our build time decreased substantially with llvm.


Doesn't pyenv basically do that automating away? I don't think I've ever had issues compiling Python using pyenv.


There is also a world of windows out there. Compiling Python from scratch is not a ton of fun there. Rye wants to have a very consistent experience for everybody.


One of the key benefits of Python is it's very easy to motivate for getting a Mac or Linux laptop from IT when you use it :)


Python is a nightmare to use on those platforms as well. It's one of my favourite languages, but I have (until now I guess) started avoiding it like the plague for anything important or distributable because of the actual hell that is dealing with it in production.


What actual hell in production have you experienced? I'm curious!


I've had a ton of issues with pyenv and using pyenv/pyenv-virtualenv/poetry that were fixed by moving to rye


No idea what that is, the standard distribution is so easy to work with I don't need anything else.


Well an obvious issue is that you have to do that!

Also I think a big issue is the inconsistency between platforms. For example the official Python installed doesn't include python3.exe (frankly because the devs are idiots), but the one from the Microsoft app store does!

If you stay on one platform you wouldn't see those issues.


> No Python Binary Distributions: CPython builds from python.org are completely inadequate. On some platforms you only get an .msi installer, on some you literally only get tarballs.

I'm just guessing, but they could mean that there are no macOS/Windows binaries for security releases of older Python versions. You can't, for example, download an installer for Python 3.10.14. The last Windows installer is for Python 3.10.11 (April 5, 2023).


Agree. Support PSF, don’t advocate a new “right way” of doing it. Weird position.


The PSF is the marketing and self-promotion arm in the Python ecosystem. It allows people who have done little or nothing to gain influence.

The PSF has nothing to do with software development.


UV is more likely to support SPF


Those French splitters!


I tried Rye during its first days. It would (without any indication) download some custom build of Python, which was dynamically linked so won't work unless you're running a distribution similar to the build environment.

Linux distributions ARE NOT binary compatible, you can't just build Python on one distro and run in on another. You need to statically link everything to do that.


The way different Pythons from PyPI don't work can be, for example that various optional modules are (not) included. For example, for my own Python installs, I build without tkinter. I have no use for this module, and it's always a burden to find the outdated XWidget libraries necessary for this pile of garbage to build.

Seeing how this module is basically useless, a lot of Linux distros also exclude it from the Python they package. But PyPI builds try to include everything.

There are few more modules like that.

Another aspect is various defaults that help Python locate its parts or control its loading. Eg. the site module or sysconf. For various reasons various vendors may configure these differently. This may result in some packages being broken upon install.

I.e. Python programmers are notoriously bad at creating packages and understanding how they work (also, Wheel format is a dumpster fire of nonsense). So, a lot of developers don't understand the consequences of packaging anything that's not strictly Python source code (which, frankly, should never have been packaged! but hey, remember, Wheel? dumpster fire? So... where was I...) anyways, native libraries packaged with python source may end up in some random place Python doesn't look at, and consequently fails to load, or other data files end up in the wrong place because the developer packaging them after countless trial and error has made it work on their computer, with their particular Python installation, but they still don't know why it worked.

Similarly, if a package wants to provide header files so that other packages can compile against the native modules the package provides... oh-ho-ho, bet you didn't know that's possible, right?! Well, don't bother. It mostly doesn't work anyways. And there's a bunch more stuff like that.

As a "typical" user, you might have never encountered any of the issues above, but me, as a Python infra person who's summoned to extinguish fires started by talented Python programmers by using tools like the one in OP deals with this stuff pretty regularly. So, I can relate to the disappointment with any aspect of Python infrastructure. There has never been a case of me discovering something in Python infra and looking at it with admiration. At best it's passable, but in most instances it's hilariously bad.


Original author of Rye here: there are no official Python builds outside of macOS and Windows and the macOS builds cannot be installed programmatically. They also cannot be placed in any location other than the dedicated framework location which often causes issues for people who do not expect specific versions to be installed. Quite often installing the macOS builds of Python breaks other tools that expect a different version to be there.

I’m glad regular Python versions work for you, and you can register them with Rye. That’s very intentionally supported.

The goal of rye is to reach higher. It wants to see how close we can get to an experience you can get from node or Rust. That includes getting access to Python itself.

I have been programming Python since 2.2 and have helped countless of people over the years with their Python challenges. From mentoring to tutoring to just helping on IRC and other places. So many times people had a completely broken Python installation or ran into compilation errors. Even pyenv ships patches to make Python compile cleanly in some cases.

The indygreg builds have issues, no doubt about it. In an ideal world the PSF would distribute portable Python builds for all platforms.


I do like that Rye's Python builds apparently use libedit instead of readline. One less GPL-licensed thing to worry about for redistribution :)


As someone with a large project that depends on the standard readline that was a major hiccup when moving to rye. Luckily there's a gnureadline package.


Wow, so many haters :(

I love Rye. It does what it says on the tin. It makes the whole venv/Python-version/packaging process actually pleasant, and it’s invisible to someone used to Python-official usage (pyproject.toml et al). And it makes Python feel like Cargo, which is a great thing to work with too.


If like me, you've ignored poetry and friends and stuck with pip-tools (congrats!), uv (used by rye internally) is a drop in replacement.

IMHO pip-tools was always the far nicer design than poetry, pipenv etc as it was orthogonal to both pip and virtualenv (both of which have been baked into Python for many years now). I would argue Rye is the iterative, standards compliant approach winning out.

Beyond the speedups from Rust, it's nice to have some opinionated takes on where to put virtualenvs (.venv) and how to install different Python versions. It sounds small, but since wheels fixed numpy installs, sane defaults for these and a baked in pip-tools is basically all that was missing. Talking of which, what has been the point of anaconda since binary wheels became a thing?


> what has been the point of anaconda since binary wheels became a thing?

When you need python + R + some linked or CLI binary in an isolated environment. Also you will use the same tool to manage this environment across multiple OSs (e.g. no OS specific `apt`, `brew`, etc).


I still love miniconda for DS work. If you want to setup a project to process some videos using some python libraries, you can use conda to install a specific version of ffmpeg into the project without worrying about your system installation.

Lot's of random C/C++/Fortran libraries that can be used directly from conda and save a massive headache.


On Linux, binary wheels are unreliable and sometimes segfault.


As somebody who tried to pick up Python after hearing there was one way to do everything…the installation and environment management experience was a train wreck.

Glad to hear it’s getting better finally.


What you heard is from the Zen of Python, a short text meant to express core ideas behind the design of the Python language. You can read it by typing `import this` in the Python interpreter. The exact sentence is:

    There should be one-- and preferably only one --obvious way to do it.
This sentence was coined as an answer to a catch phrase that was used to describe the Perl programming language: There Is More Than One Way To Do It. Giving programmers more freedom to express themselves in different ways was presented as a good thing by the Perl community.

Python was partly marketed as a replacement for Perl and the sentence from the Zen of Python expresses a difference from Perl. The idea is that having different ways to do things leads to confusion and code that is harder to maintain, problems that Perl was supposed to incur according to its critics.

The sentence was true to a certain extent when it came to the Python language. It don't think it has ever been true for the Python ecosystem. For example, during the early 2000s, there were a plethora of web back-end frameworks for Python. As the Python language has since gained a lot of features, I'm not even sure that this is true for the language itself.

Regarding package management, this has always been a weak point of the Python ecosystem. Python developers often make jokes between themselves about that. Unfortunately, I would be very surprised if this project was to put an end to this issue.

Despite all this, I encourage you to learn Python because it's a very interesting and powerful language with an extremely rich ecosystem. Yes, there are many ways to do the same thing with it. But on the other hand, there is a way to do pretty much anything with it.


> Python feel like cargo

I am sold. Was thinking of trying out pixie after poetry took whole day and still couldn't resolve deps.

Looks like there are more python package managers that chat apps from Google ?


> poetry took whole day and still couldn't resolve deps.

I hate doing this, but the solution is to reduce the search space for poetry to find a compatible version.

Verbosely install with poetry (-vvv) and note the package it gets stuck on. Find the currently installed version from the lock file and specify it as the minimum in your pyproject.toml file.

The time to find a solution went from 2-4 hours to <90 seconds when I did this a few months ago for a large complex work project.


Pixi is limited in focus to the Conda ecosystem within Python's ecosystem. Rye is not quite what Cargo is to Rust, it's more like a faster Poetry. Both Rye and Pixi are using uv, which aspires to close the gap for Python packaging tools to be the Cargo of Python. Rye will likely fold into UV at some point in the future.


I was going to complain, but I’ll ask you/yall instead: what do you mean “makes it actually pleasant”? Is it too hard to summarize? Because I don’t think I ever identified anything about Anaconda or Poetry that felt like a particular choice, at least UX-wise. And curation-wise, it seems like a hard sell to trust a smaller org over the larger established group.

In other words: what does it say on the tin?? All I can read is “good package manager is good because good and fast and good”. Maybe there’s a comparison or ethos page…


A lot of data people use Anaconda. Anaconda is sooo slow. Even on a very beefy workstation, Anaconda often needs > 10 mins to solve an environment, and often fails. I would be excited to try something without these issues.


Mamba fully replaces anaconda and uses a sat solver in c++. IIRC, conda now uses libmamba under the hood as well. If you post a list of dependencies, I can time it on my box and post the timings here. (Not saying conda/mamba are best nor perfect but the last time I've seen 10m resolve times was a very long time ago)


Everyone using Anaconda should switch to Mamba or Pixi, if not for speed, then for Anaconda's licensing switcheroo. Their legal department will chase you to the ends of the earth to get their money.

Really horrific experience with the folks at Anaconda. Stay far away.


Speed for one thing. Rye also manages your python version by downloading a version and with a less finicky setup the pipenvs/pipenv virtualenv shell scripts(which take longer and are less reliable because they compile python from source instead of downloading it).

As someone who has had to deal with his teams python setup. Installing poetry and pipenv and compiling Python automatically on every users machine is a lot more finicky in practice. Plus poetry wasn't just much slower sometimes locking took many minutes to finish appearing to lock up.

There's also rye install/rye tool install works like pipx, install tools in a silo-ed virtualenv with a run file in the rye dir you've already added to $PATH (it also has parameters to pass in extra parameters such as installing db packages for slaacodegen, and optionally exposing their executables on your path). It bundles other tools from astral ie ruff which is the new hotness for python linting /auto formatting/import sorting that's also fast/written in rust.

I feel with rye/uv/ruff astral is finally making progress towards a fast useful all in one python package Manager/tool like cargo. And they keep on adding a lot of useful features, for example ruff is slowly working towards implementing both flake8 and pylint and other lints.


The cynicism of HN surprises me sometimes.

I love Rye. After using package managers from other languages like cargo and hex, the lack of a similar system for Python always had me yearning for more. I'm really happy to say Rye has completely solved this itch for me, and its a real pleasure to use as someone who doesn't want to install different tools to manage venvs, python versions and my project's dependencies. Rye just does it all.


Have you tried conda? How is it compared to Rye?

I haven't had an idea of Rye yet, but conda can do "manage venvs, python versions and my project's dependencies" fine.


ML researcher perspective: Conda is... dog slow, even for relatively simple tasks (clone and run a project). The recommendation nowadays is to use Mamba (iirc), but in my experience (a few years old now) it's been unstable and unable to find package solves which worked on my system / our cluster.

I've settled on just using Poetry for most things, and then using pip in a venv to install the pyproject.toml file from Poetry either in a Dockerfile or directly on a cluster. That's worked fairly well so far, even with torch/cuda (and the mess of CUDA versioning) and from macOS to Linux.

I think uv/rye is a good next step, Poetry can be a bit slow as well at times.


It IS slow, no argument there, but I never find the speed of a package management tool too important.

Maybe it's different for other ecosystem such as node etc., but when I'm doing research in ML I config my project mostly just once and do the bulk work (install cuda pytorch etc.), later it's mostly just activate and occasionally add some util packages via pip.

What makes conda better than native venv+pip is its extensive libraries/channel and be able to solve/build complicated dependencies effortlessly especially when you have to run your project on both Windows and Linux.

This is not to say speeding up isn't needed, of course!


> What makes conda better than native venv+pip is its extensive libraries/channel and be able to solve/build complicated dependencies effortlessly especially when you have to run your project on both Windows and Linux.

For me, most stuff is installed via pip anyways. The only things I'm pulling via conda is blas, scipy, torch and all that stuff that's a PITA to install.


If you are working on a large collaborative project, switching between branches can mean needing to rebuild your container images. It's not something I do every day, but it happens enough that the difference between 1 minute (doesn't disrupt flow/train of thought) and 10 minutes (disrupts flow) means something.


The mamba solver comes with conda nowadays. It's not slow any more.


Not only it is slow, it has so many idempotency issues that it makes it barely usable.


I agree. Rye is working great for me. I wanted a Python cargo, and Rye has achieved that from my perspective so far.


A new python dependency manager is like stumbling across a new JavaScript framework


My philsophy is simple. If the program is intended to be distributable, just use Go. If it does not require port stuff, use docker. If you have an IT team or someone to hand you a computer with OS and Python version installed that everyone else in the org uses, use venv.

If you have to work with ports, you have to distribute programs, or your libraries depend on C or OS stuff, then start consulting where you do not have to manage the codebase or have no committment to it after getting paid.


It's more complicated to write machine learning software in go than it is to write portable apps in python. Same goes for a lot of uses cases for python outside of backend servers or similar web related use cases. You can't really just "use go" for a lot of the things people use python for, at least not realistically


I have seen my fair share of ML Python codebases. Distribution is a mess, onboarding new people is a mess. The thing I would says just works is OS level configuration things like Kubernetes or NixOS are proven technology that works and there are enough resources for issues that can be self-debugged instead of opening tickets/ gh issues or reaching out to support. But as these are much complicated technology, you need domain experts and should not pressure ML engineers or data scientists to figure this out. I have seen Python packaging to be such a mess it is easier to teah to Python engineers ML or DS, then ML engineers proper package handling and distribution. The very existence dozens of packaging solutions show that engineer would rather create something from scrath rather work with existing tools.


I mean, I completely agree with that. I'm a MLE and I absolutely, utterly hate how much of a mess it can be and how much time is spent just helping interns getting their env set up reliably (we now have a pretty reliable setup/docs but that was after a few painful onboardings). I just think that using another language for some of what python does would be even more painful, just not on the packaging side of it


Choosing a language based on its distribution capabilities is the wrong criteria. Instead, decide based on what it enables you to do, and deal with the distribution later. The distribution won't matter if your project is not successful anyway.


> If the program is intended to be distributable, just use Go.

Sometimes you need to use a Python library.


None of the problems you pose should be of any issue to anyone who calls themselves a professional.


Hilarious that this is being downvoted. Can you imagine professionals in any other industry being so pathetic? “Oh man, making bridges is sooo hard, I won’t stand by my work on anything that is above the ground. Making highways is sooo hard, I won’t stand by my work for anything that has to hold a lot of weight. Making food safely is soooo hard, I won’t stand by my work for anything that requires chilled storage and/or cooking to an appropriate temperature.”

Grow up. Have some respect for yourself, your work, and the industry.


Indeed. Docker solved distributing and running python programs like 10 years ago. You can even run CUDA and pytorch in docker nowadays. And the usual answers you see on HN every time someone brings up "just use docker" on those threads, is "but I don't wanna """learn""" docker". Takes 10 min to get a python container running with 0 experience in Docker.


Yes, however unlike Poetry et al, this one is actually good!


It's really insane. And, as a user, frustrating that there is still no standard after so many years.


I like pixi (https://pixi.sh/latest/). Let's me pin python version, install packages from conda and PyPI. And also written in Rust.


It looks really interesting but it is hard to really invest in yet another ecosystem that tells you to curl and pipe into bash and then tells you to eval arbitrary command output.


For what it's worth, you can install pixi with Cargo. The current invocation is:

  cargo install --locked --git https://github.com/prefix-dev/pixi.git pixi
I try new versions of pixi from time to time because I have a project that depends on LAVIS and EasyOCR. My default project-management tool, Poetry, has problems with PyTorch. Right now, I use pip-tools for the project. While Conda worked, I didn't like the tooling that much. What is currently blocking me from using pixi is the PyPI integration (https://github.com/prefix-dev/pixi/issues/1295). I can evaluate pixi in earnest when it is solved.


Thanks for the link. Is it faster than conda?


It's orders of magnitude faster than conda


yes


I find pixi great. If anyone uses conda, pixi is a drop-in replacement where the environment is associated with the git/project directory, similar to devbox/devenv/flox.

The story is a bit complicated. There was conda by the anaconda company written in Python. Then the open source ecosystem conda-forge is a conda channel with CI build bots. Then mamba being in the same umbrella under conda-forge is a drop-in replacement of conda written in C++ (this is actual drop-in that `alias conda=mamba` should work.) Then now conda uses libmamba as the solver to speed it up.

Then the author of mamba spin it off to pixi, a rewrite in rust with different philosophy on how environments should be located and activated, with full compatibility with conda environments.

Conda always supports installing packages from PyPI via pip (when it isn’t available from conda channels for example.) and pixi support PyPI packages via uv. That makes pixi fast. (There are other optimizations done outlined in their blog post making it much faster than even mamba.)

If anyone uses any non-pure python packages, then conda is the way to go. Package manager choice (conda/mamba/pixi) is secondary.

The problem with PyPI is the lack of gate keeping. That coupled with lack of standard way to package non pure python packages makes environments leaking (see comments on errors encountered in exotic or old environments), and/or non-reproducible (especially when people is distributing source only and doing crazy things in setup.py to bootstrap their environments including compilers.)

In conda land, the conda-forge channel has pretty good gate keeping to ensure quality, such as being constrained properly, licensed properly (PyPI maintainers sometimes didn’t include the necessary license file in the distribution), environment isolated properly, etc. it’s not bullet proof as there is official bot that maintainers can use to auto-merge changes from PyPI that has wrong version constraints for example.

The problems that no tools can solves right now are centered around PyPI: deal with packages not available in conda, and releasing packages virtually mandates releasing on PyPI first.

When installing packages available on PyPI only through conda, there are some of its dependencies still available through conda. AFAIK, no package manager will use conda packages to fulfill the PyPI package dependencies. You can manually add the conda packages to resolve dependencies, risking not subjecting it to the right version constraints.

And when you author an open source python package, even if your setup relies on conda channels only, you most probably would/need to release it on PyPI first (releasing on conda-forge channel virtually mandates a presence at PyPI first). Then you need non-conda tools to help you. This is why Rye would still be useful to people like me, and worth checking out.


Worth noting that the uv folks (https://astral.sh/blog/uv) have taken over reins from this Armin project.


On the introduction page:

> Rye is still a very experimental tool, but this guide is here to help you get started.

While I’m really excited about this project, I’m planning on waiting until this project is in a more mature stage. I am a big fan of everything else the Astral team has put out so I have high hopes for this.


We use rye to develop the openai-python library and it's been great.


Rye looks great (i've also really enjoyed using Poetry as well). But man, do I have respect for Go's quality of out-of-the-box tooling. Feels like every time I start a Python project I need to re-learn the state-of-the-art wrt dependency/venv management.


Oh this is incredible. As a non python developer, I never understood why python projects has a terrible DX. Say what you want about npm/node, but it's super easy to bootstrap a project and to understand what dependencies are required.

Is there a reason why Python has taken this long to get a mature alternative to "requirements.txt"?


My favorite ecosystem, Ruby, has an order of magnitude less issues in this regard, so you can imagine my sadness to discover my second tech love, Emacs, is just absolutely unreal in this regard. And there's just no way around it, I wish I could hack on Emacs in Ruby, but there appears to be no way around learning elisp. And elisp is... crufty, to say the least. At least the documentation is really good once you get past all the beginners issues. And there's a refreshing self-awareness there, fun seeing 'this function has been around since around Emacs 23' when looking stuff up.


Are you claiming rye came up with the first mature alternative? Tons of people have been using solutions like Poetry for years. Don't discredit that.


I think it's just in Python's DNA. I dunno, even going back to the Python 2/3 debacle, it's just one language and ecosystem I've never felt "has it together". Despite being a language for beginners, I've never felt it especially easy to use.


> Say what you want about npm/node, but it's super easy to bootstrap a project and to understand what dependencies are required.

Huh? JavaScript is legendary for having a near-infinite amount of ever-changing frameworks and "build" systems, with various overlapping abilities.


I would think that making it manage musl/alpine first-class and not require a branch in the road to deal with it, was worth the effort. Things like this make it harder enough that people will do hacky modes of work to get around it.

(I hit similar things with libressl/libcrypt from time-to-time. It is not quite exactly a drop-in replacement world for these things. It's painful enough to make back-porting modern SSL dependent code a royal pain, of destroyed shared lib dependencies)


>Alternatively if you don't trust this approach, you can download the latest release binary

Is there a security difference between running a shell script that runs a binary vs running a binary directly? Or downloading a executable from the browser vs downloading a shell script using curl?

I get that running the shell script can subjectively look more scary, but doesn't it just basically reveal the inherent danger of running an exe anyhow, assuming there's no code signing keys involved?


I remember that you can detect the "curl | bash" server side and serve a different script than what the user would get by downloading using other methods[1]. But yeah, the binary itself already has enough attack surface.

[1] https://news.ycombinator.com/item?id=34145799


You can virus scan a binary before running it


Attackers can run a virus scan before distributing it.


not sure what you’re getting at


attackers can check if a virus scan would detect the virus and change it until it passes the scans, so virus scans are not sufficient protection against dedicated attackers.

just because a virus scan did not find anything in a binary, that doesn't mean the binary is safe.


Thats true but no sane malware developer would share their binary with VT. Downloading a binary is still safer than having your shell run arbitrary stuff


Sure but then it will trigger false positives some of the time

https://github.com/astral-sh/rye/issues/468


Sure but I know a few trusted AVs I look at the results for to know whether this is the case, usually Malwarebytes and ESET, and Kaspersky


https://news.ycombinator.com/item?id=40905891

* Python has too many package managers *


I don't disagree, but the trouble is, which do you axe?

Everyone agrees there's too many, but nobody yet agrees what to converge on.

It's probably made worse by the fact that package management isn't considered part of the python core team's remit.

There's a github issue on the Rye page for "Should Rye exist?" for exactly that reason, and Armin (the original developer) seems pretty consious of the package manager clutter. But in all honesty, it solves a real problen that exists (that other package managers don't solve) so I can't see any reason Rye should do anything differently just because there are lots of package managers that currently exist.


I would say it's the job of the people making python to select or provide a better solution than the current one


Rough edges as they may be, I'm just going to put my chips on Hatch because it works well, the defaults for build/formatting/testing/publication are sane and built on other tools, I can actually read the source code easily enough to PR it (which I have done), and it's under the auspices of the PyPA. In short, it's a good facade.

Also, the CLI tooling doesn't even depend on using Hatchling as a build system, which is an understated benefit.

Unless one's got a particularly weird use case where they actually need to know the ins and outs of PEP 517 (Maturin/PyO3 with system specific dependencies, for example) it is going to be completely fine for the vast majority of packages with minimal additional configuration.


Hatch is very featureful and achieves a lot, but for me, a lot of the defaults and how it's configured is entirely backwards from what I expect. I've concluded it is not for me.

My concrete suggestion is that Hatch would be better for me if it was developed by a team and not a single person. Then its core ideas would be judged by multiple people from different angles. As it is now, it's written to the taste and philosophy of a single author.


That may be the case, but my experience from submitting issues is that ofek is very receptive to other's perspectives and spends a lot of time engaging with other takes in issues on the Hatch repo and elsewhere.

As an example I see he's in this thread replying to a cousin comment :).

Sidebar: the benefit of one person early on in a project's development is that it's much easier to establish conceptual integrity in the Fred Brooks sense: "Conceptual integrity in turn dictates that the design must proceed from one mind, or from a very small number of agreeing resonant minds".


One person's coherent vision can be very good for a project, I think so too!


I’ve used both Hatch and Rye extensively. Hatch is great, but currently lacks monorepo support (it is expected to be added by the Autumn). I do like the way that Rye bootstraps itself, the dependency resolver does seem to be much faster, I like that you get updated pip lockfiles on every sync, and it already has opinionated monorepo support. Getting a new developer spun up is as easy as “rye sync”. But I agree that the fact that Hatch is under the auspices of the PyPA is persuasive indeed.

Rye for some reason requires that you put dev dependencies under a non-standard key in pyproject.toml, but other than that, it’s pretty trivial to swap between Hatch and Rye as you like (which was indeed one of the primary goals of PEP621 afaiu). I for one will certainly be checking in on Hatch again when monorepo support is added.


Workspaces is unblocked now, it depended on a large refactor which has now been merged: https://github.com/pypa/hatch/pull/1602

In my free time, it's the only thing I'm working on!


Excellent news - thank you for all of your excellent work on Hatch!


Awesome, thank you. Happy Hatch user.


I'm sure experienced Python developers will continue to use the familiar pyenv/venv/pip toolset, or its many variations. However, Rye is better for beginners and developers coming from other languages. First, Rye is an all-in-one tool, written in Rust for speed, with a unified set of commands for setting a Python version and installing packages, with environments created automatically. Second, the tool uses a folder/project approach to development like languages such as JavaScript and Ruby, where a developer sets up a project in a folder and then specifies a language version and dependencies, rather than the old Python approach where packages get installed into a Python version that is shared among projects (unless a virtual environment is created with venv). The Rye documentation is good, still I thought it's good to offer a tutorial for beginners, so I wrote "How to install Python on Mac" [0] and "How to use Rye for Python" [1]. I don't know if Rye will catch on, but it's a breath of fresh air.

[0] https://mac.install.guide/python/install [1] https://mac.install.guide/python/use-rye


The amount of effort that goes into making Python usable is insane.


Question about the company Astral: how does it make money? I see it has backers, and a mission, and beliefs, but is there something that's sold?


It's kind of ironic that there isn't a decent business model for folks who build tools that uplift the productivity of MILLIONS of devs.

I'll just say that even though Linux is FOSS, RedHat still became a multi-million dollar company.

Likewise Travis Oliphant was able to build things like Anaconda and Quantsight to monetize some of these essential open source tools.

With wide enough adoption, almost anything can be monetized at some level.


There is a decent business model. There are several, in fact. I'm asking which Astral has, if any.


I don't think it makes money yet.

They've said the eventual plan is services [1].

[1]: https://astral.sh/blog/announcing-astral-the-company-behind-...


Here I am and pip+venv+pyproject.toml work just fine. I don’t distribute outside our team, maybe that’s why?


Try making a portable package that pulls pytorch with CUDA (reliably) with that set up. I don't even think it's possible as you can't add another package index when using python build.


Cross-platform applications that rely on binary libraries have always been the beginning of my Python dependency misery. It usually works fine, but when it doesn't, it's miserable.


I get that, but isn’t that a problem with dependency management itself and not the tools?


Same for me. Does rye help? Or what's your silver bullet?


I haven't encountered any silver ammunition for that particular gun. I haven't worked on a big python project in a while, so I'm not sure if rye would do the trick.


PyInstaller + bundling the dlls and binaries works well for me


One annoying problem that I encountered with that approach was different binaries having different version conflicts on different platforms. Not unsolvable, but definitely unsolvable without a whole lot of irritation, and a huge PITA to maintain. It definitely makes it possible though, which is a whole lot better than nothing.


If I want to dip a toe in and run Rye and pyenv side-by-side on the same machine (not for the same project obviously)... is that going to be a footgun? I'm assuming... yes, since the shims from both are going to conflict?

Update: looks like this might be supported, from the FAQ...

Can I use Rye Alongside Other Python Installations?

Rye given it's experimental nature does not want to disrupt already existing Python workflows. As such using it alongside other Python installations is intentionally supported. Even if the Rye shims come first on the PATH, Rye will automatically resolve to a different Python installation on the search path when invoked in a folder that contains a non Rye managed project.

As such the answer is a clear yes!


Just watched the video introduction. I don't mind people preferences regarding keyboard kind, but those noisy mechanical keyboards sounds awful, tiresome and downright aggressive when watching a recording as they stand out and sound a bit like a machine gun in an FPS. I don't need that kind of aggression, especially when it is so loud it is covering the voice of the presenter.

I would appreciate it if people doing videos presentation would switch to quieter keyboard, use directionnal microphones so that only their voice is recorded or edit the sound afterwards to quiet down the keyboard frequencies out of respect to their audience. It is not hard to do.


Funny, I realized within the first few minutes that I’m not interested in Rye, but I couldn’t stop watching because I loved the mechanical keystroke sounds so much. I don’t use a mechanical keyboard for various reasons (I tried), but those sounds…

(Not saying having them in the video for everyone is a good thing—I respect the fact that some people can find them annoying)


Well you are perfectly entitled to watch ASMR videos of people typing on a loud keyboard :D

I personnally do like a mechanical keyboard and the sound of them when I am using it, but don't use it much unless I am the only one at home out of respect to my kids and partner, especially in the morning[1]. And I would also switch keyboard to record a video or do a videocall as I don't want the sound of it to be so present.

I think this is just a negative point on a video as it makes a busy foreground sound over what looks like a bad microphone recording in the background. That should be the opposite, the voice should be the foreground sound to your ears.

[1] partner has irregular work schedule so might want to sleep longer in the morning when came late the night before.


Is this a common opinion? After watching the video I couldn't imagine why anyone would be so bothered by the keyboard sounds in it, which is maybe just a little bit too loud? And way less distracting than the bad sound quality of the microphone used (which again, is pretty easy to get over).


I really don't think this is "aggressive". You seem to be saying "I don't like the keyboard sound", which is fine, but overcharacterising isn't necessary. And writing a two paragraph comment about how easy it is to fit in with your preference appears more deliberately aggressive than anything the video does.


Ruff and uv are both excellent tools, which are developed by a VC-backed company, Astral: https://astral.sh/about. I wonder what their pitch was.


That's what I'm worried about. What if we start using Rye, bake it into our projects and then Astral goes "Sike! Now we will collect all information available to the Rye binary and require 1$ for each build"


Rye-powered Python deployment platform a la Vercel?


Anaconda competitor? Many companies in this space start out by releasing new OSS tools and then turn into consultancy sweatshops.


I recommend Rye because of its python bianries (credit to Indygreg). What a time saver!


I'm not sure about Rye, but I've been experimenting with uv as a replacement for pip-tools and so far I'm impressed. I've been using and contributing to pip-tools for years now but it has a few quirks that I don't like. uv seems to solve them already, and it's way faster.

There's a few bits I don't like in uv, but I intend to submit patches soon.

The one thing I think sets these projects back is the naming. Rye, uv... They just aren't as good names as pip, pip-tools, venv etc. This could make it harder to convince people of their legitimacy. But I could be wrong, maybe they will succeed based on their merits.

A "Cargo for Python" I find a bit misleading. Python is very different to Rust and upon learning Rust I didn't find anything that I hadn't already got in all my Python work. If anything Cargo is just 'the best parts of all packaging software so far, including Python".


It looks interesting; I'll keep an eye on it. Side note: The intro video is great, but the keyboard clicks were annoying.


As a Mac user, I'd be interested if you could add:

* Installation through Homebrew * Plugin for Oh My Zsh!


Rye installs as a shell script you pipe to bash

curl -sSf https://rye.astral.sh/get | bash

(I believe this downloads the correct binary for you. Rye is a single binary built in rust but it will also download a version of Python to use for some operations) or you can download a zipped tar file from https://rye.astral.sh/guide/installation/#installing-rye

https://rye.astral.sh/guide/installation/#add-shims-to-path Oh-My-Zsh:

mkdir $ZSH_CUSTOM/plugins/rye rye self completion -s zsh > $ZSH_CUSTOM/plugins/rye/_rye

Then make sure rye plugin is enabled in ~/.zshrc

I think the ZSH_CUSTOM would be your ohmyzsh folder


I am starting to wonder if I’m the only one that just types “brew install NameOfThatThingTheyWantMeToPipeInFromCurl” into my terminal as a reflex.

It’s already available: https://formulae.brew.sh/formula/rye#default


I have not yet tried Rye, but the installation guide shows me that it's yet another (seemingly new, too) tool that creates yet another folder in the user's home directory (as opposed to ~/.config or ~/.local/share). Why?


It looks like Armin Ronacher answered this question in the issues when he was maintainer (https://github.com/astral-sh/rye/issues/316#issuecomment-158...):

> While I understand the desire for XDG, I personally hate it because it encourages common files to be in different locations on different operating system, driving up the complexity for everybody. Once you go down that path, every single documentation has to explain different paths for different platforms and configurations, how different configs can affect where stuff is. It gets gnarly quickly.

> I do not believe the benefits outweigh the downsides and XDG is generally not enough of a standard that it prevents people from putting stuff into the home folder. If you want to relocate `rye` there, you can already do by exporting `RYE_HOME`.


Rye has been wonderful for me. Works so nicely and isn't adding too many opinions. Things like poetry tend to really be not amenable to certain workflows, which has likely made many people stick around to `pip -r requirements.txt`.


[context: honest question from occasional Python user] What's wrong with poetry?


Poetry has two flaws imo:

1. It's written in Python, which makes it slower and prone to bootstrapping issues.

2. It doesn't manage your Python installation, which necessitates the use of a tool like pyenv.

Rye sidesteps both of those by (a) being written in Rust and (b) trying to solve all of the problems solved by poetry and pyenv in one go.


I like that it doesn't manage the python/venv installation. E.g. rye creates a .venv for every project and in case of packages that are large, that starts to add up as more projects are added. With poetry, I can separately create a "common" virtualenv which I can use with bunch of throwaway projects; this is of course assuming that the version requirements of these projects do not clash - if they do, I can always create another virtualenv.

With rye, I activated a virtualenv and then created a new project, it proceeded to setup its own .venv within the project instead of just using the one that was already activated.


I would like for more venv sharing, but rye is leaning heavily into correctness, which (in the current Python universe) is much easier to do by recreating the world.


I would consider both of those flaws of Rye, not Poetry. Python package managers not written in Python will by definition have less contributors, and to me, they make Python look like a toy language (you can't write a package manager, a fairly trivial program with the exception of dependency management, in Python - what can you write then?)

As for managing Pythons, I would consider this to be orthogonal to packaging, and the default system Python is often good enough.


The problem with Python for such tools is not so much the speed but that it’s hard to distribute and bootstrap.


I think this is a fine opinion, we like tools that do exactly how much we want them to. But I'd suggest setting up python (and virtual envs) was actually a big headache for a lot of newer users, and some of the old ones (me that is).

I also don't see why leaning into python being a wrapper around rust/cpp/c is a bad thing. Each language has its own niche and packaging/bootstrapping is more of a systems level language problem.


If poetry works for you, it's fine. It works great 90+% of the time for me and I found it before rye or hatch or whatever, so I use it too. Eventually it will either have to adopt to the new standards as they firm up or it will probably get left in the dust, but that's fine. Hopefully if/when that happens something like rye will be mature and boring and well supported everywhere instead.

The latest tools are not always the best tools, old and boring, as long as they work for you are perfectly fine.


Does it automatically install and setup the required Python version for the project?


Yes. You choose a standard version (usually the latest stable 3.12 currently) that it downloads at setup or it downloads and sets up the pinned version from your pyproject.toml.


It doesn’t work consistently across environments


Could you explain futher? By 'environment' do you mean platform?


Like during dev setup, does it work on your friends laptop but not yours, even though you’re both running MacOS latest (this was my experience) Or it works on your laptop but not a CI machine (also my experience).

What we want is for a declarative toml file to reliably reproduce dependencies regardless… sadly even across very similar machines it fails to do so.


isnt this what the lockfile is for? are people installing poetry lock dependencies on macos that turn out different?


I’d recommend reading about Python Wheels - https://realpython.com/python-wheels

Unlike, say NPM, Python doesn’t just download a versioned set of python files depending on your lock file


Probably have different versions of Python installed manually. And/or architecture different.



Poetry struggles with complex packages like PyTorch.


My world is really simple and I wonder if I am missing anything:

1: I develop and run Python projects in Docker containers.

2: Each Python project has a setup.py.

3: The setup.py installs dependencies from their git repos.

What benefit would I get from using any additional tools?


Well... why are you using Docker?


Without containers, how do you prevent some dependency going rogue or bonkas and messing with files outside of the project directory?


Nix would be another option. I find it odd so many tools like this are so Python specific, when a lot of the complexity is dealing with non-Python dependencies (glibc versions, and all the different C shared libs).


I think there's room for both. I want speed and correctness within my little Python ecosystem, so I reach for a tool like Rye to help with the package management. Then to solve the system level issue maybe I reach for a simple direnv implementation to install the system level dependencies needed for the project.

You could of course lean all the way in and manage and build your Python dependencies using Nix as well, but that is... introducing a lot more complexity and slowness at this time.

I think it comes down to the frequency at which we iterate on things and the level of expertise required to deal with them. The system level dependencies might change much more rarely, and require more expertise to work with tooling, but if I can hand off a clean system environment to a developer who doesn't know about or need to care about Nix at all and then they are able to quickly iterate inside the Python ecosystem I get the best of both worlds - I solve the non-Python dependency problem myself and the end user (the developer) has a fast, smaller scoped, way less inscrutable problem to deal with when they are working within the Rye ecosystem.


If I'm not building a shared library, what is the benefit of having my project as a package? It seems Rye only supports python packages, but most my stuff never touches PyPI or anything similar.


For that use-case, you can set up a `virtual project` using Rye. And use it just to create python envs, and sync dependencies.

Honestly, the biggest time-saver for me has been Rye automatically fetching python binaries that work everywhere, and setting up clean venvs.

- https://rye.astral.sh/guide/virtual/


Rye is a python only thing, and if you're using python without any external dependencies, it probably doesn't have that much benefit (although it does make having different python versions for different projects a lot cleaner if you need that).

I think that it has a huuuuge benefit for applications as well as libraries. Rye generates lock files, which are really handy for pinning down versions of something running in production so that it'll behave predictably.


I am using external dependencies I’m just usually not packaging my project.


Some OSes need shims to easily call scripts on the command line. Also, you can define multiple entry points that might live in one big ol file.


One that that isn't clear is editor support -- I recently switched to poetry for a few repos at work because it took the pain out of editor setup for my team -- it works with intellij, vscode, and emacs (well, doom emacs made it pretty easy) -- anyone know how well this integrats with e.g. emacs?

A quick search turned up this unanswered question: https://github.com/astral-sh/rye/issues/888


I use it with vscode and I've not had any problems. vscode just sees the environment. Rye puts things in standard places (PEP places) and uses common tools under the hood, so I think that it should generally just work with an editor that understands python environments.


Presentation shows that if you decide to fix a python-version in a .python-version file, rye sync will not work and complain, and you will have to fetch it manually with `rye fetch <version>`.

I am wondering why the developers chose not to propose to fetch the missing version when doing `rye sync`. I would expect anyone preparing the video tutorial to think "wait, this is stupid, let's fix this asap" instead of showing off potential users how unfriendly the ux is.


I read this thread early this morning and have been trying Rye for the first time over the last several hours. I like it and a few things have helped: 1) add a Makefile targets for auto-creating a requirements.txt file to make my two new projects play well with pip, etc. for people not wanting to use Rye. 2) ChatGPT saved me time setting up VSCode and Emacs for Rye+Python dev.

So far, I love Rye! So fast and no roadblocks yet.


> desire to establish a one-stop-shop for all Python users, Rye provides a unified experience to install and manages Python installations

This seems an impossible task on linux, where yum/apt packages are installed long before any higher-level python tools like poetry or rye. With the continual sprawl of "one tool to rule them all" in Py & Js, I'm thankful for Go being a much more consistent ecosystem.


Rye isn't aiming to replace apt. Rye is just aiming to do the same as Go, having one tool to manage dependencies. In current ecosystem there are many many different tools tho manage Python project's dependencies unfortunately.


Rye is never going to replace apt. There are too many system dependencies and people do not want to deal with multiple package managers in most cases. I suspect even within Python, there will never be one tool. Go is materially different, because it is the language and tooling wrapped into one. Python would have to have the solid tooling built into the CLI, so that we don't feel the need for tools like Rye to begin with. Python is already too fractured to ever realize this, imho.

We have been moving away from python packagers towards linux packagers because it is trivial to mirror them internally. We do this because external mirrors have been unreliable and we have full control over when updates get rolled out without having to pin every version (the mirror effectively pins them)


I'm still using Poetry for dependency management, but that will most probably change someday (hopefully with automated conversion of the pyproject.tml files).

I'm OK with the pythons that are provided my the OS I'm using (including Deadsnakes) so this side of rye doesn't appeal to me. I've started using uv to create virtualenvs, though, so that's probably the gateway drug. We'll see.


Why Rust? Aren't you alienating Python devs from working on it?

I see that UV is bragging about being 10-100x faster than pip. In my experience the time spent in dependency resolution is dwarfed by the time making web requests and downloading packages.

Also, this isn't something that runs every time you run a Python script. It's ran once during installation of a Python package.


I actually think that Python's tooling should not be written in Python. Because if yes, you end up with at least two version of Python, one to run the tooling, one to run the project.


I'm not sure of the answer, but one thing Rust has obviously bought them is native binaries for Mac/Windows/Linux. For a project that purports to be about simplicity, it's very important to have an onboarding process that doesn't replicate the problems of the Python ecosystem.


If you are building a production app that uses python in a containerized way, you may find yourself rebuilding the containers (and reinstalling packages) multiple times per day. For us, this was often the slowest part of rebuilds. UV has dramatically sped it up.


Uv has already proven itself by being faster at every step it seems like, except maybe downloading. But notably it includes unpacking and/or copying files from cache into the new virtualenv, which is very fast.


It parallelizes downloads and checking of the packages.

It also doesn't compile .py files to .pyc at install time by default, but that just defers the cost to first import.


It runs every time you build a docker image or build something in your CI


so it take 3 seconds to run instead of 0.3? Don't get me wrong, that's a huge improvement but in my opinion not worth switching languages over

Features should be developed and tested locally before any code is pushed to a CI system. Dependency resolution should happen once while the container is being built. Containers themselves shouldn't be installing anything on the fly it should be baked in exactly once per build.


Modern CI can also cache these dependency steps, through the BuildKit based tools (like Buildx/Dagger) and/or the CI itself (like GHA @cache)


their uv tool is incredible -- it is substantially faster than vanilla pip even on a fast laptop, and they were smart to adopt pip-tools commands for pinning

will not go back to pipenv / poetry unless uv does something truly awful to me

I feel more meh about rye, except as a way to use uv for package management

and I don't understand why astral is a company


I have a project managed by Poetry - how easy would it be to switch that to use rye? What are the upsides/downsides compared to Poetry?


Do you have problems with Poetry? If not, don't switch


That's a sane comment here. Don't switch tools when a new one appears just because the new might be better. Switch tools when you actually have a problem with the current one and the new one solves that while not degrading the rest.


While we're on this topic:

1. Can anyone recommend a good file layout/organization for a polyglot monorepo that contains Python, C++, and JS?

2. Does the build system need to be use something like Buck/Bazel?

3. How do you usually open your project in an IDE? do you open the root of the monorepo, or do you open each subproject separately?


Ironic that this just went by as well https://dublog.net/blog/so-many-python-package-managers/


> and most notably Rust has Cargo – quite possibly the most widely loved package manager tool in existence.

I thought Rubygems was "the most widely loved package manager" given how often it's been used as inspiration for other languages' package management systems... but maybe I'm just getting old :)


Counterpoint: directly from the article, a relevant quote: "[rye] has matured to the point where it now does most if not all of what poetry does only faster"

So perhaps rye is all we need. I've been happy with pipenv before (letting me keep a list of dependencies and their versions in a file separate from the requirements.txt - so I can upgrade core dependencies rather than sub-dependencies). But I'll give rye a try next time I touch Python packages.


I kinda like that it downloads its own interpreters... could be useful when using pyinstaller to have a more portable interpreter.


conda works really well for local hacking. Whatever it does, it sidesteps almost all the other footguns I have encountered with local pythons.

on the other hand, I've found Go to be meaningfully superior for many things, and Rust to be meaningfully superior for many other things, and Scala excellent for the JVM and I simply can't actually recommend other languages for professional greenfield work outside of exceptionally niche cases.


How experimental is it? Should we use it for production for small app deployments?

We are open to the cutting edge, but it can't be too sharp or rough.


I read in another thread that people are doing dev in a container. It’s supposed to be awesome.

How does this compare do developing inside a container?


> The windows installer normally will automatically register the rye path in the PATH environment variable. If this does not work

Why would it not work?


Very interesting! I will definitely try this out.

Out of curiosity though: is there anything like this that exists, but less Python-specific?


ruff, then uv, then rye All of them changed my pythonist everyday life, one after another.

Great thanks to all the team.


I thought they "really going to get it right this time" N-1 python packaging tools ago.

(happy Nix user)


Why should I use this over Miniconda?


Hassle isn't really what I think of when I think of Python. C++, yes, but not Python.


Could you publish it on Homebrew to make the installation on Mac hassle-free too?



“There should be one, and preferably only one, obvious way to do it—xor it’s packaging”


Pretty cool. Would be nice to have Cython handling.


what do rye offers to a pdm-based project?

https://pdm-project.org/


Can Rye work with Conda projects?


How does it differ from Poetry


but why?


If you're interested in the historical rationale for Rye I suggest reading the original README from when it was first released: https://github.com/astral-sh/rye/blob/8e430b0519fe2a125837e1...

And Armin's later post about the vision for the project a year later: https://lucumr.pocoo.org/2024/2/4/rye-a-vision/


[flagged]


In case you're being serious, this very clearly wasn't a resume boosting project. Rye creator Armin has Flask, Jinja and Click on his resume (among many others) - it's stacked already.


The author's resume already stands above most. I think this is solving a real problem.

Even the PSF, which frustratingly does not take a stance on the matter, is somewhat considering a very similar idea in PyBI.


I suppose but they could've just made a whole API on a homelab that is a resume to test against and for luls.


Not to be confused with Rye, the programming language: https://ryelang.org


Confused with uv/rye


venv, pip, and requirements.txt are just fine.


Unless you want reproducible installations on multiple platforms.


cries in CI


No, they're not. Just teaching a beginner what the hell a venv is is a nightmare


I'm about 20 years into coding, and still don't "actually" know what a venv is haha


venv really doesn't seem meaningfully different than jenv or nvm to me.


Also bad. Modern languages have per-prpject dependencies.


I don't mind venv and pip — they may be inconvenient but at least they work.

The requirements.txt however is essentially broken for any project that changes in time. If you don't freeze transitive dependencies versions your builds break with no prior notice. If you do, transitive dependencies stay there forever even when not needed anymore.


pip-tools solves this problem.


I am fortunate enough to not know why you're wrong (having only ever used Python for my own personal projects); but I've seen enough people claim that (in a corporate setting) you are, to be confident that they're right.


Even with your own personal projects it can become a mare. If you like to code interstitial (in those quiet moments between your day job) and hot desk and have a multi platform environment its pain to just use those tool and setup for each new place - I code on Windows (+wsl), MacOS and linux sometimes. I uses github privately to sync everything. Its just me.

Rye is much quick to get started on a machine; install rye+git etc, pull project from git, rye sync. The sync command does all the lifting, installing python version that I set, setting up a venv and installing deps.


Obligatory https://xkcd.com/927 and https://xkcd.com/1987

Probably not actually a large issue here since it uses pyproject.toml which is what Poetry uses and seems to be the standard people are moving to.


> it uses pyproject.toml which is what Poetry uses and seems to be the standard people are moving to

Yes, PEP 517 is standard now. But, that's not actually a build system and depending on how you configure it, the actual contents can differ wildly. While build-system.build-backend is standard, how one actually configures that backend in tools.<your-build-tool> are not standard at all.

It's more proper to think of it as a standard meta-format for build system configuration.

There is effectively no real portability between build/project tools except in the most absolutely trivial cases.


#1987 is actually outdated and way too small. Nowadays it should include build/install/pip/flit/rye/setup/setuptools/setuptools-deprecated/poetry/[insert 50 more build tools].


Obligatory indeed. No need to water it down - both are still highly relevant for python



I wrote the `so-many-python-package-managers` article and posted it earlier today. I'd like to apologize for misrepresenting `rye` in the first draft. I did not realize it had made so much progress in the span of the past year, and it really does look like an excellent tool. Certainly, with the `uv` backend, it just about fixes my one major gripe with `poetry`

I'll be giving it a go on my next project!

(Note: I updated my article just now to say more good things about `rye`)


Don't worry, I didn't actually read your article (I read the first few pharagraphs) then I scrolled a bit, went "yeah, thats lots of package managers" and pasted it into my notes somewhere.


Yup that's why I wrote it. It's basically my notes dump.

Wait and see my next blog post. It's going to be about Genghis Khan and a paternity dispute.


This is a very nice article, thank you for the kind words.


I mention you specifically! Thank you for your oversized contributions to the Python programming landscape.

When I hear folks say you need an entire team to do X, Y, Z, I just point them to examples like Charlie Marsh!


Neat! A relevant quote: "[rye] has matured to the point where it now does most if not all of what poetry does only faster"


How are they doing this? A few months ago it was a replacement for pip, now they basically rewrote most of Poetry? In a few months? Let alone Ruff being a rewrite of like fifty tools.

How is this even possible?


I think you're thinking of UV, this is a project that they adopted. As far as I understand UV is where development is happening and actually Rye is soft deprecated.



Ah yeah, Rye is Armin's thing. Still, this is a lot of development.


It's "just" a wrapper for the existing tools. If you're already using ASDF for your python versions and know how to use venv, there's not much gain here - saves a couple of simple commands only. It could replace your poetry (https://python-poetry.org/) if you're using it though.

Specifically, it doesn't do anything new/different by itself.


It's all-in-one. Also mise > asdf.


So much free advertising XKCD gets simply by being always relevant... :~P


requirements.txt

daily reminder: you can specify exact commits from git repos.


You should use pyproject.toml to specify loose constraints instead.


guys just use matrioshka.py it is your one-stop package management system for python.

just prefix any other package manager with matrioskhka

e.g. python -m matrioskha rye install pip install pipx setup.py install

A single distribution command for all of your projects.


Instead of creating solutions to problems we create…

Why don’t we just stop creating those problems =p


Why is the installation binary? Conda pulls the same shenanigans, which I hate. Why should I trust a self-extracting shell archive?

I only install Conda on untrusted machines.


$ nano thing.py

$ python3 thing.py

Don’t see the hassle there.


Keep reading the python tutorial you're working through and you'll eventually see they import 3rd party libraries. That's when things get difficult.


I keep my 3rd party libraries to a bare minimum as I actually know how to code.

The time it takes to write a utility function is less than the long term pain of dependency hell and all this tooling.

I suggest you go read through your CS 101 textbook.


Surely everyone working on codebases large enough to require package management has the flexibility to make unilateral architectural decisions and re-solve problems a battle-tested library already solved, having mitigated all of the counterintuitive edge cases and security holes you haven't discovered yet. Well, maybe not, but please, please don't let reality discourage you from vigorously patting yourself on the back for not being stymied by limitations that weren't imposed on you.


- Open piano

- Push keys in the correct order

If you forget what order to push the keys, it's literally written on the sheet in code easy enough for young children to decipher. I don't understand the kerfuffle.


Yes, you don’t need a concert hall, a grand piano, or a team of recording engineers to push a key. You are correct.


Right. Everything can be reduced to a simple problem if you ignore enough important details.


I wish people would just stop using Python for large projects. It's such a terrible language and ecosystem with a huge surface area of gotchas.


no we won't


Who needs it. I install everything globally and then when some dependency fucks it all up I just reinstall the OS. Been meaning to learn the venv thing but I've been doing this for years and there's no turning back now. My workflow works for me.


This would have been not totally insane pre containers and using VirtualBox and just reimaging.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: