A Docker container starts in two seconds or so. And gives me everything I need. So no need to dabble with a VM.
There is not much overhead in running a project in a container. The project has a setup file that turns a fresh Debian 10 into whatever environment it needs. And thats it. Run that setup script in your Dockerfile to create a container and you are all set. Want to run the project in a VM or on bare metal? Just install Debian 10, run the setup script and you all set.
> Also at what point do people just realize that all of this overhead is a gigantic waste of time and just use a better language?
Probably some time shortly after your developer time costs less than your cloud compute time. Until you hit that point (if ever) there are few options as cost-effective as Python.
In any language ever if you use non-vendored shared libs you will hit this problem. Certainly not specific to Python, in fact the reason package managers on *nix are necessary (and not just a nice to have) is because of this.
Yeah to be honest if you need containers to make a project reproducible this is just a sign of failure. You're basically saying you need to encapsulate the entire system for your code to run correctly.
Loads of tools have external dependencies that are hard dependencies. And library search paths is my machine vary from those on prod... I could go on, but I'm not sure I understand why using a container to manage all that is a failure?
There are plenty of reasons why you might want to containerize a project. If you have a lot of system dependencies, for example, you might want to consider including a Dockerfile in your project to make it portable.
However what makes Python a failure is that people feel they need this to dependably run a python program which only has pure-python dependencies.
Compare this to a language like Rust, or the NPM ecosystem. In those cases, the tools have managed to dependably encapsulate projects such that you only need the package manager to make a project fully repeatable.
With either of those ecosystems, there's basically one system dependency, and you can find any repository online and dependably do `git clone ...` then `cargo build` etc. to make it work. With Python, you effectively have to reproduce the original developer's system, and that is a failure.
Huh? Either something is really weird about your env or we have different ideas about what counts as a pure Python package.
Because if you don’t rely on Python packages with extensions that farm out to external libs it’s as easy as git clone, pyenv virtualenv, pip install -r, and python -m build.
The think that makes this worse than other ecosystems is:
1. virtualenv shouldn't be necessary. This is more or less the same concept as containerization. This is only needed because python has a fractured ecosystem, and setting up your environment for one project can break another.
2. you also have to know which environment encapsulation and package management solution the library author is using - this is not standardized
1. Virtualenv is essentially the same as node_modules, yet everyone rants and raves and loves that. And the kind of breakage you're talking about is astronomically rare in my experience.
2. No you don't - what makes you say that?
virtualenv is so much less user friendly than npm. Like why do I have to run a `source` command to make virtualenv work? I don't use either often, but I can remember how to use npm if I haven't used it in like 6 months, but I have to look up the right commands virtualenv if I haven't used it for like 2 weeks.
Also at what point do people just realize that all of this overhead is a gigantic waste of time and just use a better language?