I actually like Python's package management (with just pip and venv). It's different to all other modern solutions, taking a per-shell rather than per-project approach, but that doesn't mean worse.
The advantage is that it's less "magic" than, say, npm. No need for rules saying how to find node_modules, you just have a PYTHONPATH env variable.
The Rust and Java approach is to do everything through a build tool's CLI, and I can't complain about that. It's probably the best compromise: Less magic than npm, and more user-friendly than Python.
I was talking more about the package lookup when running code, via e.g. `node script.js`. I think it looks in all parent directories of the CWD, or maybe of the script? It's not too complicated, but it is "more magic" IMO.
Actually building Python packages is pretty complex, but that's the case for JS too. Java avoids this by distributing compiled libraries.
It looks up from script.js'd directory or the earliest parent with node_modules. Not CWD. A lot like "import somemodule" in script.py tries to import somemodule.py file from the directory script.py is in.
Traversing to the parent is especially nice for scripts. In python having scripts outside the module directory is quite painful.
Gyp is a lot easier than Python setup.py. The "easy" Java packages are comparable to pure Python/pure JS packages. With e.g. C bindings JNI/Java packaging is a horrid pain.
NPM’s original sin was making package installation and general management so painless that folks installed micro packages for everything. Can’t really fault the software for that.
The issue was, as you say, the introduction of ESM. It used to be that you required modules one way and one way only (yes there was AMD for advanced use cases, but it was an add-on), then people felt the need to “standardize” that, no we have this mess of ESM and CJS.
Review and pin your direct dependencies. With transitive dependencies it doesn't differ from trusting large dependencies in general.
The alternative to micropackages has significant downsides. Pulling in extra surface and rolling your own buggy implementations while waiting for some commitee to bikeshed years on the implementation.
Making the right thing easy rather than the wrong thing hard is a lot better approach.
If you don’t vendor your dependencies. Which is a poor practice that is commonly associated with NPM, but is by no means a requirement of the technology.
Especially if…, I should say. Even vendored dependencies are a risk as NPM commits the additional sin of allowing the act of pulling a package onto the local machine for inspection to execute arbitrary code in the form of “postinstall” hooks.
The Python community needs to solve this ASAP. This almost weekly pain point has turned a language I used to love in college into one of my most despised languages. The fact that the ML community uses this broken platform is infuriating.
Make a Python version 4 that focuses only on fixing the packaging. 100% per-project hermeticy. No global packages whatsoever. Solve just this issue and bring the entire ecosystem on board. Kill all the various virtualenvs, the anacondas, global packages. All of it.
Learn from Rust/Cargo. That project does it mostly right (sans lack of namespaces and reproducible builds).
It took 10 years to force a breaking migration from Python 2 to Python 3 in order to handle Unicode...
I have resigned myself to repeatedly smashing my keyboard on the desk until it rains keys in blind rage induced frustration to let off steam and then calmly creating an issue in the appropriate repo instead of allowing even a modicum of hope that this will ever be fixed systemically.