Hacker News new | past | comments | ask | show | jobs | submit login

Does this mean it would be easy to move off Python all together? It seems like the problem stems from everyone using pytorch at the base layer. How realistic is it recreate those apis in another, more modern language. Coding in Rust, Go... then distributing a single binary vs. pip hell seems like it would be worth it.



Check https://pytorch.org/tutorials/advanced/cpp_frontend.html

You can easily build a standalone binary (well, it would be GiB+ if you use CUDA... but that's the cost of statically linking cu*), had you coded your model and training loop in C++.

It then happily runs everywhere as long as a NVIDIA GPU driver is available (don't need to install CUDA).

Protip: Your AI research team REALLY DON'T WANT TO DO THIS BECAUSE THEY LOVE PYTHON. Having Python, even with the dependency management shit, is a feature, not a bug.

(if you want Rust / Go and don't want to wrapping libtorch/tf then you have a lot of work to do but yeah it's possible. also there are model compiler guys [1] where the promise is model.py in model.o out you just link it with your code)

[1] https://mlc.ai


Go would be interesting for the reason you could send an executable.

I’d love for JS/TS to dominate as well. Use ‘bun bun’ to send an executable if need be, but also use in in web backends.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: