Hacker News new | past | comments | ask | show | jobs | submit login

Here is the process I use for smallish services -

1. Create a python package using setup.py 2. Upload the resulting .tar.gz file to a central location 3. Download to prod nodes and run pip3 install <packagename>.tar.gz

Rolling back is pretty simple - pip3 uninstall the current version and re-install the old version.

Any gotchas with this process?




If you are using it for small services it's probably fine. But the original article did say that uninstall sometimes doesn't work correctly. Apt is more formal than pip.

So at some point, as you know you'll need to move on.


You have to do this every time there's a change in the codebase which is not easy. How do you stick this into a CI without the git & pip issue talked about in the post?


I have to do this everytime I have to deploy, which is similar to having to create a deb package everytime Nylas has to deploy.

There are no git dependencies in the process I describe above.

The pip drawback that is discussed in the post is of PyPi going down. In the process described above there is no PyPi dependency. Storing the .tar.gz package in a central location is similar to Nylas storing their deb package on S3.


Are you using a venv?


Nope.


If you did it would probably strengthen the isolation of your modules from conflicts, or say un-installation errors. Whether that's needed is up to you.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: