0

We have a Python application that runs on a single server (the master), and that is copied to multiple other servers via rsync (the nodes), in an all Ubuntu environment. Currently we have to login to each server when any Python changes include any new libraries, so we have to go to each server, login via SSH, and python3.7 -m pip install whatever. What is the pythonic way to do this so that it’s possible to efficiently clone a python application including new libraries?

krypterro
  • 391
  • 1
  • 3
  • 14
  • 1
    Seems like the answer is in your tags. Why not just use venv/pyenv to capture the dependencies and copy over the virtual environment directory instead of just the script by itself? Using something like rsync would only copy the things that had changed so it'd be relatively efficient. Or go whole hog and use a container (like docker) to manage it. – estabroo Apr 22 '19 at 16:22
  • I will look into pyenv further, but on my initial review it seemed to be a means to select a python version, not exactly what I thought I needed. I guess what I am really asking is, of all the python virtual environment options, which is the most modern, best supported, and the most likely to provide the solution I need? – krypterro Apr 23 '19 at 00:27
  • [pex](https://pex.readthedocs.io/en/stable/) might make the whole pushing a virtualenv out easier. Personally I just use docker containers to manage things like this, all dependencies are self contained and you can run them just about anywhere. – estabroo Apr 23 '19 at 01:40

0 Answers0