I work at a small enterprise, where the number of end users and contributes for a particular set of packages would be no more than 5 forever (this is guaranteed). For the past 3 months, we've just been doing all our research and development on a shared windows drive using anaconda and local git repos for each project. As you can imagine, the resulting source code has become a bit of a mess, with each person's code pointing at a specific package in someone else's folder and dependencies breaking all the time.
We've finally gotten a github repository and much of our code has become production stable. I want to distribute this code in a tarball and have each person build from source with disutils in a shared anaconda environment. The full package would be organized by each subpackage and would contain both cython and python files. Ultimately, it would look something like this
|tools
|setup.py (created with disutils)
| __init__.py
|package A (dir)
|__init__.py
|module A1.py
|module A2.pyx
....
|package B (dir)
|__init__.py
|package B1.py
|package B2.pyx
I am going to put the "tools" package into the github repo. Every time an important change happens to anything the tools package, we pull from the master branch and rerun setup.py to keep their local tools package up to date. In addition, we would all be using a shared anaconda environment, so that dependencies on external packages would not break the tools package install.
Is this the correct way to distribute production code among a small number of end users? I'm coming from a research not software development background and I genuinely don't know how software distribution works. Is forcing everyone to rebuild from source every single time we update the "tools" package overkill? Traditionally when I install using conda/pip I can just say something like "pip install mypackage --upgrade." Is there some similar sort of procedure we can use here?