I'm faced with a Docker build process that's pretty slow, in part because of all the Python packages we're building and installing over and over. I'd very much like to speed it up.
I've downloaded the packages from PyPI so I can get a good look at them. I've also put them in a local pypiserver (two, actually), and confirmed I can install them from there.
The packages' extensions are:
87 .whl ............................................................
23 .tar.gz ................
2 .zip .
I'm thinking some of those .tar.gz's (and .zip's? and source .whl's?) would be much faster to install if I converted them to manylinux wheels, and put them in a local pypiserver instance with the same version number. In fact, one package tends to fail to compile at random - so the build process should be a little more reliable too if this works out.
Is there a (relatively?) straightforward process for doing such a thing? That is, to take a .tar.gz from pypi (not an arbitrary .tar.gz - only a handful from pypi) and convert it to a binary manylinux .whl?
For example, probably the most time-consuming package in our docker build is https://pypi.org/project/pycapnp/ It takes about 80 seconds to build and install on my Linux Mint 19.1 laptop. It's a .tar.gz.
Thanks!