Is pip capable of recursively searching through a file system path and installing a package and its dependencies? For example, given the following file structure (files not shown) pip install packageA -f C:\packages\
does not work.
C:\packages\
C:\packages\packageA\
C:\packages\packageA\1.0.0
C:\packages\packageA\1.0.1
C:\packages\packageB\
C:\packages\packageB\2.2.1
C:\packages\packageB\2.2.4
Additionally, can these packages be pure source with a setup.py file? Or do they need to be a binary like wheel or zip files. And finally, is there a way to resolve the dependencies? For example, if packageA requires a version of packageB, can pip get that version of packageB from my folders? Do I need html files indicating where and what things are?
I am aware I can point pip directly to local paths (pip install C:\packages\packageA\1.0.0
), but I want this to function as if the packages were available in PyPI. For example, if a user types pip install packageB
or pip install requirements.txt
and that requirements file contains packages that exist locally but not in PyPI, it would just work. (I could set the local package storage path in a configuration file so the pip command doesn't need the -f argument)
Basically, I want to replicate PyPI functionality using a file system without a webserver (security won't let us run one). Any insight would be greatly appreciated.