There is no quick way to describe this problem, so stay with me! A similar question was already asked, but my use case is a bit different. The easiest way to explain it is by describing an actual use case:
I have a folder with some common utility modules, that are in use by my scripts.
commonPythonFiles/
pathUtils.py
procUtils.py
specialModuleUtils.py
groupedCommonPythonFiles/
groupUtils1.py
groupUtils2.py (*)
This modules may have cross imports: procUtils
use functions from pathUtils.py
, while groupUtils1.py
use both of them (procUtils.py
and pathUtils.py
).
There is a special module, which is not available from the start of the scripts - it is extracted/copied/generated/... in runtime of main.py
by the use of specialModuleUtils.py
functions.
specialModuleFolder/ # not available from the start
specialModule.py
On the other hand, groupUtils2.py (*)
is a wrapper of a such specialModule.py
.
In some worker script (example, main.py
), this utility modules are needed, hence, they are usually imported at the beginning of the file.
Problem
#main.py
import pathUtils
import procUtils
import specialModuleUtils
import groupUtils1
import groupUtils2 # (!)
def main():
# prepare special module
args = groupUtils1.getSpecialModuleArguments(sys.argv) # might be a lot more than one line to get arguments
specialModuleUtils.create(args)
# do some stuff with groupUtils2 which use created special module
groupUtils2.doSomeStuffWithSpecialModule()
You might already suspect the problem I am facing. I am importing module that is not yet available. main.py
fails at the import groupUtils2
, since specialModuleUtils
is not yet created.
What I actually struggle with is the question: what is the right approach how to handle imports or in general, what is the best module hierarchy for such non-standard cases?
Possible solutions
- Set a rule: no imports of common modules should be placed in file header.
#main.py
import pathUtils
import procUtils
def main():
import groupUtils1
import specialModuleUtils
# prepare special module
args = groupUtils1.getSpecialModuleArguments(sys.argv) # might be a lot more than one line to get arguments
specialModuleUtils.extract(args)
# do some stuff with groupUtils2 which use created special module
import groupUtils2
groupUtils2.doSomeStuffWithSpecialModule()
This clutters the functions, duplicate import statements and complicate the use of common utility modules.
- Make special module generation as a prerequisites for a scripts -
main.py
should run in already prepared environment with availablespecialModule.py
to import. This means, that before any script would be executed, some other script/job/process needs to be run that would preparespecialModule.py
Also, this script would also be limited in usage of common python files, otherwise it might fail in the same way asmain.py
.
Since there is some logic needed to extract this special module (args = groupUtils1.getSpecialModuleArguments(sys.argv)
), simple virtual environments are not an option (or are they?).
Q: What I actually struggle with is the question: what is the right approach how to handle imports or in general, what is the best module hierarchy for such non-standard cases?