tl;dr How do I import a Python module from an embedded Python script, so that the imported module can use the global variables and functions provided by the host system?
I am writing a Python script for some system (written presumably in C++). I write the script, put it into a special predefined folder, the system executes it on some events.
As the script grows large and unwieldy, I want to split it into several modules, let's say, module1.py
and module2.py
, imported by main_script.py
, which is loaded and executed by the host system. However, the imported modules cannot use global stuff main_script.py
can (I assume, the host system adds some global variables, functions, classes etc. while loading main_script.py
; the modules, however, aren't loaded by the host system directly, so they end up not having all those globals).
So far I've come up with the following - it looks for the globals that are present in the main_script.py
but not in the modules, and adds them to the modules:
#main_script.py
import module1, module2
for m in [module1, module2]:
for k, v in globals().items():
if not hasattr(m, k):
setattr(m, k, v)
It works (at least for my case so far) but doesn't look particularly elegant: I have to list the imported modules twice; if I want to import some sub-modules from the modules, I would have to do the same there; I have to watch out for possible global name clashes, etc.) As the problem doesn't sound too uncommon, I feel like I may be reinventing the square wheel here. Is there a better way to do this?
UPD.: Based on [my interpretation of?] suggestions from the answer of @Merlin Katz and a comment from @Sraw, I've modified my scripts as follows. First, added an empty script core.py
. Then modified main_script.py
:
#main_script.py
import core
#only inject into the empty 'core' module
for k, v in globals().items():
if not hasattr(core, k):
setattr(core, k, v)
#can now import modules that depend on those globals
import module1, module2
Then, every module that has to use the injected globals should import core
and use them from there:
#module1.py
import core
_blah = core.blahblah #a shortcut
core.call_global_function()
my_obj1 = core.blahblah.SomeClassDefinedInBlahblah()
my_obj2 = _blah.SomeClassDefinedInBlahblah() #a bit shorter version of the above
#etc.
This looks somewhat cleaner and there's no risk of overwriting some existing global variables. Modules, imported by module1
and module2
can also simply import core
and use the global variables.
UPD.: Furthermore, I'm not sure if it's worth it, but instead of keeping an empty core.py
module, you can create it dynamically:
#main_script.py
import sys
from types import ModuleType
core = ModuleType("core")
sys.modules["core"] = core
#inject into our dynamically created 'core' module
for k, v in globals().items():
if not hasattr(core, k):
setattr(core, k, v)
#the modules can still import `core` the same way as before
import module1, module2