I'm currently working on a abstract set of modules for Python 2.7, that I will ship as a python package:
myabstractpkg
- abstract
- core
- logging
- ...
- nodes
- ...
The modules will then be implemented in a totally different set of packages:
myimppkg
- implementation
- core
- logging
- ...
- nodes
- ...
At runtime however, I want to be able to do always imports like this in my tools that use the implemented modules:
from myabstractpkg.api import nodes
from myabstractpkg.api.core.logging import Logger
This way the developer always imports from the "virtual" api module which would then decide where to actually point the importer.
I know I might somehow be able to hack it together by modifying the modules dict:
from myimppkg import implementation
sys.modules["myabstractpkg.api"] = implementation
or doing a clever import for everything in __init__.py
of myabstractpackage.api
, but this feels a bit fragile to me.
I wonder if you guys have some input on what the best approach to do this is. I might be on a really ugly track with this whole remapping modules thing, so if you guys have any smarter, more pythonic solutions, for my API abstraction, implementation, usage approach, I would love to hear them.