I've read the documentation about __init__.py
files and some nice questions here on SO, but I'm still confused about its proper usage.
Context
I have a code with many packages and sub-packages. I've defined many classes, some of which I need to create one (and only one) instance for the whole user session. These new objects are then used in different parts of code, so that anytime I (or the user) update data/information in this objects, it will be used across all the code without having to change anything else. To be clearer, let me show you a basic scheme of what I'm talking about.
The code have an over-simplified structure like:
root/
__init__.py
tools/
__init__.py
... (some modules)
aliases.py (which defines the class Aliases)
physics/
__init__.py
... (some modules)
constants/
... (some modules)
constants.py (which defines the class Ctes)
units.py (which defines the class Units)
In the code, I need to manage aliases, units and constants. The way I found to deal with that is to create one instance of each, and use it across all the code. With this method, I'm sure, for example, that if an alias is added while the code is still running, it could be used anywhere in the code because there is only one shared instance of Aliases. This is what I need (the same apply for units, and constants by the way).
Current status
As for now, the way I'm doing this is, I think, not the best. Indeed, I'm creating the instance of, let's say Aliases, directly after declaring the class, in the same file:
in root/tools/aliases.py
import ... (stuff
class Aliases(object):
"""The Aliases manager"""
...
ALIASES = Aliases()
And then, in any file I need to use Aliase, I do:
in any_file.py
(anywhere in the code)
from root.tools.aliases import ALIASES
ALIASES.method1() # possibly in functions or other classes
ALIASES.method2() # ... etc
And for some other classes, I'm even using the __init__.py
file at the root of the code:
in root/__init__.py
# CTES is the instance of Ctes() created in root/physics/constants/constants.py
from root.physics.constants.constants import CTES
CTES.add(...) # add a new constant that needs to be known
(of course, CTES does not just store some constants, I define some methods to exploit them, so it makes sense to have them in this class instead of just defining them as regular python constants in a module)
Questions
I'm wondering if I'm doing this right (probably not). Maybe it is better to use files __init__.py
and initiate the shared instances in it. But then do this brings some problems (like dependency cycles, or increased memory usage...)? Also, how to use the created instances elsewhere in the code? Like this?:
in root/tools/__init__.py
import root.tools.aliases as Al
ALIASES = Al.Aliases()
# should I delete the imported module: del Al ??
and then in any_file.py
from root.tools import ALIASES
ALIASES.method(...)
Or should all these instances better be included in a file (eg. root/shared.py
) which I import in root/__init__.py
so that I'm sure it is initiated?
I've read many times it is better to keep __init__.py
files empty (which is the case right now in my code, except of course for root/__init__.py
). What do you think?
I'm a bit lost (you could probably see that from the fact I'm not very clear). Any help/advice is more than welcome. I'd like to avoid any non pythonic solution, or solutions that could confuse the user, or make things unsafe.