I have multiple deep neural networks in my model and want them to have the same input sizes (networks are of different classes). For example, my model is:
class Model:
def __init__(self, cfg: DictConfig):
self.net1 = Net1(**cfg.net1_hparams)
self.net2 = Net2(**cfg.net2_hparams)
Here, Net1 and Net2 have different sets of hyper parameters, but among which the input_size
parameter is shared between Net1 and Net2, and have to be matched, i.e., cfg.net1_hparams.input_size == cfg.net2_hparams.input_size
.
I could define the input_size at the parent level: cfg.input_size
and manually pass them to both Net1 and Net2. But, I want the hparams-configs of each Net's are complete so that later I can build Net1 only using the cfg.net1_hparams
.
Is there a good way to achieve this in hydra?