I am developing a project based on the pytorch lightning + hydra template found here https://github.com/ashleve/lightning-hydra-template. I am trying to instantiate a Pytorch dataset object using hydra instantiate, overriding the default cfg value with the DictConfig object. Specifically, I have this config file:
...
training_dataset:
_target_: src.datamodules.components.baseline_datasets.FSD50K
cfg: omegaconf.dictconfig.DictConfig(content={})
mode: "training"
...
While the pytorch lightning datamodule does the following:
class AudioTagDataModule(LightningDataModule):
def __init__(self, cfg: DictConfig):
super().__init__()
self.cfg = cfg
self.save_hyperparameters(logger=False)
def setup(self, stage: Optional[str] = None):
self.trainset = instantiate(self.cfg.datamodule.training_dataset, cfg=self.cfg)
...
The rest of the code is pretty much unmodified from the template. However, when the pytorch dataset is instantiated, I get an error due to the config being empty. Checking in debug, I see that the config value is not being overridden, despite having the same name that was specified in the configs. Could someone provide some guidance on why the override is not working and how to correctly proceed? Thank!