0

I am trying to overwrite from the CLI a group of parameters and I am not sure how to do it. The structure of my conf is the following

conf
├── config.yaml
├── optimizer
│   ├── adamw.yaml
│   ├── adam.yaml
│   ├── default.yaml
│   └── sgd.yaml
├── task
│   ├── default.yaml
│   └── nlp
│       ├── default_seq2seq.yaml
│       ├── summarization.yaml
│       └── text_classification.yaml

My task/default looks like this

# @package task
defaults:
  - _self_
  - /optimizer/adam@cfg.optimizer

_target_: src.core.task.Task
_recursive_: false

cfg:
  prefix_sep: ${training.prefix_sep}

while the optimiser/default looks like this

_target_: null
lr: ${training.lr}
weight_decay: 0.001
no_decay:
  - bias
  - LayerNorm.weight

and one specific optimiser, say adam.yaml, looks like this

defaults:
  - default

_target_: torch.optim.Adam

In the end the config I'd like to be computed is like this

task:
  _target_: src.task.nlp.nli_generation.task.NLIGenerationTask
  _recursive_: false
  cfg:
    prefix_sep: ${training.prefix_sep}
    optimizer:
      _target_: torch.optim.Adam
      lr: ${training.lr}
      weight_decay: 0.001
      no_decay:
      - bias
      - LayerNorm.weight

I would like to be able to modify the optimiser via the CLI (say, use sgd), but I am not sure how to achieve this. I tried, but I understand why it fails, this

python train.py task.cfg.optimizer=sgd # fails
python train.py task.cfg.optimizer=/optimizer/sgd #fails

Any tips on how to achieve this?

Github discussion here.

Pietro
  • 415
  • 6
  • 16

1 Answers1

2

You can't override default list entries in this form. See this. In particular:

CONFIG : A config to use when creating the output config. e.g. db/mysql, db/mysql@backup.
GROUP_DEFAULT : An overridable config. e.g. db: mysql, db@backup: mysql.

To be able to override a default list entry, you need to define it as a GROUP_DEFAULT. In your case, it might look like

defaults:
  - _self_
  - /optimizer@cfg.optimizer: adam
Omry Yadan
  • 31,280
  • 18
  • 64
  • 87
  • 1
    Thanks a lot @Omry, that works perfectly. I tried to overwrite it from the CLI with `python train.py optimizer=adam` and it did not work but hydra automatically suggested using `python train.py optimizer@task.cfg.optimizer=adam` which worked perfectly! Thanks a lot – Pietro Mar 12 '22 at 17:15