Currently training models using AllenNLP 1.2:
allennlp train -f --include-package custom-exp /usr/training_config/mock_model_config.jsonnet -s test-mock-out
The config is very standard:
"dataset_reader" : {
"reader": "params"
},
"data_loader": {
"batch_size": 3,
"num_workers": 1,
},
"trainer": {
"trainer_params": "various"
},
"vocabulary": {
"type": "from_files",
"directory": vocab_folder,
"oov_token": "[UNK]",
"padding_token": "[PAD]",
},
"model": {
"various params": ...
}
and serializing them to the test-mock-out
directory (also have model.tar.gz
).
Using the allennlp train
command, is it possible to continue training? The documentation states Model.from_archive
should be used, but it's unclear how the config should be adapted to use it.