I save and load pytorch state_dict file and then evaluate and retrain the model. That works.
torch.save(net.state_dict(), path)
net.load_state_dict(torch.load(path))
However, when I modify the state_dict
file (manually changing the values) after loading it and evaluating it, I receive the error:
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
How to safely modify the state_dict
and then retrain it without the error?