1

I am facing issue while loading the model using torch which was trained using GPU, I am trying to load that model using CPU. however I am successfully able to load the model but while predicting the results I am getting error. However if I use GPU machine I am able to predict the output but not on the CPU:

My code:

****To save the model I am using :****
PATH = "model.pt"
torch.save(model, PATH)

**To Load the Model**


 import torch
    PATH = "model.pt"
    device = torch.device('cpu')
    loaded_model=torch.load(PATH, map_location=device)

I am able to successfully load the model. but while predicting I am getting runtime error

**Predicting the loaded model using CPU**
predicted_title = loaded_model.predict([abstract])

Runtime Error: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver

I am sorry if the error might turn out very simple but I am not able to rectify this.

James Z
  • 12,209
  • 10
  • 24
  • 44
Harry DSOUZA
  • 27
  • 1
  • 2
  • Well according to the [documentation](https://pytorch.org/tutorials/beginner/saving_loading_models.html#save-on-gpu-load-on-cpu) this should work. Can you verify if the the abstract is also on CPU device? – Ramesh Arvind Jun 01 '21 at 20:20

1 Answers1

0

You can output the models device with

print(loaded_model.device)

if it is not cpu, do

model = model.to('cpu')
Berkay Berabi
  • 1,933
  • 1
  • 10
  • 26