1

I want to do a binary classification and I used the DenseNet from Pytorch.

Here is my predict code:

densenet = torch.load(model_path)
densenet.eval()
output = densenet(input)
print(output)

And here is the output:

Variable containing:
54.4869 -54.3721
[torch.cuda.FloatTensor of size 1x2 (GPU 0)]

I want to get the probabilities of each class. What should I do?

I have noticed that torch.nn.Softmax() could be used when there are many categories, as discussed here.

dennlinger
  • 9,890
  • 1
  • 42
  • 63
龚世泽
  • 11
  • 2
  • Have you tried using softmax for that already? – dennlinger Oct 16 '18 at 06:46
  • Thx for your reply! I tried it and it give out two float that adding up to 1. But I'm not sure it's ture or not – 龚世泽 Oct 16 '18 at 07:17
  • Can you edit your post with the additional information of a `print(model)`, so we can see what the last steps in the model are? If there already is a softmax built into it, it doesn't make sense to redo it, but oftentimes this is instead left open to the user when retraining. Edit: When loading it with `densenet = torchvision.models.densenet121()`, it doesn't. So unless you specified something else, you can safely use Softmax on that. – dennlinger Oct 16 '18 at 08:01
  • 2
    Yes, its' last layer is a: (classifier): Linear (1920 -> 2). And thx for your help :) . I think I should learn deep-leaning first before I use it. – 龚世泽 Oct 16 '18 at 08:42

1 Answers1

1
import torch.nn as nn

Add a softmax layer to the classifier layer: i.e. typical:

num_ftrs = model_ft.classifier.in_features
model_ft.classifier = nn.Linear(num_ftrs, num_classes)

 updated:

model_ft.classifier = nn.Sequential(nn.Linear(num_ftrs, num_classes), 
nn.Softmax(dim=1))
Ben D
  • 21
  • 2