5

Basically this is the VGG-16 Model, I have performed Transfer Learning and Fine Tuned the model, I have trained this model 2 weeks ago and found both the test and train accuracy but now I need Class wise accuracy of the model too, I am trying to find out the Confusion matrix and wanna plot the matrix too. Training Code:

# Training the model again from the last CNN Block to The End of the Network
dataset = 'C:\\Users\\Sara Latif Khan\\OneDrive\\Desktop\\FYP_\\Scene15\\15-Scene'
model = model.to(device)
optimizer = Adam(filter(lambda p: p.requires_grad, model.parameters()))

#Training Fixed Feature Extractor for 15 epochs
num_epochs = 5
batch_loss = 0
cum_epoch_loss = 0 #cumulative loss for each batch

for e in range(num_epochs):
    cum_epoch_loss = 0
  
    for batch, (images, labels) in enumerate(trainloader,1):
        images = images.to(device)
        labels = labels.to(device)

        optimizer.zero_grad()
        logps = model(images)
        loss = criterion(logps, labels)
        loss.backward()
        optimizer.step()
    
        batch_loss += loss.item()
        print(f'Epoch({e}/{num_epochs} : Batch number({batch}/{len(trainloader)}) : Batch loss : {loss.item()}')
        torch.save(model, dataset+'_model_'+str(e)+'.pt')
    
print(f'Training loss : {batch_loss/len(trainloader)}')

This is the code I am using to check the accuracy of my model based on data from the test loader.

model. to('cpu')

model.eval()
with torch.no_grad():
    num_correct = 0
    total = 0
    
    #set_trace ()
    for batch, (images,labels) in enumerate(testloader,1):
        
        logps = model(images)
        output = torch.exp(logps)
        
        pred = torch.argmax(output,1)
        total += labels.size(0)
        
        num_correct += (pred==labels).sum().item()
        print(f'Batch ({batch} / {len(testloader)})')
        
        # to check the accuracy of model on 5 batches
        # if batch == 5:
            # break
            
    print(f'Accuracy of the model on {total} test images: {num_correct * 100 / total }% ')  

Next, I need to find the class-wise accuracy of the model. I am working on the Jupyter Notebook. Should I reload a saved model and find the cm or what will the appropriate way of doing it.

Sarah Latif Khan
  • 191
  • 3
  • 11

1 Answers1

3

You have to save all the predictions and targets of the test set.

predictions, targets = [], []
for images, labels in testloader:
    logps = model(images)
    output = torch.exp(logps)
    pred = torch.argmax(output, 1)

    # convert to numpy arrays
    pred = pred.detach().cpu().numpy()
    labels = labels.detach().cpu().numpy()
    
    for i in range(len(pred)):
        predictions.append(pred[i])
        targets.append(labels[i])

Now you have all the predictions and actual targets of the test-set stored. Next step is to create the confusion matrix. I think I can just give you my function I always use:

def create_confusion_matrix(y_true, y_pred, classes):
    """ creates and plots a confusion matrix given two list (targets and predictions)
    :param list y_true: list of all targets (in this case integers bc. they are indices)
    :param list y_pred: list of all predictions (in this case one-hot encoded)
    :param dict classes: a dictionary of the countries with they index representation
    """

    amount_classes = len(classes)

    confusion_matrix = np.zeros((amount_classes, amount_classes))
    for idx in range(len(y_true)):
        target = y_true[idx][0]

        output = y_pred[idx]
        output = list(output).index(max(output))

        confusion_matrix[target][output] += 1

    fig, ax = plt.subplots(1)

    ax.matshow(confusion_matrix)
    ax.set_xticks(np.arange(len(list(classes.keys()))))
    ax.set_yticks(np.arange(len(list(classes.keys()))))

    ax.set_xticklabels(list(classes.keys()))
    ax.set_yticklabels(list(classes.keys()))

    plt.setp(ax.get_xticklabels(), rotation=45, ha="left", rotation_mode="anchor")
    plt.setp(ax.get_yticklabels(), rotation=45, ha="right", rotation_mode="anchor")

    plt.show()

So y_true are all the targets, y_pred all the predictions and classes is a dictionary which maps the labels to the actual class-names, for example:

classes = {"dog": [1, 0], "cat": [0, 1]}

Then simply call:

create_confusion_matrix(targets, predictions, classes)

Probably you will have to adapt it to your code a little but I hope this works for you. :)

Theodor Peifer
  • 3,097
  • 4
  • 17
  • 30