0

Context:

  • I have done a detection application. For each image, M mask detections are made. Each image has N ground truths elements. For each image M and N can be different.

  • A mask is of shape [widht,height]

  • I've grouped those images in batches of size 8.

  • To get together all detected masks for all batches, I've constructed a list all_detected_masks.

  • all_detected_masks type is list[list[tensor[

  • all_detected_masks shape (if it was numpy) should be [N_BATCHES,8,M,widht,height]

  • i have a similar list but for ground truths; shape is [N_BATCHES,8,N,widht,height]

Problem:

  • For each batch, I want to calculate the Jaccard Index, preferably with torchmetrics module.

  • But when I use binary_jaccard_index, I face two problems:

  1. i need to do torch.stack(pred_masks) to convert the list to tensor, but as M and N vary to each image, I can't stack

  2. To avoid (1), I can try to loop over the images (which I wanted to avoid, the goal is to calculate to all batch at once). This would get me to apply binary_jaccard_index in [M,widht,height] and [N,widht,height] elements. But when I do this, I get always 0 as a result, which, by printing out the result, I know is not true.

Key Question:

How can I calculate Jaccard Index for a batch of images directly?

Note:

There is a extra layer of complexity there is all those masks belong to a different class, but I prefer to go past this first before going to next steps. If you have some tips on this next layer, I would be very please too.

Snipet Code for 2:

for i in range(len(batches_predictions_masks)): # for each batch
    #get gt
    gts_msks = gts_masks[i]
    
    # get predict
    preds_masks = batches_predictions_masks[i]
    
    logger.info(batches_predictions_masks[i][0].shape)
    logger.info(gts_masks[i][0].shape)
    logger.info(type(batches_predictions_masks[i][0]))
    
    for j in range(len(batches_predictions_masks[i])):
    
        val = binary_jaccard_index(preds_masks[i][j], gts_msks[i][j].cuda())
        logger.info(val)

0 Answers0