0

I am trying to do multiclass segmentation with U-net. In the previous trials I tried the binary segmentation and it works. But when I try to do multiclass I am facing this error.

ValueError: 'generator yielded an element of shape (128,192,1) where an element of shape (128,192,5) was expected

This 5 denoted the number of classes. This is how I defined my output layer. output:Tensor("output/sigmoid:0",shape(?,128,192,5),dtype=float32).

I kept a crop size of input_shape:(128,192,1) because of grayscale image and label_shape:(128,192,5)

Data is loaded in the tensorflow dataset and uses a tf.iterator. A generator yields data from tf.dataset.

def get_datapoint_generator(self):
  def generator():
   for i in itertools.count(1):
    datapoint_dict=self._get_next_datapoint()
    yield datapoint_dict['image'],datapoint_dict['mask']

The _get_next_datapoint_ function gets next datapoint from ram, and processes cropping and augmentation.

Now, where would have it gone wrong that the it doesnt match with the output shape?

Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36
ram
  • 17
  • 7

1 Answers1

0

Can you try to use this implementation? I am using this one but it is in Keras

def sparse_crossentropy(y_true, y_pred):
    nb_classes = K.int_shape(y_pred)[-1]
    y_true = K.one_hot(tf.cast(y_true[:, :, 0], dtype=tf.int32), nb_classes + 1)
    return K.categorical_crossentropy(y_true, y_pred)
Cenk Bircanoglu
  • 499
  • 4
  • 10
  • Sure I will try and see but can you explain a bit why Loss affects this shape of labels? – ram Nov 13 '20 at 09:34
  • The loss function calculates the differences between the output of the model and the label and optimizer function uses that difference to update the weights of the model. So the output of the model and label should be in the same format. But you can do some changes to the loss function to support different shapes. To make it simpler, if you are working on classification problem, if the output of the model is integer, the label should be integer to calculate the difference but you can use also one_hot_encoded label and calculate the diff in loss function with reshaping the y_true or y_pred. – Cenk Bircanoglu Nov 13 '20 at 09:40
  • Hey thanx man. Even the normal categorical entropy works. the problem was which i read in a blog that for losses like crossentropy the label shape and output shape are different. But i kept it the same and thats why – ram Nov 13 '20 at 12:58