1

I am working for a project of semantic segmentation of retinal blood vessels with Tensorflow with the MobileUNet model and I have received this error:

    InvalidArgumentError (see above for traceback): logits and labels must 
    be broadcastable: logits_size=[82944,2] labels_size=[90000,2] 
[[Node: softmax_cross_entropy_with_logits_sg = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, 
_device="/job:localhost/replica:0/task:0/device:CPU:0"](softmax_cross_entropy_with_logits_sg/Reshape,
softmax_cross_entropy_with_logits_sg/Reshape_1)]]

Here my code is as follows:

network=network = build_mobile_unet(net_input, preset_model = args.model, num_classes=num_classes) 
net_input = tf.placeholder(tf.float32,shape=[None,None,None,3]) 
net_output = tf.placeholder(tf.float32,shape=[None,None,None,num_classes])
losses = tf.nn.softmax_cross_entropy_with_logits(logits=network, labels=net_output) 
cost = tf.reduce_mean(losses)

opt = tf.train.AdamOptimizer(0.001).minimize(cost)

init = tf.initialize_all_variables() _,current=sess.run([opt,cost],feed_dict={net_input:input_image_batch, net_output:segmented_image_batch})

The input image is 300x300, and is in the RGB colour-space. The output is a binary image with the same size as input.

Can someone help me?

  • In the `tf.nn.softmax_cross_entropy_with_logits` function, you need to provide one N-element label for each training point. If each batch contains K points that can be classified into N classes, the `labels` argument must be set to a tensor of size [K, N]. – MatthewScarpino Aug 29 '18 at 13:41
  • yess, i juste provide one label for each training point which is a binary image with size 300*300 and i have this problem. – Henda Boudegga Aug 29 '18 at 14:39

2 Answers2

0

We answered this problem which is also related to architecture Understand this in following link Input to reshape is a tensor with 37632 values, but the requested shape has 150528 Let us know if you face any issue

0

The same problem occured with me.This comes when we use label_size more than number of the classes in our dataset.

In the last fully connected layer(Dense) I had used 46 but in my dataset there was only 38 classes.So when I used 38 instead of 46,problem solved.