2

I would like to use this function in Tensorflow loss function:

def rectified_projection(self, disp_x, image):
    H, W, B = self.HEIGHT, self.WIDTH, self.batch_size
    disp_x = tf.cast(disp_x, tf.int32)
    disp_x = self.bias_x + disp_x
    disp = tf.concat([self.disp_y, disp_x], 3)
    disp = tf.clip_by_value(disp, 0, W)
    sdisp = tf.scatter_nd(disp, image, (B, H, W, 3), name="SCATTER")
    return sdisp

This code moves pixels of image in row by the value of disp_x which is output from layer.

The problem is when I want to traing my network with this transformation. Tensorflow outputs that dont know how to propagate gradients through the network.. How can it be fixed?

EDIT:

Complete Error message:

 ValueError: 
 No gradients provided for any variable, 
 check your graph for ops that do not support gradients, 
 between variables "last_layer" "loss_function"
MBT
  • 21,733
  • 19
  • 84
  • 102
marek094
  • 424
  • 2
  • 11

1 Answers1

1

It is a common issue when you use tf.clip()

SOLUTION : Use a bijective function from R to [0,W] like : vector = W * (vector / (2*tf.reduce_max(tf.abs(vector)))+0.5), for which a gradient can be calculated...

Tbertin
  • 635
  • 7
  • 21
  • Thx for your answer. I know what you mean, but the error accurs even if I remove the clipping. I think, that the `tf.cast` and `tf.scatter` makes the problem – marek094 Sep 06 '18 at 17:18