0

I want to implement a custom loss function (from this paper: https://arxiv.org/abs/1706.00909) in MatConvNet.

My code uses the DagNN wrapper. I want to modify the class SegmentationLoss() to use a custom loss function, custom_loss() instead of vl_nnloss(). For the forward() pass, custom_loss() returns the calculated loss value.

What I don't understand is what custom_loss() should do during the backward() pass in SegmentationLoss(). What is the additional input derOutputs, where does it come from and what should be the return value of custom_loss()?

Thanks!

jlhw
  • 245
  • 1
  • 17
  • I'm not sure where you found the class `SegmentationLoss` in the MatConvNet package. I don't see it in version 1.0-beta25. In any case, to implement a custom loss function, you'll want to implement the forward and backward passes of the network. In the backward pass, you're taking derivatives with respect to (w.r.t.) weights and inputs. To get the derivative w.r.t. the weights, you need the backpropagated error signal to the node the weight is pointing to, which I believe is what `derOutputs` represents. Take a look at documentation in the `eval` function for a more thorough understanding. – Vivek Subramanian Oct 22 '17 at 05:23
  • @VivekSubramanian Thanks! The class `SegmentationLoss` is in the fully convolutional network package. – jlhw Nov 06 '17 at 21:13

0 Answers0