3

I'd like to define a loss function in tensorflow based on a histogram but that requires the histogram function to support gradients, which is not the case. For example, using tf.histogram_fixed_width I get an error "No gradients provided for any variable, check your graph for ops that do not support gradients". I am hence looking for a work-around or an alternative function to compute histograms in tensorflow that supports gradients.

Time2Lime
  • 61
  • 3

1 Answers1

0

Loss function must be of form f(x, [y, ...])->R. It has to produce a single real number and must be differentiable (for every R there must exist a sence of direction towards a better solution). Histograms are taking in your input but produce a data structure as output and they are not differentiable. You can try to define in your own words what a "better" or "good" histogram should look like. If you have a shape in mind, you can describe it in terms of ideal hitogram and use KL-divergence as a loss function between ideal and actual histogram.

y.selivonchyk
  • 8,987
  • 8
  • 54
  • 77
  • 1
    Let me be more precise. I have a 1D tensor from which I want to extract a probability density function (PDF) and then define a loss that makes this PDF as close as possible to a desired PDF. The desired PDF is defined as a 1D tensor (I don't necessarily have an analytical expression for it). Does your suggested approach work for this? Do you have a reference to some sample code? That would be very helpful. I've never used tfp so far.. – Time2Lime Jun 08 '19 at 17:33
  • Yes, after your explanation it seems even more so. You can look into https://stackoverflow.com/questions/41863814/kl-divergence-in-tensorflow for inspiration and check if anything fits you. If your target and actual PDFs are defined by the same number of buckets than we can call them 'classes' and you can simply use crossentropy loss – y.selivonchyk Jun 09 '19 at 02:49