5

I need an approximation to maximum and minimum. Since the max and min are not differentiable I am looking for an differentiable approximation to it.

Does anybody know about it? for example I need to minimize the following:

[max (a - max_{x\in c}(x) )^2 + (a - max_{x\in d}(x) )^2]

Thomas Jungblut
  • 20,854
  • 6
  • 68
  • 91
user570593
  • 3,420
  • 12
  • 56
  • 91

2 Answers2

2

The Softmax function is a differentiable mapping from a vector to a scalar, and approximates the maximum function.

Camille Goudeseune
  • 2,934
  • 2
  • 35
  • 56
1

Smooth maximum (and minimum) is one candidate:

sum(x * exp(alpha * x)) / sum(exp(alpha * x))

where alpha -> +Inf converges to maximum, and alpha -> -Inf to minimum.

Another is LogSumExp:

log(sum(exp(x)))

Which is a max. I guess asking of a max of negated xs should give min

Couple of other are mentioned in paper Multiple Instance Learning: Algorithms and Applications such as generalized mean, noisy-or and the "ISR" model (because it's described in "Integrated segmentation and recognition" paper).

Slartibartfast
  • 8,735
  • 6
  • 41
  • 45