An example is the following: The goal is to optimize a function f(x1,..,xN)
subject to the constraint x1 + .. + xN = 1
with 0 < x_i <= 1
. How do I do this with tensorflow in general (with arbitrary constraints)?
I have seen the ScipyOptimizerInterface which allows to do this.
In this special case, a different approach is to use the nnsoftmax
transform to write x_i = exp(y_i)/sum_i(exp(y_i))
, i=1,..,N
. y1,..,yN
are unconstrained now and can be optimized with SGD, i.e. with TensorFlow without any constraints. In GPFlow, this would be equivalent to a transform associated to the variables {y1,..,yN}
.
Is there any other approach?