I want to create my own loss function in keras, which contains derivatives. For example,
def my_loss(x):
def y_loss(y_true,y_pred):
res = K.gradients(y_pred,x)
return res
return y_loss
is defined, and
model = Sequential()
model.add(Dense(10, input_dim=2, activation='sigmoid'))
model.add(Dense(1, activation='linear'))
model_loss = my_loss(x=model.input)
model.compile(loss=model_loss, optimizer='adam')
Now because the input is of two-dimensional,
K.gradients(y_pred,x)
must be a two-dimensional vector. However, I don't know how to get each scalars in the gradients. What I finally want is all the second derivatives of y_pred with respect to x. Is there a convenient way to get this?
It is similar to this post, but this post separated two-dimensional variables into two one-dimensional variables. Is there any other way to get gradients without separating inputs?