0

I want to create my own loss function in keras, which contains derivatives. For example,

def my_loss(x):
    def y_loss(y_true,y_pred):
        res = K.gradients(y_pred,x)
        return res
    return y_loss

is defined, and

model = Sequential()
model.add(Dense(10, input_dim=2, activation='sigmoid'))
model.add(Dense(1, activation='linear'))
model_loss = my_loss(x=model.input)
model.compile(loss=model_loss, optimizer='adam')

Now because the input is of two-dimensional,

K.gradients(y_pred,x)

must be a two-dimensional vector. However, I don't know how to get each scalars in the gradients. What I finally want is all the second derivatives of y_pred with respect to x. Is there a convenient way to get this?


It is similar to this post, but this post separated two-dimensional variables into two one-dimensional variables. Is there any other way to get gradients without separating inputs?

CSH
  • 497
  • 1
  • 5
  • 16
  • Sorry if I am missunderstanding, what you are trying to achieve, but typically you want to calculate the gradient of the loss with respect to something and not of the output?! Anyway, I think your question is a duplicate of [this](https://stackoverflow.com/questions/49935778/second-derivative-in-keras) post – pafi Mar 14 '19 at 07:55
  • @pafi Yes it is similar. But your reference separated two inputs using function API, while I try to differentiate with respect to a vector element. – CSH Mar 14 '19 at 10:52
  • Yea, the core concepts are the same. But I think these answers will help you as well. – pafi Mar 14 '19 at 11:10
  • Bit unclear what you are asking; Do you want to calculate the Laplacian of a Neural Network with respect to inputs? Or do you want the loss function to be something related to the Laplacian. If you just need to calculate the Laplacian; there is no need to add it as a loss function. Some clarification would help us answer – Abhimanyu Mar 14 '19 at 19:37
  • @Abhimanyu At first, it was a Laplacian, so maybe tf.linalg.trace(tf.hessians) resolves the problem, but now I want to get access to each partial derivatives. Sorry for make you confused. I editted the problem – CSH Mar 14 '19 at 22:37

2 Answers2

1

If you want Laplacians , why not use tf.hessians which has all second derivates? The Laplacian should equal the trace of the Hessian Matrix (by identity)

https://www.tensorflow.org/api_docs/python/tf/hessians

Abhimanyu
  • 134
  • 8
  • A problem arise when I access the values of each element of the Hessian Matrix. I want the values for tf.hessians(y_pred,x) but this offers only a Tensor type stuff. Can you tell me how to access this? – CSH Mar 14 '19 at 10:51
0

Unfortunately, Keras does not have a convenient way to get each components of the gradients. Therefore, I used tensorflow to resolved this problem.

if f if the object function with variable x=(x1,x2)

X=tf.placeholder(tf.float32,shape=(None,2))
f=f(X)#assume it is defined'

then df/dx_1 is

tf.gradients(f,x)[0][:,0]

df/dx_2 is

tf.gradients(f,x)[0][:,1]

d^2f/dx_1^2 is

tf.gradietns(tf.gradients(f,x))[0][:,0]

d^2f/dx_2^2 is

tf.gradietns(tf.gradients(f,x))[0][:,1]

d^2f/dx_1dx_2 is

tf.gradietns(tf.gradients(f,x)[0][:,0])[0][:,1]

I believe there is a better way, but I can't find.

CSH
  • 497
  • 1
  • 5
  • 16