0

I'm trying to solve 2D Darcy equation which is a mixed formulation. Suppose I have a target vector and source vector as follows:

u = [u1,u2,p] 
x = [x,y]. 
grad(u,x) =
[du1/dx, du2/dx, dp/dx;
 du1/dy, du2/dy, dp/dy]

I'm not understanding if this is what happens if I do tf.gradients(u,x).

John Conde
  • 217,595
  • 99
  • 455
  • 496

1 Answers1

1

tf.gradients(u,x) doesn't return what you want because

from https://www.tensorflow.org/api_docs/python/tf/gradients,

gradients() adds ops to the graph to output the derivatives of ys with respect to xs. It returns a list of Tensor of length len(xs) where each tensor is the sum(dy/dx) for y in ys and for x in xs.

Here is how you can get jacobian.

import tensorflow as tf

x=tf.constant([3.0,4.0])

with tf.GradientTape() as tape:
  tape.watch(x)
  u1=x[0]**2+x[1]**2
  u2=x[0]**2
  u3=x[1]**3
  u=tf.stack([u1,u2,u3])

J = tape.jacobian(u, x)
print(J)
'''
tf.Tensor(
[[ 6.  8.]
 [ 6.  0.]
 [ 0. 48.]], shape=(3, 2), dtype=float32)
'''
Laplace Ricky
  • 1,540
  • 8
  • 7