Questions tagged [gradienttape]

147 questions
0
votes
1 answer

Avoid Warnings during Jacobian Matrix Calculation

I have a trained LSTM model. I would like to calculate the Jacobian matrix of the output w.r.t input. I have written the following code: data = pd.read_excel('filename') a = data[:20] #shape is (20,5) b = data[50:70] #shape is (20,5) A =…
Pradyumna
  • 13
  • 4
0
votes
3 answers

Using Gradient Tape for Jacobian of LSTM model - Python

I am building a sequence to one model prediction using LSTM. My data has 4 input variables and 1 output variable which needs to be predicted. The data is a time series data. The total length of the data is 38265 (total number of timesteps). The…
0
votes
1 answer

Problem about getting None from the GradientTape.gradient in TensorFlow

I tried the following code: from d2l import tensorflow as d2l import tensorflow as tf @tf.function def corr2d(X, k, Y): #@save """Compute 2D cross-correlation.""" with tf.GradientTape() as tape: for i in range(Y.shape[0]): …
0
votes
1 answer

Checking condition in call method of custom layer using tf.cond()

I'm implementing a custom layer in tensorflow 2.x . My requirement is such that, the program should check a condition before returning the output. class SimpleRNN_cell(tf.keras.layers.Layer): def __init__(self, M1, M2, fi=tf.nn.tanh,…
Lawhatre
  • 1,302
  • 2
  • 10
  • 28
0
votes
1 answer

Is it possible to acquire an intermediate gradient? (Tensorflow)

When using gradient tape you can calculate the gradient after using: with tf.GradientTape() as tape: out = model(x, training=True) out = tf.reshape(out, (num_img, 1, 10)) # Resizing loss =…
user13906837
0
votes
1 answer

How do I access Tensor values (e.g. Metrics) which are updated within a tf.function?

I have been working on a model whose training loop uses a tf.function wrapper (I get OOM errors when running eagerly), and training seems to be running fine. However, I am not able to access the tensor values returned by my custom training function…
sbab94
  • 1
0
votes
1 answer

Tensorflow gradient always gives None when using GradientTape

I was playing around and trying to implement my own loss function in TensorFlow but I always get None gradients. To reproduce the problem I've now reduced my program to a minimal example. I define a very simple model: import tensorflow as tf model…
man zet
  • 826
  • 9
  • 26
0
votes
1 answer

GradientTape not computing gradient

I understand that so long as i am defining a computation in tf.GradientTape() context, the gradient tape would compute the gradient w.r.t all the variables that the output of the computation depends on. However, i think i am not quite grasping the…
figs_and_nuts
  • 4,870
  • 2
  • 31
  • 56
0
votes
0 answers

Gradient Calculation in Tensorflow using GradientTape - Getting unexpected None value

I am having a problem calculating the gradient in TensorFlow 1.15. I think it's something related context manager or keras session, but I am not sure about it. Following is the code I have written: def create_adversarial_pattern_CW(input_patch,…
0
votes
1 answer

Jacobian matrix of logits with respect to image using tf.GradientTape

I am trying to find the Jacobian of logits with respect to input but I do get None and I could not figure it why. Let'say I have a model, I trained it and saved it. import tensorflow as tf print("TensorFlow version: ",…
ARAT
  • 884
  • 1
  • 14
  • 35
0
votes
3 answers

How to save and load model with tf.gradienttape in tenworflow2

I am using tf.gradienttape for model training and it is successful to save checkpoints for every epoch. with train_summary_writer.as_default(): with tf.summary.record_if(True): for epoch in range(epochs): for train_id in…
Brian Lee
  • 173
  • 3
  • 14
0
votes
1 answer

Error when working with GradientTape() and jacobian() in Tensorflow 2.0

I am working with GradientTape() and jacobian() in Tensorflow 2.0 in Python. This code executes fine: x = tf.Variable(2.0, dtype=tf.float32) with tf.GradientTape() as gT: gT.watch(x) g = tf.convert_to_tensor([x, 0.0], dtype=tf.float32) dg =…
1 2 3
9
10