I have these two unnamed op tensors logits
and outputs
under a variable scope, but the lt
command isn't listing these two tensors under the op 'MatMul' and 'Softmax' during the tfdbg
session after a test run on a checkpoint. Here is a snapshot of the code:
with tf.variable_scope(scope):
d_inputs = dropout(inputs, keep_prob=keep_prob, is_train=is_train)
d_memory = dropout(memory, keep_prob=keep_prob, is_train=is_train)
JX = tf.shape(inputs)[1]
with tf.variable_scope("attention"):
inputs_ = tf.nn.relu(dense(d_inputs, hidden, use_bias=False, scope="inputs"))
memory_ = tf.nn.relu(dense(d_memory, hidden, use_bias=False, scope="memory"))
outputs = tf.matmul(inputs_, tf.transpose(memory_, [0, 2, 1])) / (hidden ** 0.5)
mask = tf.tile(tf.expand_dims(mask, axis=1), [1, JX, 1])
# The tensor down below
logits = tf.nn.softmax(softmax_mask(outputs, mask))
# And the tensor down below here as well
outputs = tf.matmul(logits, memory)
res = tf.concat([inputs, outputs], axis=2)
What can I do to retrieve these variables for testing purposes on tfdbg
?
For alternative purposes, can I retrieve them using the normal tensorflow session by using tf.add_to_collection(op_name, tensor)
as mentioned in this answer?