0

With tensorflow, I've made a dataset = tf.data.TFRecordDataset(filename) and iterator = dataset.make_one_shot_iterator(). Then in each round iterator.get_next() would give out a mini-batch of data as input.

I am training a network with Dropout layer, so I'm supposed to write something like this:

sess.run(train_op,feed_dict={keep_prob:0.5})
accuracy,loss = sess.run([acc,loss],feed_dict={keep_prob:1.0})

in which keep_prob represents the probability to keep a neuron alive, that differs in training and testing (at this place is the evaluating) process.

The problem arises here is each sess.run() triggers the iterator.get_next() to get a new batch of input. This is not what it was supposed to be like.

What should I do if I wanna these two sess.run() have the same input tensors?

Thank you very much :-)

Woody. Wang
  • 135
  • 1
  • 9

1 Answers1

0

I've just been guided to this place where you could find the answer to this question.

The main idea is to use tf.data.Iterator.from_structure() instead of tf.data.Dataset.make_initializable_iterator() to create the iterator and to initialize the iterator for training, validation, and test dataset separately.

VLAZ
  • 26,331
  • 9
  • 49
  • 67
Woody. Wang
  • 135
  • 1
  • 9