I saw an example of training cifar10 data using tensorflow: https://github.com/tensorflow/models/tree/master/tutorials/image/cifar10
the code generate a batch of images from several single image using tf.train.batch and create a queue of batchs using prefetch_queue. I understand it is necessary to use queues to pre-fetch data when training data is large. I guess tf.train.batch maintains a queue internally (because it has a capacity parameter). Since a queue of batches is already maintained in tf.train.battch, is it necessary to create another queue with tf.contrib.slim.prefetch_queue? what does tf.contrib.slim.prefetch_queue do exactly?
the key parts of the cifar-10 example code is shown below:
import tensorflow as tf
images, labels = tf.train.batch(
[image, label],
batch_size=...,
num_threads=...,
capacity=...,
min_after_dequeue=...)
batch_queue = tf.contrib.slim.prefetch_queue.prefetch_queue(
[images, labels],
capacity=...)