3

I'd like to train a classifier on one ImageNet dataset (1000 classes each with around 1300 images). For some reason, I need each batch to contain 64 images from a specific class (provided as int or placeholder). How to do it efficiently with the latest TensorFlow?

This is a follow-up question to How to sample batch from only one class at each iteration.

My current thought is to use tf.data.Dataset.filter:

specific_class = 2  # as an example

dataset = tf.data.TFRecordDataset(filenames)
# __parser_fun__ produces datum tuple (x, y)
dataset = dataset.map(__parser_fun__, num_parallel_calls=num_threads)
dataset = dataset.shuffle(20000)
# print(dataset) gives <ShuffleDataset shapes: ((3, 128, 128), (1,)), 
# types: (tf.float32, tf.int64)>

dataset = dataset.filter(lambda x, y: tf.equal(y[0], specific_class))
dataset = dataset.batch(64)
dataset = dataset.repeat()
iterator = dataset.make_one_shot_iterator()
x_batch, y_batch = iterator.get_next()

A minor problem with filter is that I need to construct an iterator every time I want to sample from a new class.

Another idea is to use tf.contrib.data.rejection_resample but it seems prohibitive computationally (or is it?).

I wonder if there is other efficient way to sample batches from a particular class?

Richard_wth
  • 682
  • 4
  • 10

1 Answers1

4

Conceptually your Dataset is parameterized by a variable (the label to sample). This is totally doable!

Executing eagerly:

import numpy as np
import tensorflow as tf
tf.enable_eager_execution()

data = dict(
    x=tf.constant([1., 2., 3., 4.]),
    y=tf.constant([1, 2, 1, 2])
)

requested_label = tf.Variable(1)
dataset = (
    tf.data.Dataset.from_tensor_slices(data)
    .repeat()
    .filter(lambda d: tf.equal(d["y"], requested_label)))


it = dataset.make_one_shot_iterator()
for i, datum in enumerate(it):
  assert int(datum["y"]) == 1
  assert float(datum["x"]) in [1., 3.]
  if i > 5:
    break

requested_label.assign(2)

for i, datum in enumerate(it):
  assert int(datum["y"]) == 2
  assert float(datum["x"]) in [2., 4.]
  if i > 5:
    break

Graph building:

import tensorflow as tf

graph = tf.Graph()
with graph.as_default():
  data = dict(
      x=tf.constant([1., 2., 3., 4.]),
      y=tf.constant([1, 2, 1, 2])
  )

  requested_label = tf.Variable(1)
  dataset = (
      tf.data.Dataset.from_tensor_slices(data)
      .repeat()
      .filter(lambda d: tf.equal(d["y"], requested_label)))


  it = dataset.make_initializable_iterator()
  datum_tensors = it.get_next()
  switch_label_op = requested_label.assign(2)

  graph.finalize()
  with tf.Session() as session:
    session.run(requested_label.initializer)  # label=1
    session.run(it.initializer)
    for _ in range(5):
      datum = session.run(datum_tensors)
      assert int(datum["y"]) == 1
      assert float(datum["x"]) in [1., 3.]

    session.run(switch_label_op)  # label=2

    for _ in range(5):
      datum = session.run(datum_tensors)
      assert int(datum["y"]) == 2
      assert float(datum["x"]) in [2., 4.]
Allen Lavoie
  • 5,778
  • 1
  • 17
  • 26
  • Thank you very much for your detailed answer. However, yours also used `tf.data.Dataset.filter` and is indeed the same as the example I provided. – Richard_wth Sep 25 '18 at 04:07
  • You asked about a way to sample from different classes without recreating an iterator. If this isn't what you want, you'll need to edit more detail into your question. – Allen Lavoie Sep 25 '18 at 17:09
  • Sorry, my fault, I did not notice `requested_label` is a `tf.Variable`. Thanks again for your answer. – Richard_wth Sep 25 '18 at 23:53
  • No worries, I should probably add more non-code explanation. – Allen Lavoie Sep 26 '18 at 18:18