1

I am using inception v1 architecture for transfer learning. I have downloded the checkpoints file, nets, pre-processing file from the below github repository

https://github.com/tensorflow/models/tree/master/slim

I have 3700 images and pooling out the last pooling layer filters from the graph for each of my image and appending it to a list . With every iteration the ram usage is increasing and finally killing the run at around 2000 images. Can you tell me what mistake I have done ?

https://github.com/Prakashvanapalli/TensorFlow/blob/master/Transfer_Learning/inception_v1_finallayer.py

Even if I remove the list appending and just trying to print the results. this is still happening. I guess the mistake is with the way of calling the graph. When I see my ram usage , with every iteration it is becoming heavy and I don't know why this is happening as I am not saving anything nor there is a difference between 1st iteration

From my point, I am just sending one Image and getting the outputs and saving them. So it should work irrespective of how many images I send.

I have tried it on both GPU (6GB) and CPU (32GB).

Prakash Vanapalli
  • 677
  • 1
  • 9
  • 16

1 Answers1

2

You seem to be storing images in your graph as tf.constants. These will be persistent, and will cause memory issues like you're experiencing. Instead, I would recommend either placeholders or queues. Queues are very flexible, and can be very high performance, but can also get quite complicated. You may want to start with just a placeholder.

For a full-complexity example of an image input pipeline, you could look at the Inception model.

Allen Lavoie
  • 5,778
  • 1
  • 17
  • 26
  • the image which I am sending to inception.preprocessing function need to be a tensor, if I keep a placeholder and feed images to the network, sess.run would convert it into a numpy array, which will be rejected by preprocessing function. How do I create a tensor from my np array without using the constant as you mentioned above ? – Prakash Vanapalli Feb 08 '17 at 12:40