2

I am trying to do image classification using Keras's Xception model modeled after this code. However I want to use multiple GPU's to do batch parallel image classification using this function. I believe it is possible and I have the original code working without multi GPU support however I can not get multi_gpu_model function to work as I would expect. I am following this example for the multi GPU example. This is my code (it is the backend of a Flask app), it instantiates the model, makes a prediction on an example ndarray when the class is created, and then expects a base 64 encoded image in the classify function:

import os
from keras.preprocessing import image as preprocess_image
from keras.applications import Xception
from keras.applications.inception_v3 import preprocess_input, decode_predictions
from keras.utils import multi_gpu_model
import numpy as np
import tensorflow as tf
import PIL.Image
from numpy import array


class ModelManager:

    def __init__(self, model_path):
        self.model_name = 'ImageNet'
        self.model_version = '1.0'
        self.batch_size = 32
        height = 224
        width = 224
        num_classes = 1000
        # self.model = tf.keras.models.load_model(os.path.join(model_path, 'ImageNetXception.h5'))
        with tf.device('/cpu:0'):
            model = Xception(weights=None,
                             input_shape=(height, width, 3),
                             classes=num_classes, include_top=True)
            # Replicates the model on 8 GPUs.
        # This assumes that your machine has 8 available GPUs.
        self.parallel_model = multi_gpu_model(model, gpus=8)
        self.parallel_model.compile(loss='categorical_crossentropy',
                                    optimizer='rmsprop')

        print("Loaded Xception model.")
        x = np.empty((1, 224, 224, 3))
        self.parallel_model.predict(x, batch_size=self.batch_size)
        self.graph = tf.get_default_graph()
        self.graph.finalize()

    def classify(self, ids, images):
        results = []
        all_images = np.empty((0, 224, 224, 3))
        # all_images = []
        for image_id, image in zip(ids, images):
            # This does the same as keras.preprocessing.image.load_img
            image = image.convert('RGB')
            image = image.resize((224, 224), PIL.Image.NEAREST)

            x = preprocess_image.img_to_array(image)
            x = np.expand_dims(x, axis=0)
            x = preprocess_input(x)
            all_images = np.append(all_images, x, axis=0)
        # all_images.append(x)
        # a = array(all_images)
        # print(type(a))
        # print(a[0])

        with self.graph.as_default():
            preds = self.parallel_model.predict(all_images, batch_size=288)
        #print(type(preds))

        top3 = decode_predictions(preds, top=3)[0]
        print(top3)
        output = [((t[1],) + t[2:]) for t in top3]

        predictions = [
            {'label': label, 'probability': probability * 100.0}
            for label, probability in output
        ]

        results.append({
            'id': 1,
            'predictions': predictions
        })
        print(len(results))
        return results

The parts I am not sure about is what to pass the predict function. Currently I am creating an ndarray of the images I want classified after they are preprocessed and then passing that to the predict function. The function returns but the preds variable doesn't hold what I expect. I tried to loop through the preds object but decode_predictions errors when I pass a single item but responds with one prediction when I pass the whole preds ndarray. In the example code they don't use the decode_predictions function so I'm not sure how to use it with the response from parallel_model.predict. Any help or resources is appreciated, thanks.

Troy Zuroske
  • 762
  • 1
  • 12
  • 31

1 Answers1

-1

the following site illustrates how to do that correctly link

Jamal Alkelani
  • 606
  • 7
  • 19
  • That link is the second link I have in the question. No where on that page has the decode_predictions function being used which is where I’m stuck. – Troy Zuroske May 27 '19 at 23:25