1

I want to use tf.keras.TimeDistributed() layer with the tf.hub inception_v3 CNN model from the latest TensorFLow V2 version (tf-nightly-gpu-2.0-preview). The output is shown below. It seemst that tf.keras.TimeDistributed() is not fully implemented to work with tf.hub models. Somehow, the shape of the input layer cannot be computed. My question: Is there a workaround this problem?

tf.keras.TimeDistributed with regular tf.keras.layer works fine. I just would like to apply the CNN model to each time step.

Model

import tensorflow as tf
import tensorflow_hub as hub 
from tensorflow.keras import layers, Model

model_url = "https://tfhub.dev/google/tf2- 

preview/inception_v3/feature_vector/3"

feature_layer = hub.KerasLayer(model_url,
                               input_shape = (299, 299, 3),
                               output_shape = [2048],
                               trainable = False)

video = layers.Input(shape = (None, 299, 299, 3))

encoded_frames = layers.TimeDistributed(feature_layer)(video)

model = Model(inputs = video, outputs = encoded_frames)

Expected output

tf.keras model

Error messages

File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/base_layer.py", line 489, in compute_output_shape raise NotImplementedError NotImplementedError

2 Answers2

1

Wrapper Layers like TimeDistributed require a layer instance to be passed. If you build the model out of custom layers, you'll need to at least wrap them in tf.keras.layers.Lambda. This might not be possible in your case of models from hub.KerasLayer, so you might consider the solutions posted here:

TimeDistributed of a KerasLayer in Tensorflow 2.0

MarkV
  • 1,030
  • 7
  • 15
1

In Tensorflow 2 it is possible to use custom layers in combination with the TimeDistributed layer. The error is thrown because it can't compute the output shape (see here).

So in your case you should be able to subclass KerasLayer and implement compute_output_shape manually.

Josef
  • 2,869
  • 2
  • 22
  • 23