1

I am trying to make a spatio temporal graph convolutional network where a gcn layer is sandwiched between two tomporal cnn layer. The code is following:

inputs = Input(shape=(train_x.shape[1],train_x.shape[2],train_x.shape[3]), batch_size=None)
# temporal convolution
y = tf.keras.layers.Conv1D(128, 9, activation='relu')(inputs)
#graph convolution
y = tf.keras.layers.Conv2D(32, (1,1), activation='relu')(y) 
n, v, t, kc = y.shape
y = tf.reshape(y,(n, 1, kc//1, t, v))
y = tf.einsum('nkctv,kvw->nwtc', y, AD_tensor)
#temporal convolution
y = tf.keras.layers.Conv1D(16, 9, activation='relu')(y)

concat = Flatten()(y)

fc = Dense(units=80, activation='relu')(concat)
fc1 = Dense(units=40, activation='relu')(fc)
fc2 = Dense(units=40, activation='relu')(fc1)
fc3 = Dense(units=80, activation='relu')(fc2)
out = Dense(1, activation = 'sigmoid')(fc3)

model = Model(inputs, out)
model.compile(loss='mse', optimizer= Adam(lr=0.0001))
model.fit(train_x, train_y, validation_data = (valid_x,valid_y), epochs=300, batch_size=2)

When I run this code it shows me this type error:

    TypeError: Failed to convert object of type <class 'tuple'> to Tensor. 
Contents: (None, 1, 32, 72, 25). Consider casting elements to a supported type.
Swakshar Deb
  • 125
  • 4

1 Answers1

1

To use Tensorflow operations with Keras layers, you should wrap them in a Lambda layer as such. The Lambda layer takes a function as its argument.

y = tf.keras.layers.Lambda(lambda x: tf.reshape(n, v, t, kc))(y)

However, for reshaping, Keras already provides a layer for this operation, so you could do

y = tf.keras.layers.Reshape(shape=(v, t, kc))(y)

The layer version of reshaping already takes into account the batch dimension, so you only need to specify the other dimensions.

For the einsum operation, you can use

y = tf.keras.layers.Lambda(lambda x: tf.einsum('nkctv,kvw->nwtc', x[0], x[1]))([y, AD_tensor])
Richard X
  • 1,119
  • 6
  • 18