1

I'm trying to create a custom neural network model in TensorFlow 2.0. I am aware that it's been repeatedly advised in the TF2.0 community that custom models should be built with the existing modules in the Functional API as much as possible.

However, in my model, there is a requirement for one hidden layer to have neurons with different activation functions. For example, I want a hidden layer with three neurons, one of which is linear and the other two are sigmoids. And the final model is just repeatedly stacking this layer some N times.

There is no appropriate function in the tf.keras.layers module to implement the above. Is there a way to implement this myself using a class definition like: MyDenseLayer(tf.keras.layers.Layer)? Then it would be easy for me to build the complete model by stacking this custom-defined layer.

1 Answers1

1

You can do the following,

import tensorflow as tf
from tensorflow.keras.layers import Input, Lambda, Activation, Dense
from tensorflow.keras.models import Model

def f(x):
  return tf.stack([a(c) for c,a in zip(tf.unstack(x[0], axis=1), x[1])], axis=1)

inp = Input(shape=(10,))
out = Dense(3)(inp)
out = Lambda(lambda x: f(x))(
        [out, [Activation('linear'), Activation('sigmoid'), Activation('sigmoid')]]
      )
model = Model(inputs=inp, outputs=out)

Explanation:

The trick is in the Lambda layer.

  • First split the Dense output on axis=1.
  • Zip it with activations and iterate the zipped list while applying the corresponding activation
  • Stack the outputs to create a single tensor
thushv89
  • 10,865
  • 1
  • 26
  • 39