5

I am creating a customized activation function, RBF activation function in particular:

from keras import backend as K
from keras.layers import Lambda

l2_norm = lambda a,b:  K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))

def rbf2(x):
X = #here i need inputs that I receive from previous layer 
Y = # here I need weights that I should apply for this layer
l2 = l2_norm(X,Y)
res = K.exp(-1 * gamma * K.pow(l2,2))
return res

The function rbf2 receives the previous layer as input:

#some keras layers
model.add(Dense(84, activation='tanh')) #layer1
model.add(Dense(10, activation = rbf2)) #layer2

What should I do to get the inputs from layer1 and weights from layer2 to create the customized activation function?

What I am actually trying to do is, implementing the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidean distance between its input vector and its weight vector.

For example, layer1 has 84 neurons and layer2 has 10 neurons. In general cases, for calculating output for each of 10 neurons of layer2, we do the dot product of 84 neurons of layer1 and 84 weights in between layer1 and layer2. We then apply softmax activation function over it.

But here, instead of doing dot product, each neuron of the layer2 outputs the square of the Euclidean distance between its input vector and its weight vector (I want to use this as my activation function).

Any help on creating RBF activation function (calculating euclidean distance from inputs the layer receives and weights) and using it in the layer is also helpful.

today
  • 32,602
  • 8
  • 95
  • 115
NewToCoding
  • 199
  • 1
  • 2
  • 15
  • Do you mean you want to get **the output** of `layer1` and `layer2` and pass it to your rbf function? If that's the case then are your sure it would work with the current definition of your activation function, since they have different shapes? – today Dec 19 '18 at 17:09
  • What I am actually trying to do is, implement the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidian distance between its input vector and its weight vector. – NewToCoding Dec 19 '18 at 21:59
  • In short, I need the outputs of `layer1` and weights of each neuron of `layer2`. and I want to calculate the Euclidean distance between them. – NewToCoding Dec 19 '18 at 23:03
  • Is gamma a single parameter for each neuron or is it a vector? I think it is a single parameter. – today Dec 20 '18 at 10:35
  • Yes, it is a single parameter. I am using gamma = 1 – NewToCoding Dec 20 '18 at 10:41

2 Answers2

9

You can simply define a custom layer for this purpose:

from keras.layers import Layer
from keras import backend as K

class RBFLayer(Layer):
    def __init__(self, units, gamma, **kwargs):
        super(RBFLayer, self).__init__(**kwargs)
        self.units = units
        self.gamma = K.cast_to_floatx(gamma)

    def build(self, input_shape):
        self.mu = self.add_weight(name='mu',
                                  shape=(int(input_shape[1]), self.units),
                                  initializer='uniform',
                                  trainable=True)
        super(RBFLayer, self).build(input_shape)

    def call(self, inputs):
        diff = K.expand_dims(inputs) - self.mu
        l2 = K.sum(K.pow(diff,2), axis=1)
        res = K.exp(-1 * self.gamma * l2)
        return res

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.units)

Example usage:

model = Sequential()
model.add(Dense(20, input_shape=(100,)))
model.add(RBFLayer(10, 0.5))
today
  • 32,602
  • 8
  • 95
  • 115
  • Comments are not for extended discussion; this conversation has been [moved to chat](https://chat.stackoverflow.com/rooms/185572/discussion-on-answer-by-today-how-to-implement-rbf-activation-function-in-keras). – Samuel Liew Dec 20 '18 at 22:19
  • Can someone explain what gamma,kwargs , self.mu are? – Jerry Jun 19 '21 at 09:03
2

There is no need to reinvent the wheel here. A custom RBF layer for Keras already exists.

Hagbard
  • 3,430
  • 5
  • 28
  • 64