12

I'm using Keras with a Theano as a backend and I have Sequential neural network model.

I wonder is there a difference between following?

model.add(Convolution2D(32, 3, 3, activation='relu'))

and

model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
Milo Lu
  • 3,176
  • 3
  • 35
  • 46
angubenko
  • 264
  • 1
  • 3
  • 11

1 Answers1

9

They are essentially the same. The advantage of putting it separately is that you can add other layers (say BatchNormalization) in between.

In Keras, if not specified, Convolution2D will use the 'linear' activation by default, which is just the identity function

def linear(x):
    '''
    The function returns the variable that is passed in, so all types work.
    '''
    return x 

and all that Activation layer does is applying the activation function to the input

def call(self, x, mask=None):
    return self.activation(x)

Edit:

So basically Convolution2D(activation = 'relu') applies relu activation function after performing convolution, which is the same as applying Activation('relu') after Convolution2D(32, 3, 3)

the last two lines of the call function of the Convolution2D layer are

output = self.activation(output)
return output

where output is the output of convolution. So we know applying the activation function is the last step of Convolution2D.

Source code:
Convolution2D layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/convolutional.py
Activation layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/core.py
activation functions: https://github.com/fchollet/keras/blob/master/keras/activations.py

dontloo
  • 10,067
  • 4
  • 29
  • 50
  • Just to clarify. So basically Convolution2D(activation = 'relu') applies relu activation function after performing convolution, which is the same as applying Activation('relu') after Convolution2D(32, 3, 3)? – angubenko May 09 '16 at 20:17
  • @angubenko yes, I added some code and explanation in the answer, hope that helps. – dontloo May 10 '16 at 02:03