34

This is my code that works if I use other activation layers like tanh:

model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(Activation(act))
model.add(Dropout(0.15))
model.add(Dense(64, init='uniform'))
model.add(Activation('softplus'))
model.add(Dropout(0.15))
model.add(Dense(2, init='uniform'))
model.add(Activation('softmax'))

sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='binary_crossentropy', optimizer=sgd)
model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2)

In this case, it doesn't work and says "TypeError: 'PReLU' object is not callable" and the error is called at the model.compile line. Why is this the case? All the non-advanced activation functions works. However, neither of the advanced activation functions, including this one, works.

pr338
  • 8,730
  • 19
  • 52
  • 71

3 Answers3

33

The correct way to use the advanced activations like PReLU is to use it with add() method and not wrapping it using Activation class. Example:

model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(act)
Tarantula
  • 19,031
  • 12
  • 54
  • 71
  • 1
    - If we have two dense FC layers, should we add after each of them, and If we also have dropout, what should we do? – fermat4214 Mar 16 '17 at 13:59
  • In the case of ReLU, it doesn't matter if you add Dropout before or after the activation (maybe performance differences only, but results will be the same). – Tarantula Jun 26 '18 at 05:52
21

If using the Model API in Keras you can call directly the function inside the Keras Layer. Here's an example:

from keras.models import Model
from keras.layers import Dense, Input
# using prelu?
from keras.layers.advanced_activations import PReLU

# Model definition
# encoder
inp = Input(shape=(16,))
lay = Dense(64, kernel_initializer='uniform',activation=PReLU(),
            name='encoder')(inp)
#decoder
out = Dense(2,kernel_initializer='uniform',activation=PReLU(), 
            name='decoder')(lay)

# build the model
model = Model(inputs=inp,outputs=out,name='cae')
Mattia Paterna
  • 1,268
  • 3
  • 15
  • 31
5

For Keras functional API I think the correct way to combine Dense and PRelu (or any other advanced activation) is to use it like this:

focus_tns =focus_lr(enc_bidi_tns)

enc_dense_lr = k.layers.Dense(units=int(hidden_size))
enc_dense_tns = k.layers.PReLU()(enc_dense_lr(focus_tns))

dropout_lr = k.layers.Dropout(0.2)
dropout_tns = dropout_lr(enc_dense_tns)

enc_dense_lr2 = k.layers.Dense(units=int(hidden_size/4))
enc_dense_tns2 = k.layers.PReLU()(enc_dense_lr2(dropout_tns)) 

of course one should parametrize layers according to the problem

sashaostr
  • 625
  • 8
  • 16
  • 1
    There are really few examples out there with advanced activations using the functional API. If you are to use multiple inputs or outputs this is the way to go. This gave me a great insight. Thanks. – Julian C Feb 25 '18 at 18:51