I would like to use the activation function relu
with its parameter alpha
set to 0.2, but I could not figure out how this can be done for my model
import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam
model_input = Input(shape = x_train[0].shape)
x = Dense(120, activation = 'relu')(model_input)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)
I saw there is a way to do this in this answer, which uses model.add()
. But I am not sure how this could work for me, could you please help me?
Thank you in advance!