1

I would like to use the activation function relu with its parameter alpha set to 0.2, but I could not figure out how this can be done for my model

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam

model_input = Input(shape = x_train[0].shape)
x = Dense(120, activation = 'relu')(model_input)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

I saw there is a way to do this in this answer, which uses model.add(). But I am not sure how this could work for me, could you please help me?

Thank you in advance!

zyy
  • 1,271
  • 15
  • 25
  • Hi and welcome to AI SE! Unfortunately, this question is off-topic here because you're asking how something can be done in a certain library, which is a programming issue. See [https://ai.stackexchange.com/help/on-topic](https://ai.stackexchange.com/help/on-topic). I will migrate this question to SO. – nbro Mar 24 '20 at 04:52

1 Answers1

1

First, note that you're specifying the activation as a string, while in the example provided in the answer you're linking us to the activation function is specified by creating the object of the class representing the activation function. Second, note that you want to use the "leaky ReLU" activation function, while you're currently specifying only "relu" as the activation function.

To answer your question, you can probably do something like this

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.layers import LeakyReLU

model_input = Input(shape = x_train[0].shape)
x = Dense(120)(model_input)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

I haven't tried this code, but it should work!

zyy
  • 1,271
  • 15
  • 25
nbro
  • 15,395
  • 32
  • 113
  • 196