0

I am beginner in python, deep learning and neural network. I had made custom activation function. What i want to know when i am making custom activation function that root from sigmoid, where should i define the derivative for my custom activation function?

I've tried reading about automatic differentation. but i am not sure does keras automatically derivative my custom sigmoid?

my custom activation function in keras/activation.py

def tempsigmoid(x, temp=1.0):
    return K.sigmoid(x/temp)

my model

def baseline_model():
    # create model
    model = Sequential()
    model.add(Conv2D(101, (5, 5), input_shape=(1, 28, 28), activation='relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Dropout(0.2))
    model.add(Flatten())
    model.add(Dense(128, activation='relu'))
    model.add(Dense(num_classes, activation='tempsigmoid'))
    # Compile model
    model.compile(loss='mse', optimizer='adam', metrics=['accuracy'])
    return model
astri
  • 17
  • 2
  • 7

2 Answers2

1

Yes, Keras uses automatic differentiation, as it only supports backends with this feature (like TensorFlow).

So you do not need to define the gradient or derivative at all, it will be computed for you automatically.

Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
  • thank you for your response. Wow , do you know resource good reading for automatic differentiation? I knew that other DL framework such as chainer dont have this thing. And i want to know how about pytorch. – astri Apr 25 '19 at 01:30
  • Is there a reference to any documentation confirming that Keras automatically takes the derivative? – Albert Oct 01 '19 at 16:00
0

You can define the custom metric for start in the same script.

You don't need to provide quotations ' ', you can simply write:

model.add(Dense(num_classes, activation=tempsigmoid))
Sreeram TP
  • 11,346
  • 7
  • 54
  • 108
Timbus Calin
  • 13,809
  • 5
  • 41
  • 59
  • thank you for your respond. Whats the difference custom activation function in the same script and in the master activation.py file. – astri Apr 25 '19 at 01:27
  • I just wanted to see that you succeed with implementing the activation function, and only then to move it to another script – Timbus Calin Apr 25 '19 at 08:05