1

Note: The logistic activation function already exists in tf.nn.logistic. However, I am trying to develop a custom activation which is on similar lines. Please refer to the code snippet below.

def custom_logistic(x):
    value= tf.to_float(tf.math.exp(-x))
    Activate_input = 1/(1+value)
    return Activate_input

Now, when I am calling the above function at the input layer of the code:

model.add(Dense(10, input_dim=3, activation=custom_logistic))

Note: Here, the model is a sequential model.

I get an accuracy that is lesser than what I get when I use activation = tf.nn.sigmoid

Is there anything else I need to define?. Please let me know.

Mitra Lanka
  • 69
  • 2
  • 9
  • In your function, what is `mx` ? – Susmit Agrawal Feb 11 '20 at 08:50
  • Import mxnet as mx – Mitra Lanka Feb 11 '20 at 14:39
  • 1
    You can use numpy arrays or Tensorflow tensors with Tensorflow; you can't use MXNet tensors. – Susmit Agrawal Feb 11 '20 at 16:04
  • Thank you! It works when I am using (1/1+tf.to_float(tf.math.exp(-x))), but the accuracy is low. I will edit my question to make it clear. Kindly, look into it. – Mitra Lanka Feb 11 '20 at 18:14
  • Are you missing brackets? Try " (1/(1+tf.to_float(tf.math.exp(-x))))".... – Andi Schroff Feb 14 '20 at 08:41
  • It did not make any difference to the results @Andy Schroff. Thank you for the response! – Mitra Lanka Feb 18 '20 at 20:10
  • Hi @Mitra Lanka, Can you provide a minimum reproducible code and an example screenshot of your model's accuracy? – TF_Support Apr 02 '20 at 09:08
  • @MitraLanka, Hi. have reproduced your case but have seen no difference in the Accuracy even with the Custom Activation function. `tf.to_float` has resulting in error and `tf.cast` has resolved that error. If your issue is still not resolved, please share the complete code so that we can help you. Thanks! –  May 17 '20 at 06:09

0 Answers0