Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.
Questions tagged [activation-function]
343 questions
8
votes
4 answers
Keras How to use max_value in Relu activation function
Relu function as defined in keras/activation.py is:
def relu(x, alpha=0., max_value=None):
return K.relu(x, alpha=alpha, max_value=max_value)
It has a max_value which can be used to clip the value. Now how can this be used/called in the…

krat
- 129
- 1
- 5
7
votes
4 answers
Why does almost every Activation Function Saturate at Negative Input Values in a Neural Network
This may be a very basic/trivial question.
For Negative Inputs,
Output of ReLu Activation Function is Zero
Output of Sigmoid Activation Function is Zero
Output of Tanh Activation Function is -1
Below Mentioned are my questions:
Why is it that…

RakTheGeek
- 405
- 1
- 5
- 13
7
votes
2 answers
RL Activation Functions with Negative Rewards
I have a question regarding appropriate activation functions with environments that have both positive and negative rewards.
In reinforcement learning, our output, I believe, should be the expected reward for all possible actions. Since some options…

ZAR
- 2,550
- 4
- 36
- 66
7
votes
1 answer
How to specify the axis when using the softmax activation in a Keras layer?
The Keras docs for the softmax Activation states that I can specify which axis the activation is applied to. My model is supposed to output an n by k matrix M where Mij is the probability that the ith letter is symbol j.
n = 7 # number of symbols in…

RobertJoseph
- 7,968
- 12
- 68
- 113
6
votes
2 answers
What is the best activation function to use for time series prediction
I am using the Sequential model from Keras, with the DENSE layer type. I wrote a function that recursively calculates predictions, but the predictions are way off. I am wondering what is the best activation function to use for my data. Currently I…

the_dankest
- 195
- 3
- 13
6
votes
3 answers
Is there a logit function in tensorflow?
Is there a logit function in tensorflow, i.e. the inverse of sigmoid function? I have searched google but have not found any.

Larry Xu
- 79
- 1
- 3
6
votes
1 answer
Advanced Activation layers in Keras Functional API
When setting up a Neural Network using Keras you can use either the Sequential model, or the Functional API. My understanding is the the former is easy to set up and manage, and operates as a linear stack of layers, and that the functional approach…

Joseph Bullock
- 75
- 1
- 6
6
votes
2 answers
binary threshold activation function in tensorflow
I have a piece of code that uses sigmoid activation function for classification that outputs [0,1]. But I need a activation function that outputs binary values either 0 or 1.
x = tf.placeholder("float", [None, COLUMN])
Wh =…

user3104352
- 1,100
- 1
- 16
- 34
5
votes
1 answer
Is it true that `inplace=True` activations in PyTorch make sense only for inference mode?
According to the discussions on PyTorch forum :
What’s the difference between nn.ReLU() and nn.ReLU(inplace=True)?
Guidelines for when and why one should set inplace = True?
The purpose of inplace=True is to modify the input in place, without…

spiridon_the_sun_rotator
- 774
- 10
- 20
5
votes
5 answers
How to change activation layer in Pytorch pretrained module?
How to change the activation layer of a Pytorch pretrained network?
Here is my code :
print("All modules")
for child in net.children():
if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU):
print(child)
print('Before changing…

Hamdard
- 265
- 5
- 18
5
votes
1 answer
How do i write a custom wavelet activation function for a Wavelet Neural Network using Keras or tensorflow
Trying to build a Wavelet Neural Network using Keras/Tensorflow. For this Neural Network I am supposed to use a Wavelet function as my activation function.
I have tried doing this by simply calling creating a custom activation function. However…

King
- 51
- 3
5
votes
2 answers
How to implement RBF activation function in Keras?
I am creating a customized activation function, RBF activation function in particular:
from keras import backend as K
from keras.layers import Lambda
l2_norm = lambda a,b: K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))
def rbf2(x):
X =…

NewToCoding
- 199
- 1
- 2
- 15
5
votes
1 answer
Why is ReLU used in regression with Neural Networks?
I am following the official TensorFlow with Keras tutorial and I got stuck here: Predict house prices: regression - Create the model
Why is an activation function used for a task where a continuous value is predicted?
The code is:
def…

Popovici Andrei-Sorin
- 61
- 2
- 6
5
votes
2 answers
How do I use categorical_hinge in Keras?
Maybe a very dumb question but I can't find an example how to use categorical_hinge in Keras. I do classification and my target is shape(,1) with values [-1,0,1] so I have 3 categories. Using the functional API I have set up my output layer like…

Manngo
- 829
- 7
- 24
5
votes
1 answer
Round an activation function in Keras
I am trying to create an activation function to use in my keras model.
Basically, what I want is an sigmoid function that has only two decimal places. So I was trying to create my own activation function like this:
def mySigmoid(x):
return…

fsofelipe
- 51
- 1
- 4