Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.
Questions tagged [activation-function]
343 questions
0
votes
2 answers
How do we know a neuron is activated when we use activation function
I need clarification on when exactly do we say an activation function is activated. The job of activation function is to introduce non-linearity, right. Is it just scaling a given input to confined range?

User1312
- 43
- 6
0
votes
1 answer
How do you write a custom activation function in python for Keras?
I'm trying to write a custom activation function for use with Keras. I can not write it with tensorflow primitives as it does properly compute the derivative. I followed How to make a custom activation function with only Python in Tensorflow? and it…

Anonymous Geometer
- 16
- 1
0
votes
1 answer
Get Specific Indices from a Tensorflow Tensor
I am trying to implement the BReLU Activation Function using tensorflow.keras which is described below.
Following is the code I wrote for the custom layer:
class BReLU(Layer):
def __init__(self):
super(BReLU, self).__init__()
def…

Soumik Rakshit
- 859
- 9
- 22
0
votes
2 answers
What do non-linear activation functions do at a fundamental level in neural networks?
I've been trying to find out what exactly non-linear activation functions do when implemented in a neural network.
I know they modify the output of a neuron, but how and for what purpose?
I know they add non-linearity to otherwise linear neural…

Michael Pirrall
- 11
- 5
0
votes
1 answer
Implementing the Square Non-linearity (SQNL) activation function in Keras
I have been trying to implement the square non-linearity activation function function as a custom activation function for a keras model. It's the 10'th function on this list https://en.wikipedia.org/wiki/Activation_function.
I tried using the keras…

gb4
- 11
- 2
0
votes
1 answer
May I Learn Some Details about Implementing a Custom Activation Function in Keras?
@patapouf_ai
Relating to How to make a custom activation function with only Python in Tensorflow?
I am a newcomer to Python, keras, and tf. I implemented a piece-wise constant custom activation function using the method above as follows
import…

Theron
- 567
- 1
- 7
- 21
0
votes
1 answer
Do the derivatives of the activation functions have to be ranged [0,1]?
I found that the derivatives of the common activation functions are ranged in [0,1].
https://ml-cheatsheet.readthedocs.io/en/latest/activation_functions.html
It is the cause of gradient vanishing in RNN.
What is the reason that the derivatives are…

QuantCub
- 3
- 4
0
votes
1 answer
How to define a morphology operation in a Neural Network as an activation function?
I recently completed a task to study how to use morphological operation as an activation function for neural networks. But I had no idea and didn't know how to use keras for custom functionality. Can anyone provide Suggestions or related papers?
0
votes
2 answers
Where should i define derivative from custom activation function in Keras
I am beginner in python, deep learning and neural network. I had made custom activation function. What i want to know when i am making custom activation function that root from sigmoid, where should i define the derivative for my custom activation…

astri
- 17
- 2
- 7
0
votes
1 answer
Maxout activation function- implementation in NumPy for forward and backpropogation
I am building a vanilla neural network from scratch using NumPy and trialling the model performance for different activation functions. I am especially keen to see how the 'Maxout' activation function would effect my model performance.
After doing…

Abishek
- 767
- 5
- 9
0
votes
1 answer
Would backpropagation work as expect when using this code as swish in a CNN?
I would like to use swish (as a layer) in a CNN.
I am not sure if this is the correct way to implement a such activation function.
Will back propagation work properly with this code?
class Swish(nn.Module):
def forward(self,x):
return x…

Jaja
- 662
- 7
- 15
0
votes
1 answer
Where activation function calculated in the session.run()
I'm studying with Tensorflow open source code.
I would like to find specific place where actual calculation is executed.
However, it's really hard to find from the deep open source code.
So, I want to get any directions from people who've already…

C.H.Song
- 39
- 6
0
votes
1 answer
Create custom 'non differentiable' activation function in keras
Is it possible to create a custom activation function of the form:
def newactivation(x):
if x <= -1:
return -1
elif x > -1 and x <= 1
return x
else :
return 1
So basically it would be a linearized version of…

kleka
- 364
- 3
- 14
0
votes
0 answers
Softmax activation function output (with Tanh)
I am working on an MLP-neural network using supervised learning.
For the hidden layers I am using Tanh (-1,1) and for the output layer Softmax (which gives the probability distribution btw 0 and 1.
As I am working with supervised learning should my…

LVoltz
- 15
- 4
0
votes
1 answer
keras custom activation to drop under certain conditions
I am trying to drop the values less than 1 and greater than -1 in my custom activation like below.
def ScoreActivationFromSigmoid(x, target_min=1, target_max=9) :
condition = K.tf.logical_and(K.tf.less(x, 1), K.tf.greater(x, -1))
case_true =…

Isaac Sim
- 539
- 1
- 7
- 23