Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.
Questions tagged [activation-function]
343 questions
1
vote
3 answers
How can I add user defined activation function in CNN model instead of builtin function in keras?
Instead of sigmoid which gives sigmoid(x) = 1 / (1 + exp(-x))
I want a activation function mish such as mish(x)=x * tanh(softplus(x))
I want ti use it as
conv_layer1 = Conv3D(filters=8, kernel_size=(3, 3, 5), activation='mish')(input_layer) like…

Tasmia Jannat
- 15
- 5
1
vote
0 answers
Activation function for Output Layer for Regression model in Neural Network
My new question is about activation function for the output layer in regression model in Neural Network
SO: Which is better? Linear activation function or NO activation function?

Andrei
- 73
- 1
- 13
1
vote
0 answers
custom activation function in PyTorch - fix prediction
I read this post about customa ctivation function, but still I can't implement my code. My activation function can be expressed as a combination of existing PyTorch functions and it works fine function_pytorch(prediction, Q_sample). [Q_samples, is…

DeepRazi
- 259
- 4
- 13
1
vote
0 answers
Model is not learning: problem with custom activation function and/or with custom loss function
class QuaternionLoss(torch.nn.Module):
def __init__(self):
super(QuaternionLoss, self).__init__()
def forward(self, output, target):
loss = 100 * (1 - torch.dot(output.squeeze(0), target.squeeze(0)))
return…

Johnyyy
- 11
- 1
1
vote
0 answers
Negative weights and biases in siamese keras model
I try to train a siamese model in keras. I use a really simple encoder with only covnets to encode a 32x32 RGB picture into a feature vector. The encoder encodes two pictures A and B. Then a MLP compares the two vectors and computes a score between…

Oliver Tautz
- 11
- 4
1
vote
1 answer
Implement x=T if abs(x)>T as an activation function in pytorch
I would like to implement the following activation function in pytorch:
x = T if abs(x)>T else x
I could do something close with torch.clamp(min=-T, max=T) but it's not exactly the behavior I want (this would behave the same as above for x>-T but…

Noé Achache
- 195
- 2
- 9
1
vote
1 answer
Why Is My Machine Learning Algorithm Getting Stuck?
So I am hitting a wall with my C# Machine Learning project. I am attempting to train an algorithm to recognize numbers. Since this is only an exercise I have a image set of 200 numbers (20 each for 0 to 9). Obviously if I wanted a properly…

JMC0352
- 25
- 7
1
vote
0 answers
Tensorboard in Tensorflow 2.0: Plot activations and gradients of a model
First of all, in the tensorflow documentation about Tensorboard callback https://www.tensorflow.org/api_docs/python/tf/keras/callbacks/TensorBoard , they are mentioning that you can plot histograms of activations of a model. However, when I use the…

Ζι Βάγγο
- 184
- 1
- 9
1
vote
1 answer
Training the same model with two different outputs with Keras
I have a simple GRU network coded with Keras in python as below:
gru1 = GRU(16, activation='tanh', return_sequences=True)(input)
dense = TimeDistributed(Dense(16, activation='tanh'))(gru1)
output = TimeDistributed(Dense(1,…

ICHaLiL
- 29
- 3
1
vote
1 answer
keras - adding LeakyrRelu on seqauential model throws error
second_fashion_model.add(LeakyReLU(alpha=0.05))
throws error as:
The added layer must be an instance of class Layer. Found:

jay
- 65
- 1
- 10
1
vote
1 answer
Don't all neurons in a neural network always fire/activate?
I'm a bit confused by activation functions and blogs/posts that continuously mention that neurons are not activated or not fired.
But mathematically speaking, if whatever activation function (whether it's sigmoid, tanh, relu) calculates an output of…

Bob de Graaf
- 2,630
- 1
- 25
- 43
1
vote
1 answer
How do I modify the activation functions from keras?
I would like to use the activation function relu with its parameter alpha set to 0.2, but I could not figure out how this can be done for my model
import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from…

zyy
- 1,271
- 15
- 25
1
vote
0 answers
Implementation of softmax derivative
I know there are already multiple similar questions out there, but still don't really understand the derivative of the softmax function. That's how I implemented the softmax function in java:
public double[] activation(double[] input) {
double[]…

Jannik
- 399
- 2
- 5
- 22
1
vote
1 answer
Activation Function to help find anomalies
So I have one column of data, all the data there is normal and without any anomalies.
Let’s say the data is scattered just like the picture bellow.
K-means doesn’t really work on one column. I was given the advice to plot the data and then use an…

E199504
- 425
- 4
- 12
1
vote
0 answers
Accuracy differences when using custom activation functions in neural network
Note: The logistic activation function already exists in tf.nn.logistic. However, I am trying to develop a custom activation which is on similar lines. Please refer to the code snippet below.
def custom_logistic(x):
value=…

Mitra Lanka
- 69
- 2
- 9