Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
0
votes
1 answer

Plotting activation function gradients in PyTorch?

Using a pytorch model I want to plot the gradients of loss with respect to my activation functions (e.g. ReLU). For the non-activation layers I can get gradients as follows but for the activation functions I cannot do that. How can I plot my…
0
votes
1 answer

Trainable beta in swish activation function, CNN, torch

I am using Swish activation function, with trainable parameter according to the paper SWISH: A Self-Gated Activation Function paper by Prajit Ramachandran, Barret Zoph and Quoc V. Le. I am using LeNet-5 CNN as a toy example on MNIST to train 'beta'…
Arun
  • 2,222
  • 7
  • 43
  • 78
0
votes
0 answers

How to restrict the output value range for regression networks dynamically for each set of inputs

I Used a Regression CNN for estimating soil moisture based on 5 input images and a numeric SMAP data and the value of soil moisture will always be less than this SMAP input. So I want a way to integrate this output into the model training so that…
0
votes
2 answers

difference calling activation function

While I'm study tensorflow, I got a question. There are two way to define activation function. activation = 'relu' and activation = tf.nn.relu I want to know difference between of them. (Actually, I think other activation functions are include in…
0
votes
2 answers

How to create a custom conditional activation function

I want to create custom activation function in TF2. The math is like this: def sqrt_activation(x): if x >= 0: return tf.math.sqrt(x) else: return -tf.math.sqrt(-x) The problem is that I can't compare x with 0 since x is a…
David H. J.
  • 340
  • 2
  • 12
0
votes
1 answer

Adaptive Activation Function in Tensorflow 2 trained variable for mulitple calls

So I want to try out an adaptive activation function for my neural network. This means I want to have a custom loss that is similar to a standard one (like tanh or relu), however I want to add some trainable parameters. Currently, I am trying to add…
0
votes
1 answer

sklearn: Set the valute to the attribute out_activation_ to 'logistic'

I need to set the attribute activation_out = 'logistic' in a MLPRegressor of sklearn. It is supposed that this attribute can take the names of the relevant activation functions ('relu','logistic','tanh' etc). The problem is that I cannot find the…
0
votes
0 answers

How to get one-hot encoded output from keras sequential model + how to code custom binary step function in keras?

Basically I'm trying to get an array of 1s and 0s as an output from my sequential keras model. This is so that I can unflatten the array into a one-hot encoded array. I assume the best way to do this would be to code a binary step function as a…
0
votes
2 answers

What is the purpose of having the same input and output in PyTorch nn.Linear function?

I think this is a comprehension issue, but I would appreciate any help. I'm trying to learn how to use PyTorch for autoencoding. In the nn.Linear function, there are two specified parameters, nn.Linear(input_size, hidden_size) When reshaping a…
0
votes
2 answers

The function for tensor value generates this Error: 'false_fn' must be callable

I am creating a function that takes a tensor value and returns the result by applying the following formulation, There are 3 conditions so I am using @tf.functions. def Spa(x): x= tf.convert_to_tensor(float(x), dtype=tf.float32) p=…
0
votes
3 answers

Why ReLU function after every layer in CNN?

I am taking intro to ML on Coursera offered by Duke, which I recommend if you are interested in ML. The instructors of this course explained that "We typically include nonlinearities between layers of a neural network.There's a number of reasons to…
0
votes
1 answer

Output 0 of DequantizeAndLinearBackward is a view and is being modified inplace. This view was created inside a custom Function and the autogrid

I am trying to fine-tune GPT J, but I have this error. I think it's related to the activation function and it's in-place but I don't know how to code it to fix it. Is it a parameter inside the activation function that needs to be disabled? If yes,…
May Ouir
  • 21
  • 2
0
votes
1 answer

About tensorflow-wavelets

I'm trying to use the tensorflow-wavelets to use it as a layer. Everything so far is great, the problem is that my inputs are signal sequences, and when I run the code, it gives me error 1 below. When I searched in the code of the…
0
votes
0 answers

Tanh activation function in perceptron returns RuntimeWarning: invalid value encountered in double_scalars

I am trying to implement a single-layer perceptron in python. Some of the code is what I have written and some of the code is from my professor who gave us a skeleton of basically how it should be implemented. Here is the necessary code to implement…
0
votes
0 answers

How can i change the activation function of the nodes in hidden layer using neurolab?

Hello dear users of neuorlab, I want to change the activation function nodes of the hidden layer to ReLU and keep Linear function in output nodes import numpy as np import neurolab as nl # Create train samples input = np.random.uniform(-1, 1, (5,…