Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
0
votes
1 answer

keras custom activation function with threshold replacement

For the custom activation function that changes to scores below, I want to replace the values with the activated_x with less than threshold=0.5 to be 0. How can I modify? def ScoreActivationFromSigmoid(x, target_min=1, target_max=9) : …
Isaac Sim
  • 539
  • 1
  • 7
  • 23
0
votes
3 answers

Role of activation function in calculating the cost function for artificial neural networks

I have some difficulty with understanding the role of activation functions and cost functions. Lets take a look at a simple example. Lets say I am building a neural network (artificial neural network). I have 5 „x“ variables and one „y“ variable.…
Emil
  • 11
  • 4
0
votes
1 answer

Neural Network pruning mechanism

I am working on SqueezeNet pruning . I have some questions regarding the pruning code which is based on the paper : PRUNING CONVOLUTIONAL NEURAL NETWORKS FOR RESOURCE EFFICIENT INFERENCE def compute_rank(self, grad): activation_index =…
0
votes
1 answer

Derivative of activation function vs partial derivative wrt. loss function

Some terms in AI are confusing me. The derivative function used in backpropagation is the derivative of activation function or the derivative of loss function? These terms are confusing: derivative of act. function, partial derivative wrt. loss…
0
votes
1 answer

How to use maxout in Tensorflow?

guys! I have a question to ask.If I want to use maxout as activation function , how should I write the codes in Tensorflow? An input parameter is required in the slim.maxout() function, so it cannot be used for slim.arg_scope([slim.conv],…
wang
  • 1
  • 2
0
votes
1 answer

How to plot keras activation functions in a notebook

I wanted to plot all Keras activation functions but some of them are not working. i.e. linear throws an error: AttributeError: 'Series' object has no attribute 'eval' which is weird. How can I plot the rest of my activation functions? points =…
KIC
  • 5,887
  • 7
  • 58
  • 98
0
votes
0 answers

Normalization for MLPregressor

I'm using scikit-learn, and trying to understand how normalize my input variables, for different activation functions using MLPregressor: Relu Tanh Logistic How can i properly normalize the data for this activation functions?
0
votes
0 answers

Inception Model mxnet on Raspberry Pi: fix an unknown activation type error?

I'm trying to implement the Inception model on a Raspberry Pi using mxnet, and getting an error that I haven't been able to unravel: "unknown activation type". I successfully installed opencv, mxnet, and all dependencies. I'm running a short python…
DJP_123
  • 1
  • 1
0
votes
2 answers

Understanding of threshold value in a neural network

Consider the hypothetical neural network here $o_1$ is the output of neuron 1. $o_2$ is the output of neuron 2. $w_1$ is the weight of connection between 1 and 3. $w_2$ is the weight of connection between 2 and 3. So the input to neuron 3…
idpd15
  • 448
  • 2
  • 5
  • 22
0
votes
2 answers

Implementation of tanh() activation function for a CNN

I'm trying to implement the activation function tanh on my CNN, but it doesn't work, the result is always "NaN". So i created a simple application where i have a randomized matrix and try to apply the tanh(x) function thus to understand where's the…
Sabrina Tesla
  • 13
  • 1
  • 10
0
votes
0 answers

Neural Chess: Sample neural network gets stuck at value

I am attempting to write a neural network to play chess, but I am running into problems with the output. I am using python-chess library, and built rewards in. The network has 4 outputs and three Fully Connected Layers. The 4 outputs should map…
0
votes
0 answers

Which activation function should I use for my output layer if my output is a Glove vector representing a word

My output is 332 dimension (300 glove + 32 my custom vector) values of this vector range from -1 to +1 I got horrible results using sigmoid as it confined the output to 0 to 1. I'm trying Tanh right now. What about Softmax? does it suit my case?
user9929853
0
votes
1 answer

Is there a better activation function for my neural network?

I am writing a program to recognize handwritten letters. I have 500px*500px images that I import as BufferedImages and I am taking every pixel's getRBG() value as inputs to the neural network, therefore there are 250,000 inputs. The values for…
0
votes
1 answer

Neural Network with Input - Relu - SoftMax - Cross Entropy Weights and Activations grow unbounded

I have implemented a neural network with 3 layers Input to Hidden Layer with 30 neurons(Relu Activation) to Softmax Output layer. I am using the cross entropy cost function. No outside libraries are being used. This is working on the NMIST dataset…
0
votes
1 answer

About the impact of activation functions in CNN on computation time

Currently I am reading the following paper: "SqueezeNet: AlexNet-level accuracy with 50 x fewer parameters and <0.5 MB model size". In this 4.2.3 (Activation function layer), there is the following statement: The ramifications of the activation…