ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.
Questions tagged [relu]
101 questions
0
votes
1 answer
Why can Relu solve vanishing gradient?
When it comes to sigmoid, the very first of edges get rarely updated since a lot of multiplication of 0~1. I've learned it's called vanishing gradient.
But why isn't it a problem for Relu? I think the very first of the edges of relu also get rarely…

marks jun
- 25
- 4
0
votes
1 answer
creating a customized non_linearity like relu6
can someone help me to find the function which allow me to fix a threshold (maximum) not to exceed like in relu6 .
i tried
X = max(X , 6)
but I recived this error :
(OperatorNotAllowedInGraphError: using a tf.Tensor as a Python bool is not…
0
votes
1 answer
Neural network for fitting a sine wave
So, I have been learning neural network and have tried coding them from scratch and have been successful in some instances. So, I thought of fitting a simple single layer neural network to a sine wave.
I know i can use keras but i want to learn the…

Milind Kamath
- 1
- 1
0
votes
1 answer
Backpropagtion with ReLU - Understanding the calculation
I've been getting started with neural networks and am attempting to implement a forward and backward pass with a ReLU activation function. However I feel like I'm misunderstanding something relatively fundamental here when it comes to the backward…

user25758
- 109
- 10
0
votes
1 answer
How to simulate ReLU gradient with tf.GradientTape
TensorFlow has a feature called GradientTape, kinda getting gradients using Monte Carlo method(?).
I'm trying to simulate the gradient of ReLU but this doesn't work on the negative half of X.
#colab or ipython reset
%reset -f
#libs
import…

Dee
- 7,455
- 6
- 36
- 70
0
votes
1 answer
relu function neural network outputting 0 or 1
I tried implementing a simple neural network using both sigmoid and relu functions .
with the sigmoid function I got some good outputs . but when using relu I got either 0's or 1's array.
(I need the relu function beacause I'm willing to use the…

yo bar
- 13
- 5
0
votes
0 answers
Tensorflow Activation function with Dense Layer
I would like to know if it's possible to define a custom activation function like tanh to constraints the 12 outputs of the dense layers or fully connected layer like that:
If X[0:6] > 1 then X[0:6] else 1
If X[6:12] < 0 then X[6:12] else 0
X is…

SkyEeros
- 1
- 2
0
votes
0 answers
How to define a ReLU with TensorFlow custom_gradient?
I'm practicing using TensorFlow's custom_gradient decorator and I tried to define a simple ReLU. One would think it would be as simple as defining the gradient to be 1 when x > 0 and 0 otherwise. However, the following code does not yield the same…

Sam Lerman
- 301
- 2
- 8
0
votes
1 answer
Set up different actiavtion functions for different layers using "neuralnet" package
Ciao,
I am working to neuralnet in R.
I used to program this kind of stuff using Keras in python so I would expect to be able to set up different activation functions for different layers.
Let me explain. Suppose I want to build a neural net with 2…

clarkmaio
- 359
- 2
- 12
0
votes
2 answers
My convolutional network loss does not change and keeps stagnant throughout the training. how do fix this?
I am trying to train a convolutional network but the loss does change no matter what i do. I want to know where i am going wrong and also would appreciate any friendly advises as this is my first time i am dealing with such large data.
I have…

sparkles
- 174
- 8
0
votes
1 answer
Is all ReLu doing changing all negative values to 0?
I have just spent an hour reading a paper on Rectified Linear Units (ReLu). I find it very difficult to unpack the maths involved.
Is it basically just saying that, after you do convulution and pooling, you change any negative values to 0?
Is that…

Simon Kiely
- 5,880
- 28
- 94
- 180
0
votes
0 answers
How to implement the code of this new activation function for CNN
There are relu, leak relu and a new type of relu designed by myself, But I don't know how to implement it. I hope you can help me!
image

YFye
- 77
- 2
- 4
0
votes
1 answer
Neural Network ReLU Outputting All 0s
Here is a link to my project: https://github.com/aaronnoyes/neural-network/blob/master/nn.py
I have implemented a basic neural network in python. By default it uses a sigmoid activation function and that works great. I'm trying to compare changes in…

Aaron
- 117
- 7
0
votes
0 answers
Can ReLU replace a Sigmoid Activation Function in Neural Network
I'm new into this and I'm trying to replace the sigmoid activation function in the following simple NN with ReLU. Can I do that? I've tried replacing the sigmoid function, but it's not working. The output should be the AND gate(if input (0,0)->…

andree17914
- 1
- 3
0
votes
1 answer
Linear Regression using Neural Network
I am working on a regression problem with the following sample training data .
As shown I have an input of only 4 parameters with only one of them changing which is Z so the rest have no real value while an output of 124 parameters denoted from O1…

Saeed AbdelWahab
- 23
- 5