Questions tagged [relu]

ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.

101 questions
1
vote
0 answers

Why Vgg16 uses relu after each convolution layer?

In the CS231N course, it says we want zero-centered data to prevent the local gradient from always being the same sign of upstream gradient coming down, which causes inefficient gradient updates. But using relu in each layer is gonna output all…
Teresa Ho
  • 63
  • 8
1
vote
0 answers

Cannot use relu activation in IndyLSTMCell in TensorFlow 1.10

I tried the IndyLSTMCell in TensorFlow 1.10. It works with the default activation (tanh), but it does not work with nn_ops.relu. When I set the activation to relu, the loss became NAN. IndyGRUCell has the same problem. The relu activation does work…
Bo Shao
  • 143
  • 6
1
vote
2 answers

Tensorflow ReLu doesn't work?

I have written a convolutional network in tensorflow with relu as an activation function, however it is not learning (loss is constant for both eval and train data set). For different activation functions everything works as it should. Here is code…
0
votes
0 answers

Find mathematical interpretation of loss function and back propagation

I have a code that uses machine learning and neural network. I used TensorFlow 2.0 and Keras. it is a classification program that gives output as 0 or 1. I used ReLU as activation function. Sparse SoftMax Cross Entropy is used as loss function.…
0
votes
0 answers

What Is the importance of using Relu?

I get confused with the activation functions. why we widely use the Relu function also at the end it's mapping will be a line? Using the sigmoid and tanh make the decision boundary to be squiggle which will fit the data well but, relu map a line(…
0
votes
0 answers

tf.gradientTape computing gradients as None

I want to implement a new version of ReLU with a learnable parameter 'p' as follows - import tensorflow as tf #myrelu x_shape = [4,6] p = tf.Variable(tf.random.uniform([], minval=-10, maxval=10)) x = tf.Variable(tf.random.uniform(x_shape,…
psj
  • 356
  • 3
  • 18
0
votes
0 answers

Custom Relu Layer with Tensorflow

For pedagogic purposes, I would like to implement a custom dense relu layer (without biais) using Tensorflow. The idea is to clearly see the backpropagation both for the inputs and for the synaptic weights. The custom gradient is thus not on the…
MadMax2048
  • 21
  • 2
0
votes
2 answers

The function for tensor value generates this Error: 'false_fn' must be callable

I am creating a function that takes a tensor value and returns the result by applying the following formulation, There are 3 conditions so I am using @tf.functions. def Spa(x): x= tf.convert_to_tensor(float(x), dtype=tf.float32) p=…
0
votes
0 answers

Is there any way to rename relu to relu_love?

I want to rename the operation layer at a deeper level, not the sequential model layer. For example concatV2, AvgPool, AddN, Sub, MaxPool, relu, reluGrad, etc. I want to change the names of these operations. I couldn't find anything related to it no…
0
votes
0 answers

As a result of profiling, can the occurrence of each operation change in the same model?

I profiled several cnn models like Reset50 with TensorFlow Profiler. However, I confirmed that the operation showed different #occurence in different environments such as gpu, batch size, etc. I thought that the profiling layer, i.e. operation, was…
yoon
  • 33
  • 7
0
votes
1 answer

Setting ReLU inplace to 'False'

Below I have written code which accepts a pretrained model as argument (vgg, resnet, densenet etc) and returns the model with ReLU state as 'False'. It is written after testing many different specific architectures. I would like to re-write it in a…
0
votes
1 answer

Using Prelu in Tensorflow

I am building a reinforcement learning model. I am trying to use PRelu in my 2D Conv model using tensorflow. Here is the code for Actor Model. code: from tensorflow.keras.layers import Conv2D, Input, MaxPool1D, concatenate, Lambda, Dense,…
0
votes
0 answers

Leaky-ReLU back propagation with numpy

I wanted to implement the Leaky ReLU activation function with numpy (forward and backward pass) and wanted to get some comments about whether this implementation is correct. So the Leaky ReLU(x) = x if x > 0 and alpha * x if x <= 0. This means, the…
binaryBigInt
  • 1,526
  • 2
  • 18
  • 44
0
votes
1 answer

What's wrong with my relu_backward the error is always 1.0?

I am writing CS231n assignment1 two-layer-net and I meet difficulty in relu_backward. My impletment is as below: def relu_backward(dout, cache): """ Computes the backward pass for a layer of rectified linear units (ReLUs). Input: -…
dubugger
  • 89
  • 6
0
votes
1 answer

Getting "ValueError: Unknown activation function: PReLU" when I try to load a trained model that employes PReLU as the activation function?

As the title is self-descriptive, I'm getting the ValueError: Unknown activation function: PReLU error when I try to load my trained CNN model that employed the PReLU as the activation function for both Convolutional and Dense layers. How can I use…
talha06
  • 6,206
  • 21
  • 92
  • 147