Questions tagged [relu]

ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.

101 questions
0
votes
0 answers

How is the derivative in this ReLU backpropagation being calculated? (Neural Network)

The "dvalue" variable is what I'm hung up on... I understand the derivative of the ReLU. Picture 1 Picture 2
0
votes
0 answers

Double leaky relu (custom activation function) (tf 2.5.0)

I'm trying to create a leaky relu that has the same gradient for values > 1 than for values < 0. I have an implementation that seems to work but it's about 50% slower than the normal leaky relu. So I think there must be a better way. Here is a…
0
votes
1 answer

XOR with ReLU activation function

import numpy as np import matplotlib.pyplot as plt %matplotlib inline input = [[0,0,1],[0,1,1],[1,0,1],[1,1,1]] output = [0,1,1,0] N = np.size(input,0) # number of samples Ni = np.size(input,1) # dimension of the samples of input No = 1 #…
JungSoo Ok
  • 11
  • 4
0
votes
2 answers

How LeakyReLU layer works without setting the number of units?

When building Sequential model, I notice there is a difference between adding relu layer and LeakyReLU layer. test = Sequential() test.add(Dense(1024, activation="relu")) test.add(LeakyReLU(0.2)) Why cant we add layer with activation =…
Boom
  • 1,145
  • 18
  • 44
0
votes
1 answer

Multiple Activation Functions for multiple Layers (Neural Networks)

I have a binary classification problem for my neural network. I already got good results using the ReLU activation function in my hidden layer and the sigmoid function in the output layer. Now I'm trying to get even better results. I added a second…
0
votes
0 answers

Pytorch-Forecasting N-Beats model with SELU() activation function?

I am working at timeseries forecasting, and I've using the PyTorch lib pytorch-forecasting lately. If you didn't know it, try it. It's great. I am interested in SELU activation function for Self-Normalizing-Networks (SNNs, see, e.g., the docs). As I…
P. Navarro
  • 87
  • 10
0
votes
1 answer

NER activation function in SPACY

I have searched the documentation, but I couldn't find the answer. Does SPACY uses ReLu, Softmax or both as activation function? Thanks
0
votes
1 answer

If we primarily use LSTMs over RNNs to solve the vanishing gradient problem, why can't we just use ReLUs/leaky ReLUs with RNNs instead?

We all knows that vanishing gradient problem occurs when we are using deep neural network with sigmoid and if we use relu , it solves this problem but it creates dead neuron problem and then it solves by leaky relu . Why we moves toward LSTM if…
0
votes
2 answers

Neural Network Using ReLU Activation Function

I am trying to use a neural network to predict the price of houses. Here is what the top of the dataset looks like: Price Beds SqFt Built Garage FullBaths HalfBaths LotSqFt 485000 3 2336 2004 2 2.0 …
325
  • 594
  • 8
  • 21
0
votes
1 answer

How implement Leaky ReLU in Keras from scratch?

How to implement Leaky ReLU from scratch and use it as a custom function in Keras, I have a rough snippet but am not sure how close I am to the correct definition. My question comes in two parts: 1-Is my implementation correct? 2-If not, what am I…
0
votes
0 answers

LSTM activation function for monotonic input data

If I am using an LSTM to predict future values of a time series chart which is more or less monotonically increasing. Does tanh work as an activation function for all the LSTM units since it is a bounded function? or would relu be the right function…
bcsta
  • 1,963
  • 3
  • 22
  • 61
0
votes
1 answer

Keras activation layer is not working well

I made a model as below at first: from tensorflow.keras.layers import Dense, Flatten, Conv2D, Dropout, BatchNormalization, AveragePooling2D, ReLU, Activation from tensorflow.keras import Model class MyModel(Model): def __init__(self): …
Geonsu Kim
  • 601
  • 1
  • 6
  • 12
0
votes
1 answer

Simple ANN model converges with tanh(x) as the activation function, but it doesn't with leaky ReLu

I'm training a simple ANN model (MLP) using as the activation function tanh(x) and, after some interactions, it converges with error equal to 10^-5, here's my full code: import numpy as np import pandas as pd # Base de dados a ser treinada x =…
Luís Eduardo
  • 52
  • 1
  • 6
0
votes
1 answer

Tensorfow-lite PReLU Fusion and TransposeConv Bias

When we convert a tf.keras model with PReLU with tf 1.15, the PReLU layers becomes ReLU and seem to get fused with previous operators. As a result, the keras h5 file of 28 MB becomes 1.3 MB in size.It looks like number of parameters gets…
anilsathyan7
  • 1,423
  • 17
  • 25
0
votes
1 answer

The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() python numpy using ReLu function

import numpy as np class NeuralNetwork(): def __init__(self): np.random.seed(1) self.synaptic_weights = np.random.random((8, 5)) def rectified(self, x): return max(0, x) def rectified_derivative(x): …
mwckres0
  • 15
  • 6