Questions tagged [relu]

ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.

101 questions
1
vote
1 answer

Change the threshold value of the keras RELU activation function

I am trying to change the threshold value of the activation function Relu while building my neural network. So, the initial code was the one written below where the default value of the relu threshold is 0. model = Sequential([ Dense(n_inputs,…
Prakhar Rathi
  • 905
  • 1
  • 11
  • 25
1
vote
1 answer

Unable to load_model due to 'unknown activation_function: LeakyReLU'

I have constructed, fitted, and saved the following model: import tensorflow as tf from tensorflow import keras from tensorflow.keras import layers from tensorflow.keras import preprocessing from tensorflow.keras.models import Sequential import…
niketp
  • 409
  • 1
  • 9
  • 20
1
vote
0 answers

Neural Network does not converge when using RELU or Leaky Relu

I have programmed a simple NN library which creates a neural network with any size chosen and can train the network with a given activation function and its derivative. The networks do very well job with sigmoid as activation function but never…
1
vote
0 answers

How to clip layer output in MLP with `tf.keras.activations.relu()`?

According to the documentation, tf.keras.activations.relu(x, alpha=0.0, max_value=None, threshold=0) seems to clip x within [threshold, max_value], but x must be specified. How can I use it for clipping the output of a layer in neural network? Or is…
Paw in Data
  • 1,262
  • 2
  • 14
  • 32
1
vote
0 answers

Negative predictions with R's neuralnet using ReLU act fct

I am building a ANN to predict total healthcare costs (therefore a continuous variable) using a range of input variables such as age, gender, insurance coverage and number of chronic conditions. I have taken the following steps using R's neuralnet…
Isobel M
  • 55
  • 1
  • 6
1
vote
1 answer

NaN in regression neural network

I was trying to build a NN on python to solve regression problem with inputs X (a,b) and output Y(c). Using leaky Relu as an activation function for hidden layer and linear function for the output layer. After 3-4 iterations nn seems to get burst…
joskiy18
  • 11
  • 2
1
vote
0 answers

val_loss and loss not decreasing u-net

I am training a U-NET model on 238 satellite images. my val_loss is not decreasing below 0.3, despite of the different architectures that I…
1
vote
1 answer

keras - adding LeakyrRelu on seqauential model throws error

second_fashion_model.add(LeakyReLU(alpha=0.05)) throws error as: The added layer must be an instance of class Layer. Found:
jay
  • 65
  • 1
  • 10
1
vote
1 answer

TypeError: relu() missing 1 required positional argument: 'x'

I am getting this error and I don't know why it's coming. Anyone help me out. import warnings warnings.filterwarnings('ignore',category=FutureWarning) import tensorflow as tf import keras from keras.layers.convolutional import Conv2D,…
Udit Jain
  • 13
  • 1
  • 4
1
vote
1 answer

Unexpected output for keras ReLU layer

In the keras documentation, the function keras.activations.relu(x, alpha=0.0, max_value=None, threshold=0.0) is defined as: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = alpha * (x - threshold) otherwise. I…
Julien REINAULD
  • 599
  • 2
  • 5
  • 18
1
vote
1 answer

Fitting a neural network with ReLUs to polynomial functions

Out of curiosity I am trying to fit neural network with rectified linear units to polynomial functions. For example, I would like to see how easy (or difficult) it is for a neural network to come up with an approximation for the function f(x) = x^2…
RikH
  • 2,994
  • 1
  • 16
  • 15
1
vote
2 answers

relu activation function using lambda

Hi I want to implement a lambda function in python which gives me back x if x> 1 and 0 otherhwise (relu): so I have smth. like: p = [-1,0,2,4,-3,1] relu_vals = lambda x: x if x>0 else 0 print(relu_vals(p)) It is important to note that I want to…
2Obe
  • 3,570
  • 6
  • 30
  • 54
1
vote
1 answer

dx=(x>0)*dout what does x>0 part in this code?

There is python code about Relu backward propagation. and code is like dx=(x>0)*dout, what does x>0 part do? Can anyone explain me this line of code?
1
vote
1 answer

Why do both tf.nn.relu and tf.nn.sigmoid work the same in this custom estimator

This is the guide to make a custom estimator in TensorFlow: https://www.tensorflow.org/guide/custom_estimators The hidden layers are made using tf.nn.relu: # Build the hidden layers, sized according to the 'hidden_units' param. for units in…
Dee
  • 7,455
  • 6
  • 36
  • 70
1
vote
1 answer

Why am I getting Nan after adding relu activation in LSTM?

I have simple LSTM network that looks roughly like this: lstm_activation = tf.nn.relu cells_fw = [LSTMCell(num_units=100, activation=lstm_activation), LSTMCell(num_units=10, activation=lstm_activation)] stacked_cells_fw =…
Pawel Faron
  • 312
  • 2
  • 9