Questions tagged [relu]

ReLu is an abbreviation for Rectified Linear Unit, in the branch of neural networks.

101 questions
0
votes
3 answers

Core ML coremltools AttributeError: module 'keras.applications.mobilenet' has no attribute 'relu6'

We are trying to convert a .h5 Keras model into a .mlmodel model, my code is as follows: from keras.models import load_model import keras from keras.applications import MobileNet from keras.layers import DepthwiseConv2D from…
Rex
  • 43
  • 7
0
votes
1 answer

Why use ReLu in the final layer of Neural Network?

It is recommended that we use ReLu in the final layer of the neural network when we are learning regressions. It makes sense to me, since the output from ReLu is not confined between 0 and 1. However, how does it behave when x < 0 (ie when ReLu…
0
votes
1 answer

Using the Proper ReLU derivative prevents learning

I'm trying to implement the backpropagation with ReLU as the activation function. If I am not mistaken the derivative of that function is 1 for x > 0 and 0 for x < 0. Using this derivative the network does not learn at all. Searching for other…
Ymi_Yugy
  • 513
  • 1
  • 4
  • 7
0
votes
2 answers

Implementation of ReLU function on C++

I'm actually working on a CNN, I'm using a Sigmoid for the activation function, but i would like to use the ReLU. I have implemented a code for ReLU using Eigen, but it doesn't seems to work, can you help me please ? Here's my code: Matrix…
Sabrina Tesla
  • 13
  • 1
  • 10
0
votes
1 answer

Tensorflow on non MNIST dataset, sofmax Relu softmax. Why is the prediction accuracy low?

I'm sorry if this is a naive question. This is my first attempt at using tensorflow. I'm using this after trying numpy on a non-MNIST dataset as a part of the Udacity course. Now, this is the code that I've written. However, this is giving me just…
Sayantan
  • 315
  • 5
  • 20
0
votes
1 answer

Python ReLu activation function desn't work

My first neural network was using sigmoid activation function and was working fine. Now I want to switch to more advanced activation function(ReLu). But with ReLu my NN doesn't work at all. 90% of errors, while using sigmoid there were 4% of errors.…
user9363390
0
votes
1 answer

Frequency Domain Relu: How to compute sum of diracs where spatial domain values are positive?

I'm attempting to implement the frequency domain ReLu as detailed in: http://cs231n.stanford.edu/reports/2015/pdfs/tema8_final.pdf The formula that is confusing me is on the bottom left of page 4. I am not confident that I am computing the sum of…
Kevinj22
  • 966
  • 2
  • 7
  • 11
-1
votes
1 answer

I have created a neural network with 1 hidden layer and with parametric RelU as the activation for the hidden layer

import numpy as np from scipy.special import expit as sigmoid from scipy.special import softmax as sm import pandas as pd import math from sklearn.metrics import mean_squared_error from sklearn.metrics import accuracy_score from sklearn.metrics…
-1
votes
2 answers

Python Neural Networks. ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

I am trying to add a ReLU activation function layer to my neural network. However when I try the following code I get this error: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() I tried…
-1
votes
2 answers

Questions about programming a cnn with PyTorch

I'm pretty new at programming cnn so I'm a little bit lost. I'm trying to do this part of the code, where they ask me to implement a fully-connected network to classify the digits. It should contain 1 hidden layer with 20 units. I should use ReLU…
Georgia
  • 109
  • 1
  • 6
-1
votes
1 answer

When does ReLU kills the neurons?

I am confused regarding the dying ReLU problem. ReLU will kill the neuron only during the forward pass? Or also during the backward pass?
Joshua
  • 409
  • 1
  • 4
  • 12
1 2 3 4 5 6
7