Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
-1
votes
1 answer

How do i get the maximum valued label when using a softmax activation function in the output layer of neural network?

in a model I have trained I am applying softmax function in the output layer of the neural network. the output has 41 categories and I want to fetch the label with max value and the value itself ..i. e in the output. 41 diseases for a set of inputs…
-1
votes
1 answer

How to set a suitable activation function for an ANN having negative input values

I am creating an ANN which has 3 input neurons which take inputs from the device "s accelerometer in the form of x , y , z. These values are positive as well as negative depending upon the acceleration. I am not able to get an suitable activation…
user9477964
-1
votes
1 answer

When does ReLU kills the neurons?

I am confused regarding the dying ReLU problem. ReLU will kill the neuron only during the forward pass? Or also during the backward pass?
Joshua
  • 409
  • 1
  • 4
  • 12
-1
votes
1 answer

Custom sigmoid activation function

So, I'm using Keras to implement a convolutional neural network. At the end of my decoding topology there's a Conv2D layer with sigmoid activation. decoded = Conv2D(1, (3, 3), activation='sigmoid', padding='same')(x) Basically, I want to change the…
-2
votes
1 answer

Does the choice of activation function depend on value input range?

I am currently working with audio data and an autoencoder. The input data goes from [-1 to 1], same has to be true for output data [-1 to 1] So, as to help the network retain values between -1 and 1 throught, I'm using Tanh() activation functions to…
-2
votes
1 answer

Why tanh function return different in tensorflow and pytorch?

I find that tensorflow and pytorch tanh result is different, I want to know why did this happen? I know that the difference is very small, so is this acceptable? import numpy as np import tensorflow as tf import…
roger
  • 9,063
  • 20
  • 72
  • 119
-2
votes
1 answer

Which Activation function should I use in the layer which is just previous to Final layer in Deep Neural Network?

I have a data set with single label multiclass.MNIST Dataset . I want to build the Deep Neural Network classifier on that Dataset. It is obvious that the activation function on last layer will be Softmax. But I am very curious which activation…
-2
votes
1 answer

Which Activation Function to use for Neural Networks

Apologies in advance if this question is not the conventional approach, where a snippet of code or a question about a code is involved. I'm just trying to understand certain specific points on the subject of Neural Networks. I was watching a YouTube…
Hazzaldo
  • 515
  • 1
  • 8
  • 24
-2
votes
1 answer

Neural Networks: What does the activation layer in neural networks do?

I was reading all that fancy articles about neural networks. I know I have to use it, but I'm having a problem with understanding what does Activation Layer actually do. Could someone explain it in the easiest possible way ? Correct me if I am…
sebb
  • 1,956
  • 1
  • 16
  • 28
-3
votes
2 answers

How do I implement the function f(x) = { x^2, x>0 and -x^2 , x<0 } for a numpy array 'x'?

I'm trying to change every value in a Numpy array 'x' according to the following function: f(x) = { x^2 , x >= 0 and -x^2 , x < 0 } @numpy.vectorize def squares(x): return (x ** 2) if x >= 0 else -(-x ** 2) The function seemed to…
-3
votes
1 answer

Activation function for linear dataset

I have been working with data sets that mostly show the linear relationship between different attributes/features. What activation should I be using with linear datasets? I have been using sigmoid function until now. Is there any other activation…
Lambar
  • 51
  • 6
-4
votes
1 answer

Deep neural network not learning

I am training MNIST on 8 layers (1568-784-512-256-128-64-32-10) fully-connected deep neural network with the newly created activation function as shown in the figure below.This function looks a bit similar to the ReLU, however, it gives a litter…
-4
votes
1 answer

Are there any plans to implement a leaky ReLU in H2O?

Are there any plans to implement a leaky ReLU in the Deep Learning module of H2O? I am a beginner to neural nets, but in the limited amount of model building and parameter tuning, I have found the ReLUs to generalize better, and was wondering if…
bio.rf
  • 41
  • 2
1 2 3
22
23