Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
0
votes
1 answer

Keras "Tanh Activation" function -- edit: hidden layers

Tanh activation functions bounds the output to [-1,1]. I wonder how does it work, if the input (features & Target Class) is given in 1-hot-Encoded form ? How keras (is managing internally) the negative output of activation function to compare them…
0
votes
2 answers

Text Classification Using Neural Network

I am new to machine learning and neural network. I am trying to do text classification using neural network from scratch. In my dataset there are 7500 documents each labeled with one of seven classes. There are about 5800 unique words. I am using…
0
votes
1 answer

In simple multi-layer FFNN only ReLU activation function doesn't converge

I'm learning tensorflow, deep learning and experimenting various kinds of activation functions. I created a multi-layer FFNN for the MNIST problem. Mostly based on the tutorial from the official tensorflow website, except that 3 hidden layers were…
0
votes
1 answer

ReLU activation function outputs HUGE numbers

I have FINALLY been able to implement backpropagation, but there are still some bugs I need to fix. The main is issue the following: My ReLU activation function produces really big dJdW values (derivative of error function wrt weights). When this…
0
votes
1 answer

How to create a simple one time activation process using c#?

I want to create a simple one time activation process for my windows form application. So, I basically have two forms, form1 is the activation window and form2 is the actual program. I've create a very basic activation program in form1 given…
Bumba
  • 321
  • 3
  • 13
0
votes
1 answer

how to apply a transformation to a single neuron?

Usually, an activation function is applied to all neurons of a given layer as in layer = tf.nn.relu(layer) How can I apply an activation function to say the second neuron only? How can I apply a specific transformation (say tf.exp()) to a specific…
0
votes
0 answers

tensorflow throws an error of ValueError: None values not supported

I have a piece of code that try to get output as the dimension of input that has either 0 or 1 instead of values of [0,1]. I tried to get binary activation function to evaluate binary dataset. x = tf.placeholder("float", [None, COLUMN]) Wh =…
user3104352
  • 1,100
  • 1
  • 16
  • 34
0
votes
1 answer

Tensor flow has no attribute elu (exponential linear unit)

I have been using tensor flow to implement a neural network, but I am not sure what is happening but I am getting this error message: h1=tf.nn.elu(tf.matmul(X,w_h1)+b_h1) AttributeError: 'module' object has no attribute 'elu' If I replace…
DanielTheRocketMan
  • 3,199
  • 5
  • 36
  • 65
0
votes
3 answers

Validation loss in keras while training LSTM and stability of LSTM

I am using Keras now to train my LSTM model for a time series problem. My activation function is linear and the optimizer is Rmsprop. However, i observe the tendency that while the training loss is decreasing slowly overtime, and fluctuate around a…
Thanh Quang
  • 193
  • 1
  • 11
0
votes
1 answer

How to make a piecewise activation function with Python in TensorFlow?

The active function in my CNN has the form: abs(X)< tou f = 1.716tanh(0.667x) x >= tou f = 1.716[tanh(2tou/3)+tanh'(2tou/3)(x-tou)] x <= -tou f = 1.716[tanh(-2tou/3)+tanh'(-2tou/3)(x+tou)] tou is a constant. So, in TensorFlow it is possible…
0
votes
1 answer

Gradient update with a custom "loss" function

I am working in tensorflow on a neural network that try to maximize correlation between two data sets: http://ttic.uchicago.edu/~klivescu/papers/andrew_icml2013.pdf I have a "loss" function which is a bit complicated so I wrote it in terms of numpy…
0
votes
1 answer

Are there cases where it is better to use sigmoid activation over ReLu

I am training a complex neural network architecture where I use a RNN for encoding my inputs then, A deep neural network with a softmax output layer. I am now optimizing my architecture deep neural network part (number of units and number of hidden…
ryuzakinho
  • 1,891
  • 3
  • 21
  • 35
0
votes
1 answer

Python Keras LSTM input output shape issue

I am running keras over tensorflow, trying to implement a multi-dimensional LSTM network to predict a linear continuous target variable , a single value for each example(return_sequences = False). My sequence length is 10 and number of features…
NRG
  • 149
  • 2
  • 10
0
votes
0 answers

MATLAB transfer function which use custom threshold?

Is there any MATLAB transfer(activation) function which its threshold could be set in desired value? (which mean we could for example set its threshold to a value, so if sum of weighted inputs was greater than a, neuron will fire and in other cases,…
0
votes
1 answer

How to Implement modifiable activation function in the neuron class in java?

I'm learning the concept of neural networks. I decided to try making the neuron class by myself. What is the best way to implement different activation functions in my code? Now it uses only the binary step function. It's my first try in coding…