Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
1
vote
1 answer

Apply own activation function to layer in tensorflow

I'm using a model where the tensorflow relu function is used for the activation of the hidden layers. So basically the model does this h = tf.nn.relu(zw) where zw are all the elements from the output from the previous layer times weights. According…
Atirag
  • 1,660
  • 7
  • 32
  • 60
1
vote
0 answers

how to use ReLu unit and get output in dl4j

I'm trying to make AutoEncoder in dl4j. Input: 200 integers range from 0 ~ approx 40000. code for model: MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder() .seed(seed) .iterations(ITERATIONS) …
user2659088
  • 133
  • 1
  • 7
1
vote
1 answer

Artificial Neural Network- why usually use sigmoid activation function in the hidden layer instead of tanh-sigmoid activation function?

why is log-sigmoid activation function the primary selection in the hidden layer instead of tanh-sigmoid activation function? And also, if I use Z-score normalization, could I use sigmoid activation function in the hidden layer?
1
vote
0 answers

Error Applying Selu Activation function with tensorflow

I was trying to implement the new SELU activation function from https://arxiv.org/pdf/1706.02515. For more information here is my code: import tensorflow as tf import numpy as np from PIL import Image import os from keras.activations import…
I. A
  • 2,252
  • 26
  • 65
1
vote
1 answer

Making training example of multilayer perceptron

I'm trying to make several training examples to get a set of weights and bias for the particular network which correctly implements a hard threshold activation function. Four inputs x_1, ... x_4 , where x_i is Real number, and the network must…
1
vote
1 answer

Normalization of data before activation function

I am Altering this tutorial to matlab where I am trying to classify to 1/0 class. each of my data points x is of dimension 30, that is it has 30 features. This is my first NN. My problem is, when I try to calculate a1=np.tanh(z1) or in matlab a1 =…
havakok
  • 1,185
  • 2
  • 13
  • 45
1
vote
1 answer

Logistic function misclassification

I'm having trouble trying to teach a neural network the XOR logic function. I've already trained the network with succesful results using the hyperbolic tangent and ReLU as activation functions (regarding the ReLU, I know it's not the appropiate for…
tulians
  • 439
  • 5
  • 23
1
vote
1 answer

sigmoid - back propagation neural network

I'm trying to create a sample neural network that can be used for credit scoring. Since this is a complicated structure for me, i'm trying to learn them small first. I created a network using back propagation - input layer (2 nodes), 1 hidden layer…
1
vote
0 answers

Implementing Maxout Activation in Theano

The only example of maxout implementation in Theano is on this link. My understanding is that I use any activation function and then maxout is just a post processing of the hidden layer outputs. I tried to apply this to my own HiddenLayer class.…
Zhubarb
  • 11,432
  • 18
  • 75
  • 114
0
votes
0 answers

How to give custom activation function to Conv2DNormActivation

As it seems torchvision.ops.Conv2dNormActivation function only takes the Activation functions defined under torch.nn due to the declaration of the activation_layer argument as Callable[..., torch.nn.Module] in the source code. I tried defining a…
0
votes
0 answers

Pytorch: Back-propagation of custom many-to-one nonlinearity

I have code for an input-output nonlinear function that takes a list of inputs X and weights W and produces a single nonlinear output. I am interested in using this as my "neuron" and seeing if I can use back-propagation to train this. (Ideally I…
Steven Sagona
  • 115
  • 1
  • 11
0
votes
1 answer

Can Pytorch handle custom nonlinear activation functions that are not 1-to-1 functions?

I am interested in making a neural network with custom nonlinear activation functions that are not 1-to-1 functions. I see that it is possible to add custom nonlinear activation functions to Pytorch, but the only functions that are considered are…
Steven Sagona
  • 115
  • 1
  • 11
0
votes
0 answers

Cant concatenate neurons which each have specific activation function

I want to assign different activation functions for neurons in a linear layer but dont want to lose the grads. How I do: def __init__(self): super(DataNet, self).__init__() self.fc1 = nn.Linear(2, 5) self.fc2 = nn.Linear(5, 5) …
0
votes
0 answers

Is it possible to change activation functions all at once?

I'd like to load a pretrained Inception-V3 model. I have two question. How can I access to activation functions? Is it possible to change all Relu activation functions to tanh all at once? import torch model = torch.hub.load("pytorch/vision",…
0
votes
0 answers

What Is the importance of using Relu?

I get confused with the activation functions. why we widely use the Relu function also at the end it's mapping will be a line? Using the sigmoid and tanh make the decision boundary to be squiggle which will fit the data well but, relu map a line(…