Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
1
vote
1 answer

Keras - Default Axis for softmax function is set to Axis

I am learning how to create sequential models. I have a model: *model = Sequential()* I then went on to add pooling layers and convolution layers (which were fine). But when creating the dense layer: *model.add(Dense(num_classes, activation =…
J Houseman
  • 91
  • 2
  • 10
1
vote
0 answers

Supervised classification combined with off-policy reinforcement learning

I have 2 neural networks: Predicts action values Q(s, a) using off-policy reinforcement learning - Approximates the best response to an opponent's average behaviour. Imitate its own average best response behaviour using supervised…
1
vote
1 answer

Implementing sigmoid function in python

I am trying to implement a simple neural network for XOR function. The activation function I am using is Sigmoid function. The code for the sigmoid function is: def ActivationFunction(a) e = 2.671 # Sigmoid Function expo =…
pr22
  • 179
  • 1
  • 2
  • 9
1
vote
2 answers

Sigmoid activation for multi-class classification?

I am implementing a simple neural net from scratch, just for practice. I have got it working fine with sigmoid, tanh and ReLU activations for binary classification problems. I am now attempting to use it for multi-class, mutually exclusive problems.…
KOB
  • 4,084
  • 9
  • 44
  • 88
1
vote
2 answers

Tensorflow custom activation function

I implemented a network with TensorFlow and created the model doing the following in my code: def multilayer_perceptron(x, weights, biases): layer_1 = tf.add(tf.matmul(x, weights["h1"]), biases["b1"]) layer_1 = tf.nn.relu(layer_1) …
Gilfoyle
  • 3,282
  • 3
  • 47
  • 83
1
vote
2 answers

Softmax MLP Classifier - which activation function to use in hidden layer?

I am writing a single Multi-Layer Perceptron from scratch, with just an input, hidden and output layer. The output layer will use the softmax activation function to produce probabilities of several mutually exclusive outputs. In my hidden layer it…
1
vote
1 answer

How to replace relu6 operations with regular relu in Tensorflow checkpoint?

Straightforward question really, I need to convert a Tensorflow model I have to a format that doesn't support relu6, just regular relu. My model is in the form of 3 ckpt (checkpoint) files (the data, index, and meta files). I need to be able to…
1
vote
1 answer

how to write softmax derivative in python code

I am trying to write a neural network MLP model from scratch. However, I am stuck on the derivative of softmax function. I know that the softmax function in python code is def softmax(input_value): input_value -= np.max(input_value) return…
1
vote
1 answer

Meaning of y-axis in in Tensorboard Activation Summary

I'm having trouble interpreting the y-axis for my activation summaries. I understand that the x-axis is values and the z-axis is the global step. I thought the y-axis is a density chart of activated nodes in the layer, but that doesn't seem right.…
1
vote
1 answer

Is weight initialization different for dense and convolutional layers?

In a dense layer, one should initialize the weights according to some rule of thumb. For example, with RELU, the weights should come from a normal distribution and should be rescaled by 2/n where n is the number of inputs to the layer (according to…
1
vote
1 answer

How to use tf.nn.crelu in tensorflow?

I am trying different activation functions in my simple neural network. It does not matter using tf.nn.relu, tf.nn.sigmoid,... the network does what it should do. But if I am using tf.nn.crelu, I have a dimension error. It returns something like…
j35t3r
  • 1,254
  • 2
  • 19
  • 53
1
vote
1 answer

How to use my own activation function in tensorflow train API?

Can I define my own activation function and use it in the TensorFlow Train API, i.e. the high level API with pre-defined estimators like DNNClassifier? For example, I want to use this code but replace the activation function tf.nn.tanh with…
1
vote
1 answer

Artificial Neural Network RELU Activation Function and Gradients

I have a question. I watched a really detailed tutorial on implementing an artificial neural network in C++. And now I have more than a basic understanding of how a neural network works and how to actually program and train one. So in the tutorial a…
1
vote
1 answer

PReLU Activation Function update rule

I just finished reading Delving Deep into Rectifiers paper. This paper proposes a new activation function called PReLU. Maybe it is obvious, because the paper did not mention it, but I want to know when is the parameter 'a' of PReLU updated? Is it…
1
vote
1 answer

Activation Function NNET

I've created a neural network using caret and nnet. Now, I need to deploy the NN in oracle for production. I already have the weights for each input and hidden layers. However, I'm not sure which was the activation function used. Is there any way…