Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
1
vote
2 answers

Constraint on the sum of parameters in Keras Layer

I want to add custom constraints on the parameters of a layer. I write a custom activation layer with two trainable parameters a and b s.t: activation_fct = a*fct() + b*fct(). I need to have the sum of the parameters (a+b) equal to 1 but I don't…
ENT
  • 13
  • 2
1
vote
1 answer

Keras custom layer input shape compatibility problem

I'm trying to write a custom activation layer in keras. The problem is, I've tried to do it with a sigmoid and with a relu activation function. The examples are pratically identical, but one works while the other doesn't. The working example…
1
vote
1 answer

How to implement a single hidden layer containing neurons with different activation functions?

I'm trying to create a custom neural network model in TensorFlow 2.0. I am aware that it's been repeatedly advised in the TF2.0 community that custom models should be built with the existing modules in the Functional API as much as…
1
vote
2 answers

Can gradient descent itself solve non-linear problem in ANN?

I'm recently studying the theory about neural network. And I'm a little confuse about the role of gradient descent and activation function in ANN. From what I understand, the activation function is used for transforming the model to non-linear…
1
vote
1 answer

Combination of more activations in output in keras in r

Is it possible to combine the softmax and linear activation functions in output layer in keras interface for R ? E.g. first 5 neurons will be softmax, because they should predict classes which are mutually exclusive, and the 6th and 7th neurons will…
pikachu
  • 690
  • 1
  • 6
  • 17
1
vote
0 answers

Is there any BLAS routine for take abs/tanh/sin for a whole C-vector?

Assuming that we have a C vector real myVector[50] = {3,2.2,31...} and I want to turn take the abs, than or sin or why not max(0, m) or sigmoid. Is there any BLAS routine for that, or do I need to use a for-loop? Can I improve the for-loop if there…
euraad
  • 2,467
  • 5
  • 30
  • 51
1
vote
0 answers

Invalid Syntax in keras model.add Convolutional layers

I am trying to build a VGG16 model but came across the invalid syntax error while compiling it. Error is in the activation function of the below line. model.add(Convolution2D((64,3,3,activation='relu'))) However, if I change the code as below,…
Sree
  • 973
  • 2
  • 14
  • 32
1
vote
0 answers

Custom Softmax Function on Keras

I need to custom Softmax function to directly to be one_hot encoding form, However, I gained en error. e.g. the output of an original-softmax function is [0.4, 0.2, 0.3, 0.1]. I want to force the output to be [1, 0, 0, 0] Note: My model is LSTM with…
1
vote
1 answer

How to add noise to activations at in Keras at inference time?

GaussianNoise in Keras seems to be only to add noise during training time. I need to add noise to activations in test time. My architecture is resnet50 pretrained on imagenet with all layers frozen, except for the fact that the gaussian noise needs…
1
vote
0 answers

Thresholding in intermediate layer using Gumbel Softmax

In a neural network, for an intermediate layer, I need to threshold the output. The output of each neuron in the layer is a real value, but I need to binarize it (to 0 or 1). But with hard thresholding, backpropagation won't work. Is there a way to…
1
vote
2 answers

relu activation function using lambda

Hi I want to implement a lambda function in python which gives me back x if x> 1 and 0 otherhwise (relu): so I have smth. like: p = [-1,0,2,4,-3,1] relu_vals = lambda x: x if x>0 else 0 print(relu_vals(p)) It is important to note that I want to…
2Obe
  • 3,570
  • 6
  • 30
  • 54
1
vote
2 answers

Confusion about neural networks activation functions

I followed a tutorial about an image classifier using Python and Tensorflow. I'm now trying to apply deep learning to a custom situation. I made a simulation program of sellers/buyers where the customers buy a stone following its wishes. The stones…
M. Ozn
  • 1,018
  • 1
  • 19
  • 42
1
vote
2 answers

Ways to limit output of NN regression problem in certain limit(i.e. I want my NN to always predict output values only between -20 to +30)

I am training NN for the regression problem. So the output layer has a linear activation function. NN output is supposed to be between -20 to 30. My NN is performing good most of the time. However, sometimes it gives output more than 30 which is not…
jd95
  • 404
  • 6
  • 14
1
vote
2 answers

Plotting a new activation function defined using an existing one from Keras

Is it possible to plot an activation function that I define using an already existing activation from Keras? I tried doing it simply like this: import keras from keras import backend as K import numpy as np import matplotlib.pyplot as plt # Define…
kamilazdybal
  • 303
  • 3
  • 9
1
vote
0 answers

What is the suitable activation function for determining odd and even numbers?

I want to create a neural network with tensorflow js just for learning purposes. It should determine if the input is odd or even and I think that can not be achieved with sigmoid or linear activations. Is there a sin or cos function which is…