Questions tagged [activation-function]

Activation function is a non-linear transformation, usually applied in neural networks to the output of the linear or convolutional layer. Common activation functions: sigmoid, tanh, ReLU, etc.

343 questions
0
votes
1 answer

Keras – Artificial Neural Networks - Error when using a custom activation function

I’m creating an Artificial Neural Network (ANN) using Kera’s Functional API. Link to the data csv file: https://github.com/dpintof/SPX_Options_ANN/blob/master/MLP3/call_df.csv. Relevant part of the code that reproduces problem: import pandas as…
0
votes
1 answer

Pytorch VNet final softmax activation layer for segmentation. Different channel dimensions to labels. How do I get prediction output?

I am trying to build a V-Net. When I pass the images to segment during training, the output has 2 channels after the softmax activation (as specified in the architecture in the attached image) but the label and input has 1. How do I convert this…
0
votes
1 answer

Tensorflow2 Keras Tuning both units and activation function

I am trying to setup a Keras tuner to simultaneously tune both the number of layers and the activation function. The network attempts to warp a 2D function into another 2D function. I keep getting the error: ValueError: Unknown activation function:…
The Dude
  • 661
  • 2
  • 11
  • 20
0
votes
1 answer

What are the activation functions that can get 2 answers?

I have an algorithm that reads a 4X4 grid with a shape on one of the cells. I am trying to develop a CNN algorithm where it could know what the shape is and where it is. The following would be the possible outputs [0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0,…
0
votes
1 answer

Neural Net Activation Function with Numerical, Categorical Output

I have a neural network that outputs numeric values, but these values are categorical (e.g., 0, 0.25, 0.5, 0.75, 1). What would be a good activation function to use for my output layer? I am wondering this because my output is numerical, but…
325
  • 594
  • 8
  • 21
0
votes
0 answers

Why LSTM doesn't predict low and high values in regression?

I made a stack of bidirectional LSTM layers following by 4 Dense/Dropout layers with a swish activation function in order to predict a continuous value between 0 and 2. I compiled the model with mean_squared_error loss function, Adam optimizer and…
kabhel
  • 324
  • 3
  • 11
0
votes
1 answer

How implement Leaky ReLU in Keras from scratch?

How to implement Leaky ReLU from scratch and use it as a custom function in Keras, I have a rough snippet but am not sure how close I am to the correct definition. My question comes in two parts: 1-Is my implementation correct? 2-If not, what am I…
0
votes
1 answer

Difference between Cost Function and Activation funtion?

I would like to understand the difference between Cost function and Activation function in a machine learning problems. Can you please help me understand the difference?
Gowdhaman008
  • 1,283
  • 9
  • 20
0
votes
0 answers

LSTM activation function for monotonic input data

If I am using an LSTM to predict future values of a time series chart which is more or less monotonically increasing. Does tanh work as an activation function for all the LSTM units since it is a bounded function? or would relu be the right function…
bcsta
  • 1,963
  • 3
  • 22
  • 61
0
votes
1 answer

LSTM Predictions

I'm working on a LSTM model, I found some examples and I was confused about the output. Here, I'm trying to predict the next 24 hours, should I put 1 or 24 on the Dense layer? is this section correct ? I've been following this video reg =…
0
votes
1 answer

How to add a Activation Layer with a specified value in DeepLearning4J?

For example, the default value of alpha in Activation-ELU is 1. How to set it to 1.5(say). In other framework like PyTorch we can do this using torch.nn.ELU(1.5). I cannot find any documentation in Deeplearning4J.
0
votes
0 answers

Limit outputs of neural network between 0 and 1, while also enabling to have exactly 0 and 1

My question would be if there is a possibility to limit the outputs of a neural network between 0 and 1, while also enabling it to have exactly 0 and 1 values. I tried sigmoid activation function in keras, but it enables me to have only ~0.99 and…
0
votes
0 answers

Understanding CNN by visualizing class activations using GRAD_CAM

I followed the blog Where CNN is looking? to understand and visualize the class activations in order to predict something. The given example works very well. I have developed a custom model using autoencoders for image similarity. The model accepts…
0
votes
1 answer

R-squared negative for a linear regression model - Tensorflow

I built a neural network without an activation function, which is, therefore, a Linear regression Model: def build_model(): model = tf.keras.models.Sequential([ tf.keras.layers.Input(shape=(1,)), tf.keras.layers.Dense(1) ]) …
0
votes
1 answer

Sending email activation(confirmation) link when user UPDATING email address in Laravel

How can I resend my activation account email to my user once the user update the email address. Email address should not be changed in the db until user click on the confirmation link following is my update function for an user in my…
Volka Dimitrev
  • 337
  • 4
  • 15
  • 38