I am trying to build a Neural Network to study one problem with a continuous output variable. A schematic representation of the neural network used is described below
Is there any reason why I should use the tanh() activation function instead of the sigmoid() activation function in this case? I have been using in the past the sigmoid() activation function to solve logistic regression problems using neural networks, and it is not clear to me whether I should use the tanh() function when there is a continuous output variable.
Does it depend on the values of the continuous output variable? For example: (i) Use sigmoid() when the output variable is normalized from 0 to 1 (ii) Use tanh() when the output variable has negative values.
Thanks in advance