I want the neural network to guess a number as close to the actual output as possible. What activation function best suits this scenario?
Asked
Active
Viewed 74 times
1 Answers
0
It seems like you are trying to do a regression task, therefore you would most likely want to use a linear
activation function (which is the default activation in Keras) for your final layer. You can also use relu
if you require that the outputs be non-negative.
You will also want to use a suitable loss function for regression such as mean_squared_error
.

bigmac
- 145
- 1
- 6