-1

I want the neural network to guess a number as close to the actual output as possible. What activation function best suits this scenario?

enter image description here

1 Answers1

0

It seems like you are trying to do a regression task, therefore you would most likely want to use a linear activation function (which is the default activation in Keras) for your final layer. You can also use relu if you require that the outputs be non-negative.

You will also want to use a suitable loss function for regression such as mean_squared_error.

bigmac
  • 145
  • 1
  • 6