0

I have a trained neural network which suitably maps my inputs to my outputs. Is it then possible to specify a desired y output and then use a gradient decent method to determine the optimum input values to get that output?

When using backpropegation, the partial derivative of a weight is used with error function to proportionally adjust the weights; is there a way to do something similar with the input values themselves and a target y value?

Stephen
  • 1
  • 2

1 Answers1

0

A neural network is basically a complex mathematical function. By adjusting the weights you basically adjust that function's parameters. Given that, your question is if you can easily and automatically invert the function. I don't think this can be done easily.

I think that the only thing you can do is to create another inverted network and train it with inverted data.

Andrzej Gis
  • 13,706
  • 14
  • 86
  • 130