0

Is supervised training of a neural network with 2 unknown outputs possible where there is a relation such as y=a.x^b between known parameters (y,x) and unknowns (a,b). here (a,b) are the outputs of network!!!

  • This is a very special NNW. I only know NNW with multiplication and addition. Not with "power of". This looks more like math problem and is probably not well placed in SO. – RaphMclee Sep 14 '13 at 20:27
  • I don't really see why you need a neural network for this -- can't you just re-express your relation as log y = log a + b log x and then fit your parameters using linear regression ? Also, @RaphMclee is probably right, this is more of a question for stats.SE or math.SE. – lmjohns3 Sep 19 '13 at 05:10

1 Answers1

1

The direct consequence of the universal approximation theorem is that any continous function from the compact subset of R^d onto k-dimensional hypercube can be approximated with standard feed forward neural network with given error bound eps.

So in simple words - in fact every function can be trained using neural network, which does not mean that in practise any algorithm will actually do this (it is purely existantional proof, which gives no intuition "where to look").

So if your question is "is it possible to train a network that will aproximate my function?" the answer is yes, if the question is "is it possible to make neural network represent exactly my function" then the answer is yes, but given a custom activation function.

lejlot
  • 64,777
  • 8
  • 131
  • 164