0

If I use the trained RNN(or LSTM) to generate series data(name the network with generated-RNN), then I use the series data to train the RNN(i.e. the same structure with the generated-RNN) from scratch, is it possible to get the same trained network(the same trained weights) with the generated-RNN?

grizzthedj
  • 7,131
  • 16
  • 42
  • 62
Liepill Li
  • 13
  • 4

1 Answers1

0

You take series with X as input and Y as output to train a model, then with that model you generate series O.

Now you want to recreate the Ws from O=sigmoid(XWL1+b)...*WLN+bn with O as input and X as output.

Is it possible?

Unless X=O, then most likely not. I can't give a formal mathematical proof but multiplying forward through a network isn't equal to multiplying backwards, mainly due to the activation function. If you removed the activation function or took the inverse of the activation function you would more likely approach the Ws you desired, though another set of weights might also get you the the same output for a given input.

Also, this question would be better received in stats.stackexchange than stackoverflow.

Mo Azim
  • 13
  • 7