I am using Tensorflow 0.8 to train the Deep Neural Networks. Currently, I encounter an issue that I want to define two exact same Neural Networks N1 and N2, and I train N1, during the training loop, I copy updated weights from N1 to N2 every 4 iterations. In fact, I know there is way using tf.train.saver.save()
to save all N1 weights into a .ckpt
file on Disk, and using tf.train.saver.restore()
to load those weights from .ckpt file, which is equivalent to the copy functionality. However, this load/reload will impact the training speed, and I wonder if there are other more efficient ways to do the copy (For example, do in-memory copy, etc.). Thanks!
Asked
Active
Viewed 601 times
1

Ruofan Kong
- 1,060
- 1
- 17
- 34
-
Take a look at my question, here: https://stackoverflow.com/questions/48547688/tensorflow-keras-copy-weights-from-one-model-to-another?noredirect=1#comment84092982_48547688 – benbotto Jan 31 '18 at 22:51
1 Answers
0
If you could have your code/ more detail here that would be beneficial. However, you can return the session you're using to train N1 and access it while you want to train N2.

Elmira
- 296
- 2
- 5