2

Some architectures require two or more training ops (for example, in GAN you need to train a generator and discriminator). How can you achieve that with TF-Slim training functions? As far as I can see slim.learning.train takes only one training op.

mrry
  • 125,488
  • 26
  • 399
  • 400

2 Answers2

0

You can sum the training ops created by slim.learning.create_train_op. A train_op is just a tensor that will update the parameters when evaluated and return the loss. If you add two training ops both will be evaluated (in parallel).

Paweł Nowak
  • 143
  • 1
  • 6
0

You can overide train_step_fn, which is the main body function when running slim.learning.train().

For example, suppose you have train_op1 and train_op2, set train_ops = [train_op1, train_op2], and then you can try the following:

def train_step_fn(session, train_ops, global_step, train_step_kwargs):
  session.run(train_ops[0], ...)
  session.run(train_ops[1], ...)
  ...

slim.learning.train(train_step_fn=train_step_fn, ...)
yx luo
  • 1
  • 1