1

I want to implement broadcast some values from chief to all workers with distributed TensorFlow like MPI's bcast: https://mpi4py.readthedocs.io/en/stable/tutorial.html#collective-communication

I guess broadcast_send or tf.raw_ops.CollectiveBcastSend is the operation, but I cloud not found any examples on TensorFlow official document.

Is there a good example to use such the row level distributed operations?

Shuhei Fujiwara
  • 193
  • 1
  • 7

0 Answers0