Ray Train automatically stores various things to Tensorboard. In addition, I want to log custom histograms, images, PR curves, scalars, etc. How do I access Ray Train's internal TBXLogger so that I can log additional things?
Asked
Active
Viewed 108 times
1 Answers
0
Hi Ray team member here.
What ray train version are you using?
I don't think we currently have an end to end example to showcase how to do this. But you should be able to have something like tf.summary.image
in your training function. Note this would log images per training worker.

Xiaowei
- 29
- 2
-
Currently Ray 2.2.0. Our stack is Pytorch and don't have tf in our requirements, although tensorboardx is installed via Ray. There isn't a way to get a handle to the summary writer? – crypdick Feb 12 '23 at 03:33
-
ah i see. Could you use this in your training function? https://pytorch.org/docs/stable/tensorboard.html – Xiaowei Feb 13 '23 at 23:38
-
multiple SummaryWriters are not supported, so I would need to get Ray's SummaryWriter. See this related comment on the Ray issue tracker: https://github.com/ray-project/ray/issues/4762#issuecomment-491501357 – crypdick Feb 14 '23 at 00:48
-
Can we scope each SummaryWriter to each train worker rank? Say if you have 4 workers, they will have their separate local path to work with and there shouldn't be multiple SummaryWriters pointing to the same folder issue. – Xiaowei Feb 17 '23 at 19:27