2

Using tensorflows tutorial on DCGAN as an example: https://www.tensorflow.org/tutorials/generative/dcgan?hl=en

To log the loss, the following example was used: https://www.tensorflow.org/tensorboard/get_started?hl=en

Using the above as a reference, I added a few lines to view the loss in tensorboard, however couldn't do the same for the generator/discriminator weights and bias.

Code used to view generator/discriminator loss :

g_loss = tf.keras.metrics.Mean('g_loss', dtype=tf.float32)
d_loss = tf.keras.metrics.Mean('d_loss', dtype=tf.float32) 

Preparing writer / log directory :

current_time = datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
train_log_dir = 'logs/' + current_time + '/train'
train_summary_writer = tf.summary.create_file_writer(train_log_dir)

then for each epoch I pass in gen_loss and disc_loss into g_loss and d_loss respectively, then do the following :

 with train_summary_writer.as_default(): 
     tf.summary.scalar('g_loss', g_loss.result(), step=epoch) 
     tf.summary.scalar('d_loss', d_loss.result(), step=epoch)

The above allows you to view g_loss and d_loss under the scalars tab in tensorboard.

So how can I do the same for the weights and bias?
I can see that it makes use of tf.GradientTape() to carry out the backpropagation. When this is used, I presume you do not need to use model.fit() with callbacks, and instead make use of generator.trainable_variables with tf.summary.histogram(), but I'm unsure how to put it all together.

and you also need to "merge" scalars and histograms at some point if you want to view both?

trennerzz
  • 21
  • 2

0 Answers0