3

I have a subclassed model that instantiates a few custom layers via subclassing. I tried using keras.utils.plot_model() but all it does is print the model block, none of the layers appeared.

Can a Tensorflow expert comment on this? Will this feature ever be implemented in the future? If not, what is the next best alternative to examine the computation graph? Note that model.summary() only gives a summary of the parameters of the custom layer, within which contains two dense layers. Ideally, I like to see all the computations, if that is not asking too much...

Update: I dug into the source, looks like plot_model() first check for the _is_graph_network attribute. Graph Networks are used in Functional and Sequential APIs. From the source:

Two types of Networks exist: Graph Networks and Subclass Networks. Graph networks are used in the Keras Functional and Sequential APIs. Subclassed networks are used when a user subclasses the Model class. In general, more Keras features are supported with Graph Networks than with Subclassed Networks, specifically:

  • Model cloning (keras.models.clone())
  • Serialization (model.get_config()/from_config(), model.to_json()/to_yaml())
  • Whole-model saving (model.save())

(custom graph component) Naturally, I like to know if I can build a graph network component, so my subclassed model/layer can work with these features. Does that involve a lot of effort?

(tf.function graph visualization) Can someone let me know if graph visualization via Tensorboard works with Tensorflow2 tf.functions? In Tensorflow 1.x, one defines a name scope for a logical group of ops (e.g. generator/discriminator in GAN, encoder/decoder in VAE and loss/metrics), they are then displayed as a high-level block in the graph visualization. Can I define something similar for tf.functions?

David KWH
  • 147
  • 8
  • 1
    If anyone sees this and knows someone who can help, please let me know. I really hope to find out more about the inner workings of tensorflow2. Thanks. – David KWH Aug 10 '19 at 18:49
  • Have you figured out how to plot the model created using subclassing technique? – donto Dec 22 '19 at 08:44
  • @DavidKWH if you're still looking for the answer, I've posted a simple workaround [here](https://stackoverflow.com/a/63898244/9215780). – Innat Sep 15 '20 at 08:47

1 Answers1

2

Accodring to the officical documentation https://www.tensorflow.org/tensorboard/graphs, you can

use TensorFlow Summary Trace API to log autographed functions for visualization in TensorBoard

Here is a simple example for visuliazing a subclassed model:

import tensorflow as tf

from tensorflow.keras import layers

class MyModel(tf.keras.Model):

  def __init__(self, num_classes=10):
    super(MyModel, self).__init__(name='my_model')
    self.num_classes = num_classes
    self.dense_1 = layers.Dense(32, activation='relu')
    self.dense_2 = layers.Dense(num_classes, activation='sigmoid')

  def call(self, inputs): 
    x = self.dense_1(inputs)
    return self.dense_2(x)

model = MyModel(num_classes=10)

model.compile(optimizer=tf.keras.optimizers.RMSprop(0.001),
              loss='categorical_crossentropy',
              metrics=['accuracy'])

data = np.random.random((1000, 32))
labels = np.random.random((1000, 10))

@tf.function
def trace():
  data = np.random.random((1, 32))
  model(data)


logdir = "trace_log"
writer = tf.summary.create_file_writer(logdir)
tf.summary.trace_on(graph=True, profiler=True)
# Forward pass
trace()
with writer.as_default():
  tf.summary.trace_export(name="model_trace", step=0, profiler_outdir=logdir)

Then you can use Tensorboard to examine the computation graph:

tensorboard --logdir trace_log
wangyu
  • 150
  • 7