2

I use MLFlow with autolog to keep track of my Tensorflow models:

mlflow.tensorflow.autolog(every_n_iter=1)
with mlflow.start_run():
  model = ...
  model.compile(...)
  model.fit(...)

and then I want to use my tensorboard logs located in the artifacts. But when I run:

%tensorboard --logdir=<logs_path>

I have the error message: "No dashboards are active for the current data set. Probable causes:

You haven’t written any data to your event files. TensorBoard can’t find your event files."

I work on Databricks, so log_path is something like:

/dbfs/databricks/mlflow-tracking/..

Any ideas?

1 Answers1

1

I found something that works for me.

My experiment_log_dir in databricks notebook:

try:
    username = dbutils.notebook.entry_point.getDbutils().notebook().getContext().tags().apply('user')
except:
    username = str(uuid.uuid1()).replace("-", "")
experiment_log_dir = "/dbfs/user/{}/tensorboard_log_dir/".format(username)

My tensorboard callback in model.fit():

    run_log_dir = experiment_log_dir + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
    tensorboard_callback = keras.callbacks.TensorBoard(log_dir=run_log_dir, histogram_freq=1)

    with mlflow.start_run(nested=True) as run:
        self.model.fit(
            [...]
            callbacks=[tensorboard_callback]
        )

Then, in databricks notebook, when I run tensorboard:

 %tensorboard --logdir $experiment_log_dir/

doesn't work, but if you add slash in the end:

 %tensorboard --logdir $experiment_log_dir/ 

it does