I want to save weights only when loss is getting lower and reuse them for evaluation.
lowest_loss = Inf
if loss[round] < lowest_loss:
lowest_loss = loss[round]
model_weights = transfer_learning_iterative_process.get_model_weights(state)
eval_metric = federated_eval(model_weights, [fed_valid_data])
where:
federated_eval = tff.learning.build_federated_evaluation(model_fn)
Is there a possible way to save server weights in hdf5 format or as a checkpoint and reuse it?