0

I'm using raytune to do hyperparameter optimization with tensorflow model.

What I want to do is after training, upload model.h5 and its training log upto S3.

My code is something like this:

tuner = tuner.Tuner(
    tune.with_resources(),
    tune_config=tune.TuneConfig(metric='val_loss', mode='min', num_samples=5, time_budget=10000, reuse_actors=True),
    run_config =air.RunConfig(local_dir='hpo_test/', name='test_1',
       sync_config=tune.SyncConfig(
           upload_dir='s3://raytune_logs',
           syncer='auto',
           sync_artifacts=True, sync_on_checkpoint=True),
       checkpoint_config=ray.air.CheckpointConfig(num_to_keep=5, 
                  checkpoint_score_attribute='val_loss',
                  checkpoint_score_order='min'),
       log_to_file=True
)

This successfully uploads to S3 however it does so for each hyperparameter configuration. I want to upload only the model and log of best performing hyperparameter configuration and also be able to reach the file therefore I can get data from it. The folder name is concatenation of hyperparameters which is bottleneck when approaching the folder.

haneulkim
  • 4,406
  • 9
  • 38
  • 80

0 Answers0