2

I'm trying to create a Vertex AI Pipeline to perform a hyperparameter tuning job that reads the data from a Vertex AI Dataset to have the metadata functionality track the relationship between dataset, model and endpoint (once I deploy the best model).

I'm following this tutorial that reads directly the data from tensorflow_datasets, but I don't see any way to pass a Vertex AI dataset to the hyperparameter tuning job op.

Do anyone know how to access a Vertex AI Dataset in a Hyperparameter tuning job?

Thank you.

Daruri
  • 86
  • 4

1 Answers1

0

You will need to add hypertune to the notebook so that it can write the different hyperparameters and their performances at the interface:

import hypertune

hp_metric = f1_score(y_test, y_pred, average='weighted')    
    
hpt = hypertune.HyperTune()
    hpt.report_hyperparameter_tuning_metric(hyperparameter_metric_tag='accuracy',metric_value=hp_metric,global_step=100)

And also add args to tune the model:

if __name__ == '__main__':
    
    parser = argparse.ArgumentParser()
    # Input Arguments
    
    parser.add_argument(
        '--max_depth',
        help = 'RF model parameter- depth',
        type = int,
        default = 100
    )
    
    parser.add_argument(
        '--max_features',
        help = 'RF model parameter- Features',
        type = int,
        default = 34
    )
    
    parser.add_argument(
        '--max_leaf_nodes',
        help = 'RF max_leaf_nodes',
        type = int,
        default = 8
    )
    
    parser.add_argument(
        '--min_samples_leaf',
        help = 'RF min_samples_leaf',
        type = int,
        default = 1
    )
razimbres
  • 4,715
  • 5
  • 23
  • 50
  • Sorry if my question wasn't clear. My problem is regarding how to use a Vertex AI Dataset while performing a hyperparameter tuning job in a Vertex AI Pipeline – Daruri Oct 06 '22 at 17:53