0

I'd like to use Ray Tune to optimize for metric_slow, but, since that takes a long time before it is available, to use ASHA to early stop based on metric_fast_but_rough. I tried to do this by giving scheduler one metric and tune.run a different one, setting TUNE_DISABLE_STRICT_METRIC_CHECKING. However, I got the following error:

You passed a `metric` or `mode` argument to `tune.run()`, but the scheduler you are using was already instantiated with their own `metric` and `mode` parameters. Either remove the arguments from your scheduler or from your call to `tune.run()`

How can I tell Ray Tune to optimize one metric but schedule (early stop) based on a different one?

SRobertJames
  • 8,210
  • 14
  • 60
  • 107

1 Answers1

0

Does the following work for you?

from ray import tune
from ray.tune.search.optuna import OptunaSearch
from ray.tune.schedulers import ASHAScheduler
from ray.tune.search import ConcurrencyLimiter


def my_objective(config):
    # Hyperparameters
    width, height = config["width"], config["height"]

    for step in range(config["steps"]):
        # Iterative training function - can be any arbitrary training procedure
        your_calculation_here()
        # Feed the score back back to Tune.
        tune.report(
           iterations=step, metric_fast_but_rough=metric_fast_but_rough, metric_slow=metric_slow
        )

searcher = OptunaSearch(metric="metric_slow", mode="max")
algo = ConcurrencyLimiter(searcher, max_concurrent=4)

analysis = tune.run(
    my_objective,
    search_alg=algo,
    num_samples=8,
    scheduler = ASHAScheduler(metric="metric_fast_but_rough", mode="max", max_t=10, grace_period=1, reduction_factor=2),
    config=config,
)
Xiaowei
  • 29
  • 2