Are Population Based Training (PBT) and HyperOpt Search combinable ?
The AsyncHyperBandScheduler is used in the Hyperopt Example of ray.tune
Here config set some parameters for the run()
function
config = {
"num_samples": 10 if args.smoke_test else 1000,
"config": {
"iterations": 100,
},
"stop": {
"timesteps_total": 100
},
}
and space is the argument for the hyperparameter Space inside the Tune Search Algorithm with hyperopt
functions:
space = {
"width": hp.uniform("width", 0, 20),
"height": hp.uniform("height", -100, 100),
"activation": hp.choice("activation", ["relu", "tanh"])
}
algo = HyperOptSearch()
scheduler = AsyncHyperBandScheduler()
run(easy_objective, search_alg=algo, scheduler=scheduler, **config)
yet in the population-based example with Keras the hyperparameter space is loosely given by hyperparam_mutations
inside the Tune Trials Scheduler with numpy
functions
pbt = PopulationBasedTraining(
hyperparam_mutations={
"dropout": lambda: np.random.uniform(0, 1),
"lr": lambda: 10**np.random.randint(-10, 0),
"rho": lambda: np.random.uniform(0, 1)
})
and config is used in a different way:
To set starting parameters for individuals of the population
run(MemNNModel,scheduler=pbt,
config={
"batch_size": 32,
"epochs": 1,
"dropout": 0.3,
"lr": 0.01,
"rho": 0.9
})
To summarize:
Hyperopt takes its the Hyperparameter Space inside the Search Algorithm with hyperopt
functions
Population-Based Training takes its the Hyperparameter Space inside the Trials Scheduler with numpy
functions
Both use Config differently
From this answer I take that, the Hyperopt Search will only take completed Trials into account.
Is there a conflict with sampled hyperparameters from Hyperopt Search and "changed while running" hyperparameters from Population-Based Training? quote: "a single trial may see many different hyperparameters over its lifetime"