Questions tagged [ray-tune]
72 questions
0
votes
0 answers
Ray Tune conflicting with the `breakpoint()` function
The code section attached below, which is basically copy-pasted from the official documentation of Ray Tune runs as expected until the last line, but calling breakpoint() after tuner.fit() breaks the debugger (I can't see any call stack or local…

Dvir Berebi
- 1,406
- 14
- 25
0
votes
0 answers
Hyperparameter-Search while adding Special tokens
# define get_model function
def get_model(params):
db_config = config
if params is not None:
db_config.update({'attention_probs_dropout_prob': params['attention_drop_out'],
…

LearnToGrow
- 1,656
- 6
- 30
- 53
0
votes
1 answer
Ray tune AssertionError: HpBandSter must be installed! | Cannot run HyperBandForBOHB
I try to use HyperBandForBOHB in ray tune, but I always get this error while all requirement has been installed:
AssertionError: HpBandSter must be installed!
You can install HpBandSter with the command:
`pip install…

Arman Asgharpoor
- 259
- 3
- 17
0
votes
1 answer
Using ray tune `tune.run` with pytorch returns different optimal hyperparameters combination
I've initialized two identical ANN with PyTorch (both as structure and initial parameters), and I've noticed that the hyperparameters setting with Ray Tune, returns different results for the two ANN, even if I didn't have any random…

Lorenzo Boletti
- 41
- 3
0
votes
0 answers
Ray Tune | Find optimal network hidden size using PBT
I intend to develop a model to test whether PBT is working correctly or not and want to find the optimal hidden layer size via PBT in ray tune, but the hidden layer sizes found by PBT are not optimal. The results are not even suboptimal. I have…

Arman Asgharpoor
- 259
- 3
- 17
0
votes
0 answers
RayTune HyperOptSearch - fitting resampling into pipeline throws error: All intermediate steps should be transformers and implement fit and transform
I'm getting started with Raytune and trying to set up a HyperOptSearch with imbalanced data.
Fitting a pipeline without RandomOverSampler works fine, but when I add that in, I get the error:
TypeError: All intermediate steps should be transformers…

Hanafi Haffidz
- 148
- 8
0
votes
0 answers
Ray tune and pytorch - How to prevent running out of space on disk?
I have this code:
search_alg = HyperOptSearch()
hyperopt_search = HyperOptSearch(
metric="val_acc", mode="max")
tuner = tune.Tuner(tune.with_resources(train_fn, {"cpu": 1}),…

Slowat_Kela
- 1,377
- 2
- 22
- 60
0
votes
0 answers
Function checkpointing is disabled. This may result in unexpected behavior when using checkpointing features or certain schedulers
Could someone explain why this code:
config_dict = {
"c": tune.choice([64,128,256]),
"dp": tune.choice([0.6,0.7,0.8,0.9]),
"layers":tune.choice([2,3,4,5]),
}
hyperopt_search = HyperOptSearch(
…

Slowat_Kela
- 1,377
- 2
- 22
- 60
0
votes
0 answers
Ray tune / syne tune string search space
I am trying to set a string Domain wrapper for syne tune search parameters (Domain class is inherited from ray-tune.
My entry point requires an argument such as --config [lr=0.1, optimizer=adam] with other override configurations.
I would like to…

Isdj
- 1,835
- 1
- 18
- 36
0
votes
1 answer
How can I use GPU properly on Ray tune?
When I try to use ray tune for hyper-parameter optimization, an error below occurred.
RuntimeError: No CUDA GPUs are available
(main pid=4099) *** SIGSEGV received at time=1664685800 on cpu 0 ***
(main pid=4099) PC: @ 0x7f7999651050 (unknown) …

Semayuki
- 11
- 3
0
votes
1 answer
How to print metrics per epoch for the best model from ray tune?
I have this code:
from ray import tune
from ray import air
from ray.air.config import RunConfig
from ray.tune.search.hyperopt import HyperOptSearch
from hyperopt import fmin, hp, tpe, Trials, space_eval, STATUS_OK
import os
config_dict = {
…

Slowat_Kela
- 1,377
- 2
- 22
- 60
0
votes
0 answers
How to get ray tune to run with pytorch - IsADirectoryError: [Errno 21] Is a directory
I have this code (a complete reproducible example):
## Standard libraries
CHECKPOINT_PATH = "/home/ad1/test_predictor/new_dev_v1"
DATASET_PATH = "/home/ad1/test_predictor"
import torch
device = torch.device("cuda:0") if torch.cuda.is_available()…

Slowat_Kela
- 1,377
- 2
- 22
- 60
0
votes
2 answers
why is ray Tune with pytorch HPO error 'trials did not complete, incomplete trials'?
Could someone explain why this code (that I took from here):
## Standard libraries
import os
import json
import math
import numpy as np
import time
## Imports for plotting
import matplotlib.pyplot as plt
#%matplotlib inline
#from IPython.display…

Slowat_Kela
- 1,377
- 2
- 22
- 60
0
votes
1 answer
Ray Tune: How to optimize one metric but schedule (early stop) based on a different one?
I'd like to use Ray Tune to optimize for metric_slow, but, since that takes a long time before it is available, to use ASHA to early stop based on metric_fast_but_rough. I tried to do this by giving scheduler one metric and tune.run a different…

SRobertJames
- 8,210
- 14
- 60
- 107
0
votes
1 answer
How can I change the ray_results folder when using TuneGridSearchCV?
I am running quite a large parameter search using TuneGridSearchCV on an xgboost model using my university's HPC cluster. The results are being saved to ~/ray_results however I don't have enough space to save all the files to the home directory as…

shoopdoop
- 1
- 1