0

is there a way to have pruning with CatBoost and Optuna (in LightGBM it's easy but in Catboost I can't find any hint). My code is like this

def objective(trial):
    param = {
        'iterations':trial.suggest_int('iterations', 100,1500, step=100),
        'learning_rate':trial.suggest_uniform("learning_rate", 0.001, 0.3),
        'random_strength':trial.suggest_int("random_strength", 1,10),
        'max_bin':trial.suggest_categorical('max_bin', [2,3,4,5,6,8,10,20,30]),
        'grow_policy':trial.suggest_categorical('grow_policy', ['SymmetricTree', 'Depthwise', 'Lossguide']),        
        "colsample_bylevel": trial.suggest_uniform("colsample_bylevel", 0.1, 1),
        'od_type' : "Iter",
        'od_wait' : 30,
        "depth": trial.suggest_int("max_depth", 1,12),
        "l2_leaf_reg": trial.suggest_loguniform("l2_leaf_reg", 1e-8, 100),
        'custom_metric' : ['AUC'],
        "loss_function": "Logloss",
        }
    
    if param['grow_policy'] == "SymmetricTree": 
        param["boosting_type"]= trial.suggest_categorical("boosting_type", ["Ordered", "Plain"])
    else:
        param["boosting_type"] = "Plain"
        
    # Added subsample manually
    param["subsample"] = trial.suggest_float("subsample", 0.1, 1)

### CV ###

    # How to add a callback for pruning?
    scores = cv(train_dataset,
            param,
            fold_count=5, 
            early_stopping_rounds=30,         
            plot=False, verbose=False)
    
    return scores['test-AUC-mean'].mean()
Andrea Dalseno
  • 158
  • 1
  • 8

2 Answers2

5

No, because catboost doesn't provide any callback like the other boosting libraries. However, catboost plans to introduce a callback function in the near future. After the release of the feature, optuna may implement integration for catboost like LightGBM. See also the feature request on github https://github.com/optuna/optuna/issues/2464.

nzw0301
  • 359
  • 1
  • 9
  • 2
    Optuna has implemented CatBoost callback officially! https://optuna.readthedocs.io/en/latest/reference/generated/optuna.integration.CatBoostPruningCallback.html#optuna.integration.CatBoostPruningCallback – nzw0301 Mar 14 '22 at 07:49
  • This has pruning for `train_test_split`. How to do it for cross_val_score or cross_validate? If I don't want to write a forloop? – paradocslover Dec 11 '22 at 10:32
  • I suppose it was not possible without for-loop, I'm afraid. – nzw0301 Dec 26 '22 at 06:38
1

Yes, CatBoost does now support pruning with Optuna. Adding to @nzw0301's comment, please see Optuna's example of pruning a CatBoost model here:

Optuna Documentation - Pruning a CatBoost Model

K. Thorspear
  • 473
  • 3
  • 12