3

I met a strange problem:
I defined my XGB hyper-parameter 'max_depth' by hyperopt

hp.choice('max_depth',range(2,20))

But I got 'max_depth' = 0 or 1 result, which is not within [2,20) restriction. Why? Anything I missed? Thanks.

Error result:

{'colsample_bytree': 0.18, 'learning_rate': 0.05, 'max_depth': 1, 'reg_alpha': 3.44, 'reg_lambda': 0.92}

{'colsample_bytree': 0.41, 'learning_rate': 0.09, 'max_depth': 0, 'reg_alpha': 0.14, 'reg_lambda': 3.53}

{'colsample_bytree': 0.71, 'learning_rate': 0.17, 'max_depth': 0, 'reg_alpha': 2.21, 'reg_lambda': 2.82}
def xgb_classifier_tune(params):
    obj='binary:logistic' if class_nums==2 else 'multi:softmax'
    random.seed(time.time())
    xgb_model=xgb.XGBClassifier(
            max_depth=params['max_depth'],
            colsample_bytree=params['colsample_bytree'],
            learning_rate=params['learning_rate'],
            reg_alpha=params['reg_alpha'],
            reg_lambda=params['reg_lambda'],
            objective=obj,
            n_estimators=100000,
            random_state=random.randint(0,99999),
            n_jobs=-1)

    if params['max_depth']<2:
        return {'loss':999.999,'status': STATUS_FAIL,'info':[0,0,0,{}]}
    xgb_model.fit(tune_train_x,tune_train_y,eval_set=[(tune_valid_x,tune_valid_y)],verbose=1,early_stopping_rounds=100) #verbose: 0 (silent), 1 (warning), 2 (info), 3 (debug)
    predict_y=xgb_model.predict(tune_valid_x)
    f1,mcc,roc_auc,table=get_score(tune_valid_y[y_feature].values,predict_y)
    return 'loss':-mcc,'status': STATUS_OK

def xgb_hyper_tune():
    mdep=list(range(2,20))
    space={'max_depth':hp.choice('max_depth',mdep),
        'colsample_bytree':hp.uniform('colsample_bytree',0.1,0.9),
        'learning_rate':hp.quniform('learning_rate',0.01,0.2,0.01),
        'reg_alpha':hp.uniform('reg_alpha',0.1,6.0),
        'reg_lambda':hp.uniform('reg_lambda',0.1,6.0)}

    trials=Trials()
    best_param=fmin(xgb_classifier_tune,space,algo=tpe.suggest,max_evals=100, trials=trials)
    return best_param
barbsan
  • 3,418
  • 11
  • 21
  • 28
Ayung
  • 31
  • 3

2 Answers2

6

I also faced the same problem. It is not an error.

From the official document of hyperopt, it is not understandable. But the answer of zilin xiang and this very nice explanation of ncalik helped me to understand the reason of occurring the problem. This helped me to solve the problem.

Reason for Occurring the Problem

It is due to using hp.choice() and due to not setting return_argmin=False inside the fmin(). Since you have set max_depth [2,20), getting index 0 or 1 means that it is using max_depth 2 (for index 0) or 3 (for index 1).

Solving the Problem

If you want to get the actual parameter, instead of index, then, you can use two approaches.

  1. use hyperopt.space_eval(): hyperopt.space_eval(space, best_param).
  2. set return_argmin=False in fmin(): best_param=fmin(xgb_classifier_tune,space,algo=tpe.suggest,max_evals=100, trials=trials, return_argmin=False). Instead of the index, it will return you the actual values of the parameters.

To know more, you may check this.

Md. Sabbir Ahmed
  • 850
  • 8
  • 22
5

Because hp.choice will return the index instead of value of item in your restriction. For example, 0 means the value of max_depth is 2.

Obsidian
  • 3,719
  • 8
  • 17
  • 30
zilin xiang
  • 51
  • 1
  • 2