20

I am trying to use the XGBClassifier wrapper provided by sklearn for a multiclass problem. My classes are [0, 1, 2], the objective that I use is multi:softmax. When I am trying to fit the classifier I get

xgboost.core.XGBoostError: value 0for Parameter num_class should be greater equal to 1

If I try to set the num_class parameter the I get the error

got an unexpected keyword argument 'num_class'

Sklearn is setting this parameter automatically so I am not supposed to pass that argument. But why do I get the first error?

LetsPlayYahtzee
  • 7,161
  • 12
  • 41
  • 65

4 Answers4

17

You need to manually add the parameter num_class to the xgb_param

    # Model is an XGBClassifier
    xgb_param = model.get_xgb_params()
    xgb_param['num_class'] = 3
    cvresult = xgb.cv(xgb_param, ...)

The XGBClassifier does set this value automatically if you use its fit method, but does not in the cv method

mxdbld
  • 16,747
  • 5
  • 34
  • 37
17

In my case, the same error was thrown during a regular fit call. The root of the issue was that the objective was manually set to multi:softmax, but there were only 2 classes. Changing it to binary:logistic solved the problem.

Czyzby
  • 2,999
  • 1
  • 22
  • 40
0

Are you using xgboost.cv function? I encountered the same problems but found the solution. Here is my code:

    xgb_param = model.get_xgb_params()
    extra = {'num_class': 3}
    xgb_param.update(extra)
    cvresult = xgb.cv(xgb_param, xgtrain, ...)

xgb_param is the dictionary of the XGBoost model parameters. Then I append a new dict extra to it to specify the num_class, pass the new dict to the cv function. This works.

Bruce Jinru Su
  • 189
  • 6
  • 14
0

In xgboost version 1.4.2, what worked for me was including num_class as a parameter to the regressor with a value equal to the number of targets/outputs.

params = { "objective": "multi:softmax", 'num_class': 3}
model = xgb.XGBRegressor(**params)
janeon
  • 59
  • 4