I saw that some xgboost
methods take a parameter num_boost_round
, like this:
model = xgb.cv(params, dtrain, num_boost_round=500, early_stopping_rounds=100)
Others however take n_estimators
like this:
model_xgb = xgb.XGBRegressor(n_estimators=360, max_depth=2, learning_rate=0.1)
As far as I understand, each time boosting is applied a new estimator is created. Is that nor correct?
If that is so, then the numbers num_boost_round
and n_estimators
should be equal, right?