Мне нужно запустить xgboost
в BaggingRegressor
, я использую xgboost
import xgboost
D_train = xgboost.DMatrix(X_train, lab_train)
D_val = xgboost.DMatrix(X_train[test_index], lab_train[test_index])
D_pred =xgboost.DMatrix( X_train[test_index])
D_test = xgboost.DMatrix(X_test)
D_ttest = xgboost.DMatrix(ttest)
xgb_params = dict()
xgb_params["objective"] = "reg:linear"
xgb_params["eta"] = 0.01
xgb_params["min_child_weight"] = 6
xgb_params["subsample"] = 0.7
xgb_params["colsample_bytree"] = 0.6
xgb_params["scale_pos_weight"] = 0.8
xgb_params["silent"] = 1
xgb_params["max_depth"] = 10
xgb_params["max_delta_step"]=2
watchlist = [(D_train, 'train')]
xg = xgboost.Booster()
print('1000')
model = xgboost.train(params=xgb_params, dtrain=D_train, num_boost_round=1000,
evals=watchlist, verbose_eval=1, early_stopping_rounds=20)
y_pred1 = model.predict(D_ttest)
, как использовать все те же параметры, но в BaggingRegressor
?
ЕслиЯ делаю
gdr = BaggingRegressor(base_estimator= xgboost.train( params=xgb_params,
dtrain=D_train,
num_boost_round=3000,
evals=watchlist,
verbose_eval=1,
early_stopping_rounds=20))
, затем начинается обучение xgboost
, а затем код
gdr_model = gdr
print(gdr_model)
gdr_model.fit(X_train, lab_train)
train_pred = gdr_model.predict(X_test)
print('mse from log: ', mean_squared_error(lab_train, train_pred))
train_pred = gdr_model.predict(ttest)
Не имеет смысла, или я ошибаюсь?подскажите как решить эту проблему