如何打印拟合 XGBoost 模型的超参数?

发布于 2025-01-13 10:18:59 字数 227 浏览 0 评论 0原文

如果我在数据上拟合 XGBoost 模型并且未设置任何参数(所有参数均为默认值),那么如何打印这些设置?

xgb_outofbox = XGBClassifier(random_state=0).fit(X_train, y_train)

我希望调用类似 xgb_outofbox.params_ 的内容,但这不起作用。我找不到这个非常简单的问题的任何答案。

If I fit an XGBoost model on data and set none of the parameters (all are defaults), how do I then print those settings?

xgb_outofbox = XGBClassifier(random_state=0).fit(X_train, y_train)

I'm looking to call something like xgb_outofbox.params_, but that doesn't work. I can't find any answers to this very simple question.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

已下线请稍等 2025-01-20 10:18:59

TL;DR
xgb_outofbox.get_params() 带有文档 此处

详细信息
假设您有一个模型:“xgb_outofbox

带有数据:

X_train = np.random.random((1000, 10))
y_train = np.random.randint(2, size=1000)

它是一个分类器:XGBClassifier()

并且您为其提供以下参数:

params = {"objective": "binary:logistic",
          "max_depth": 7,
          "learning_rate": 0.1,
          "n_estimators": 50}

这样您就可以创建分类器:
xgb_outofbox = XGBClassifier(**params)

然后拟合数据: xgb_outofbox.fit(X_train, y_train)

然后您就可以打印出参数,如下所示:
print(xgb_outofbox.get_params())

总而言之,代码可能如下所示:

import numpy as np
from xgboost import XGBClassifier

# generate data
X_train = np.random.random((1000, 10))
y_train = np.random.randint(2, size=1000)

# hyperparameter dictionary
params = {"objective": "binary:logistic",
          "max_depth": 7,
          "learning_rate": 0.1,
          "n_estimators": 50}

# unpack hyperparameters into classifier
xgb_outofbox = XGBClassifier(**params)

# fit the model
xgb_outofbox.fit(X_train, y_train)

# get the parameters
print(xgb_outofbox.get_params())

TL;DR
xgb_outofbox.get_params() with documentation here

The Details
So say you have a model: "xgb_outofbox"

With data:

X_train = np.random.random((1000, 10))
y_train = np.random.randint(2, size=1000)

It's a classifier: XGBClassifier()

And you provide it the following parameters:

params = {"objective": "binary:logistic",
          "max_depth": 7,
          "learning_rate": 0.1,
          "n_estimators": 50}

Such that you create the classifier:
xgb_outofbox = XGBClassifier(**params)

And then fit the data: xgb_outofbox.fit(X_train, y_train)

You would then be able to print out the parameters as follows:
print(xgb_outofbox.get_params())

Altogether the code could look like this:

import numpy as np
from xgboost import XGBClassifier

# generate data
X_train = np.random.random((1000, 10))
y_train = np.random.randint(2, size=1000)

# hyperparameter dictionary
params = {"objective": "binary:logistic",
          "max_depth": 7,
          "learning_rate": 0.1,
          "n_estimators": 50}

# unpack hyperparameters into classifier
xgb_outofbox = XGBClassifier(**params)

# fit the model
xgb_outofbox.fit(X_train, y_train)

# get the parameters
print(xgb_outofbox.get_params())
伪装你 2025-01-20 10:18:59

这可以通过打印模型对象本身来完成,即只需编写:

xgb_outofbox

This can be done by printing the model object itself, i.e. just write:

xgb_outofbox
打小就很酷 2025-01-20 10:18:59

使用get_xgb_params

print(xgb_outofbox.get_xgb_params())

它返回XGBoost特定参数。

输出将是这样的:

{'objective': 'binary:logistic', 'base_score': None, 'booster': None, 'colsample_bylevel': None, 'colsample_bynode': None, 'colsample_bytree': None, 'gamma': None, 'gpu_id': None, 'interaction_constraints': None, 'learning_rate': None, 'max_delta_step': None, 'max_depth': None, 'min_child_weight': None, 'monotone_constraints': None, 'n_jobs': None, 'num_parallel_tree': None, 'random_state': None, 'reg_alpha': None, 'reg_lambda': None, 'scale_pos_weight': None, 'subsample': None, 'tree_method': None, 'validate_parameters': None, 'verbosity': None}

Use get_xgb_params:

print(xgb_outofbox.get_xgb_params())

It returns XGBoost specific parameters.

The output would be something like this:

{'objective': 'binary:logistic', 'base_score': None, 'booster': None, 'colsample_bylevel': None, 'colsample_bynode': None, 'colsample_bytree': None, 'gamma': None, 'gpu_id': None, 'interaction_constraints': None, 'learning_rate': None, 'max_delta_step': None, 'max_depth': None, 'min_child_weight': None, 'monotone_constraints': None, 'n_jobs': None, 'num_parallel_tree': None, 'random_state': None, 'reg_alpha': None, 'reg_lambda': None, 'scale_pos_weight': None, 'subsample': None, 'tree_method': None, 'validate_parameters': None, 'verbosity': None}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文