AWS XGBOOST模型中未识别/无效的eval_metrics

发布于 2025-02-12 11:07:29 字数 1912 浏览 1 评论 0 原文

xgb.set_hyperparameters(objective='binary:logistic',num_round=100)
xgb.fit({'train': s3_input_train})

...


from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner
hyperparameter_ranges = {'eta': ContinuousParameter(0, 1),
                         'min_child_weight': ContinuousParameter(1, 10),
                         'alpha': ContinuousParameter(0, 2),
                         'max_depth': IntegerParameter(1, 10),
                         'num_round': IntegerParameter(1, 300),
                        'gamma': ContinuousParameter(0, 5),
                        'lambda': ContinuousParameter(0, 1000),
                        'max_delta_step':IntegerParameter(1, 10),
                        'colsample_bylevel':ContinuousParameter(0.1, 1),
                        'colsample_bytree':ContinuousParameter(0.5, 1),
                        'subsample':ContinuousParameter(0.5, 1)}


objective_metric_name = 'validation:aucpr'

tuner = HyperparameterTuner(xgb,
                            objective_metric_name,
                            hyperparameter_ranges,
                            max_jobs=50,
                            max_parallel_jobs=3)

tuner.fit({'train': s3_input_train, 'validation': s3_input_val}, include_cls_metadata=False, wait=False)

返回错误:


An error occurred (ValidationException) when calling the CreateHyperParameterTuningJob operation: The objective metric for the hyperparameter tuning job, [validation:aucpr], isn’t valid for the [811284229777.dkr.ecr.us-east-1.amazonaws.com/xgboost:latest] algorithm. Choose a valid objective metric.

用F1和Logloss替换AUCPR时,同样适用。它们清楚地定义为用于分类目的的文档中的评估指标。

我该怎么做才能允许F1, AUCPR和Logloss评估指标?

xgb.set_hyperparameters(objective='binary:logistic',num_round=100)
xgb.fit({'train': s3_input_train})

...


from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner
hyperparameter_ranges = {'eta': ContinuousParameter(0, 1),
                         'min_child_weight': ContinuousParameter(1, 10),
                         'alpha': ContinuousParameter(0, 2),
                         'max_depth': IntegerParameter(1, 10),
                         'num_round': IntegerParameter(1, 300),
                        'gamma': ContinuousParameter(0, 5),
                        'lambda': ContinuousParameter(0, 1000),
                        'max_delta_step':IntegerParameter(1, 10),
                        'colsample_bylevel':ContinuousParameter(0.1, 1),
                        'colsample_bytree':ContinuousParameter(0.5, 1),
                        'subsample':ContinuousParameter(0.5, 1)}


objective_metric_name = 'validation:aucpr'

tuner = HyperparameterTuner(xgb,
                            objective_metric_name,
                            hyperparameter_ranges,
                            max_jobs=50,
                            max_parallel_jobs=3)

tuner.fit({'train': s3_input_train, 'validation': s3_input_val}, include_cls_metadata=False, wait=False)

Returns the error:


An error occurred (ValidationException) when calling the CreateHyperParameterTuningJob operation: The objective metric for the hyperparameter tuning job, [validation:aucpr], isn’t valid for the [811284229777.dkr.ecr.us-east-1.amazonaws.com/xgboost:latest] algorithm. Choose a valid objective metric.

The same applies when replacing aucpr with f1 and logloss. They are clearly defined as evaluation metrics in the documentation for classification purposes. https://docs.aws.amazon.com/sagemaker/latest/dg/xgboost-tuning.html

What can I do to allow the f1, aucpr and logloss evaluation metrics?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

路弥 2025-02-19 11:07:29

验证:AUC 验证:F1 验证:Logloss 确实是评估指标,它们不是可调的XGBOOST HYPERPARAMETERS。

请参阅对于可调的超参数,

您的代码试图将客观度量设置为不是支持。

评估指标将作为HyperParamaters的一部分输入:

例如,

xgb.set_hyperparameters(
    eval_metric="auc",
    objective="binary:logistic",
    num_round=10,
    rate_drop=0.3,
    tweedie_variance_power=1.4,
)

您共享的文档:

a learning objective function to optimize during model training

an eval_metric to use to evaluate model performance during validation

While validation:auc, validation:f1 and validation:logloss are indeed Evaluation Metrics they are not Tunable XGBoost Hyperparameters.

Please see the table below for the Tunable Hyperparameters

Your code is trying to set the objective metric as one which is not supported.

Evaluation metrics would be input as part of the hyperparamaters:

For example,

xgb.set_hyperparameters(
    eval_metric="auc",
    objective="binary:logistic",
    num_round=10,
    rate_drop=0.3,
    tweedie_variance_power=1.4,
)

From the doc you shared:

a learning objective function to optimize during model training

an eval_metric to use to evaluate model performance during validation
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文