Optuna LightGBM LightGBMPruningCallback

发布于 2025-01-18 06:33:43 字数 2026 浏览 4 评论 0原文

我在寻找最佳 auc 的 lightgbm 建模中遇到错误。任何帮助将不胜感激。

import optuna  
from sklearn.model_selection import StratifiedKFold
from optuna.integration import LightGBMPruningCallback
def objective(trial, X, y):
    param = {
        "objective": "binary",
        "metric": "auc",
        "verbosity": -1,
        "boosting_type": "gbdt",
        "lambda_l1": trial.suggest_loguniform("lambda_l1", 1e-8, 10.0),
        "lambda_l2": trial.suggest_loguniform("lambda_l2", 1e-8, 10.0),
        "num_leaves": trial.suggest_int("num_leaves", 2, 256),
        "feature_fraction": trial.suggest_uniform("feature_fraction", 0.4, 1.0),
        "bagging_fraction": trial.suggest_uniform("bagging_fraction", 0.4, 1.0),
        "bagging_freq": trial.suggest_int("bagging_freq", 1, 7),
        "min_child_samples": trial.suggest_int("min_child_samples", 5, 100),
    }


    cv = StratifiedKFold(n_splits=5, shuffle=True, random_state=1121218)

    cv_scores = np.empty(5)
    for idx, (train_idx, test_idx) in enumerate(cv.split(X, y)):
        X_train, X_test = X.iloc[train_idx], X.iloc[test_idx]
        y_train, y_test = y[train_idx], y[test_idx]

        pruning_callback = optuna.integration.LightGBMPruningCallback(trial, "auc")
        
        model = lgb.LGBMClassifier(**param)
        
        model.fit(
            X_train,
            y_train,
            eval_set=[(X_test, y_test)],
            early_stopping_rounds=100,
            callbacks=[pruning_callback])
        
        preds = model.predict_proba(X_test)
        cv_scores[idx] = log_loss(y_test, preds)
        auc_scores[idx] = roc_auc_score(y_test, preds)
        
    return np.mean(cv_scores), np.mean(auc_scores)
    


study = optuna.create_study(direction="minimize", study_name="LGBM Classifier")
func = lambda trial: objective(trial, sample_df[cols_to_keep], sample_df[target])

study.optimize(func, n_trials=1)

由于以下错误,试验 0 失败:ValueError('The 中间值与目标值不一致 学习方向方面。请指定一个要最小化的指标 LightGBMPruningCallback。',)*

I am getting an error on my modeling of lightgbm searching for optimal auc. Any help would be appreciated.

import optuna  
from sklearn.model_selection import StratifiedKFold
from optuna.integration import LightGBMPruningCallback
def objective(trial, X, y):
    param = {
        "objective": "binary",
        "metric": "auc",
        "verbosity": -1,
        "boosting_type": "gbdt",
        "lambda_l1": trial.suggest_loguniform("lambda_l1", 1e-8, 10.0),
        "lambda_l2": trial.suggest_loguniform("lambda_l2", 1e-8, 10.0),
        "num_leaves": trial.suggest_int("num_leaves", 2, 256),
        "feature_fraction": trial.suggest_uniform("feature_fraction", 0.4, 1.0),
        "bagging_fraction": trial.suggest_uniform("bagging_fraction", 0.4, 1.0),
        "bagging_freq": trial.suggest_int("bagging_freq", 1, 7),
        "min_child_samples": trial.suggest_int("min_child_samples", 5, 100),
    }


    cv = StratifiedKFold(n_splits=5, shuffle=True, random_state=1121218)

    cv_scores = np.empty(5)
    for idx, (train_idx, test_idx) in enumerate(cv.split(X, y)):
        X_train, X_test = X.iloc[train_idx], X.iloc[test_idx]
        y_train, y_test = y[train_idx], y[test_idx]

        pruning_callback = optuna.integration.LightGBMPruningCallback(trial, "auc")
        
        model = lgb.LGBMClassifier(**param)
        
        model.fit(
            X_train,
            y_train,
            eval_set=[(X_test, y_test)],
            early_stopping_rounds=100,
            callbacks=[pruning_callback])
        
        preds = model.predict_proba(X_test)
        cv_scores[idx] = log_loss(y_test, preds)
        auc_scores[idx] = roc_auc_score(y_test, preds)
        
    return np.mean(cv_scores), np.mean(auc_scores)
    


study = optuna.create_study(direction="minimize", study_name="LGBM Classifier")
func = lambda trial: objective(trial, sample_df[cols_to_keep], sample_df[target])

study.optimize(func, n_trials=1)

Trial 0 failed because of the following error: ValueError('The
intermediate values are inconsistent with the objective values in
terms of study directions. Please specify a metric to be minimized for
LightGBMPruningCallback.',)*

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

少女情怀诗 2025-01-25 06:33:43

您的目标函数返回两个值,但在创建研究时仅指定一个方向。尝试以下操作:

study = optuna.create_study(directions=["minimize", "maximize"], study_name="LGBM Classifier")

Your objective function returns two values but you specify only one direction when creating the study. Try this:

study = optuna.create_study(directions=["minimize", "maximize"], study_name="LGBM Classifier")
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文