python int太大而无法转换为c长 - 运行随机搜索cv方法时

发布于 2025-02-05 02:10:10 字数 2233 浏览 3 评论 0原文

我正在尝试为XGBClassifer进行超参数调整,并且在拟合随机搜索CV方法时遇到了此错误。我不知道为什么会发生此错误。我没有使用大于极限的任何数字。 x_train_mms仅具有0到1之间的值,而y_train值是二进制0和1。有人知道如何解决此问题吗?

这是代码:

    # Preprocessing 
    scaler = MinMaxScaler()
    scaler.fit(X)
    X_mms = pd.DataFrame(scaler.fit_transform(X), index=X.index, columns=X.columns)
    X_train_mms =  pd.DataFrame(scaler.transform(X_train), index=X_train.index, columns=X_train.columns)
    X_test_mms = pd.DataFrame(scaler.transform(X_test), index=X_test.index, columns=X_test.columns)
    X_valid_mms = pd.DataFrame(scaler.transform(X_valid), index=X_valid.index, columns=X_valid.columns)
# Hyperparameter tuning function
    def hypertune(model_parameter):
        model = model_parameter['function']
        parameter = model_parameter['params']
        scores = {'mcc': matthews_corrcoef}
        cv = StratifiedKFold(n_splits = 10)
        search = RandomizedSearchCV(model, parameter, n_iter=100, 
        scoring=make_scorer(scores['mcc']), cv=cv, random_state=42, return_train_score = True)
        search.fit(X_train_mms, y_train)
        attr = {}
        attr['rank'] = search.cv_results_['rank_test_score']
        attr['test_means'] = search.cv_results_['mean_test_score']
        attr['test_stds'] = search.cv_results_['std_test_score']
        attr['train_means'] = search.cv_results_['mean_train_score']
        attr['train_stds'] = search.cv_results_['std_train_score']
        attr['params'] = search.cv_results_['params']
        attributes = pd.DataFrame(attr)
        return attributes
    
    
    parameter = {'gamma': np.concatenate((np.arange(0.0001, 0.001, 0.0001), np.arange(0.001, 0.01, 0.001), np.arange(0.01, 0.1, 0.01), np.arange(0.1, 1, 0.1), list(range(1,11))), axis = None),
                  'learning_rate': np.arange(0.01,10,0.01),
                  'max_depth': list(range(1,21)),
                  'n_estimators': [int(x) for x in np.linspace(start = 50, stop = 1000, num = 20)],
                  'reg_alpha': np.arange(0,10,0.1),
                  'reg_lambda': np.arange(0,10,0.1)}
    
    xgbooster = {'name': 'XGBooster', 'function': XGBClassifier(), 'params': parameter}
    
    optimized_xgb = hypertune(xgbooster)

I am trying to do hyperparameter tuning for XGBClassifer and I'm getting this error while fitting the Randomized Search CV method. I have no idea why this error occurred. I didn't use any number greater than the limit. X_train_mms just has values between 0 and 1 and y_train value is binary 0 and 1. Does anyone know how to solve this issue?

Here is the code:

    # Preprocessing 
    scaler = MinMaxScaler()
    scaler.fit(X)
    X_mms = pd.DataFrame(scaler.fit_transform(X), index=X.index, columns=X.columns)
    X_train_mms =  pd.DataFrame(scaler.transform(X_train), index=X_train.index, columns=X_train.columns)
    X_test_mms = pd.DataFrame(scaler.transform(X_test), index=X_test.index, columns=X_test.columns)
    X_valid_mms = pd.DataFrame(scaler.transform(X_valid), index=X_valid.index, columns=X_valid.columns)
# Hyperparameter tuning function
    def hypertune(model_parameter):
        model = model_parameter['function']
        parameter = model_parameter['params']
        scores = {'mcc': matthews_corrcoef}
        cv = StratifiedKFold(n_splits = 10)
        search = RandomizedSearchCV(model, parameter, n_iter=100, 
        scoring=make_scorer(scores['mcc']), cv=cv, random_state=42, return_train_score = True)
        search.fit(X_train_mms, y_train)
        attr = {}
        attr['rank'] = search.cv_results_['rank_test_score']
        attr['test_means'] = search.cv_results_['mean_test_score']
        attr['test_stds'] = search.cv_results_['std_test_score']
        attr['train_means'] = search.cv_results_['mean_train_score']
        attr['train_stds'] = search.cv_results_['std_train_score']
        attr['params'] = search.cv_results_['params']
        attributes = pd.DataFrame(attr)
        return attributes
    
    
    parameter = {'gamma': np.concatenate((np.arange(0.0001, 0.001, 0.0001), np.arange(0.001, 0.01, 0.001), np.arange(0.01, 0.1, 0.01), np.arange(0.1, 1, 0.1), list(range(1,11))), axis = None),
                  'learning_rate': np.arange(0.01,10,0.01),
                  'max_depth': list(range(1,21)),
                  'n_estimators': [int(x) for x in np.linspace(start = 50, stop = 1000, num = 20)],
                  'reg_alpha': np.arange(0,10,0.1),
                  'reg_lambda': np.arange(0,10,0.1)}
    
    xgbooster = {'name': 'XGBooster', 'function': XGBClassifier(), 'params': parameter}
    
    optimized_xgb = hypertune(xgbooster)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文