使用Sklearn,返回错误结果的多项式回归

发布于 2025-02-12 12:26:30 字数 1554 浏览 1 评论 0原文

我正在尝试使用Python Sklearn库进行多项式回归,但是我获得的结果与我从Excel中获得的结果大不相同。

代码:

def polynomial_regression(x_param, y_param):
    print(x_param)
    print(y_param)
    """create a polynomial regression graph"""
    # convert x_param features to a numpy array
    x_param = np.array(x_param)

    # save a PolynomialFeatures with degree of 3
    poly = PolynomialFeatures(degree=3, include_bias=False)

    # we fit and transform the numpy array x_param
    poly_features = poly.fit_transform(x_param.reshape(-1, 1))

    # create a LinearRegression instance
    poly_reg_model = LinearRegression()

    # we fit our model to our data
    # which means we train our models by introducing poly_features and y_params values
    poly_reg_model.fit(poly_features, y_param)

    # predict the response 'y_predicted' based on the poly_features and the coef it estimated
    y_predicted = poly_reg_model.predict(poly_features)

    # visualising our model
    plt.figure(figsize=(10, 6))
    plt.title(f"Polynomial regression, coef={poly_reg_model.coef_}", size=16)
    plt.scatter(x_param, y_param)
    plt.plot(x_param, y_predicted, c="red")
    plt.show()

结果:

预期结果:

现在结果应该看起来像这样吗?如果是这样,如果没有,我做错了什么? 感谢您提前的帮助。

I am trying to do a polynomial regression using python sklearn library, but the result I get is very different from the one I get from excel.

code:

def polynomial_regression(x_param, y_param):
    print(x_param)
    print(y_param)
    """create a polynomial regression graph"""
    # convert x_param features to a numpy array
    x_param = np.array(x_param)

    # save a PolynomialFeatures with degree of 3
    poly = PolynomialFeatures(degree=3, include_bias=False)

    # we fit and transform the numpy array x_param
    poly_features = poly.fit_transform(x_param.reshape(-1, 1))

    # create a LinearRegression instance
    poly_reg_model = LinearRegression()

    # we fit our model to our data
    # which means we train our models by introducing poly_features and y_params values
    poly_reg_model.fit(poly_features, y_param)

    # predict the response 'y_predicted' based on the poly_features and the coef it estimated
    y_predicted = poly_reg_model.predict(poly_features)

    # visualising our model
    plt.figure(figsize=(10, 6))
    plt.title(f"Polynomial regression, coef={poly_reg_model.coef_}", size=16)
    plt.scatter(x_param, y_param)
    plt.plot(x_param, y_predicted, c="red")
    plt.show()

result:
results

expected result:
expected result

now is the results suppose to look like this ? if so why , if no what am i doing wrong ?
thanks for your help in advance.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

勿忘初心 2025-02-19 12:26:30

@adam demo_fighter:让我发布一个解决方案,以澄清@Andrey Lukyanenko的讲话。

问题是plt.plot(x_param,y_predialded,c =“ red”)。此绘图命令将连接X_Param的后续​​点,并通过行段进行Y_________。如果x_param中的值不是单调的,则可以创建图片中出现的曲折ZAG模式。该解决方案只是在进行分析之前仅订购X值列表。

import matplotlib.pyplot as plt
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression

# Original data
xraw = [3, 6, 6, 9, 12, 15, 3, 6, 13, 9, 13, 16]
yraw = [9, 12, 9, 9, 12, 9, 6, 3, 8, 3, 1, 3]
# Ordered data in x:
OrderedID   = np.argsort(xraw)
x = np.array(xraw)[OrderedID]
y = np.array(yraw)[OrderedID]

print(x)
print(y)

def polynomial_regression(x_param, y_param):
    """create a polynomial regression graph"""
    # save a PolynomialFeatures with degree of 3
    poly = PolynomialFeatures(degree=3, include_bias=False)
    print(poly)

    # we fit and transform the numpy array x_param
    poly_features = poly.fit_transform(x_param.reshape(-1, 1))

    # create a LinearRegression instance
    poly_reg_model = LinearRegression()

    # we fit our model to our data
    # which means we train our models by introducing poly_features and y_params values
    temp = poly_reg_model.fit(poly_features, y_param)
    print(temp)

    # predict the response 'y_predicted' based on the poly_features and the coef it estimated
    y_predicted = poly_reg_model.predict(poly_features)

    # visualising our model
    plt.figure(figsize=(10, 6))
    plt.title(f"Polynomial regression, coef={poly_reg_model.coef_}", size=16)
    plt.scatter(x_param, y_param)
    plt.plot(x_param, y_predicted, c="red")
    plt.show()

polynomial_regression(x, y)

PS1:我转换为NP.Array在功能之外。

ps2:不错的个人资料pic =)。

@Adam Demo_Fighter: Let me maybe post a solution to clarify the remark by @Andrey Lukyanenko.

The issue is plt.plot(x_param, y_predicted, c="red"). This plot command will connect the subsequent points from x_param and y_predicted by line segments. If the values in x_param are not monotonic, then this creates the zig zag patterns that appear in your plot. The solution is simply to order the list of x values before carrying out the analysis.

import matplotlib.pyplot as plt
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression

# Original data
xraw = [3, 6, 6, 9, 12, 15, 3, 6, 13, 9, 13, 16]
yraw = [9, 12, 9, 9, 12, 9, 6, 3, 8, 3, 1, 3]
# Ordered data in x:
OrderedID   = np.argsort(xraw)
x = np.array(xraw)[OrderedID]
y = np.array(yraw)[OrderedID]

print(x)
print(y)

def polynomial_regression(x_param, y_param):
    """create a polynomial regression graph"""
    # save a PolynomialFeatures with degree of 3
    poly = PolynomialFeatures(degree=3, include_bias=False)
    print(poly)

    # we fit and transform the numpy array x_param
    poly_features = poly.fit_transform(x_param.reshape(-1, 1))

    # create a LinearRegression instance
    poly_reg_model = LinearRegression()

    # we fit our model to our data
    # which means we train our models by introducing poly_features and y_params values
    temp = poly_reg_model.fit(poly_features, y_param)
    print(temp)

    # predict the response 'y_predicted' based on the poly_features and the coef it estimated
    y_predicted = poly_reg_model.predict(poly_features)

    # visualising our model
    plt.figure(figsize=(10, 6))
    plt.title(f"Polynomial regression, coef={poly_reg_model.coef_}", size=16)
    plt.scatter(x_param, y_param)
    plt.plot(x_param, y_predicted, c="red")
    plt.show()

polynomial_regression(x, y)

PS1: I transformed to np.array outside of the function.

PS2: Nice profile pic =).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文