数据小站
数据科学成长之路

广义线性回归(Generalized Linear Models)

1.最小二乘法

线性回归LinearRegression

最小二乘法的回归是求解最小残差的问题
LinearRegression fits a linear model with coefficients w = (w_1, …, w_p) to minimize the residual sum of squares between the observed responses in the dataset, and the responses predicted by the linear approximation.

LinearRegression will take in its fit method arrays X, y and will store the coefficients w of the linear model in its coef_ member:

算法复杂性

此方法使用X的奇异值分解来计算最小二乘解.如果X是大小为( n,p )的矩阵,此方法的成本为O ( N2 )

示例

from sklearn import linear_model
reg = linear_model.LinearRegression()
reg.fit ([[0, 0], [1, 1], [2, 2]], [0, 1, 2])
reg.coef_ ==> 系数
reg.intercept_ ==> 截距
reg.predict()

2.岭回归Ridge Regression

岭回归相比LinearRegression,在用sklearn的linear_model建立模型时多了个参数alpha
from sklearn import linear_model
reg = linear_model.Ridge (alpha = .5)

赞(0) 打赏
未经允许不得转载:技术文档分享 » 广义线性回归(Generalized Linear Models)

评论 抢沙发