In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article is about how Python implements the ordinary least squares method. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
Generalized linear regression model:
Regard as a coefficient vector (coef_); set
As intercept (intercept_)
1. Ordinary least square method (Ordinary Least Squares)
The purpose of linear regression is to minimize the sum of squares between the predicted values and the actual values:
Import matplotlib.pyplot as pltimport numpy as np# loads the dataset "datasets" from sklearn import datasets, and linear_model# gets the dataset of diabetes diabetes = datasets.load_diabetes () # using one of the features, the role of np.newaxis is to increase the dimension diabetes_X = diabetes.data [:, np.newaxis 2] # split the X variable data set into training set and test set diabetes_X_train = diabetes_X [:-20] diabetes_X_test = diabetes_X [- 20:] # split Y target variables into training set and test set diabetes_y_train = diabetes.target [:-20] diabetes_y_test = diabetes.target [- 20:] # create linear regression object regr = linear_model.LinearRegression () # use training data to train Practice the model regr.fit (diabetes_X_train) Diabetes_y_train) # View the correlation coefficient print ('Coefficients:\ nvariance, regr.coef_) # View the mean of the square of the residual (mean square error,MSE) print ("Residual sum of squares:% .2f" #% is formatted% np.mean ((regr.predict (diabetes_X_test)-diabetes_y_test) * * 2) # Explained variance score: 1 is perfect prediction# explain the variance score (R ^ 2) The best score is 1: # coefficient R ^ 2 = 1-UBV, where u is the square of residual, u = (y_true-y_pred) * * 2). Sum () # v is the sum of squares of deviations V = (y_true-y_true.mean ()) * * 2). Sum () print ('Variance score:% .2f'% regr.score (diabetes_X_test, diabetes_y_test)) # draw the test point plt.scatter (diabetes_X_test, diabetes_y_test, color='black') # draw the predicted point plt.plot (diabetes_X_test, regr.predict (diabetes_X_test), color='blue') Linewidth=3) # Delete the scale of the X axis plt.xticks (()) # delete the scale of the Y axis plt.yticks (()) plt.show ()
Computational complexity of ordinary least square method
This method calculates the solution of the least square by using the singular value decomposition (singular value decomposition,SVD) of X. If X is a matrix of (n < p), then the cost is
Thank you for reading! This is the end of this article on "how to achieve ordinary least Squares in Python". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.