Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to write the basic code of Java linear regression

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces "how to write the basic code of Java linear regression". In the daily operation, I believe that many people have doubts about how to write the basic code of Java linear regression. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubt of "how to write the basic code of Java linear regression". Next, please follow the editor to study!

# Use linear model to model this data.from sklearn.linear_model import LinearRegressionimport numpy as nplr=LinearRegression () lr.fit (pga.distance [:, np.newaxis], pga ['accuracy']) # Another way is using pga [[' distance']] theta0=lr.intercept_theta1=lr.coef_print (theta0) print (theta1) # calculating cost-function for each theta1# calculates the average cumulative error def cost Theta1): Juni0 for i in range (len (x)): mse= (x [I] * theta1+theta0-y [I]) * * 2 J+=mse return J / (2*len (x)) theta0=100theta1s = np.linspace (- 3 for theta1 in theta1s 2197) costs= [] for theta1 in theta1s: costs.append (cost (pga ['distance'], pga [' accuracy'], theta0,theta1)) plt.plot (theta1s,costs) plt.show () print (pga.distance) # adjust thetadef partial_cost_theta0 Theta1): # when our model is a linear fitting function: y=theta1*x + theta0 Instead of sigmoid function, when nonlinear, we can directly use sigmoid # to operate the whole x series directly, saving a calculation, and finally finding the sum re-average h=theta1*x+theta0 diff= (hmury) partial=diff.sum () / len (diff) return partialpartial0=partial_cost_theta0 (pga.distance,pga.accuracy,1,1) def partial_cost_theta1 (xmemy theta0theta1): # our model is a linear fitting function: y=theta1*x + theta0, not a sigmoid function When non-linear, we can use sigmoid h=theta1*x+theta0 diff= (Hmury) * x partial=diff.sum () / len (diff) return partialpartial1=partial_cost_theta1 (pga.distance,pga.accuracy,0,5) print (partial0) print (partial1) def gradient_descent (x journal yjournal alpha 0.1): # set the default parameter # calculate the cost # adjust the weight # calculate the error cost Judge whether it converges or reaches the maximum number of iterations most_iterations=1000 convergence_thres=0.000001 c=cost [c] cost_pre=c+convergence_thres+1.0 counter=0 while ((np.abs (c-cost_pre) > convergence_thres) & (counter)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report