In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
Editor to share with you how to use the TensorFlow neural network to build a linear regression model, I believe that most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!
Let's start with some data:
Import numpy as npimport tensorflow as tfimport matplotlib.pyplot as plt# randomly generates 1000 points around the straight line of y=0.1x+0.3 num_points = 1000vectors_set = [] for i in range (num_points): X1 = np.random.normal (0.0,0.55) # np.random.normal (mean,stdev,size) gives a Gaussian random number (field) with a mean value of mean and a standard deviation of stdev. When size is assigned, for example, size=100, it returns 100 Gaussian random numbers. Y 1 = x 1 * 0.1 + 0.3 + np.random.normal (0.0,0.03) # the added Gaussian distribution is artificial noise vectors_set.append ([x 1, y 1]) # generate some samples x_data = [v [0] for v in vectors_set] y_data = [v [1] for v in vectors_set] plt.scatter (x_data, y_data, cantilever r') plt.show () # to construct an one-dimensional w matrix The value is to randomly initialize the random number w = tf.Variable (tf.random_uniform ([1],-1.0,1.0), name='w') # between the weight parameters of [- 1,1] to construct an one-dimensional b matrix and initialize it to 0b = tf.Variable (tf.zeros ([1]), name='b') # to establish a regression formula. After calculation, the estimated value yy = w * x_data + b # defines the loss function, and the mean square error between the estimated value y and the actual value y_data is regarded as the loss loss = tf.reduce_mean (tf.square (y-y_data), name='loss') # the gradient descent method is used to optimize the parameters. The learning rate is 0.5optimizer = tf.train.GradientDescentOptimizer (0.5) # train is equivalent to an optimizer. The training process is to minimize initialization of losstrain = optimizer.minimize (loss, name='train') sess = tf.Session () # initialization of global variables init = tf.global_variables_initializer () sess.run (init) # print initialization w and bprint ('w =', sess.run (w),'b =', sess.run (b), 'loss =' Sess.run (loss)) # 20 training iterations for step in range (20): sess.run (train) # print trained w and b print ('w =', sess.run (w),'b =', sess.run (b), 'loss =', sess.run (loss)
Run the code, and the following figure is the data point that the above code has just constructed:
After we have the data, we construct a linear regression model to learn what kind of w and b the data accords with. After training, we can see whether the w and b obtained are close to the w and b when we constructed the data. The final result is w = [0.10149562] b = [0.29976717] loss = 0.000948041, that is, this linear regression model learns the rules of data distribution. It can also be seen that with the number of iterations of training, the loss value becomes smaller and smaller, that is, the model is getting better and better. The trained w and b are constructed into the blue line in the diagram, which is the line that can best fit the data at present. The running result is shown in the figure:
The above is all the contents of the article "how to use TensorFlow neural network to construct linear regression model". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un