Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use tensorboard to show the graph of neural network

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains "how to use tensorboard to display the graph of neural network". The content of the explanation is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn "how to use tensorboard to display the graph of neural network".

# create a neural network Using tensorboard to display graphimport tensorflow as tfimport numpy as npimport matplotlib.pyplot as plt # without pip install matplotlib# defining a nerve layer def add_layer (inputs, in_size, out_size, activation_function=None): # add one more layer and return the output of this layer with tf.name_scope ('layer'): with tf.name_scope (' Weights'): Weights = tf.Variable (tf.random_normal ([in_size, out_size]) Name='W') with tf.name_scope ('biases'): biases = tf.Variable (tf.zeros ([1, out_size])) + 0.1 with tf.name_scope (' Wx_plus_b'): Wx_plus_b = tf.matmul (inputs) Weights) + biases if activation_function is None: outputs = Wx_plus_b else: outputs = activation_function (Wx_plus_b) # return outputs#make up some real datax_data = np.linspace (- 1,300) [:, np.newaxis] # x _ data values are between-1 and 1 There are 300 units (example), plus a dimension newaxis, that is, 300 rows * newaxis column noise = np.random.normal (0,0.05, x_data.shape) # the mean is 0. The variance is 0.05 The format is the same as x_data y_data = np.square (x_data)-0.5 + noise#define placeholder for inputs to networkwith tf.name_scope ('inputs'): xs = tf.placeholder (tf.float32, [None, 1], name='x_input1') # none means that no matter how many examples are given, ys = tf.placeholder (tf.float32, [None, 1], name='y_input1') # add hidden layerl1 = add_layer (xs, 1,10) Activation_function=tf.nn.relu) # add output layerprediction = add_layer (L1,10,1, activation_function=None) # the error between prediction and real datawith tf.name_scope ('loss'): loss = tf.reduce_mean (tf.square (ys-prediction) Reduction_indices= [1]) # sums and averages each example reduction_indices= [1] refers to with tf.name_scope ('train'): train_step = tf.train.GradientDescentOptimizer (0.1) .minimize (loss) # corrects and improves errors with a learning efficiency of 0.1 # initialization # init = tf.initialize_all_variables () init = tf.global_variables_ Initializer () sess = tf.Session () sess.run (init) # load the entire framework into one file Then load it from the file and put it in a browser to view # writer=tf.train.SummaryWriter ("logs/", sess.graph) # first find the path to tensorboard.exe and enter c:Anaconda\ Scripts, execute tensorboard.exe-- the path to the image generated by logdir= code (not in Chinese) writer=tf.summary.FileWriter (".. /.. / logs/", sess.graph) fig = plt.figure () ax = fig.add_subplot (1,1,1) ax.scatter (x_data) Y_data) plt.ion () plt.show () # show () is an one-time display In order for a continuous display Add plt.ion () for i in range (1000): sess.run (train_step, feed_dict= {xs:x_data,ys:y_data}) if I% 50 = = 0: # to see the step improment shows the data of the actual point # print (sess.run (loss,feed_dict = {xs:x_data,ys:y_data})) try: # erase the last line before each line Erase the first line of lines Because lines has only one line Then lines [0], for the first time, there is no line ax.lines.remove (lines [0]) except Exception: pass # shows the prediction data prediction_value = sess.run (prediction, feed_dict= {xs: x_data}) # stores the value of prediction_value lines = ax.plot (x_data, prediction_value, 'rmurf, lw=5) # draws with red lines And the width is 5 # stop for 0.1 seconds and then draw the next line plt.pause (0.1)

The graph of the generated tensorboard:

Thank you for reading, the above is the content of "how to use tensorboard to display the graph of neural network". After the study of this article, I believe you have a deeper understanding of how to use tensorboard to display the graph of neural network, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report