Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the common basic operations of python neural network TensorFlow

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains "what are the common basic operations of python neural network TensorFlow". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Next, let the editor take you to learn "what are the common basic operations of python neural network TensorFlow?"

To make deep learning faster and easier to apply to new problems, choosing a deep learning tool is an essential step.

TensorFlow is an open source computing framework officially launched by Google on November 9, 2015. The TensorFlow computing framework can well support various algorithms of deep learning.

TensorFlow is well compatible with the different needs of academic research and industrial production.

On the one hand, the flexibility of TensorFlow enables researchers to use it to quickly implement new model designs.

On the other hand, the powerful distributed support of TensorFlow is also very important for the model training carried out by the industry on massive data sets. As Google's open source deep learning framework, TensorFlow includes Google's exploration of artificial intelligence and successful commercial applications over the past decade.

In addition to TensorFlow, there are some mainstream deep learning open source tools available. Each tool has its own characteristics, and you can choose your own deep learning tools according to your own needs and preferences. For example, I was exposed to Caffe at the beginning of deep learning, and then TensorFlow learned some of the features of TensorFlow after open source. I still prefer the style of TensorFlow. Of course, it is good to consider that I will use more than one deep learning tool.

This article makes a rigorous comparison of the five popular open source deep learning frameworks: caffe, Neon, TensorFlow, Theano, and Torch. The authors open source their comparative Benchmarks code: https://github.com/DL-Benchmarks/DL-Benchmarks

This paper compares three aspects: scalability (extensibility), hardware utilization (hardware utilization) and speed (speed).

Evaluation tests are all deployed on a single machine, and multithreaded CPU and GPU (Nvidia Titan X) are tested.

The speed evaluation criteria include gradient computing time (gradient computation time) and forward propagation time (forward time). For convolutional neural networks, the author also

Experiments are carried out on different convolution algorithms supported by these depth frameworks and their corresponding performance.

Through the experiment, the following conclusions are drawn.

Theano and Torch are the most scalable deep learning frameworks.

In terms of test performance on CPU, Torch is the best, followed by Theano.

For large-scale convolution and fully connected networks, Torch is the best performance on GPU, followed by Neon.

Theano wins the best in deploying and training LSTM networks caffe is the easiest standard deep learning framework to test and evaluate performance

Finally, TensorFlow is somewhat similar to Theano and is a more flexible framework, but its performance is not as good as that of the above frameworks.

However, after all, this article is a thing of the past. At that time, TensorFlow could only use cuDNN v.2, but now I have installed V5.1, and TensorFlow has released version 1.0. At present, the performance of each tool still needs new evaluation to explain the problem.

Variables: create, initialize, save, and load

When training the model, use variables to store and update parameters. The variable contains the Tensor that is stored in the cache in memory. They need to be explicitly initialized when modeling, and they must be stored on disk after model training. The values of these variables can be loaded after model training and analysis.

This document describes the following two TensorFlow classes. Click the following link to view the complete API document:

Tf.Variable class

Tf.train.Saver class

Refer to TensorFlow Chinese Community

-add a nerve layer

Input parameters are inputs, in_size, out_size, and activation_function

# add layer def add_layer (inputs, in_size, out_size, activation_function=None): weights = tf.Variable (tf.random_normal ([in_size, out_size]), name='weights') biases = tf.Variable (tf.zeros ([1, out_size]) + 0.1, name='biases') y = tf.matmul (inputs) Weights) + biases if activation_function is None: outputs = y else: outputs = activation_function (y) return outputs

-loss

Loss = tf.reduce_mean (tf.reduce_sum (tf.square (ys-prediction), reduction_indices= [1])

Loss function cross_entropy Cross Entropy of Classification problem

Loss = tf.reduce_mean (- tf.reduce_sum (ys * tf.log (prediction), reduction_indices= [1]))

-create

When you create a variable, you pass a tensor as an initial value to the constructor Variable (). TensorFlow provides a series of operators to initialize the tensor, whether the initial value is constant or random.

Note that all of these operators require you to specify the shape of the tensor. That shape automatically becomes the shape of the variable. The shape of a variable is usually fixed, but TensorFlow provides an advanced mechanism to readjust its number of rows.

# Create two variables.weights = tf.Variable (tf.random_normal ([784,200], stddev=0.35), name= "weights") biases = tf.Variable (tf.zeros ([200]), name= "biases")

-initialize

The initialization of variables must be done explicitly before other operations in the model are run. The easiest way is to add an operation that initializes all variables and run that operation first before using the model.

Use tf.global_variables_initializer () to add an operation to initialize the variable. Remember to run that operation after the model has been fully built and loaded.

# 7. The initialization variable init = tf.global_variables_initializer () # tf.global_variables_initializer () initializes all variables in parallel # sometimes you need to initialize the current variable with the initialization value of another variable, so note that # use the initialized_value () attribute of other variables when initializing a new variable with the values of other variables. # you can directly use the initialized value as the initial value of the new variable, or you can assign it to the new variable as a value calculated as a tensor. # W1 = tf.Variable (tf.random_normal ([784,200], stddev=0.35), name= "W1") # w2 = tf.Variable (w1.initialized_value (), name= "w2") # 8. Startup diagram (graph) sess = tf.Session () sess.run (init)

-initialized by another variable

You sometimes need to initialize the current variable with the initialization value of another variable. Because tf.global_variables_initializer () initializes all variables in parallel, you need to be careful when there is such a need. When initializing a new variable with the value of another variable, use the initialized_value () attribute of the other variable. You can directly use the initialized value as the initial value of the new variable, or you can assign it to the new variable as a value calculated as a tensor.

W1 = tf.Variable (tf.random_normal ([784,200], stddev=0.35), name= "W1") W2 = tf.Variable (w1.initialized_value (), name= "w2") so far, I believe you have a deeper understanding of "what are the common basic operations of python neural network TensorFlow". You might as well do it in practice! Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report