In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
How to compare TensorFlow and PyTorch, in view of this problem, this article introduces the corresponding analysis and answer in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible method.
TensorFlow or PyTorch? Starting with TensorFlow or PyTorch? A year ago, the issue was uncontroversial and, of course, TensorFlow. But times have changed, and now the situation is very different. Let's analyze and compare the two mainstream frameworks.
First of all, let's take a look at the latest statistics. The data in the figure below are obtained from the arxiv paper. The yellow line above is the proportion of TensorFlow usage, and the red line below is the proportion of PyTorch usage. As you can see, the recent data are almost the same, and even the red line PyTorch is slightly better in 2019.6.
The bar chart on the right shows the cumulative data from January to June. The proportion of TensorFlow is still slightly higher, but the growth rate of 23% is significantly lower than the 19.4% of PyTorch. In other words, the utilization rates of TensorFlow and PyTorch in academia are not bad.
Source: https://www.oreilly.com/ideas/one-simple-graphic-researchers-love-pytorch-and-tensorflow
What are the factors that determine the utilization of a framework? I would like to summarize the following four aspects:
Ease of use
Speed
Number of operators
Open source model
The first is ease of use. Since the official release of PyTorch 1.0 stable version in December 2018, PyTorch has only been able to increase its ease of use for more than half a year. The ease of use of PyTorch on the one hand is that debug is simple, you can directly set breakpoints to view the values of each tensor, on the other hand, tensor and numpy format can be converted and called each other, and you can use the control flow of python, which greatly expands its flexibility (the eager mode launched by TensorFlow has a similar purpose and effect).
The second is speed. Training is a time-consuming and laborious process, it is normal to train with GPU for one or two days, and even some large models need ten days and a half months with a large amount of data (such as BERT large), so the training speed is also an indicator that people are more concerned about. Although PyTorch provides a very flexible interface and adopts dynamic graph mechanism, it also does a lot of optimization, such as asynchronous call, pipelined execution as far as possible, so that its speed is similar to that of TensorFlow, and even better than TensorFlow in some scenarios.
The third is the number of operators. TensorFlow is undoubtedly the biggest winner at this point, providing more than 8000 python API (see https://tensorflow.google.cn/api_docs/python), there are basically no operators that users can't find, and all algorithms can be spelled out with TensorFlow operators. However, too much API is also a burden, both low level and high level, easy to make users dizzy. Due to its late start, PyTorch has fewer disadvantages in quantity, but it has advantages in quality.
The fourth is the open source model. This is actually very important, just imagine, now you want to use BERT, this is published by Google research, based on TensorFlow open source, then naturally can only choose TensorFlow to start. Although some people open source PyTorch version, but after all, the time will be later, and unofficially, the quality can not be guaranteed 100%, the attention will be greatly reduced. However, PyTorch also attaches great importance to this aspect. Before, he Kaiming was dug up to build his own open source library for image detection and segmentation, Detectron, which played a great role in improving the utilization rate of PyTorch.
Generally speaking, these two frameworks need to be necessary. Master their own skills and look at the new models in order to remain invincible.
Appendix:
TensorFlow models:
Bert
Boosted_trees
Mnist
Resnet
Transformer
Wide_deep
Adversarial_crypto
Adversarial_text
Attention_ocr
Audioset
Autoencoder
Brain_coder
Cognitive_mapping_and_planning
Compression
Cvt_text
Deep_contextual_bandits
Deep_speech
Deeplab
Delf
Differential_privacy
Domain_adaptation
Fivo
Gan
Im2txt
Inception
Keypointnet
Learning_to_remember_rare_events
Learning_unsupervised_learning
Lexnet_nc
Lfads
Lm_1b
Lm_commonsense
Maskgan
Namignizer
Neural_gpu
Neural_programmer
Next_frame_prediction
Object_detection
Pcl_rl
Ptn
Marco
Qa_kg
Real_nvp
Rebar
Resnet
Seq2species
Skip_thoughts
Slim
Street
Struct2depth
Swivel
Syntaxnet
Tcn
Textsum
Transformer
Vid2depth
Video_prediction
PyTorch models
AlexNet
VGG
ResNet
SqueezeNet
DenseNet
Inception v3
GoogLeNet
ShuffleNet v2
MobileNet v2
ResNeXt
FCN ResNet101
DeepLabV3 ResNet101
Faster R-CNN ResNet-50 FPN
Mask R-CNN ResNet-50 FPN
Keypoint R-CNN ResNet-50 FPN
This is the end of the answer to the question on how to compare TensorFlow and PyTorch. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel for more related knowledge.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.