In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
How to introduce the meaning of in-place operation in PyTorch, many novices are not very clear about this, in order to help you solve this problem, the following editor will explain in detail for you, people with this need can come to learn, I hope you can gain something.
This article mainly introduces the meaning of in-place operation in PyTorch, which has good reference value and hopes to be helpful to everyone. Let's follow the editor and have a look.
In-place operation in pytorch refers to changing the value of a tensor without copying, but directly changing its value on the original memory. You can make it an in-place operator.
The suffix "_" is often added to the pytorch to represent the in-place in-place operation, such as .add _ () or .scoter (). The + =, * = in python is also in-place operation.
The following is a normal add operation, and the value of x does not change after the end add operation:
Import torchx=torch.rand (2) # tensor ([0.8284, 0.5539]) print (x) y=torch.rand (2) print (Xeroy) # tensor ([1.0250, 0.7891]) print (x) # tensor ([0.8284, 0.5539])
The following is the in-situ operation, which changes the value of the original variable after execution:
Import torchx=torch.rand (2) # tensor ([0.8284, 0.5539]) print (x) y=torch.rand (2) x.addy) print (x) # tensor ([1.1610, 1.3789])
In the official question document, the following paragraph is used:
If you use in-place operation without reporting an error, then you can be sure that your gradient calculation is correct.
Supplementary knowledge: the role of inplace in nn.ReLU (inplace=True) in PyTorch
When we use PyTorch to build a neural network, we will encounter nn.ReLU (inplace=True). What does inplace=True mean?
Nn.Conv2d, nn.ReLu (inpalce=True), # inplace is True, default is False
Whether to directly overwrite the previous value with the calculated value
For example: X = xylene 1
That is, the value obtained from the + 1 operation of the original value x is directly assigned to x.
Instead of finding an intermediate variable y as follows:
Y=x+1x=y
First assign x to the intermediate variable y after the + 1 operation, and then assign the value of y to x
This requires memory to store the variable y
So when inplace=True:
Is to directly modify the tensor passed from the upper network nn.Conv2d, so that it can save operational memory and do not have to store other variables.
Is it helpful for you to read the above content? If you want to know more about the relevant knowledge or read more related articles, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.