Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Default and modification method of requires_grad () after merging pytorch Variable and Tensor

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article introduces the knowledge of "requires_grad () default and modification method after the merger of pytorch Variable and Tensor". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

After the pytorch update, Variable and Tensor merged. Now torch.Tensor () can backpropagate updates like Variable, the return value is that Tensor,Variable automatically creates the tensor, and the return value is Tensor, (so you don't need to use Variable in the future). After the Tensor is created, the default requires_grad=Flase can be changed to True through xxx.requires_grad_ (). Let's take a look at how the official documents are introduced.

The following code is attached and the official document code: import torchfrom torch.autograd import Variable # must call the library lis=torch.range (1Power6) when using Variabl. Reshape ((- 1p3)) # create a 1x6 shape # Line does not specify (- 1 means that the computer calculates) floattensor matrix print (lis) print (lis.requires_grad) # check whether the default requires_grad is Flaselis.requires_grad_ () # use .coach _ grad_ () Change the default requires_grad to trueprint (lis.requires_grad)

The results are as follows:

Tensor ([[1, 2, 3.]

[4, 5, 6.])

False

True

To create a Variable,Variable, you must receive Tensor data and cannot be written directly as a=Variable (range (6)). Reshape ((- 1) 3))

Otherwise, report an error Variable data has to be a tensor, but got range

Correct as follows: import torchfrom torch.autograd import Variabletensor=torch.FloatTensor (range (8)). Reshape ((- 1) 4) my_ten=Variable (tensor) print (my_ten) print (my_ten.requires_grad) my_ten.requires_grad_ () print (my_ten.requires_grad)

Results:

Tensor ([[0.,1.,2.,3.]

[4, 5, 6, 7.])

False

True

As you can see from the above, Tensor can completely replace Variable.

The following is the official document: # create Tensor x = torch of requires_grad = False by default. Ones (1) # create a tensor with requires_grad=False (default) x. Requires_grad# out: False# creates another Tensor, again requires_grad = Falsey = torch. Ones (1) # another tensor with requires_grad=False# both inputs have requires_grad=False. So does the outputz = x + y # because there are two Tensor x reparations, gradations false. Can not achieve automatic differentiation, # so the operation (operation) z=x+y of z is also unable to automatic differentiation, requires_grad=Falsez. Requires_grad# out: False# then autograd won't track this computation. Let's verifyproof # therefore cannot autograd, the program reported an error z. Backward () # out: program error: RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn# now create a tensor with requires_grad=Truew = torch. Ones (1, requires_grad = True) w. Requires_grad# out: True# add to the previous result that has require_grad=False# because the requires_grad=True of Tensor w is entered in the operation of total, so the operation can perform back propagation and derivation automatically. Total = w + z # the total sum now requires graded total. Requires_grad# out: True# autograd can compute the gradients as welltotal. Backward () w. Grad#out: tensor ([1.]) # and no computation is wasted to compute gradients for XMagi y and z, which don't require grad# does not calculate the gradient z of the three because of the requires_grad=False of ZMagol Xpeny. Grad = = x. Grad = = y. Grad = = None # Trueexisting_tensor. Requires_grad_ () existing_tensor. Requires_grad# out:True

Or a given requires_grad=True when created directly with Tensor

My_tensor = torch.zeros (3 lis 4 lis.requires_grad_ resurrected grad = True) my_tensor.requires_grad# out: Truelis=torch.range. Reshape (- 1 Japanese 3)) print (lis) print (lis.requires_grad) lis.requires_grad_ () print (lis.requires_grad)

Result

Tensor ([[1, 2, 3.]

[4, 5, 6.], requires_grad=True)

True

True

Add: the meaning of volatile and requires_grad in pytorch

Excluding subgraphs in the Backward process

The BP process of pytorch is determined by a function, loss.backward (), and you can see that the gradient of who is required is not passed in the backward () function. Then we can boldly guess that in the process of BP, pytorch calculates the gradient of all the Variable that affects loss.

But sometimes we don't want to find all the gradients of Variable. Then it is necessary to consider how to exclude subgraphs (ie. Exclude unnecessary gradient calculations).

How to exclude subgraphs from the BP process? Two parameters of Variable (requires_grad and volatile)

Requires_grad=True requirement gradient

Requires_grad=False does not require gradients

Volatile=True is equivalent to requires_grad=False. On the contrary, the opposite is true. Ok

Note: if an is requires_grad=True,b, it is requires_grad=False. Then c=a+b is requires_grad=True. The same principle applies to volatile

Why exclude subgraphs?

Some people may ask, all the gradients are calculated, and if you don't update them, you'll get it.

This involves the question of efficiency, calculating a lot of useless gradients is a waste of resources (time, computer memory)

This is the end of the content of "requires_grad () default and modification method after the merger of pytorch Variable and Tensor". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report