In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly shows you the "example analysis of automatic differentiation in Python", which is easy to understand and clear. I hope it can help you solve your doubts. Let me lead you to study and study the "example analysis of automatic differentiation in Python".
I. brief introduction
Antograd packet is the core of all neural networks in Pytorch. Autograd provides automatic differentiation for all operations on Tensor, which is a framework defined by runs, which means that backward propagation is defined in the way the code runs, and each iteration may be different
II. TENSOR
Torch.Tensor is the core of the package.
1. If the property .operations _ grad is set to True, all actions against tensor will begin to be tracked.
two。 After the calculation is complete, you can call backward () from the band calculation multi-gradient. The gradient of the tensor is accumulated into the .grad attribute.
3. To stop tracking tensor history, you can call .detach (), which separates from the calculation history and prevents future calculations from being tracked.
4. To stop tracking history (and using memory), you can wrap the code block with with torch.no_grad ():. It is useful when evaluating the model, because the model has trainable parameters of requires_grad=True during the training phase, but it does not need a gradient during the evaluation phase.
5. Another important thing for autograd implementations is Function. Tensor and Function connect to each other and construct a pulmonary circulation map, which keeps the complete historical information of the whole calculation process. Each tensor has a .grad _ fn attribute that holds a reference to the Function that created the tensor (grad_fn is None if the tensor is created by the user)
6. If you want to calculate the derivative, call Tensor.backward (). If Tensor is a scalar (containing one element data), you do not need to specify any parameter backward (), but if it has more elements, you need to specify the gradient parameter to specify the shape of the tensor
Import torch# creates a tensor and sets requires_grad=Truex=torch.ones (2Magne2) print (x) # for the tensor operation y=x+2print (y) print (y.grad_fn) # y is created as a result of the operation, so he has a grad_fn# operation on y z=y*y*3out=z.mean () print (zjingout)
Running result
# if no corresponding parameter is provided during variable input, the input tag defaults to False,requires_grad_ () will change the tensor requires_grad tag a=torch.randn (2,2) a = ((aqui3) / (Amur1)) # if requires_grad is not set before, it will output Falseprint (a.requires_grad) a.progresresgrad _ (True) # after the change of the above statement Here you should output Trueprint (a.requires_grad) b = (apoca) .sum () # output gradient information print (b.grad_fn)
Running result
Third, gradient
Now propagate backwards, because the output contains a scalar, out,backward () is equivalent to out.backward (torch.tensor (1,))
Out.backward () # propagate print (x.grad) # print gradient
Running result
Principle
4. Example-- Jacobian Vector Product # Jacobian Vector Product x=torch.randn (3 recalcitrant gradation True) y=x*2print (y) while y.data.norm ()
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.