Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the pytorch gradient clipping method?

2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article introduces the relevant knowledge of "what is the gradient tailoring method of pytorch". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

I won't say much nonsense. Let's look at the example.

Import torch.nn as nnoutputs = model (data) loss= loss_fn (outputs, target) optimizer.zero_grad () loss.backward () nn.utils.clip_grad_norm_ (model.parameters (), max_norm=20, norm_type=2) optimizer.step ()

Parameters of nn.utils.clip_grad_norm_:

Parameters-A variable-based iterator that normalizes gradients

Maximum norm of max_norm-gradient

Norm_type-specifies the type of norm, default to L2

This is the end of the content of "what is the pytorch gradient tailoring method". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report