In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
How to use finetune in Pytorch, many novices are not very clear about this, in order to help you solve this problem, the following small series will explain in detail for everyone, people who have this need can learn, I hope you can gain something.
1. fixed parameters
for name, child in model.named_children(): for param in child.parameters(): param.requires_grad = False
After that, only pass in parameters that need to be passed back, otherwise an error will be reported.
filter(lambda param: param.requires_grad, model.parameters())
2. Lower the learning rate and speed up the decay
Finetune is fine-tuning on a pre-trained model, and the learning rate cannot be too large.
It is not clear that the learning rate could be reduced more rapidly. This way, the stepsize can be smaller when using the step policy.
base_lr trained directly from raw data is generally 0.01, fine tuning is smaller than 0.01, set to 0.001
Smaller than direct training, direct training stepsize is 100000, finetune stepsize: 50000
3. Fix bn or cancel dropout:
Batchnorm affects the effectiveness of training, tracking the mean and variance of samples with each batch. For fixed networks, bn should use global values
def freeze_bn(self): for layer in self.modules(): if isinstance(layer, nn.BatchNorm2d): layer.eval()
During training, model.train() modifies the pattern, freeze_zn() should follow here
4. filtering parameters
When training, for the optimizer, only the parameters that need to be changed should be passed, otherwise an error will be reported.
filter(lambda p: p.requires_grad, model.parameters()) Does it help you to read the above? If you still want to have further understanding of related knowledge or read more related articles, please pay attention to the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.