In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)05/31 Report--
This article introduces the relevant knowledge of "how to implement a simple custom network layer with Pytorch". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
1. Layers without parameters
First, we construct a custom layer without any parameters. To build it, we just need to inherit the base layer class and implement the forward propagation function.
Import torchimport torch.nn.functional as Ffrom torchimport nnclass CenteredLayer (nn.Module): def _ init__ (self): super (). _ init__ () def forward (self, X): return X-X.mean ()
Enter some data to verify that the network is working properly:
Layer = CenteredLayer () print (layer (torch.FloatTensor ([1,2,3,4,5])
The output is as follows:
Tensor ([- 2.,-1., 0., 1., 2.])
The operation is normal, indicating that there is nothing wrong with the network.
Now merge our self-built network layer as a component into a more complex model and enter data for verification:
Net = nn.Sequential (nn.Linear (8,128), CenteredLayer ()) Y = net (torch.rand (4,8)) print (Y.mean ()) # because there are many parameters and outputs of the model, so the mean value of Y is output here, and the model can be verified to run.
The results are as follows:
Tensor (- 5.5879e-09, grad_fn=)
2. Layers with parameters
Here we use built-in functions to create parameters, which can provide some basic management functions and are more convenient to use.
A simple custom full connection layer is implemented here, and you can modify it according to your needs.
Class MyLinear (nn.Module): def _ init__ (self, in_units, units): super (). _ _ init__ () self.weight = nn.Parameter (torch.randn (in_units, units)) self.bias = nn.Parameter (torch.randn (units,) def forward (self, X): linear = torch.matmul (X, self.weight.data) + self.bias.data return F.relu (linear)
Next, instantiate the class and access its model parameters:
Linear = MyLinear (5,3) print (linear.weight)
The results are as follows:
Parameter containing:
Tensor ([- 0.3708, 1.2196, 1.3658]
[0.4914,-0.2487,-0.9602]
[1.8458, 0.3016,-0.3956]
[0.0616,-0.3942, 1.6172]
[0.7839, 0.6693,-0.8890]], requires_grad=True)
Then enter some data to view the model output:
Print (linear (torch.rand (2,5)) # results are as follows: tensor ([1.2394, 0.0000, 0.0000], [1.3514, 0.0968, 0.6667])
We can also build the model using a custom layer in the same way as using the built-in full connectivity layer.
Net = nn.Sequential (MyLinear (64,8), MyLinear (8,1)) print (net (torch.rand (2,64)) # the results are as follows: tensor ([[4.1416], [0.2567]]) III.
We can design a custom layer through the basic layer class. This allows us to define flexible new layers that behave differently from any existing layer in the deep learning framework.
After the custom layer is defined, we can call the custom layer in any environment and network architecture.
Layers can have local parameters, which can be created by built-in functions.
IV. Reference
"hands-on deep learning"-hands-on deep learning 2.0.0-beta0 documentation
Https://zh-v2.d2l.ai/
Attached: pytorch gets the number of layers of the network and the name of each layer # create your own network import modelsmodel = models.__dict__ ["resnet50"] (pretrained=True) for index, (name, param) in enumerate (model.named_parameters ()): print (str (index) + "" + name)
The results are as follows:
0 conv1.weight
1 bn1.weight
2 bn1.bias
3 layer1.0.conv1.weight
4 layer1.0.bn1.weight
5 layer1.0.bn1.bias
6 layer1.0.conv2.weight
7 layer1.0.bn2.weight
8 layer1.0.bn2.bias
9 layer1.0.conv3.weight
This is the end of the content of "how to implement a simple custom network layer in Pytorch". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.