Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the Pytorch modeling process in python

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

Editor to share with you what the Pytorch modeling process in python is, I believe most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!

Generally speaking, we have the following steps to train neural networks:

Import library

Set the initial value of the training parameters

Import a dataset and make a dataset

Define neural network architecture

Define the training process

Training model

Below, I will annotate the above steps using the code:

1 Import library import torchfrom torch import nnfrom torch.nn import functional as Ffrom torch import optimfrom torch.utils.data import DataLoader, DataLoaderimport torchvisionimport torchvision.transforms as transforms2 set initial value # learning rate lr = 0. 0 optimization algorithm parameters gamma = 0. 1 training in small batches bs = 12 overall data cycles epochs = 103 import and make data sets

This time we use the FashionMNIST image data set, each image is a 28 to 28 pixel array, a total of 10 clothing categories, such as dresses, sneakers, bags and so on.

Note: it takes a long time to run the download for the first time.

# Import dataset mnist = torchvision.datasets.FashionMNIST (root ='. / Datastes', train = True, download = True, transform = transforms.ToTensor ()) # make dataset batchdata = DataLoader (mnist, batch_size = bs, shuffle = True, drop_last = False)

We can check the data:

For x, y in batchdata: print (x.shape) print (y.shape) break# torch.Size ([128,1,28,28]) # torch.Size ([128])

You can see that there are 128 samples in a batch, and the dimension of each sample is 1-28-28.

Then we determine the input and output dimensions of the model:

# input dimension input_ = mnist.data [0] .numel () # 78 output dimension output_ = len (mnist.targets.unique ()) # 104 defines neural network architecture

First use a full connection layer of 128 neurons, then activate the function with relu, then map the result to the dimension of the label, and activate it with softmax.

# define neural network architecture class Model (nn.Module): def _ init__ (self, in_features, out_features): super (). _ _ init__ () self.linear1 = nn.Linear (in_features, 128, bias = True) self.output = nn.Linear (128, out_features, bias = True) def forward (self, x): X = x.view (- 1) 28) sigma1 = torch.relu (self.linear1 (x)) sigma2 = F.log_softmax (self.output (sigma1), dim =-1) return sigma25 defines the training process

In practical application, we usually encapsulate the training model into a function, which can be subdivided into the following steps:

Define loss function and optimizer

Complete forward propagation

Calculate the loss

Back propagation

Gradient update

Gradient zeroing

On the basis of these six-step core operations, we usually need to monitor the training progress, loss value and accuracy of the model.

The comment code is as follows:

# function def fit (net, batchdata, lr, gamma, epochs) encapsulating the training model: # parameters: model architecture, data, learning rate, optimization algorithm parameters, number of times to traverse data # 5.1 define loss function criterion = nn.NLLLoss () # 5.1 define optimization algorithm opt = optim.SGD (net.parameters (), lr = lr, momentum = gamma) # Monitor progress: before the loop Not a single sample has seen samples = 0 # Monitoring accuracy: before the cycle, the number of correct predictions is 0 corrects = 0 # full data training several times for epoch in range (epochs): # train each batch for batch_idx, (x, y) in enumerate (batchdata): # to be on the safe side, change the tag to 1 dimension Align with sample y = y.view (x.shape [0]) # 5.2 forward Propagation sigma = net.forward (x) # 5.3 calculate loss loss = criterion (sigma Y) # 5.4 backpropagation loss.backward () # 5.5 Update gradient opt.step () # 5.6 gradient zeroing opt.zero_grad () # Monitoring progress: one batch per training The data seen by the model will increase the solution accuracy of x.shape [0] samples + = x.shape [0] #: all the correct sample sizes / the total sample sizes that have been seen # get the prediction label yhat = torch.max (sigma -1) [1] # add up correctly corrects + = torch.sum (yhat = = y) # every 200batch and at the end Print model progress if (batch_idx + 1)% 200 = = 0 or batch_idx = (len (batchdata)-1): # Monitor model progress print ("Epoch {}: [{} / {} {: .0f}%], Loss: {: .6f}, Accuracy: {: .6f}" .format (epoch + 1) Samples, epochs*len (batchdata.dataset), 100*samples/ (epochs*len (batchdata.dataset)), loss.data.item (), float (100.0*corrects/samples)) 6 training model # set random seed torch.manual_seed (51) # instantiation model net = Model (input_) Output_) # training model fit (net, batchdata, lr, gamma, epochs) # Epoch2: [256000000004%], Loss:0.524430, Accuracy:69.570312# Epoch2: [512000000009%], Loss:0.363422, Accuracy:74.984375#. # Epoch20: [600000000100100%], Loss:0.284664, Accuracy:85.771835

Now we have trained the most basic neural network with Pytorch, and we can see the training results. You can copy the code to run!

Although we do not use complex models, our basic ideas are the same every time we model.

These are all the contents of the article "what is the Pytorch modeling process in python?" Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report