Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to solve the problem of pytorch training Neural Network burst memory

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

Editor to share with you how to solve the pytorch training neural network burst memory, I believe that most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!

The overall steps in establishing an artificial neural network include the following four steps:

1. Load the original data

2. Construct concrete neural network.

3. Train the data.

4. Data testing and verification

Data loading of pytorch Neural Network Take the raw data of MINIST writing font as an example: import torchimport matplotlib.pyplot as pltdef plot_curve (data): fig=plt.figure () plt.plot (range (len (data)), data,color= "blue") plt.legend (["value"], loc= "upper right") plt.xlabel ("step") plt.ylabel ("value") plt.show () def plot_image (img,label) Name): fig=plt.figure () for i in range (6): plt.subplot (2) plt.tight_layout () plt.imshow (IMG [I] [0] * 0.3081 * 0.1307 plt.title = "gray", interpolation= "none") plt.title ("{}: {}" .format (name Label [I]. Item () plt.xticks ([]) plt.yticks ([]) plt.show () def one_hot (label,depth=10): out=torch.zeros (label.size (0), depth) idx=torch.LongTensor (label). View (- 1) out.scatter_ (dim=1,index=idx Value=1) return out batch_size=512import torchfrom torch import nn # complete the construction of the neural network package from torch.nn import functional as F # contains the commonly used function package from torch import optim # optimization toolkit import torchvision # visual toolkit import matplotlib.pyplot as pltfrom utils import plot_curve,plot_image One_hot#step1 load dataset loads packet train_loader=torch.utils.data.DataLoader (torchvision.datasets.MNIST ("minist_data", train=True,download=True,transform=torchvision.transforms.Compose ([torchvision.transforms.ToTensor (), torchvision.transforms.Normalize ((0.1307,), (0.3081,))]), batch_size=batch_size,shuffle=True) test_loader=torch.utils.data.DataLoader ("minist_data", train=True,download=False) Transform=torchvision.transforms.Compose ([torchvision.transforms.ToTensor (), torchvision.transforms.Normalize ((0.1307,), (0.3081,)]), batch_size=batch_size,shuffle=False) x print next (iter (train_loader)) print (x.shape.shape) plot_image (x Magne y, "image") print (x) shape (y)

Taking the construction of a simple neural network for regression problem as an example

The specific implementation code is as follows: import torchimport torch.nn.functional as F # incentive functions are all here x = torch.unsqueeze (torch.linspace (- 1,1,100), dim=1) # x data (tensor), shape= (100,1) y = x.pow (2) + 0.2 * torch.rand (x.size ()) # noisy y data (tensor), shape= (100) 1) class Net (torch.nn.Module): # inherit torch's Module (fixed) def _ _ init__ (self, n_feature, n_hidden, n_output): # define layer information N_feature how many inputs, n_hidden each layer of neurons, n_output how many output super (Net, self). _ _ init__ () # inherits _ _ init__ function (fixed) # defines what form each layer uses self.hidden = torch.nn.Linear (n_feature, n_hidden) # defines the hidden layer Linear output self.predict = torch.nn.Linear (n_hidden, n_output) # define the output layer linear output def forward (self, x): # x is the input information is data, and it is also the forward function in Module, defining the forward transmission process of neural network. Combine the layer information in _ _ init__ one by one # forward propagate the input value, and analyze the output value x = F.relu (self.hidden (x)) # define the excitation function (linear value of the hidden layer) x = self.predict (x) # output layer Output value return x net = Net (n_feature=1, n_hidden=10, n_output=1) print (net) # structure of net "" Net ((hidden): Linear (1-> 10) (predict): Linear (10-> 1)) "" # optimizer is the training tool optimizer = torch.optim.SGD (net.parameters (), lr=0.2) # all the parameters passed into net Learning rate loss_func = torch.nn.MSELoss () # Formula for calculating the error between predicted and real values (mean square error) for t in range: # steps of training prediction = net (x) # feed net training data x, each iterative step Output predicted value loss = loss_func (prediction, y) # calculate the error between the two # Optimization step: optimizer.zero_grad () # clear the residual update parameter value of the previous step loss.backward () # error back propagation Calculate the parameter update value optimizer.step () # apply the parameter update value to the parameters of net import matplotlib.pyplot as plt plt.ion () # Real-time drawing something about plotting for tin range: prediction = net (x) # input x and predict based on x loss = loss_func (prediction, y) # must be (1. Nn output, 2. Target) optimizer.zero_grad () # clear gradients for next train loss.backward () # backpropagation Compute gradients optimizer.step () # apply gradients if t% 5 = = 0: # draw every five steps # plot and show learning process plt.cla () plt.scatter (x.data.numpy (), y.data.numpy ()) plt.plot (x.data.numpy (), prediction.data.numpy (),'r Mutual, lw=5) plt.text (0.5,0 'Loss=%.4f'% loss.data.numpy (), fontdict= {' size': 20, 'color':' red'}) plt.pause (0.1) plt.ioff () plt.show () these are all the contents of the article "how to solve the problem of memory explosion in pytorch training Neural Network" Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report