In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
Pytorch in the use of tensorboard in how to add network structure add_graph, many novices are not very clear about this, in order to help you solve this problem, the following editor will explain in detail for you, people with this need can come to learn, I hope you can gain something.
From torch.utils.tensorboard import SummaryWriterimport torchimport torchvisionfrom torchvision import datasets,transformsfrom torch.autograd import Variableclass Model (torch.nn.Module): def _ init__ (self): super (Model,self). _ _ init__ () self.conv1=torch.nn.Sequential (# enter torch.Size ([64,1,28,28]) torch.nn.Conv2d The main input parameters are the number of input channels, the number of # output channels, the convolution kernel size, the convolution kernel movement step and the adding value. # output dimension = 1 + (input dimension-convolutional kernel size + 2*padding) / convolutional kernel step # output torch.Size ([64,64,28,28]) torch.nn.ReLU (), # output torch.Size ([64,64,28,28]) torch.nn.Conv2d (64padding1), # output torch.Size ([64Power128,28,28]) torch.nn.ReLU (), torch.nn.MaxPool2d (stride=2) Kernel_size=2) # the main input parameters are pooling window size, pooling window movement step and Padding value # output torch.Size ([64,128,14,14]) self.dense=torch.nn.Sequential (# input torch.Size ([64,144,148,128]) torch.nn.Linear (14140128), # class torch.nn.Linear (in_features) Out_features,bias = True) # output torch.Size ([64, 1024]) torch.nn.ReLU (), torch.nn.Dropout (pendant 0.5), # torch.nn.Dropout class is used to prevent convolution neural network from over-fitting during training. Its working principle is simply that in the process of model training, # returns some parameters of the convolution neural network model to zero with a certain random probability. In order to achieve the purpose of # to reduce the neural connection between the two adjacent layers. The purpose of this is to make the model trained by our final training # not over-dependent on the weight parameters of each part, so as to prevent # from overfitting. For the torch.nn.Dropout class, we can set the size of the random probability value # and use the default probability value of 0.5 if nothing is set. Torch.nn.Linear (1024pion10) # output torch.Size ([64,10]) def forward (self,x): # torch.Size ([64,1,28,28]) x = self.conv1 (x) # output torch.Size ([64,128,14,14]) x = x.view (- 1Power140128) # view () function is to splice a multi-line Tensor into one line Torch.Size ([64,140128]) x = self.dense (x) # output torch.Size ([64,10]) return xtransform = transforms.Compose ([transforms.ToTensor (), transforms.Normalize (mean= [0.5], std= [0.5])]) data_train = datasets.MNIST (root = ". / data/", transform=transform,train = True,download = True) data_loader_train = torch.utils.data.DataLoader (dataset = data_train,batch_size = 64 root = True) # images Labels = next (iter (data_loader_train)) # iterator # torch.Size ([64,1,28,28]) images = torch.randn (64,1,28,28) model= Model () writer = SummaryWriter () for i in range (5): images = torch.randn (64,1,28,28) writer.add_graph (model, input_to_model=images, verbose=False) writer.flush () writer.close () # tensorboard-- logdir=runs
The results show:
Is it helpful for you to read the above content? If you want to know more about the relevant knowledge or read more related articles, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 203
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.