In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-09-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article is a detailed introduction to "how to use torch.flatten() and torch.nn.Flatten() in Pytorch". The content is detailed, the steps are clear, and the details are properly handled. I hope this article "how to use torch.flatten() and torch.nn.Flatten() in Pytorch" can help you solve your doubts. Let's go deeper and learn new knowledge together with the ideas of the small editor.
torch.flatten(x) equals torch.flatten(x, 0) by default pulls the tensor into a one-dimensional vector, that is, flattening from the first dimension, torch.flatten(x, 1) means flattening from the second dimension.
import torchx=torch.randn(2,4,2)print(x) z=torch.flatten(x)print(z) w=torch.flatten(x,1)print(w) Output is: tensor([[-0.9814, 0. 8251], [ 0.8197, -1.0426], [-0.8185, -1.3367], [-0.6293, 0.6714]], [[-0.5973, -0.0944], [ 0.3720, 0.0672], [ 0.2681, 1.8025], [-0.0606, 0.4855]]]) tensor([-0.9814, 0.8251, 0.8197, -1.0426, -0.8185, -1.3367, -0.6293, 0.6714, -0.5973, -0.0944, 0.3720, 0.0672, 0.2681, 1.8025, -0.0606, 0.4855]) tensor([[-0.9814, 0.8251, 0.8197, -1.0426, -0.8185, -1.3367, -0.6293, 0.6714], [-0.5973, -0.0944, 0.3720, 0.0672, 0.2681, 1.8025, -0.0606, 0.4855]])
torch.flatten(x,0,1) represents flattening between the first and second dimensions
import torchx=torch.randn(2,4,2)print(x) w=torch.flatten(x,0,1) #Length of the first dimension is 2, length of the second dimension is 4, length after flattening is 2* 4 print (w.shape) print(w) Output is: tensor([[-0.5523, -0.1132], [-2.2659, -0.0316], [ 0.1372, -0.8486], [-0.3593, -0.2622]], [[-0.9130, 1.0038], [-0.3996, 0.4934], [ 1.7269, 0.8215], [ 0.1207, -0.9590]]]) torch.Size([8, 2]) tensor([[-0.5523, -0.1132], [-2.2659, -0.0316], [ 0.1372, -0.8486], [-0.3593, -0.2622], [-0.9130, 1.0038], [-0.3996, 0.4934], [ 1.7269, 0.8215], [ 0.1207, -0.9590]])
For torch.nn.Flatten(), because it is used in neural networks, the input is a batch of data, the first dimension is batch, and it is usually necessary to pull a data into one dimension instead of pulling a batch of data into one dimension. So torch.nn.Flatten() starts flattening from the second dimension by default.
import torch#random 32 channels 5*5 graph x=torch.randn(32,1,5,5) model=torch.nn.Sequential( #Input channel 1, output channel 6, 3*3 convolution kernel, step size 1, padding=1 torch.nn.Conv2d(1,6,3,1,1), torch.nn.Flatten())output=model(x)print (output.shape)#6 *(7-3+1)*(7-3+1) Output is: torch.Size([32, 150]) Read here, this article "How to use torch.flatten() and torch.nn.Flatten() in Pytorch" has been introduced, if you want to master the knowledge points of this article, you still need to practice it yourself to understand, if you want to know more about related articles, welcome to pay attention to the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
The market share of Chrome browser on the desktop has exceeded 70%, and users are complaining about
The world's first 2nm mobile chip: Samsung Exynos 2600 is ready for mass production.According to a r
A US federal judge has ruled that Google can keep its Chrome browser, but it will be prohibited from
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
About us Contact us Product review car news thenatureplanet
More Form oMedia: AutoTimes. Bestcoffee. SL News. Jarebook. Coffee Hunters. Sundaily. Modezone. NNB. Coffee. Game News. FrontStreet. GGAMEN
© 2024 shulou.com SLNews company. All rights reserved.