In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "what are the common pits in pytorch". The content is simple and clear. I hope it can help you solve your doubts. Let the editor lead you to study and learn this article "what are the common pits in pytorch?"
1.pytorch transforms tensorx=np.random.randint (1010 100, (1010 10) 10) x=TF.to_tensor (x) print (x)
This function automatically normalizes the input data. For example, sometimes if we need to convert 0-255images to numpy-type data, it will automatically change to between 0-1.
The difference between 2.stack and cat
Stack
X=torch.randn (1) z.shape 2)) y=torch.randn (1) 2)) z=torch.stack (x) # default dim=0print (2) # torch.Size ([2, 1)
So the data after stack is easy to understand, z [0J.] The data is xQuery z [1BI..] The data is y.
Cat
Z=torch.cat ((xQuery y)) print (z.size ()) # torch.Size ([2,2,3])
After cat, the data z [0jcjjjg:] is the value of x, and z [1je jade:] is the value of y.
The most important thing is that the size of the data after stack will have an extra dimension, while cat will not. There is a very simple example to illustrate, for example, to train a detection model, label is some marked points, eg: [x1memy1memx2jiny2]
If you add batchsize to the network, then Size: [batchsize,4], if I already have two piles of data, data1:Size [128jue 4] and data2:Size [128jue 4], if you need to put the two data together, then the target data:Size [256jin4].
Obviously what we need to do is: torch.cat ((data1,data2))
If our data is like this: there are 100 label, each label is put into a list (data), [X1Magi y1rect x2Magi y2], [x1Magi y1memx2Ji y2],.] Where data is a list with a length of 100, and each element in list is a tag for a picture, and size is [4]. We need to put them together into a Size: [100jue 4] data.
Obviously what we are going to do is torch.stack (data). And the input parameter of torch.stack is of type list!
Add: cat, stack, tranpose, permute, unsqeeze in pytorch
Transformation operations commonly used for tensor are provided in pytorch.
Cat connection
Splice the data along a certain dimension. The total dimension of the data remains unchanged after cat.
For example, the following code splices two two-dimensional tensor (2-dimensional 3-camera-1-dimensional 3), and after splicing, it becomes 3-dimensional 3-dimensional tensor or 2-dimensional tensor.
The code is as follows:
Import torchtorch.manual_seed (1) x = torch.randn (2jue 3) y = torch.randn (1pr 3) print (x Magi y)
Results:
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
[torch.FloatTensor of size 2x3]
-1.5228 0.3817-1.0276
[torch.FloatTensor of size 1x3]
Put the two tensor together:
Torch.cat ((XBI y), 0)
Results:
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
-1.5228 0.3817-1.0276
[torch.FloatTensor of size 3x3]
More flexible spelling:
Torch.manual_seed (1) x = torch.randn (2Magne3) print (x) print (torch.cat ((xmemx), 0)) print (torch.cat ((xmemx), 1))
Result
/ / x
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
[torch.FloatTensor of size 2x3]
/ / torch.cat ((xPowerx), 0)
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
[torch.FloatTensor of size 4x3]
/ / torch.cat ((xPowerx), 1)
0.6614 0.2669 0.0617 0.6614 0.2669 0.0617
0.6213-0.4519-0.1661 0.6213-0.4519-0.1661
[torch.FloatTensor of size 2x6]
Stack, adding new dimensions for stacking
On the other hand, stack adds new dimensions.
If two 1D tensor are stack on the 0th dimension, then the tensor; of 2D2will become stack on the first dimension, and it will become tensor of 2D2V2.
See the code:
A = torch.ones ([1mai 2]) b = torch.ones ([1jue 2]) c = torch.stack ([arecine b], 0) / / the 0th dimension stack
Output:
(0,.) =
1 1
(1.) =
1 1
[torch.FloatTensor of size 2x1x2]
C = torch.stack ([aforme b], 1) / / 1st dimension stack
Output:
(0,.) =
1 1
1 1
[torch.FloatTensor of size 1x2x2]
Transpose, the two dimensions are interchangeable
The code is as follows:
Torch.manual_seed (1) x = torch.randn (2pm 3) print (x)
The original result of x:
0.6614 0.2669 0.0617
0.6213-0.4519-0.1661
[torch.FloatTensor of size 2x3]
Interchange the dimensions of x
X.transpose (0Phone1)
Result
0.6614 0.6213
0.2669-0.4519
0.0617-0.1661
[torch.FloatTensor of size 3x2]
Permute, multi-dimension interchange, more flexible transpose
Permute is a more flexible transpose, which can flexibly change the dimensions of the original data, while the data itself remains unchanged.
The code is as follows:
X = torch.randn (2 → 3 2print 4) print (x.size ()) x quop = x.permute (1 → 0) # change the original first dimension to 0 dimension, similarly, 0 meme 1 prime2 → (x_p.size ())
Results:
Torch.Size ([2,3,4])
Torch.Size ([3,2,4])
Squeeze and unsqueeze
It is often used to increase or decrease dimensions. For example, if there is no batch dimension, increase the batch dimension to 1.
Squeeze (dim_n) compression reduces the dim_n dimension, that is, removes the dim_n dimension with a number of elements of 1.
Unsqueeze (dim_n), increase the dim_n dimension, the number of elements is 1.
The above code:
# define the tensor import torchb = torch.Tensor (2Magne1) b.shapeOut [28]: torch.Size ([2Magne1]) # without parameters, remove all dimensions with one number of elements, b _ = b.squeeze () b_.shapeOut [30]: torch.Size ([2]) # plus parameters, remove the element of the first dimension as 1, and it will not work Because there are two elements in the first dimension, b _ = b.squeeze (0) b_.shapeOut [32]: torch.Size ([2,1]) # so it's OK, b _ = b.squeeze (1) b_.shapeOut [34]: torch.Size ([2]) # add a dimension b _ = b.unsqueeze (2) b_.shapeOut [36]: torch.Size ([2,1,1]) above is all the content of the article "what are the common pits in pytorch?" Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.