Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the common operations of Tensor in Pytorch

2025-03-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly shows you "what are the common operations of Tensor in Pytorch", the content is easy to understand, clear, hope to help you solve your doubts, the following let the editor lead you to study and learn "what are the common operations of Tensor in Pytorch" this article.

1. Create the classic way of Tensorimport torch#: device = torch.device ("cuda:0") x = torch.tensor ([1w.backward 2], dtype = torch.float32,device = device,requires_grad=True) w = sum (2 * x) w.backward () print (x.device) print (x.dtype) print (x.grad) # Tensory = torch.Tensor ([1jue 2jue 3]) # equivalent to y = torch.FloatTensor ([1jue 2] 3]) # 32-bit floating-point type # the latter declares that there are other types of open gradient y.requires_grad = True#, commonly used torch.LongTensor (2 * y) torch.shortTensor (2) torch.IntTensor (2) torch.IntTensor (2 * y) w.backward () print (y.grad) print (y.dtype)

Output:

Cuda:0

Torch.float32

Tensor ([2, 2.], device='cuda:0')

Tensor ([2.,2.2,2.])

Torch.float32

Creation method similar to numpy

X = torch.linspace (1mighe10dtype = torch.float32,requires_grad = True) y = torch.ones (10) z = torch.zeros ((2jing4)) w = torch.randn ((2p3) # randomly adopted from the standard normal distribution (mean is 0, variance is 1), Gaussian noise points, while rand is equivalent to random sampling # torch.normal ()? print (x) print (y) print (z) print (w)

Output

Tensor ([1, 2, 3, 4, 5, 6, 7, 8.9, 10.], requires_grad=True)

Tensor ([1.1,1.1.1.1.1.1.1.1.1.1.1])

Tensor ([[0.,0.,0.,0.]

[0.,0.,0.,0.])

Tensor ([- 0.6505, 1.3897, 2.2265]

[- 1.7815,-1.8194,-0.4143])

Convert from numpy

Np_data = np.arange (2 numpy',np_data 13). Reshape ((2)) torch_data = torch.from_numpy (np_data) # numpy to tensorprint ('numpy',np_data) print (' torch',torch_data)

Output

Numpy [[2 4 6]

[8 10 12]]

Torch tensor ([2,4,6]

[8, 10, 12], dtype=torch.int32)

2. The combination import torchx = torch.arange (0meme 10Power1). Reshape # size= (2meme 5) y = torch.ones (10) .reshape (2maimei 1) # size= (2pime 5) print (x) print (y) w = torch.cat ((xMague y), dim = 0) # starts from the far left of size by default, where the result is: (2mem2prime5) z = torch.cat ((xmemy), dim = 1) # (2) 5) print (wMagne.Size ()) print (zmemez.size ()) # and stack ()

Output:

Tensor ([0,1,2,3,4]

[5, 6, 7, 8, 9])

Tensor ([[1.1,1.1.1.1.1.]

[1, 1, 1.])

Tensor ([[0.,1.,2.,3.4.]

[5., 6., 7., 8., 9.]

[1., 1., 1., 1.]

[1, 1, 1.]) Torch.Size ([4,5])

Tensor ([0, 1, 2, 3, 4, 1, 1, 1, 1, 1.]

[5, 6, 7, 8, 9, 1, 1.]) Torch.Size ([2,10])

3. Data type conversion

Law one

X = torch.rand ((2Yu2), dtype = torch.float32) print (x.dtype) x = x.double () print (x.dtype) x = x.int () print (x)

Output:

Torch.float32

Torch.float64

Tensor ([0,0]

[0,0]], dtype=torch.int32)

Law two

X = torch.LongTensor ((2)) print (x.dtype) x = x.type (torch.float32) print (x.dtype)

Output:

Torch.int64

Torch.float32

4. Matrix calculation x = torch.arange (0Power4). Reshape (2) print (x) print (x * x) # Direct multiplication print (torch.mm (x)) # Matrix multiplication print (x + 1) # broadcast print (x.numpy ()) # convert to numpy

Output:

Tensor ([0,1]

[2, 3]])

Tensor ([0,1]

[4, 9]])

Tensor ([2,3]

[6,11]])

Tensor ([1,2]

[3, 4]])

[[0 1]

[2 3]]

5. Dimension change

It is mainly for the up and down dimension operation with dimension size 1.

Torch.squeeze (input) # remove the dimension of dimension 1 torch.unsqueeze (input,dim) # add more than one dimension to the specified location is "what are the common operations of Tensor in Pytorch" all the content of this article, thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report