In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article mainly introduces how to use Tensor of pytorch. It is very detailed and has certain reference value. Friends who are interested must finish reading it.
1 、 Tensors
Tensors are similar to NumPy's ndaeeays, the difference is that it can be used and accelerated on GPU.
Import package
From _ _ future__ import print_functionimport torch establishes a matrix of 5x _ 3, uninitialized x = torch.empty (5prime3) print (x)
Out
Tensor ([[1.4395e-36, 4.5848e-41, 1.4395e-36], [4.5848e-41, 1.4395e-36, 4.5848e-41], [1.4395e-36, 4.5848e-41, 2.8026e-45], [- 1.9501e+00, 8.5165e+23, 0.0000e+00], [2.5223e-43, 0.0000e+00] 0.0000e+00]]) establish a random initialization matrix x = torch.rand (5) print (x)
Out
Tensor ([[0.8074, 0.9175, 0.8109], [0.3313, 0.5902, 0.9179], [0.6562, 0.3283, 0.9798], [0.8218, 0.0817,0.4454], [0.5934,0.0040,0.3411]) to establish zero initialization matrix The data type is Long...x = torch.zeros (5dtype = torch.long) print (x).
Out
Tensor ([[0,0,0], [0,0,0], [0,0,0], [0,0,0], [0,0,0]]) establishes a tensor data from datax = torch.tensor ([5.5jue 3]) print (x).
Out
Tensor ([5.5000, 3.0000])
A new tensor is formed on the basis of the original tnesor, which will inherit the shapee and dtype attributes of the original tensor. Of course, we can also modify these attributes.
X = x.new_ones (5maiden dtype = torch.double) print (x) x = torch.randn_like (xMagne Dype = torch.float) print (x)
Out
Tensor ([1, 1, 1.], [1, 1, 1.], [1, 1, 1.], [1, 1, 1.], [1, 1, 1.], dtype=torch.float64) tensor ([[- 0.0730,-0.0716,-0.8259], [- 1.7004, 0.8790] -0.0659], [- 0.8969, 0.8736,-0.6035], [- 0.1539,-2.9178,-0.7456], [- 0.0245, 0.4075, 1.4904]]) get the sizeprint of tensor (x.size ())
Out
Torch.Size ([5,3])
Torch.size is a tuple that supports all tuple operations.
2. Four ways to add the operation of Tensor
Method 1 L
Print (xroomy)
Method two
Print (torch.add (XBY))
Method 3: output to additional tensor
Result = torch.empty (5 out= result 3) torch.add (XJI y, out= result) print (result)
Method 4: replace in place-the result is stored in y
Print (y) all in-place replacement
All operations that replace tensor in place have a suffix, such as x.copy (y), which changes x
Use standard numpy to operate print (x [: 1])
Out
Tensor ([- 0.0716, 0.8790, 0.8736,-2.9178, 0.4075]) uses torch.view to change the shape of tensor x = torch.randn (4je 4) y = x.view (16) z = x.view (- 1pm 8) # the size-1 is inferred from other dimensionsprint (x.size (), y.xize (), z.size ())
Out
Torch.Size ([4,4]) torch.Size ([16]) torch.Size ([2,8]) tensor converted to numpy, using itemx = torch.rnadn (1) print (x) print (x.item ()) Torch Tensor and numpy conversion a = torch.ones (5) print (a)
Out
Tensor ([1, 1, 1, 1.])
And changing the value of tensor will also change the value of numpy.
A.add1 print (a) print (b)
Out
Tensor ([2, 2, 2, 2.]) [2. 2. 2. 2.] Transform numpy array into pytorch Tensorimport numpy as npa = np.ones (5) b = torch.from_numpy (a) np.add (a) print (a) print (b)
Out
[2. 2. 2. 2.] tensor ([2, 2, 2, 2.], dtype=torch.float64)
All tensor on cpu supports numpy conversion, except char-shaped tensor
CUDA Tensors
Tensors can be moved to other devices using the .to method
... if torch.cuda.is_avaulable (): device = torch.device ("cuda") y = torch.ones_like (x _ my device = devcie) x = x.to (device) z = x _ ray print (z) print (z.to ("cpu", torch.double)).
Out
Tensor ([- 1.0620], device='cuda:0') tensor ([- 1.0620], dtype=torch.float64) these are all the contents of this article entitled "how to use the Tensor of pytorch". Thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.