In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article shows you how to get started with Tensors. The content is concise and easy to understand. It will definitely brighten your eyes. I hope you can get something through the detailed introduction of this article.
First, getting started
1.Tensors (tensor)
Tensors (tensor) is similar to ndarray in NumPy. In addition, it can also use GPU to accelerate computation.
From__future__import print_function
Importtorch
Construct an uninitialized 5x3 matrix:
X = torch.empty (5,3)
Print (x)
Output:
Tensor ([[- 9.0198e-17, 4.5633e-41,-2.9021e-15])
[4.5633e-41, 0.0000e+00, 0.0000e+00]
[0.0000e+00, 0.0000e+00, 0.0000e+00]
[0.0000e+00, 0.0000e+00, 0.0000e+00]
[0.0000e+00, 0.0000e+00, 0.0000e+00]])
Construct a randomly initialized matrix:
X = torch.rand (5,3)
Print (x)
Output:
Tensor ([0.1525, 0.7689, 0.5664]
[0.7688, 0.0039, 0.4129]
[0.9979, 0.3479, 0.2767]
[0.9580, 0.9492, 0.6265]
[0.2716, 0.6627, 0.3248]])
Construct a 5X3 matrix with zero padding and data type long (long integer):
X = torch.zeros (5,3, dtype=torch.long)
Print (x)
Output:
Tensor ([0,0,0]
[0, 0, 0]
[0, 0, 0]
[0, 0, 0]
[0,0,0]])
Construct Tensor (tensor) directly from a set of data:
X = torch.tensor ([5.5,3])
Print (x)
Output:
Tensor ([5.5000, 3.0000])
Or create a new Tensor based on the existing Tensor (tensor). Unless the user provides a new value, these methods reuse the attributes of the input tensor, such as dtype:
X = x.new_ones (5,3, dtype=torch.double) # use the new_* method to set the dimension
Print (x)
X = torch.randn_like (x, dtype=torch.float) # reset the data type
Print (x) # the result dimension remains unchanged
Output:
Tensor ([[1, 1, 1.]
[1., 1., 1.]
[1., 1., 1.]
[1., 1., 1.]
[1, 1, 1.], dtype=torch.float64)
Tensor ([0.4228, 0.3279, 0.6367]
[0.9233,-0.5232,-0.6494]
[- 0.1946, 1.7199,-0.1954]
[0.1222, 0.7204,-1.3328]
[0.1230,-0.5800, 0.4562])
Output its size:
Print (x.size ())
Output:
Torch.Size ([5,3])
[note: torch.Size is actually a tuple, so it supports all tuple operations. ]
two。 Operation
Tensor operations have a variety of syntax. In the following example, we will first example the addition operation.
Addition operation: syntax 1
Y = torch.rand (5,3)
Print (x + y)
Output:
Tensor ([0.0732, 0.9384,-0.2489]
[- 0.6905, 2.1267, 3.0045]
[0.6199, 0.4936,-0.0398]
[- 2.0623,-0.5140, 1.6162]
[0.3189,-0.0327,-0.5353])
Addition operation: syntax 2
Print (torch.add (x, y))
Output:
Tensor ([0.0732, 0.9384,-0.2489]
[- 0.6905, 2.1267, 3.0045]
[0.6199, 0.4936,-0.0398]
[- 2.0623,-0.5140, 1.6162]
[0.3189,-0.0327,-0.5353])
Addition operation: using the output Tensor (tensor) as a parameter
Result = torch.empty (5,3)
Torch.add (x, y, out=result)
Print (result)
Output:
Tensor ([0.0732, 0.9384,-0.2489]
[- 0.6905, 2.1267, 3.0045]
[0.6199, 0.4936,-0.0398]
[- 2.0623,-0.5140, 1.6162]
[0.3189,-0.0327,-0.5353])
Addition operation: inner join
# adds x to y
Y.add_ (x)
Print (y)
Output:
Tensor ([0.0732, 0.9384,-0.2489]
[- 0.6905, 2.1267, 3.0045]
[0.6199, 0.4936,-0.0398]
[- 2.0623,-0.5140, 1.6162]
[0.3189,-0.0327,-0.5353])
[note: any operation to change the original tensor to realize the inner join is achieved by adding _ after it. For example, x. Copy _ (y), x. T _ (), will change the value of x. ]
You can use indexing and all the other gorgeous features as you do in NumPy.
Print (x [:, 1])
Output:
Tensor ([0.3279,-0.5232, 1.7199, 0.7204,-0.5800])
Resizing (resize): if you want a resize/reshape tensor, you can use torch.view:
X = torch.randn (4,4)
Y = x.view (16)
Z = x.view (- 1,8) #-1 is inferred
Print (x.size (), y.size (), z.size ())
Output:
Torch.Size ([4,4]) torch.Size ([16]) torch.Size ([2,8])
If you have a tensor with only one element, you can use .item () to get its value as the Python value.
X = torch.randn (1)
Print (x)
Print (x.item ())
Output:
Tensor ([0.1550])
0.15495021641254425
[extended reading: 100 + tensor operations, including permutation, indexing, slicing, mathematical operations, linear algebra, random numbers, etc., are described in detail here
(https://pytorch.org/docs/torch). ]
2. NUMPY bridge
Converting Torch Tensor to NumPy array is a piece of cake (and vice versa). Torch Tensor and NumPyarray share their underlying memory location, and changing one will change the other.
1. Convert Torch Tensor to NumPy array
A = torch.ones (5)
Print (a)
Output:
Tensor ([1, 1, 1, 1.])
B = a.numpy ()
Print (b)
Output:
[1. 1. 1. 1. 1.]
Learn how the value of numpyarray changes.
A.add1)
Print (a)
Print (b)
Output:
Tensor ([2.2,2.2,2.2,2.])
[2. 2. 2. 2. 2.]
two。 Convert NumPy array to Torch Tensor
Learn how to automatically change np array to Torch Tensor
Import numpy as np
A = np.ones (5)
B = torch.from_numpy (a)
Np.add (a, 1, out=a)
Print (a)
Print (b)
Output:
[2. 2. 2. 2. 2.]
Tensor ([2.2,2.2,2.2,2.], dtype=torch.float64)
Except for Char (character type) Tensor, all Tensors on CPU supports conversion to NumPy and return.
3. CUDA TENSORS (tensor)
You can use the .to method to move the tensor to any device.
# run this cell only if CUDA is available
# We use the ``torch.device`` object to write and read tensors if torch.cuda.is_available () on GPU:
Device = torch.device ("cuda") # an CUDA terminal object
Y = torch.ones_like (x, device=device) # create Tensor directly on GUP
X = x.to (device) # or use the string `.to ("cuda") ``directly
Z = x + y
Print (z)
Print (z.to ("cpu", torch.double)) # `. To`` can also change object data types
Output:
Tensor ([2.4519], device='cuda:0')
Tensor ([2.4519], dtype=torch.float64)
Total run time of the script: (0 minutes 6.338 seconds)
The above content is how to get started with Tensors. Have you learned any knowledge or skills? If you want to learn more skills or enrich your knowledge reserve, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.