Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What if PyTorch square root reports an error?

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces the PyTorch square root error report how to do, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let the editor take you to understand it.

Problem description

Initially use PyTorch to calculate the square root, create a tensor through range (), and then find the square root of it.

A = torch.tensor (list (range (9) b = torch.sqrt (a)

The following error is reported:

RuntimeError: sqrt_vml_cpu not implemented for 'Long'

Reason

Data of type Long does not support log logarithm, so why is Tensor a type of Long? Because int is used by default when creating List arrays, the data type becomes Long after being converted from List to torch.Tensor.

Print (a.dtype)

Torch.int64

Solution method

Specify the data type as floating point in advance and re-execute:

B = torch.sqrt (a.to (torch.double)) print (b)

Tensor ([0.0000, 1.0000, 1.4142, 1.7321, 2.0000, 2.2361, 2.4495, 2.6458, 2.8284], dtype=torch.float64)

Supplement: detailed explanation of common operations in pytorch20 pytorch

Matrix and scalar

This is the matrix (tensor) where each element operates with a scalar.

Import torcha = torch.tensor ([1, 2]) print (axi1) > tensor ([2,3]) Hadamard product

This is the multiplication of two tensors of the same size, and then the multiplication of the corresponding elements is this Hadamard product, which is also called element wise.

A = torch.tensor ([1d2]) b = torch.tensor ([2p3]) print (aplasb) print (torch.mul (arecob)) > tensor ([2,6]) > > tensor ([2,6])

This torch.mul () and * are equivalent.

Of course, division is similar:

A = torch.tensor ([1. Torch.tensor 2.]) b = torch.tensor ([2. Meme 3.]) print (a Masterb) print (torch.div (a Masterb)) > tensor ([0.5000, 0.6667]) > > tensor ([0.5000, 0.6667])

The torch.div () we can find is actually /, something like this: torch.add is +, torch.sub () is -, but the operation of symbols is simpler and more commonly used.

Matrix multiplication

What if we want to multiply matrices in linear algebra?

This operation can be written in three ways:

Torch.mm ()

Torch.matmul ()

@, this needs to be remembered, otherwise it may be quite deceptive to encounter this.

A = torch.tensor ([[1.], [2.]) b = torch.tensor ([2.mem3.]) .view (1) print (torch.mm (a, b)) print (torch.matmul (a, b)) print (a @ b)

This is for a two-dimensional matrix. If the operator is a multidimensional tensor, only torch.matmul () can be used. Wait, how do you multiply a matrix with a multidimensional tensor? In the multi-dimensional tensor, only the last two dimensions participate in the matrix operation, and the former dimension is actually like an index, for example:

A = torch.rand ((1, 2) b = torch.rand ((1, 2), (32) print (torch.matmul (a, b). Shape) > > torch.Size ([1, 2, 64,64])

A = torch.rand ((3, 2) b = torch.rand ((1), 2,)) print (torch.matmul (a, b) .shape) > torch.Size ([3, 2, 64])

This is also multiplicative, because there is an automatic propagation Broadcasting mechanism, which will be discussed later, and we will know that in this case, the first dimension of b will be copied three times, and then it will be the same size as a, and then the matrix will be multiplied.

Power and square print ('exponentiation') a = torch.tensor ([1. Bc2 2.]) b = torch.tensor ([2. Meme 3.]) C1 = a * * bc2 = torch.pow (a, b) print (C1 and c2) > tensor ([1, 8.]) Tensor ([1,8.])

Same as above, say no more. The square operation can be done with torch.sqrt () or, of course, with axiom * (0.5).

Logarithmic operation

When we were at school, we knew that ln was based on e, but in pytorch, this was not the case.

In pytorch, log is based on e-natural number, and then log2 and log10 are based on 2 and 10.

Import numpy as npprint ('logarithmic operation') a = torch.tensor ([2pp. E]) print (torch.log (a)) print (torch.log2 (a)) print (torch.log10 (a)) > > tensor ([0.6931, 2.3026, 1.0000]) > > tensor ([1.0000, 3.3219, 1.4427]) > > tensor ([0.3010, 1.0000, 0.4343]) approximation operation

.ceil () rounding up

Round down. Round ()

.trunc () takes an integer

.frac () take the decimal

.round () rounded

.ceil () rounding up. Floor () rounding down. Trunc () rounding the whole number. Frac () rounding the decimal. Round () rounding

A = torch.tensor (1.2345) print (a.ceil ()) > > tensor (2.) print (a.floor ()) > tensor (1.) print (a.trunc ()) > tensor (1.) print (a.frac ()) > tensor (0.2345) print (a.round ()) > > tensor (1.) Clipping operation

This is to limit a number to a range set by yourself [min,max]. If it is less than min, it will be set to min, and if it is greater than max, it will be set to max. This operation is done in some anti-generation networks, like WGAN-GP, by forcibly limiting the values of the parameters of the model.

A = torch.rand (5) print (a) print (a.clamp (0.3) 0.7) Thank you for reading this article carefully. I hope the article "what to do about PyTorch Square Root" shared by the editor will be helpful to everyone. At the same time, I also hope that you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report