Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What problems should be paid attention to when switching between Numpy and Pytorch

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly shows you "what problems you need to pay attention to when Numpy and Pytorch switch each other". The content is simple and clear. I hope it can help you solve your doubts. Let me lead you to study and learn this article "what problems need to be paid attention to when you switch between Numpy and Pytorch".

1. Numpy-- > torch   

Using the torch.from_numpy () transformation, note that the two share memory. Examples are as follows:

Import torchimport numpy as npa = np.array ([1jing2jin3]) b = torch.from_numpy (a) np.add (a, 1, out=a) print ('converted axiom, a) print (' converted baked, b) # shows that after conversion a [2 34] converted b tensor ([2jue 3,4], dtype=torch.int32) 1.2, torch-- > numpy   

Using the .numpy () transformation, again, the two share memory. Examples are as follows:

Import torchimport numpy as npa = torch.zeros ((2,3), dtype=torch.float) c = a.numpy () np.add (c, 1, out=c) print (a) print ('c) # result a: tensor ([[1, 1, 1.], [1, 1.]]) c: [1. 1. 1.] [1. 1.]

It is important to note that if you change the np.add (c, 1, out=c) in the program to c = c + 1, you will find that the two do not seem to share memory, but it is not, because the latter is equivalent to changing the storage address of c. You can use id (c) to notice that the memory location of c has changed.

Add: a problem to be noticed in the conversion between tensor data and numpy data in pytorch

In pytorch, the common function to convert numpy.array data to tensor tensor data is torch.from_numpy (array) or torch.Tensor (array), and the first function is more commonly used.

Here's a look at the difference through the code: import numpy as npimport torcha=np.arange (6dtypewriter int). Reshape (2p3) b=torch.from_numpy (a) c=torch.Tensor (a) c=torch.Tensor (a) a [0] [0] = 10print. [1012] [3 45]] tensor ([10, 1, 2], [3, 4, 5], dtype=torch.int32) tensor ([0, 1, 2.] [3, 4, 5.]) c [0] [0] = 10print [[10 12] [3 45]] tensor ([[10, 1, 2], [3, 4, 5], dtype=torch.int32) tensor ([[10, 1, 2], [3, 4. 5.]]) print (b.type ()) torch.IntTensorprint (c.type ()) torch.FloatTensor

You can see that when you modify the element value of the array a, the element value of the tensor b also changes, but the tensor c remains the same. Modify the element value of tensor c, and the element values of array an and tensor b remain the same.

This means that torch.from_numpy (array) is a shallow copy of an array, and torch.Tensor (array) is a deep copy of an array.

The above is all the contents of this article entitled "what should be paid attention to when switching between Numpy and Pytorch". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report