Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to operate the artifact with PyTorch's einops tensor

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces "how to operate the artifact with PyTorch's einops tensor". In the daily operation, I believe many people have doubts about how to use PyTorch's einops tensor to operate the artifact. The editor consulted all kinds of materials and sorted out simple and easy-to-use methods of operation. I hope it will be helpful for you to answer the question of "how to operate the artifact with PyTorch's einops tensor". Next, please follow the editor to study!

Installation:

Basic usage of pip install einops

The strength of einops is to visualize the dimensional manipulation of tensors, allowing developers to "write out as soon as they think about it". For example:

From einops import rearrange # rearrange elements according to the patternoutput_tensor = rearrange (input_tensor,'h w c-> c h w')

The dimension exchange is done with'h w c-> ch w', which is similar to permute in pytorch. However, einops's rearrange game can be more advanced:

From einops import rearrangeimport torch a = torch.randn (3,9,9) # [3,9,9] output = rearrange (a,'c (r p) w-> c r p w', pendant 3) print (output.shape) # [3,3,9]

This is the advanced usage, regard the intermediate dimension as r × p, and then give the value of p, so that the system will automatically disassemble the intermediate dimension into 3 × 3. In this way, the dimension transformation of [3,9,9]-> [3,3,3,9] is completed.

This feature is not comparable to the built-in features of pytorch.

In addition, there are reduce and repeat, which are also easy to use.

From einops import repeatimport torch a = torch.randn (9,9) # [9,9] output_tensor = repeat (a,'h w-> ch walled, cased 3) # [3,9,9]

Specify c, and you can specify the number of layers to copy.

Take a look at reduce:

From einops import reduceimport torch a = torch.randn (9,9) # [9,9] output_tensor = reduce (a,'b c (h h 3) (w w 2)-> b h w c', 'mean', h 3 2, w 2 2)

The 'mean' here specifies how it is pooled. I'm sure you can read it. If you don't understand, you can leave a message and ask questions.

Advanced usage

Einops can also be nested in pytorch's layer, see:

# example given for pytorch, but code in other frameworks is almost identical from torch.nn import Sequential, Conv2d, MaxPool2d, Linear, ReLUfrom einops.layers.torch import Rearrange model = Sequential (Conv2d (3,6, kernel_size=5), MaxPool2d (kernel_size=2), Conv2d (6,16, kernel_size=5), MaxPool2d (kernel_size=2), # flattening Rearrange ('b ch w-> b (ch w)'), Linear (165,120), ReLU (), Linear 10),)

Rearrange here is a subclass of nn.module and can be directly put into the model as a network layer.

At this point, the study on "how to use PyTorch's einops tensor to operate the artifact" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report