In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces the relevant knowledge of how to generate local attention, the content is detailed and easy to understand, the operation is simple and fast, and it has a certain reference value. I believe you will gain something after reading this local attention generation article. Let's take a look at it.
Soft attention (including spatial attention, channel attention) the purpose of soft attention learning is to select fine-grained important pixels, which are pixel level.
Hard attention (local attention) hard attention learning is devoted to searching rough potential discriminant regions, which are region level.
They are highly complementary in function. Their combined use can improve the performance of the model.
The following is about the generation of spatial attention, channel attention and local attention.
Channel attention:
It is to give different weights to each channel, for example, the shape of the horse in 1Jing 2 is more obvious, so of course, the weight of 1Jing 2 channels is larger, and the weight of 3Jing 4 places is small.
Spatial attention:
Spatial attention is a mean operation of 64 channels to get a weight of (w x h). The operation of mean learns the overall distribution of all channels and abandons the strange channels. For example, the picture of 1Magi 2 can well depict the shape of a horse, but not 3Magi 4 (but in essence, it also shows the shape of a horse), but after mean, the w x h weight is shared and gives 3p4 a certain weight description, which is equivalent to giving 3jing4 a certain amount of attention, so that they can also describe the shape of a horse.
Code class SpatialAttn (nn.Module): # enter x.shape = (32jue 3256128), find mean= for the third dimension channel (32J 1256128), a convolution photo is halved, and upsample is restored. Then 1x1 convdef _ _ init__ (self): super (SpatialAttn, self). _ _ init__ () self.conv1 = ConvBlock (1,1,3, slot2, pendant 1) self.conv2 = ConvBlock (1,1,1) def forward (self, x): # global cross-channel averagingx = x.mean (1, keepdim=True) # x.shape = (323256) 128) x.mean.shape calculates the mean according to the channel = (32jue 1256128) # 3-by-3 convx = self.conv1 (x) # x.shape = (32jue 128jue 64) # bilinear resizingx = F.upsample (x, (x.size (2) * 2, x.size (3) * 2), mode='bilinear', align_corners=True) # x.shape = (32jue 1256128) # scaling convx = self.conv2 (x) # x.shape = (32jue 1256) Return xclass ChannelAttn (nn.Module): def _ init__ (self, in_channels, reduction_rate=16): super (ChannelAttn, self). _ init__ () assert in_channels% reduction_rate= = 0self.conv1 = ConvBlock (in_channels, in_channels / / reduction_rate, 1) self.conv2 = ConvBlock (in_channels / / reduction_rate, in_channels, 1) def forward (self, x): # squeeze operation (global average pooling) # x.shape = (32,16,256) X = F.avg_pool2d (x, x.size () [2:]) # torch.size ([32 conv layers 16 conv layers) x = self.conv1 (x) # torch.size ([32 torch.size) # return xclass SoftAttn (nn.Module): # soft attention (32) 16256128) = output of spatial attention (32) 1256 Def _ _ init__ (self, in_channels): super (SoftAttn, self). _ _ init__ () self.spatial_attn = SpatialAttn () self.channel_attn = ChannelAttn (in_channels) self.conv = ConvBlock (in_channels, in_channels, 1) def forward (self, x): # x.shape (32m 16256128) y_spatial = self.spatial_attn (x) # 32J 1256128y channel = self.channel_attn (x) # 3Jol 161Pert 1y = y_spatial * yearly channelized 32 F.sigmoid 16256128y = F.sigmoid (self.conv (y)) return y#torch.Size ([32 return y#torch.Size 16256128]) this is the end of the article on "how local attention is generated". Thank you for reading! I believe that everyone has a certain understanding of the knowledge of "how to generate local attention". If you want to learn more knowledge, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.