🔌添加注意力机制
各种注意力模块优缺点
在模型中添加注意力模块
# -----------------------------------------------#
# 备选注意力模块列表
# -----------------------------------------------#
attention_bocks = [se_block, cbam_block, eca_block, CA_Block]def __init__(self, anchors_mask, num_classes, phi, pretrained=False, phi_attention=0, pruned=1):# -----------------------------------------------# # 注意力初始化 # -----------------------------------------------# self.phi_attention = phi_attention if phi_attention >= 1 and phi_attention <= 4: self.P3_attention = attention_bocks[phi_attention - 1](128) # 128为通道数# 80, 80, 256 => 80, 80, 128 P3 = self.conv3_for_upsample2(P3) if 1 <= self.phi <= 4: P3 = self.P3_attention(P3) # 80, 80, 128 => 40, 40, 256 P3_downsample = self.down_sample1(P3)
使用注意力模块改进的模型
Last updated