论文标题

AIATRACK:注意变压器视觉跟踪的关注

AiATrack: Attention in Attention for Transformer Visual Tracking

论文作者

Gao, Shenyuan, Zhou, Chunluan, Ma, Chao, Wang, Xinggang, Yuan, Junsong

论文摘要

变压器跟踪器最近取得了令人印象深刻的进步,其中注意力机制起着重要作用。但是,注意机制中的独立相关计算可能会导致嘈杂和模棱两可的注意力权重,从而抑制进一步的性能改善。为了解决这个问题,我们提出了注意力(AIA)模块,该模块通过在所有相关向量之间寻求共识来增强适当的相关性并抑制错误的相关性。我们的AIA模块可以很容易地应用于自我注意解区域和跨注意区块,以促进特征聚集和信息传播以进行视觉跟踪。此外,我们通过引入有效的功能重复使用和目标背景嵌入来充分利用时间参考,提出了一个简化的变压器跟踪框架,称为AIATRACK。实验表明,我们的跟踪器以实时速度运行时在六个跟踪基准测试中实现了最先进的性能。

Transformer trackers have achieved impressive advancements recently, where the attention mechanism plays an important role. However, the independent correlation computation in the attention mechanism could result in noisy and ambiguous attention weights, which inhibits further performance improvement. To address this issue, we propose an attention in attention (AiA) module, which enhances appropriate correlations and suppresses erroneous ones by seeking consensus among all correlation vectors. Our AiA module can be readily applied to both self-attention blocks and cross-attention blocks to facilitate feature aggregation and information propagation for visual tracking. Moreover, we propose a streamlined Transformer tracking framework, dubbed AiATrack, by introducing efficient feature reuse and target-background embeddings to make full use of temporal references. Experiments show that our tracker achieves state-of-the-art performance on six tracking benchmarks while running at a real-time speed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源