CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  理解出错之处望不吝指正。 

  本文模型叫做RASNet,在Siamese框架下重构了CF,提出了三种attention机制(general、residual、channel),这三种attention的提出使得离线训练的特征表示可以适应在线跟踪的目标,同时避免过拟合。

CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  传统的Siamese使用f(z, x)函数对跟踪目标z和search image x进行评价:

    CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  作者认为,不同位置的权重应该不同:

    CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  这里的CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记即为attention,称为“full attention”,我们又将其分为两部分:dual attention CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记和channel attention B。其中,CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记又分为两部分:general attention和residual attention(下式中的两部分):

    CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  general attention负责编码所有训练样本的共性。

  residual attention负责学习每种不同的跟踪物体之间的差异性。

    CVPR 2018 RASNet:《Learning Attentions: Residual Attentional Siamese Network for Tracking》论文笔记

  channel attention负责使模型适应于不同的contexts。