SaberNet: Self-attention based effective relation network for few-shot learning

作者:

Highlights:

• We design a novel Self-attention Based Effective Relation Network for few-shot learning and leverage relations not only from local details in feature extraction, but also from support samples and from prototype-query pair channels.

• We argue the insufficiency of conventional feature extraction in few-shot learning and demonstrate the effectiveness of self-attention in feature extraction. The proposed SaberNet can infer feature relations and model spatial long-range dependencies across features.

• Extensive experiments and analyses demonstrate the effectiveness of the proposed framework. And Saber network achieves superior performance over other state-of-the-art methods on three challenging datasets. Moreover, we present a simple and powerful baseline to investigate the effect of the backbone in few-shot learning.

摘要

•We design a novel Self-attention Based Effective Relation Network for few-shot learning and leverage relations not only from local details in feature extraction, but also from support samples and from prototype-query pair channels.•We argue the insufficiency of conventional feature extraction in few-shot learning and demonstrate the effectiveness of self-attention in feature extraction. The proposed SaberNet can infer feature relations and model spatial long-range dependencies across features.•Extensive experiments and analyses demonstrate the effectiveness of the proposed framework. And Saber network achieves superior performance over other state-of-the-art methods on three challenging datasets. Moreover, we present a simple and powerful baseline to investigate the effect of the backbone in few-shot learning.

论文关键词:Few-shot learning,Feature representation,Task analysis,Transformers

论文评审过程:Received 25 June 2022, Revised 9 August 2022, Accepted 4 September 2022, Available online 7 September 2022, Version of Record 15 September 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.109024