Learnable dynamic margin in deep metric learning

作者:

Highlights:

• We propose a new proxy-based loss, which can fully consider the intra-class variance of each class. We set a learnable margin for each class, so as to better maintain the intra-class distribution in the embedding space.

• Our loss increases the consideration of semantic relations between proxies, which can ensure the separation between classes, thus further maintaining the intra-class distribution in the embedding space.

• The standard embedding network trained with our loss achieves state-of-the-art performance on several common benchmarks of metric learning.

摘要

•We propose a new proxy-based loss, which can fully consider the intra-class variance of each class. We set a learnable margin for each class, so as to better maintain the intra-class distribution in the embedding space.•Our loss increases the consideration of semantic relations between proxies, which can ensure the separation between classes, thus further maintaining the intra-class distribution in the embedding space.•The standard embedding network trained with our loss achieves state-of-the-art performance on several common benchmarks of metric learning.

论文关键词:Deep metric learning,Proxy-based loss,Adaptive margin,Image retrieval,Fine-grained images

论文评审过程:Received 21 April 2022, Revised 14 July 2022, Accepted 7 August 2022, Available online 10 August 2022, Version of Record 17 August 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108961