SO-softmax loss for discriminable embedding learning in CNNs

作者:

Highlights:

• A generalized softmax loss deduces to various variants of softmax loss.

• Goal optimization-based transformation constrains inter/intra-class cosine similarity.

• The proposed transformation unifies cosine similarity transformations used in losses.

• SO-softmax loss is proposed to enhance the embeddings' discriminability in CNN.

• Extensive experiments show the superiority of SO-softmax over other counterparts.

摘要

•A generalized softmax loss deduces to various variants of softmax loss.•Goal optimization-based transformation constrains inter/intra-class cosine similarity.•The proposed transformation unifies cosine similarity transformations used in losses.•SO-softmax loss is proposed to enhance the embeddings' discriminability in CNN.•Extensive experiments show the superiority of SO-softmax over other counterparts.

论文关键词:Convolutional neural networks,Cosine similarity,Cross entropy loss,Quadratic transformation,Embedding learning,Softmax

论文评审过程:Received 15 July 2021, Revised 19 May 2022, Accepted 26 June 2022, Available online 27 June 2022, Version of Record 12 July 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108877