Hierarchical distillation learning for scalable person search

作者:

Highlights:

• We investigate for the first time the scalability problem involved in person search. This is a fundamentally significant problem to be solved for scaling up the deep learning solutions to person search in the real-world applications.

• We formulate a hierarchical distillation learning (HDL) approach for more discriminating knowledge transfer from a stronger teacher model into an efficient student model.

• We design a simple and effective teacher model for joint learning of person search, which largely facilitates the knowledge distillation by avoiding knowledge transfer between structure inconsistent teacher and student models.

• We demonstrate the model cost-effectiveness and performance advantages of our HDL over the state-of-the-art alternative approaches on three person search benchmarks.

摘要

•We investigate for the first time the scalability problem involved in person search. This is a fundamentally significant problem to be solved for scaling up the deep learning solutions to person search in the real-world applications.•We formulate a hierarchical distillation learning (HDL) approach for more discriminating knowledge transfer from a stronger teacher model into an efficient student model.•We design a simple and effective teacher model for joint learning of person search, which largely facilitates the knowledge distillation by avoiding knowledge transfer between structure inconsistent teacher and student models.•We demonstrate the model cost-effectiveness and performance advantages of our HDL over the state-of-the-art alternative approaches on three person search benchmarks.

论文关键词:Person search,Person re-identification,Person detection,Knowledge distillation,Scalability,Model inference efficiency

论文评审过程:Received 23 March 2020, Revised 16 November 2020, Accepted 24 January 2021, Available online 1 February 2021, Version of Record 6 February 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.107862