Progressive privileged knowledge distillation for online action detection

作者:

Highlights:

• We first introduce knowledge distillation to online action detection.

• The online action detection model is trained by privileged information.

• A curriculum learning is designed based on the difficulty of knowledge distillation.

• Only partial hidden features of the student are supposed to learn from the teacher.

• Results show better performance especially at the early stage of actions.

摘要

•We first introduce knowledge distillation to online action detection.•The online action detection model is trained by privileged information.•A curriculum learning is designed based on the difficulty of knowledge distillation.•Only partial hidden features of the student are supposed to learn from the teacher.•Results show better performance especially at the early stage of actions.

论文关键词:Online action detection,Knowledge distillation,Privileged information,Curriculum learning

论文评审过程:Received 13 August 2021, Revised 15 February 2022, Accepted 23 April 2022, Available online 29 April 2022, Version of Record 6 May 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108741