Multi-granularity for knowledge distillation
作者:
Highlights:
• Multi-granularity property of knowledge is introduced into knowledge distillation.
• A multi-granularity self-analyzing module is proposed for constructing multi-granularity knowledge.
• Two distillation schemes are designed for transferring multi-granularity knowledge.
• Demonstration of the effectiveness of the proposed mechanism with extensive experiments.
摘要
Highlights•Multi-granularity property of knowledge is introduced into knowledge distillation.•A multi-granularity self-analyzing module is proposed for constructing multi-granularity knowledge.•Two distillation schemes are designed for transferring multi-granularity knowledge.•Demonstration of the effectiveness of the proposed mechanism with extensive experiments.
论文关键词:Knowledge distillation,Model compression,Multi-granularity distillation mechanism,Multi-granularity self-analyzing module,Stable excitation scheme
论文评审过程:Received 25 April 2021, Revised 15 August 2021, Accepted 24 August 2021, Available online 31 August 2021, Version of Record 9 September 2021.
论文官网地址:https://doi.org/10.1016/j.imavis.2021.104286