Surrogate network-based sparseness hyper-parameter optimization for deep expression recognition

作者:

Highlights:

• A new iterative framework for the hyper-parameter optimization in deep sparseness strategies is proposed to adapt hyper-parameter setting to different databases in FER.

• A simplified network is deployed to surrogate the original network for hyper-parameter optimization, where Euclidean losses with unilateral back propagation are introduced to approximate the original network.

• The proposed algorithm can automatically adapt deep network metrics to different databases with a reasonable time complexity.

• The hyper-parameter optimization algorithm achieved competitive performances on six public benchmark expression databases.

摘要

•A new iterative framework for the hyper-parameter optimization in deep sparseness strategies is proposed to adapt hyper-parameter setting to different databases in FER.•A simplified network is deployed to surrogate the original network for hyper-parameter optimization, where Euclidean losses with unilateral back propagation are introduced to approximate the original network.•The proposed algorithm can automatically adapt deep network metrics to different databases with a reasonable time complexity.•The hyper-parameter optimization algorithm achieved competitive performances on six public benchmark expression databases.

论文关键词:Expression recognition,Deep sparseness strategies,Hyper-parameter optimization,Surrogate network,Heuristic optimizer

论文评审过程:Received 3 January 2020, Revised 24 September 2020, Accepted 13 October 2020, Available online 14 October 2020, Version of Record 17 October 2020.

论文官网地址:https://doi.org/10.1016/j.patcog.2020.107701