Enhanced task attention with adversarial learning for dynamic multi-task CNN
作者:
Highlights:
• We propose a novel learning framework of multi-task CNN to enhance task attention through tuning the TTC of the shared subnet DMT-CNN with adversarial learning. Supervision information introduced with adversarial learning makes each task subnet focus more on its target task without limiting the model flexibility.
• We enhance the task attention with a task discriminator shared by all layers of all subnets to improve the sharing mechanism.
• We design the even-label strategy to construct the adversarial learning structure in a dynamic multi-task model.
摘要
•We propose a novel learning framework of multi-task CNN to enhance task attention through tuning the TTC of the shared subnet DMT-CNN with adversarial learning. Supervision information introduced with adversarial learning makes each task subnet focus more on its target task without limiting the model flexibility.•We enhance the task attention with a task discriminator shared by all layers of all subnets to improve the sharing mechanism.•We design the even-label strategy to construct the adversarial learning structure in a dynamic multi-task model.
论文关键词:Deep learning,Adversarial learning,Multi-task learning
论文评审过程:Received 22 March 2021, Revised 19 March 2022, Accepted 26 March 2022, Available online 28 March 2022, Version of Record 4 April 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2022.108672