Hierarchical feature disentangling network for universal domain adaptation
作者:
Highlights:
• We propose a novel Hierarchical Feature Disentangling Network (HFDN) for universal domain adaptation (UniDA), which is a more practical domain adaptation (DA) setting compared with close-set, open-set, and partial DA.
• The proposed HFDN is the first to address the feature misalignment problem caused by both the domain gap and the category gap in UniDA.
• With the disentangled features, we propose an innovated knowledge transfer approach for UniDA, which bridges the domain gap to reduce the domain-shift by domain adversarial training, and leverages information of category gap to assign larger weights for samples from the common label set.
• Experiments demonstrate that our method achieves the best performance for UniDA on benchmark datasets compared with existing methods proposed under various domain adaptation assumptions.
摘要
•We propose a novel Hierarchical Feature Disentangling Network (HFDN) for universal domain adaptation (UniDA), which is a more practical domain adaptation (DA) setting compared with close-set, open-set, and partial DA.•The proposed HFDN is the first to address the feature misalignment problem caused by both the domain gap and the category gap in UniDA.•With the disentangled features, we propose an innovated knowledge transfer approach for UniDA, which bridges the domain gap to reduce the domain-shift by domain adversarial training, and leverages information of category gap to assign larger weights for samples from the common label set.•Experiments demonstrate that our method achieves the best performance for UniDA on benchmark datasets compared with existing methods proposed under various domain adaptation assumptions.
论文关键词:Universal domain adaptation,Feature disentanglement,Domain adversarial training,Sample reweighting
论文评审过程:Received 7 December 2020, Revised 19 January 2022, Accepted 27 February 2022, Available online 28 February 2022, Version of Record 16 March 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2022.108616