Unsupervised domain adaptation via distilled discriminative clustering

作者:

Highlights:

• We propose to solve the unsupervised domain adaptation problem by distilled discriminative clustering.

• Motivated by the essential assumption for domain adaptability, we propose to reformulate the domain adaptation problem as discriminative clustering of target data, given strong privileged information from the semantically related, labeled source data. By properly distilling discriminative source information for clustering of the target data, we aim to learn classification of target data directly, with no explicit feature alignment.

• We present clustering objectives based on a robust variant of entropy minimization for reliable cluster separation, a soft Fisher-like criterion for inter-cluster isolation and intra-cluster purity and compactness, and the centroid classification for consistent cluster ordering across domains. To distill discriminative source information for target clustering, we use parallel, supervised learning objectives on the labeled source data.

• We also give geometric intuition that illustrates how constituent objectives of our method help learn class-wisely pure, compact feature distributions.

• Experiments on five benchmarks show that our method achieves the new state of the art.

摘要

•We propose to solve the unsupervised domain adaptation problem by distilled discriminative clustering.•Motivated by the essential assumption for domain adaptability, we propose to reformulate the domain adaptation problem as discriminative clustering of target data, given strong privileged information from the semantically related, labeled source data. By properly distilling discriminative source information for clustering of the target data, we aim to learn classification of target data directly, with no explicit feature alignment.•We present clustering objectives based on a robust variant of entropy minimization for reliable cluster separation, a soft Fisher-like criterion for inter-cluster isolation and intra-cluster purity and compactness, and the centroid classification for consistent cluster ordering across domains. To distill discriminative source information for target clustering, we use parallel, supervised learning objectives on the labeled source data.•We also give geometric intuition that illustrates how constituent objectives of our method help learn class-wisely pure, compact feature distributions.•Experiments on five benchmarks show that our method achieves the new state of the art.

论文关键词:Deep learning,Unsupervised domain adaptation,Image classification,Knowledge distillation,Deep discriminative clustering,Implicit domain alignment

论文评审过程:Received 15 July 2021, Revised 25 January 2022, Accepted 7 March 2022, Available online 11 March 2022, Version of Record 15 March 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108638