Cross-Domain Gated Learning for Domain Generalization
作者:Dapeng Du, Jiawei Chen, Yuexiang Li, Kai Ma, Gangshan Wu, Yefeng Zheng, Limin Wang
摘要
Domain generalization aims to improve the generalization capacity of a model by leveraging useful information from the multi-domain data. However, learning an effective feature representation from such multi-domain data is challenging, due to the domain shift problem. In this paper, we propose an information gating strategy, termed cross-domain gating (CDG), to address this problem. Specifically, we try to distill the domain-invariant feature by adaptively muting the domain-related activations in the feature maps. This feature distillation process prevents the network from overfitting to the domain-related detailed information, and thereby improves the generalization ability of learned feature representation. Extensive experiments are conducted on three public datasets. The experimental results show that the proposed CDG training strategy can excellently enforce the network to exploit the intrinsic features of objects from the multi-domain data, and achieve a new state-of-the-art domain generalization performance on these benchmarks.
论文关键词:Domain generalization, Representation learning, Dropout, Information bottleneck
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11263-022-01674-w