Source-free unsupervised domain adaptation for cross-modality abdominal multi-organ segmentation
作者:
Highlights:
•
摘要
Domain adaptation is crucial for transferring the knowledge from the source labeled CT dataset to the target unlabeled MR dataset in abdominal multi-organ segmentation. Meanwhile, it is highly desirable to avoid the high annotation cost related to the target dataset and protect the source dataset privacy. Therefore, we propose an effective source-free unsupervised domain adaptation method for cross-modality abdominal multi-organ segmentation without source dataset access. The proposed framework comprises two stages. In the first stage, the feature map statistics-guided model adaptation combined with entropy minimization is developed to help the top segmentation network reliably segment the target images. The pseudo-labels output from the top segmentation network are used to guide the style compensation network to generate source-like images. The pseudo-labels output from the middle segmentation network is used to supervise the learning progress of the desired model (bottom segmentation network). In the second stage, the circular learning and pixel-adaptive mask refinement are used to further improve the desired model performance. With this approach, we achieved satisfactory abdominal multi-organ segmentation performance, outperforming the existing state-of-the-art domain adaptation methods. The proposed approach can be easily extended to situations in which target annotation data exist. With only one labeled MR volume, the performance can be leveled with that of supervised learning. Furthermore, the proposed approach is proven to be effective for source-free unsupervised domain adaptation in reverse direction.
论文关键词:Abdominal organs segmentation,Source data free,Unsupervised domain adaptation,Style compensation,Data privacy protection
论文评审过程:Received 29 November 2021, Revised 24 May 2022, Accepted 24 May 2022, Available online 2 June 2022, Version of Record 9 June 2022.
论文官网地址:https://doi.org/10.1016/j.knosys.2022.109155