Weakly-Interactive-Mixed Learning: Less Labelling Cost for Better Medical Image Segmentation.

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE J Biomed Health Inform, 2023, 27, (7), pp. 3270-3281
Issue Date:
2023-07
Full metadata record
Common medical image segmentation tasks require large training datasets with pixel-level annotations which are very expensive and time-consuming to prepare. To overcome such limitation and achieve the desired segmentation accuracy, a novel Weakly-Interactive-Mixed Learning (WIML) framework is proposed by efficiently using weak labels. On one hand, utilize weak labels to reduce annotation time for high-quality strong labels by designing a Weakly-Interactive Annotation (WIA) part of the WIML which prudently introduces interactive learning into the weakly-supervised segmentation strategy. On the other hand, utilize weak labels and very few strong labels to achieve desired segmentation accuracy by designing a Mixed-Supervised Learning (MSL) part of the WIML which can boost the segmentation accuracy by providing strong prior knowledge during training. Besides, a multi-task Full-Parameter-Sharing Network (FPSNet) is proposed to better implement this framework. Specifically, to further reduce annotation time, attention modules (scSE) are integrated into FPSNet to improve the class activation map (CAM) performance for the first time. To further improve segmentation accuracy, a Full-Parameter-Sharing (FPS) strategy is designed in FPSNet to alleviate the overfitting of the segmentation task supervised by very few strong labels. The proposed method is validated on the BraTS 2019 and LiTS 2017 datasets, and experiments demonstrate that the proposed method WIML-FPSNet outperforms several state-of-the-art segmentation methods with minimal annotation efforts.
Please use this identifier to cite or link to this item: