TY - JOUR
T1 - Multiatlas-Based Segmentation Editing with Interaction-Guided Patch Selection and Label Fusion
AU - Park, Sang Hyun
AU - Gao, Yaozong
AU - Shen, Dinggang
N1 - Funding Information:
This work was supported in part by NIH grant CA140413.
Publisher Copyright:
© 2016 IEEE.
PY - 2016/6
Y1 - 2016/6
N2 - We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.
AB - We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets.
KW - Distance-based voting,
KW - Segmentation editing
KW - interaction-guided editing
KW - label fusion
UR - http://www.scopus.com/inward/record.url?scp=84976420232&partnerID=8YFLogxK
U2 - 10.1109/TBME.2015.2491612
DO - 10.1109/TBME.2015.2491612
M3 - Article
C2 - 26485353
AN - SCOPUS:84976420232
SN - 0018-9294
VL - 63
SP - 1208
EP - 1219
JO - IEEE Transactions on Biomedical Engineering
JF - IEEE Transactions on Biomedical Engineering
IS - 6
M1 - 7299268
ER -