Purpose: Accurate localization of prostate in CT images plays an important role in image guided radiation therapy. However, it has two main challenges. The first challenge is due to the low contrast in CT images to localize the prostate. The other challenge is due to the uncertainty of the existence of bowel gas. In this work an automatic prostate localization method is proposed to address these two challenges. Methods: The proposed method extracts anatomical features are from input images, and the best features at distinctive image regions are learnt to localize the prostate. An online update mechanism is adopted to adaptively combine both the inter‐ patient and patient‐specific information during the learning process. An explicit similarity measure function is built based on the learnt features to align the planning image to the treatment images. The prostate in the treatment image thus can be localized by transforming the segmented prostate in the training image to the space of the treatment image. Results: We evaluate the proposed method on a 3D CT prostate dataset consisting of 10 patients. Each patient has more than 10 CT scans. The manual segmentation results provided by clinical experts are also available. The segmentation accuracy is evaluated based on two quantitative measures: The centroid distance and the Dice ratio between the estimated prostate and the manual segmented prostate. for all the patients, the 25th and 75th percentiles of the centroid distances are within 1 mm error and the average dice ratio can reach 89%. Conclusions: The proposed method can achieve high prostate segmentation accuracies in CT images. Most importantly, the proposed method is highly flexible in clinical application as high segmentation accuracies can be achieved even in the case that only the planning image of the underlying patient is available.
ASJC Scopus subject areas
- Radiology Nuclear Medicine and imaging