Chen, Yaxi;
Ivanova, Aleksandra;
Saeed, Shaheer U;
Hargunani, Rikin;
Huang, Jie;
Liu, chaozong;
Hu, yipeng;
(2024)
Segmentation by Registration-Enabled SAM Prompt Engineering Using Five Reference Images.
In: Modat, Marc and Simpson, Ivor and Špiclin, Žiga and Bastiaansen, Wietske and Hering, Alessa and Mok, Tony CW, (eds.)
Biomedical Image Registration: 11th International Workshop, WBIR 2024, Held in Conjunction with MICCAI 2024, Marrakesh, Morocco, October 6, 2024, Proceedings.
Springer: Cham, Switzerland.
![]() |
Text
2407.17933v1.pdf - Accepted Version Access restricted to UCL open access staff until 6 October 2025. Download (656kB) |
Abstract
The recently proposed Segment Anything Model (SAM) is a general tool for image segmentation, but it requires additional adaptation and careful fine-tuning for medical image segmentation, especially for small, irregularly-shaped, and boundary-ambiguous anatomical structures such as the knee cartilage that is of interest in this work. Repaired cartilage, after certain surgical procedures, exhibits imaging patterns unseen to pre-training, posing further challenges for using models like SAM with or without general-purpose fine-tuning. To address this, we propose a novel registration-based prompt engineering framework for medical image segmentation using SAM. This approach utilises established image registration algorithms to align the new image (to-be-segmented) and a small number of reference images, without requiring segmentation labels. The spatial transformations generated by registration align either the new image or pre-defined point-based prompts, before using them as input to SAM. This strategy, requiring as few as five reference images with defined point prompts, effectively prompts SAM for inference on new images, without needing any segmentation labels. Evaluation of MR images from patients who received cartilage stem cell therapy yielded Dice scores of 0.89, 0.87, 0.53, and 0.52 for segmenting femur, tibia, femoral- and tibial cartilages, respectively. This outperforms atlas-based label fusion and is comparable to supervised nnUNet, an upper-bound fair baseline in this application, both of which require full segmentation labels for reference samples. The codes are available at: https://github.com/chrissyinreallife/KneeSegmentWithSAM.git.
Type: | Proceedings paper |
---|---|
Title: | Segmentation by Registration-Enabled SAM Prompt Engineering Using Five Reference Images |
Event: | WBIR: International Workshop on Biomedical Image Registration (WBIR 2024) |
ISBN-13: | 978-3-031-73479-3 |
DOI: | 10.1007/978-3-031-73480-9_19 |
Publisher version: | https://doi.org/10.1007/978-3-031-73480-9_19 |
Language: | English |
Additional information: | This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions. |
UCL classification: | UCL UCL > Provost and Vice Provost Offices > UCL BEAMS UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Mechanical Engineering |
URI: | https://discovery.ucl.ac.uk/id/eprint/10205356 |
Archive Staff Only
![]() |
View Item |