MuSIC: Multi-Sequential Interactive Co-Registration for Cancer Imaging Data based on Segmentation Masks
Abstract
In gynecologic cancer imaging, multiple magnetic resonance imaging (MRI) sequences are acquired per patient to reveal different tissue characteristics. However, after image acquisition, the anatomical structures can be misaligned in the various sequences due to changing patient location in the scanner and organ movements. The co-registration process aims to align the sequences to allow for multi-sequential tumor imaging analysis. However, automatic co-registration often leads to unsatisfying results. To address this problem, we propose the web-based application MuSIC (Multi-Sequential Interactive Co-registration). The approach allows medical experts to co-register multiple sequences simultaneously based on a pre-defined segmentation mask generated for one of the sequences. Our contributions lie in our proposed workflow. First, a shape matching algorithm based on dual annealing searches for the tumor position in each sequence. The user can then interactively adapt the proposed segmentation positions if needed. During this procedure, we include a multi-modal magic lens visualization for visual quality assessment. Then, we register the volumes based on the segmentation mask positions. We allow for both rigid and deformable registration. Finally, we conducted a usability analysis with seven medical and machine learning experts to verify the utility of our approach. Our participants highly appreciate the multi-sequential setup and see themselves using MuSIC in the future. Best Paper Honorable Mention at VCBM2022
T. Eichner, E. Mörth, K. S. Wagner-Larsen, N. Lura, I. S. Haldorsen, E. Gröller, S. Bruckner, and N. N. Smit, "MuSIC: Multi-Sequential Interactive Co-Registration for Cancer Imaging Data based on Segmentation Masks," in Eurographics Workshop on Visual Computing for Biology and Medicine, 2022. doi:10.2312/vcbm.20221190
[BibTeX]
In gynecologic cancer imaging, multiple magnetic resonance imaging (MRI) sequences are acquired per patient to reveal different tissue characteristics. However, after image acquisition, the anatomical structures can be misaligned in the various sequences due to changing patient location in the scanner and organ movements. The co-registration process aims to align the sequences to allow for multi-sequential tumor imaging analysis. However, automatic co-registration often leads to unsatisfying results. To address this problem, we propose the web-based application MuSIC (Multi-Sequential Interactive Co-registration). The approach allows medical experts to co-register multiple sequences simultaneously based on a pre-defined segmentation mask generated for one of the sequences. Our contributions lie in our proposed workflow. First, a shape matching algorithm based on dual annealing searches for the tumor position in each sequence. The user can then interactively adapt the proposed segmentation positions if needed. During this procedure, we include a multi-modal magic lens visualization for visual quality assessment. Then, we register the volumes based on the segmentation mask positions. We allow for both rigid and deformable registration. Finally, we conducted a usability analysis with seven medical and machine learning experts to verify the utility of our approach. Our participants highly appreciate the multi-sequential setup and see themselves using MuSIC in the future. Best Paper Honorable Mention at VCBM2022
@inproceedings {EichnerMoerth2022MuSIC,
booktitle = {Eurographics Workshop on Visual Computing for Biology and Medicine},
editor = {Renata G. Raidou and Björn Sommer and Torsten W. Kuhlen and Michael Krone and Thomas Schultz and Hsiang-Yun Wu},
title = {{MuSIC: Multi-Sequential Interactive Co-Registration for Cancer Imaging Data based on Segmentation Masks}},
author = {Eichner, Tanja* and Mörth, Eric* and Wagner-Larsen, Kari S. and Lura, Njål and Haldorsen, Ingfrid S. and Gröller, Eduard and Bruckner, Stefan and Smit, Noeska N.},
note = {Best Paper Honorable Mention at VCBM2022},
project = {ttmedvis},
year = {2022},
abstract = {In gynecologic cancer imaging, multiple magnetic resonance imaging (MRI) sequences are acquired per patient to reveal different tissue characteristics. However, after image acquisition, the anatomical structures can be misaligned in the various sequences due to changing patient location in the scanner and organ movements. The co-registration process aims to align the sequences to allow for multi-sequential tumor imaging analysis. However, automatic co-registration often leads to unsatisfying results. To address this problem, we propose the web-based application MuSIC (Multi-Sequential Interactive Co-registration). The approach allows medical experts to co-register multiple sequences simultaneously based on a pre-defined segmentation mask generated for one of the sequences. Our contributions lie in our proposed workflow. First, a shape matching algorithm based on dual annealing searches for the tumor position in each sequence. The user can then interactively adapt the proposed segmentation positions if needed. During this procedure, we include a multi-modal magic lens visualization for visual quality assessment. Then, we register the volumes based on the segmentation mask positions. We allow for both rigid and deformable registration. Finally, we conducted a usability analysis with seven medical and machine learning experts to verify the utility of our approach. Our participants highly appreciate the multi-sequential setup and see themselves using MuSIC in the future.
Best Paper Honorable Mention at VCBM2022},
publisher = {The Eurographics Association},
ISSN = {2070-5786},
ISBN = {978-3-03868-177-9},
DOI = {10.2312/vcbm.20221190},
pdf = {pdfs/EichnerMoerth_2022.pdf},
thumbnails = {images/EichnerMoerth_2022.PNG},
images = {images/EichnerMoerth_2022.PNG},
}