Contrast-Enhancing Lesion Segmentation in Multiple Sclerosis: A Deep Learning Approach Validated in a Multicentric Cohort.
Details
Serval ID
serval:BIB_801C9431AA9A
Type
Article: article from journal or magazin.
Collection
Publications
Institution
Title
Contrast-Enhancing Lesion Segmentation in Multiple Sclerosis: A Deep Learning Approach Validated in a Multicentric Cohort.
Journal
Bioengineering
ISSN
2306-5354 (Print)
ISSN-L
2306-5354
Publication state
Published
Issued date
22/08/2024
Peer-reviewed
Oui
Volume
11
Number
8
Language
english
Notes
Publication types: Journal Article
Publication Status: epublish
Publication Status: epublish
Abstract
The detection of contrast-enhancing lesions (CELs) is fundamental for the diagnosis and monitoring of patients with multiple sclerosis (MS). This task is time-consuming and suffers from high intra- and inter-rater variability in clinical practice. However, only a few studies proposed automatic approaches for CEL detection. This study aimed to develop a deep learning model that automatically detects and segments CELs in clinical Magnetic Resonance Imaging (MRI) scans. A 3D UNet-based network was trained with clinical MRI from the Swiss Multiple Sclerosis Cohort. The dataset comprised 372 scans from 280 MS patients: 162 showed at least one CEL, while 118 showed no CELs. The input dataset consisted of T1-weighted before and after gadolinium injection, and FLuid Attenuated Inversion Recovery images. The sampling strategy was based on a white matter lesion mask to confirm the existence of real contrast-enhancing lesions. To overcome the dataset imbalance, a weighted loss function was implemented. The Dice Score Coefficient and True Positive and False Positive Rates were 0.76, 0.93, and 0.02, respectively. Based on these results, the model developed in this study might well be considered for clinical decision support.
Keywords
automatic segmentation, deep learning, gadolinium contrast-enhancing lesions, multiple sclerosis
Pubmed
Web of science
Open Access
Yes
Create date
09/09/2024 13:34
Last modification date
29/10/2024 7:21