Personalized Automatic Sleep Staging with Single-night Data: A Pilot Study with Kullback-Leibler Divergence Regularization.
Details
Serval ID
serval:BIB_1DE1FEB72404
Type
Article: article from journal or magazin.
Collection
Publications
Institution
Title
Personalized Automatic Sleep Staging with Single-night Data: A Pilot Study with Kullback-Leibler Divergence Regularization.
Journal
Physiological measurement
ISSN
1361-6579 (Electronic)
ISSN-L
0967-3334
Publication state
Published
Issued date
01/07/2020
Peer-reviewed
Oui
Volume
41
Number
6
Pages
064004
Language
english
Notes
Publication types: Journal Article
Publication Status: epublish
Publication Status: epublish
Abstract
Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night.
As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model.
Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%.
We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.
As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model.
Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%.
We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.
Keywords
Algorithms, Databases, Factual, Humans, Pilot Projects, Polysomnography/methods, Sleep, Sleep Stages
Pubmed
Web of science
Open Access
Yes
Create date
11/01/2024 18:05
Last modification date
18/01/2024 15:04