L-SeqSleepNet: Whole-cycle Long Sequence Modeling for Automatic Sleep Staging.

Détails

Ressource 1Demande d'une copie Sous embargo indéterminé.
Accès restreint UNIL
Etat: Public
Version: Final published version
Licence: Non spécifiée
ID Serval
serval:BIB_9BE4A24B8351
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Institution
Titre
L-SeqSleepNet: Whole-cycle Long Sequence Modeling for Automatic Sleep Staging.
Périodique
IEEE journal of biomedical and health informatics
Auteur⸱e⸱s
Phan H., Lorenzen K.P., Heremans E., Chén Oliver Y, Tran M.C., Koch P., Mertins A., Baumert M., Mikkelsen K.B., de Vos M.
ISSN
2168-2208 (Electronic)
ISSN-L
2168-2194
Statut éditorial
Publié
Date de publication
10/2023
Peer-reviewed
Oui
Volume
27
Numéro
10
Pages
4748-4757
Langue
anglais
Notes
Publication types: Journal Article
Publication Status: ppublish
Résumé
Human sleep is cyclical with a period of approximately 90 minutes, implying long temporal dependency in the sleep data. Yet, exploring this long-term dependency when developing sleep staging models has remained untouched. In this work, we show that while encoding the logic of a whole sleep cycle is crucial to improve sleep staging performance, the sequential modelling approach in existing state-of-the-art deep learning models are inefficient for that purpose. We thus introduce a method for efficient long sequence modelling and propose a new deep learning model, L-SeqSleepNet, which takes into account whole-cycle sleep information for sleep staging. Evaluating L-SeqSleepNet on four distinct databases of various sizes, we demonstrate state-of-the-art performance obtained by the model over three different EEG setups, including scalp EEG in conventional Polysomnography (PSG), in-ear EEG, and around-the-ear EEG (cEEGrid), even with a single EEG channel input. Our analyses also show that L-SeqSleepNet is able to alleviate the predominance of N2 sleep (the major class in terms of classification) to bring down errors in other sleep stages. Moreover the network becomes much more robust, meaning that for all subjects where the baseline method had exceptionally poor performance, their performance are improved significantly. Finally, the computation time only grows at a sub-linear rate when the sequence length increases.
Pubmed
Création de la notice
11/01/2024 19:05
Dernière modification de la notice
19/01/2024 8:12
Données d'usage