Improving Automatic Sleep Staging Via Temporal Smoothness Regularization.

Details

Ressource 1Download: Temporal smoothness regularization.pdf (508.92 [Ko])
State: Public
Version: Final published version
License: Not specified
Serval ID
serval:BIB_9C520F6DD4B4
Type
Inproceedings: an article in a conference proceedings.
Collection
Publications
Title
Improving Automatic Sleep Staging Via Temporal Smoothness Regularization.
Title of the conference
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Author(s)
Phan Huy, Heremans Elisabeth, Chén Oliver Y., Koch Philipp, Mertins Alfred, De Vos Maarten
Publisher
IEEE
Publication state
Published
Issued date
04/06/2023
Peer-reviewed
Oui
Language
english
Abstract
We propose a regularization method, so-called temporal smoothness regularization, for training deep neural networks for automatic sleep staging in small data settings. In intuition, we constrain the cross-entropy losses of any two adjacent epochs in the sequential input to be as close to each other as possible. The regularization closely reflects the slow transition nature of sleep process which implies small information changes between two consecutive sleep epochs. Via the regularization, we essentially discourage the network from overfitting to these small changes. Our experiments show that training the SeqSleepNet base network with the proposed regularization leads to performance improvement over the baseline without the regularization applied. Furthermore, our developed method achieves the performance on par with the state-of-the-art performance while outperforming other existing methods.
Keywords
Automatic sleep staging, transfer learning, regularization, temporal smoothness, SeqSleepNet.
Create date
11/01/2024 19:05
Last modification date
22/04/2024 16:59
Usage data