Machine-learning-based sleep staging models have achieved expert-level performance on standard polysomnographic (PSG) data. However, their application to EEG recorded by wearable devices remains limited by non-conventional referencing montage and the lack of benchmarking against PSG. Here, we tested whether an ensemble of state-of-the-art staging algorithms can reliably classify sleep from a customised configuration of the ZMax headband, adapted to record a fronto-mastoid EEG channel. Thirty-five nights of simultaneous ZMax and PSG recordings were acquired in a home setting, yielding 250.02 h of data from 10 healthy participants. PSG data were scored according to AASM criteria by two independent experts, with discrepancies resolved to obtain a consensus hypnogram. ZMax signal was processed using four machine-learning algorithms (YASA, U-Sleep, SleepTransformer, DeepResNet), whose predictions were combined into a final ensemble scoring through soft voting. Quantitative/qualitative analyses of NREM slow waves and spindles evaluated the preservation of microstructural features across recording systems. The ensemble scoring achieved almost perfect agreement with human consensus staging (night-level mean ± SD; accuracy = 88.83% ± 2.84%, Cohen's κ = 84.10% ± 4.52%, and Matthews Correlation Coefficient = 84.54% ± 4.23%). It showed excellent classification efficiency for REM (F1-score = 93.99%), N3 (89.53%), N2 (87.93%), and wakefulness (86.37%), with lower performance for N1 (53.20%). Microstructural comparisons confirmed strong correspondence between ZMax and PSG signals. These findings support the deployment of an ensemble scoring approach based on state-of-the-art sleep staging algorithms on ultra-minimal EEG setups. This paradigm advances the integration of data from modern wearable technologies into traditional PSG-based sleep research, overcoming longstanding barriers to ecological, large-scale sleep monitoring.
The Potential of Ensemble‐Based Automated Sleep Staging on Single‐Channel EEG Signal From a Wearable Device
Salfi, Federico
;Corigliano, Domenico;Amicucci, Giulia;D'Atri, Aurora;Ferrara, Michele
2026-01-01
Abstract
Machine-learning-based sleep staging models have achieved expert-level performance on standard polysomnographic (PSG) data. However, their application to EEG recorded by wearable devices remains limited by non-conventional referencing montage and the lack of benchmarking against PSG. Here, we tested whether an ensemble of state-of-the-art staging algorithms can reliably classify sleep from a customised configuration of the ZMax headband, adapted to record a fronto-mastoid EEG channel. Thirty-five nights of simultaneous ZMax and PSG recordings were acquired in a home setting, yielding 250.02 h of data from 10 healthy participants. PSG data were scored according to AASM criteria by two independent experts, with discrepancies resolved to obtain a consensus hypnogram. ZMax signal was processed using four machine-learning algorithms (YASA, U-Sleep, SleepTransformer, DeepResNet), whose predictions were combined into a final ensemble scoring through soft voting. Quantitative/qualitative analyses of NREM slow waves and spindles evaluated the preservation of microstructural features across recording systems. The ensemble scoring achieved almost perfect agreement with human consensus staging (night-level mean ± SD; accuracy = 88.83% ± 2.84%, Cohen's κ = 84.10% ± 4.52%, and Matthews Correlation Coefficient = 84.54% ± 4.23%). It showed excellent classification efficiency for REM (F1-score = 93.99%), N3 (89.53%), N2 (87.93%), and wakefulness (86.37%), with lower performance for N1 (53.20%). Microstructural comparisons confirmed strong correspondence between ZMax and PSG signals. These findings support the deployment of an ensemble scoring approach based on state-of-the-art sleep staging algorithms on ultra-minimal EEG setups. This paradigm advances the integration of data from modern wearable technologies into traditional PSG-based sleep research, overcoming longstanding barriers to ecological, large-scale sleep monitoring.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


