Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Mar 2;20(5):1360.
doi: 10.3390/s20051360.

Sleep Apnea Detection with Polysomnography and Depth Sensors

Affiliations

Sleep Apnea Detection with Polysomnography and Depth Sensors

Martin Schätz et al. Sensors (Basel). .

Abstract

This paper is devoted to proving two goals, to show that various depth sensors can be used to record breathing rate with the same accuracy as contact sensors used in polysomnography (PSG), in addition to proving that breathing signals from depth sensors have the same sensitivity to breathing changes as in PSG records. The breathing signal from depth sensors can be used for classification of sleep [d=R2]apneaapnoa events with the same success rate as with PSG data. The recent development of computational technologies has led to a big leap in the usability of range imaging sensors. New depth sensors are smaller, have a higher sampling rate, with better resolution, and have bigger precision. They are widely used for computer vision in robotics, but they can be used as non-contact and non-invasive systems for monitoring breathing and its features. The breathing rate can be easily represented as the frequency of a recorded signal. All tested depth sensors (MS Kinect v2, RealSense SR300, R200, D415 and D435) are capable of recording depth data with enough precision in depth sensing and sampling frequency in time (20-35 frames per second (FPS)) to capture breathing rate. The spectral analysis shows a breathing rate between 0.2 Hz and 0.33 Hz, which corresponds to the breathing rate of an adult person during sleep. To test the quality of breathing signal processed by the proposed workflow, a neural network classifier (simple competitive NN) was trained on a set of 57 whole night polysomnographic records with a classification of sleep [d=R2]apneaapnoas by a sleep specialist. The resulting classifier can mark all [d=R2]apneaapnoa events with 100% accuracy when compared to the classification of a sleep specialist, which is useful to estimate the number of events per hour. [d=R2]When compared to the classification of polysomnographic breathing signal segments by a sleep specialistand, which is used for calculating length of the event, the classifier has an [d=R1] F 1 score of 92.2%Accuracy of 96.8% (sensitivity 89.1% and specificity 98.8%). The classifier also proves successful when tested on breathing signals from MS Kinect v2 and RealSense R200 with simulated sleep [d=R2]apneaapnoa events. The whole process can be fully automatic after implementation of automatic chest area segmentation of depth data.

Keywords: breathing analysis; computational intelligence; depth sensors; human-machine interaction; image processing; signal processing.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Comparison of spectral analysis of 8 hour records from (a) PSG, (b) MS Kinect v2 resampled to 10 Hz and (cf) shorter signals from RealSense depth sensors.
Figure 2
Figure 2
Training data with classification and example of selected central apnea event.
Figure 3
Figure 3
Comparison of sleep apnea classification. All signals are with a sampling frequency of 10 Hz.

References

    1. Schätz M., Centonze F., Kuchynka J., Tupa O., Vysata O., Geman O., Prochazka A. Statistical recognition of breathing by MS Kinect depth sensor; Proceedings of the 2015 International Workshop on Computational Intelligence for Multimedia Understanding (IWCIM); Prague, Czech Republic. 29–30 October 2015; pp. 1–4. - DOI
    1. Ťupa O., Procházka A., Vyšata O., Schatz M., Mareš J., Vališ M., Mařík V. Motion tracking and gait feature estimation for recognising Parkinson’s disease using MS Kinect. BioMed. Eng. OnLine. 2015;14:1–20. doi: 10.1186/s12938-015-0092-7. - DOI - PMC - PubMed
    1. Procházka A., Vyšata O., Vališ M., Ťupa O., Schatz M., Mařík V. Bayesian Classification and Analysis of Gait Disorders Using Image and Depth Sensors of Microsoft Kinect. Elsevier Digit. Signal Process. 2015;47:169–177. doi: 10.1016/j.dsp.2015.05.011. - DOI
    1. Procházka A., Vyšata O., Vališ M., Ťupa O., Schatz M., Mařík V. Use of Image and Depth Sensors of the Microsoft Kinect for the Detection of Gait Disorders. Springer Neural Comput. Appl. 2015;26:1621–1629. doi: 10.1007/s00521-015-1827-x. - DOI
    1. Lachat E., Macher H., Landes T., Grussenmeyer P. Assessment and Calibration of a RGB-D Camera (Kinect v2 Sensor) Towards a Potential Use for Close-Range 3D Modeling. Remote Sens. 2015;7:13070–13097. doi: 10.3390/rs71013070. - DOI