Non-contact Doppler radar based prediction of nocturnal body orientations using deep neural network for chronic heart failure patients

Publication Type:
Conference Proceeding
2017 International Conference on Electrical and Computing Technologies and Applications, ICECTA 2017, 2017, 2018-January pp. 1 - 5
Issue Date:
Full metadata record
© 2017 IEEE. Sleep is crucial in our daily life as it plays a key role in our physical and mental health. It is important to monitor the sleep body orientations and movements due to its relationships to particular diseases, e.g., obstructive sleep apnea, insomnia or periodic limb movement disorder. Analyzing sleep body orientations also helps in determining sleep quality and irregular sleeping patterns. However, the current non-invasive sleep body orientations monitoring technologies are not well suited for long-term continuous monitoring due to its restrictions in mobility and comfort. This paper proposes a system that applies a features extraction process, utilizing wavelet packet decomposition, to extract features that describe the non-contact Doppler radar signatures caused by the body orientations. A database consisting of 24 chronic heart failure patients is selected for the training, validation and test of the non-contact body orientations prediction. These patients are diagnosed with New York Heart Association heart failure classification Class II & III and underwent full polysomnography analysis for the diagnosis of sleep apnea, disordered sleep, or both. The patients' data are randomly concatenated and partitioned into the ratio of 50% for 'Training', 15% for 'Validation' and 35% for 'Test. Across the 'Test dataset with total sleep duration of 65 hours, the body orientations prediction accuracy achieved a correct classification rate of 99.2% for 5 classes of 'Prone', 'Upright', 'Supine', 'Right and 'Left body orientations. The misclassification rate is 0.8%. A potential application would be non-contact continuous monitoring of nocturnal body orientations in the home.
Please use this identifier to cite or link to this item: