An Interpretable Deep Learning Framework for Health Monitoring Systems: A Case Study of Eye State Detection using EEG Signals

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2020 IEEE Symposium Series on Computational Intelligence, SSCI 2020, 2021, 00, pp. 211-218
Issue Date:
2021-01-05
Full metadata record
Effective monitoring and early detection of deterioration in patients play an essential role in healthcare. This includes minimizing the number of emergency encounters, reducing the length of hospitalization stay, re-admission rates of the patients, and etc. Cutting-edge methods in artificial intelligence (AI) have the ability to significantly improve outcomes. However, the struggle to interpret these black box models presents a serious problem to the healthcare industry. When selecting a model, the decision to sacrifice accuracy for interpretability must be made. In this paper, we propose an interpretable framework with the ability of real-time prediction. To demonstrate the predictive power of the framework, a case study on eye state detection using electroencephalogram (EEG) signals was employed to investigate how a deep neural network (DNN) model makes a prediction, and how that prediction can be interpreted. The promising results can be used to employ more advanced models in healthcare solutions without any concern of sacrificing the interpretation.
Please use this identifier to cite or link to this item: