Beyond doctors: Future health prediction from multimedia and multimodal observations

Publication Type:
Conference Proceeding
Citation:
MM 2015 - Proceedings of the 2015 ACM Multimedia Conference, 2015, pp. 591 - 600
Issue Date:
2015-10-13
Filename Description Size
p591-nie.pdfPublished version1.55 MB
Adobe PDF
Full metadata record
© 2015 ACM. Although chronic diseases cannot be cured, they can be effectively controlled as long as we understand their progressions based on the current observational health records, which is often in the form of multimedia data. A large and growing body of literature has investigated the disease progression problem. However, far too little attention to date has been paid to jointly consider the following three observations of the chronic disease progression: 1) the health statuses at different time points are chronologically similar; 2) the future health statuses of each patient can be comprehensively revealed from the current multimedia and multimodal observations, such as visual scans, digital measurements and textual medical histories; and 3) the discriminative capabilities of different modalities vary significantly in accordance to specific diseases. In the light of these, we propose an adaptive multimodal multi-Task learning model to co-regularize the modality agreement, temporal progression and discriminative capabilities of different modalities. We theoretically show that our proposed model is a linear system. Before training our model, we address the data missing problem via the matrix factorization approach. Extensive evaluations on a real-world Alzheimer's disease dataset well verify our proposed model. It should be noted that our model is also applicable to other chronic diseases.
Please use this identifier to cite or link to this item: