Compatible Transformer for Irregularly Sa: Multivariate Time Series

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Conference Proceeding
Citation:
2023 IEEE International Conference on Data Mining (ICDM), 2023, 00, pp. 1409-1414
Issue Date:
2023-01-01
Filename Description Size
1721068.pdfPublished version399.42 kB
Adobe PDF
Full metadata record
To analyze multivariate tune series, most previous methods assume regular subsampling of tune series, where the interval between adjacent measurements and the number of samples remain unchanged. Practically, data collection systems could produce irregularly sampled time series due to sensor failures and interventions. However, existing methods designed for regularly sampled multivariate time series cannot directly handle irregularity owing to misalignment along both temporal and variate dimensions. To fill this gap, we propose Compatible Transformer (CoFormer), a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample in irregular multivariate time series. In CoFormer, we view each sample as a unique variatetime point and leverage intra-variate/inter-variate attentions to learn sample-wise temporal/interaction features based on intravariate/inter-variate neighbors. With CoFormer as the core, we can analyze irregularly sampled multivariate time series for many downstream tasks, including classification and prediction. We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods. Code will he avilahle at littps://github.com/MediaBrain-SJTU/CoFormer.
Please use this identifier to cite or link to this item: