Part-based Collaborative Spatio-temporal Feature Learning for Cloth-changing Gait Recognition
- Publisher:
- IEEE
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings - International Conference on Pattern Recognition, 2020, 00, pp. 2057-2064
- Issue Date:
- 2020-01-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Part-based_Collaborative_Spatio-temporal_Feature_Learning_for_Cloth-changing_Gait_Recognition.pdf | Published version | 574.28 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
In decades many gait recognition methods have been proposed using different techniques. However, due to a real-world scenario of clothing variations, a reduction of the recognition rate occurs for most of these methods. Thus in this paper, a part-based spatio-temporal feature learning method is proposed to tackle the problem of clothing variations for gait recognition. First, based on the anatomical properties, human bodies are segmented into two regions, which are affected and unaffected by clothing variations. A learning network is particularly proposed in this paper to grasp principal spatio-temporal features from those unaffected regions. Different from most part-based methods with spatial or temporal features solely being utilized, in our method these two features are associated in a more collaborative manner. Snapshots are created for each gait sequence from the H − W and T − W views. Stable spatial information is embedded in the H −W view and adequate temporal information is embedded in the T −W view. An inherent relationship exists between these two views. Thus, a collaborative spatio-temporal feature will be hybridized by concatenating these correlative spatial and temporal information. The robustness and efficiency of our proposed method are validated by experiments on CASIA Gait Dataset B and OU-ISIR Treadmill Gait Dataset B. Our proposed method can both achieve the state-of-the-art results on these two databases.
Please use this identifier to cite or link to this item: