Integrating simplified inverse representation and CRC for face recognition

Publication Type:
Conference Proceeding
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2015, 9426 pp. 171 - 183
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
0623Integrating Simplified Inverse Representation and CRC for face recognition.pdfAccepted Manuscript version2.85 MB
Adobe PDF
© Springer International Publishing Switzerland 2015. The representation based classification method (RBCM) has attracted much attention in the last decade. RBCM exploits the linear combination of training samples to represent the test sample, which is then classified according to the minimum reconstruction residual. Recently, an interesting concept, Inverse Representation (IR), is proposed. It is the inverse process of conventional RBCMs. IR applies test samples’ information to represent each training sample, and then classifies the training sample as a useful supplement for the final classification. The relative algorithm CIRLRC, integrating IR and linear regression classification (LRC) by score fusing, shows superior classification performance. However, there are two main drawbacks in CIRLRC. First, the IR in CIRLRC is not pure, because the test vector contains some training sample information. The other is the computation inefficiency because CIRLRC should solve C linear equations for classifying the test sample respectively, where C is the number of the classes. Therefore, we present a novel method integrating simplified IR (SIR) and collaborative representation classification (CRC), named SIRCRC, for face recognition. In SIRCRC, only test sample information is fully used in SIR, and CRC is more efficient than LRC in terms of speed, thus, one linear equation system is needed. Extensive experimental results on face databases show that it is very competitive with both CIRLRC and the state-of-the-art RBCM.
Please use this identifier to cite or link to this item: