A new multimodal biometrics for personal identification using machine learning techniques
- Publication Type:
- Thesis
- Issue Date:
- 2008
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
01Front.pdf | contents and abstract | 5.48 MB | |||
02Whole.pdf | thesis | 75.49 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
NO FULL TEXT AVAILABLE. This thesis contains 3rd party copyright material. ----- Multiple biometrics are used to compensate for the limitations impinging upon unimodal
biometrics. Many fusion or combination techniques of different biometrics have
been proposed for personal identification and verification in the past. Among many
biometric characteristics, the facial biometric is considered the most un-intrusive single
modal technology that can be deployed in the real-world visual surveillance
environment. These uni-modal technologies suffer from many variation problems such
as pose, illumination, or facial expression due to the real-world unconstrained
environment. Considerable research has been done to cope with the variations due to
poses and lighting conditions, and multiple biometrics (such as gait and ears) which can
be integrated with facial biometrics have been proposed to compensate recognition
performance when recognizing faces at a distance. However, little research attention has
been paid to facial expression changes. In most literature, facial expression changes are
considered as noise that would degrade the recognition performance. However, can these
intra-personal variations be used as another behavioral biometric and also be useful for
assisting the extra-personal separation to improve personal identification performance?
Our hypothesis is that the dynamic information of intra-personal facial behavior could be
used not only as another behavioral biometric but also could assist the extra-personal
separation for recognition perfonnance improvement. We will propose and design
various experiments to validate and to support our hypothesis. We firstly discussed the
facial expression variation problem, secondly, we introduced another single behavioral
biometric using facial behavior, and finally, we proposed a framework to integrate facial
appearance and facial expression features for improving the personal identification
performance. Our experimental results showed that facial behavior can not only be used
as another behavioral biometrics in single modalities, but also can assist in extrapersonal
separation in multiple modalities for personal identification improvement.
Please use this identifier to cite or link to this item: