Unsupervised transfer learning for target detection from hyperspectral images
- Publication Type:
- Journal Article
- Neurocomputing, 2013, 120 pp. 72 - 82
- Issue Date:
Target detection has been of great interest in hyperspectral image analysis. Feature extraction from target samples and counterpart backgrounds consist the key to the problem. Traditional target detection methods depend on comparatively fixed feature for all the pixels under observation. For example, RX employs the same distance measurement for all the pixels. However, the best separation results usually come from certain targets and backgrounds. Theoretically, they are the purest targets and backgrounds pixels, or the constructive endmembers in the subspace model. So using those most representative pixels' feature to train a concentrated subspace is expected to enhance the separability between targets and backgrounds. Meanwhile, applying the discriminative information from these training data to the large testing data which are not in the same feature space and with different data distributions is a challenge. Here, the idea of transfer learning from interactive annotation technique in video is employed. Based on the transfer learning frame, several points are taken into consideration and the proposed method is named as an unsupervised transfer learning based target detection (UTLD) method. Firstly, the extreme target and background pixels are generated from robust outlier detection, providing the input for target samples and background samples in transfer learning. Secondly, pixels are calculated from the root points in a segmentation method with the purpose to preserve the most distribution feature of the backgrounds after reduced dimension. Thirdly, sparse constraint is imposed into the transfer learning procedure. With this constraint, a simpler and more concentrated subspace with clear physical meaning can be constructed. Extensive experiments reveal the performance is comparable to the state-of-art target detection methods. © 2013 Elsevier B.V.
Please use this identifier to cite or link to this item: