Learning From Very-Few Labeled Examples with Soft Labels

IEEE Computer Society
Publication Type:
Conference Proceeding
2010 IEEE International Conference on Image Processing ICIP 2010 - Proceedings, 2010, pp. 3869 - 3872
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2009007894OK.pdf940.44 kB
Adobe PDF
In this paper we propose Softboost, a novel Boosting al-gorithm which combines the merits of transductive and inductive learning approaches to attack the problem of learning from very few labeled training examples. In the transductive stage, soft labels of both the labeled and unlabeled samples are estimated based on a Markovian propagating procedure. While in the subsequent inductive stage, to efficiently handle out-of-sample data, we learn a weighted combination of simple rules in Boosting style, each of which maximizes confidence-weighted inter-class Kullback-Leibler (KL) divergence under current data distribution. Finally, experiments on toy dataset and USPS handwritten digits are presented to demonstrate its effectiveness.
Please use this identifier to cite or link to this item: