AB - In this paper we propose Softboost, a novel Boosting al-gorithm which combines the merits of transductive and inductive learning approaches to attack the problem of learning from very few labeled training examples. In the transductive stage, soft labels of both the labeled and unlabeled samples are estimated based on a Markovian propagating procedure. While in the subsequent inductive stage, to efficiently handle out-of-sample data, we learn a weighted combination of simple rules in Boosting style, each of which maximizes confidence-weighted inter-class Kullback-Leibler (KL) divergence under current data distribution. Finally, experiments on toy dataset and USPS handwritten digits are presented to demonstrate its effectiveness. AU - Mu, Y AU - Xu, M AU - Yan, S CY - Hongkong DA - 2010/01/01 EP - 3872 JO - IEEE International Conference on Image Processing PB - IEEE Computer Society PY - 2010/01/01 SP - 3869 TI - Learning From Very-Few Labeled Examples with Soft Labels Y1 - 2010/01/01 Y2 - 2026/04/29 ER -