TCSST: transfer classification of short & sparse text using external data

Publisher:
ACM
Publication Type:
Conference Proceeding
Citation:
Proc. Of The 21st ACM Conference on Information and Knowledge Management (CIKM-12), 2012, pp. 764 - 772
Issue Date:
2012-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2011008002OK.pdf1.07 MB
Adobe PDF
Short & sparse text is becoming more prevalent on the web, such as search snippets, micro-blogs and product reviews. Accurately classifying short & sparse text has emerged as an important while challenging task. Existing work has considered utilizing external data (e.g. Wikipedia) to alleviate data sparseness, by appending topics detected from external data as new features. However, training a classifier on features concatenated from different spaces is not easy considering the features have different physical meanings and different significance to the classification task. Moreover, it exacerbates the "curse of dimensionality" problem. In this study, we propose a transfer classification method, TCSST, to exploit the external data to tackle the data sparsity issue. The transfer classifier will be learned in the original feature space. Considering that the labels of the external data may not be readily available or sufficiently enough, TCSST further exploits the unlabeled external data to aid the transfer classification. We develop novel strategies to allow TCSST to iteratively select high quality unlabeled external data to help with the classification. We evaluate the performance of TCSST on both benchmark as well as real-world data sets. Our experimental results demonstrate that the proposed method is effective in classifying very short & sparse text, consistently outperforming existing and baseline methods
Please use this identifier to cite or link to this item: