Collective classification via discriminative matrix factorization on sparsely labeled networks

Publication Type:
Conference Proceeding
Citation:
International Conference on Information and Knowledge Management, Proceedings, 2016, 24-28-October-2016 pp. 1563 - 1572
Issue Date:
2016-10-24
Filename Description Size
p1563-zhang.pdfPublished version1.73 MB
Adobe PDF
Full metadata record
© 2016 Copyright held by the owner/author(s). We address the problem of classifying sparsely labeled networks, where labeled nodes in the network are extremely scarce. Existing algorithms, such as collective classification, have been shown to be effective for jointly deriving labels of related nodes, by exploiting label dependencies among neighboring nodes. However, when the network is sparsely labeled, most nodes have too few or even no connections to labeled nodes. This makes it very difficult to leverage supervised knowledge from labeled nodes to accurately estimate label dependencies, thereby largely degrading classification accuracy. In this paper, we propose a novel discriminative matrix factorization (DMF) based algorithm that effectively learns a latent network representation by exploiting topo-logical paths between labeled and unlabeled nodes, in addition to nodes' content information. The main idea is to use matrix factorization to obtain a compact representation of the network that fully encodes nodes' content information and network structure, and unleash discriminative power inferred from labeled nodes to directly benefit collective classification. We formulate a new matrix factorization objective function that integrates network representation learning with an empirical loss minimization for classifying node labels. An efficient optimization algorithm based on conjugate gradient methods is proposed to solve the new objective function. Experimental results on real-world networks show that DMF yields superior performance gain over the state-of-the-art baselines on sparsely labeled networks.
Please use this identifier to cite or link to this item: