Large-Cone Nonnegative Matrix Factorization

Publication Type:
Journal Article
IEEE Transactions on Neural Networks and Learning Systems, 2017, 28 (9), pp. 2129 - 2142
Issue Date:
Filename Description Size
07492255.pdfPublished Version1.26 MB
Adobe PDF
Full metadata record
© 2012 IEEE. Nonnegative matrix factorization (NMF) has been greatly popularized by its parts-based interpretation and the effective multiplicative updating rule for searching local solutions. In this paper, we study the problem of how to obtain an attractive local solution for NMF, which not only fits the given training data well but also generalizes well on the unseen test data. Based on the geometric interpretation of NMF, we introduce two large-cone penalties for NMF and propose large-cone NMF (LCNMF) algorithms. Compared with NMF, LCNMF will obtain bases comprising a larger simplicial cone, and therefore has three advantages. 1) the empirical reconstruction error of LCNMF could mostly be smaller; (2) the generalization ability of the proposed algorithm is much more powerful; and (3) the obtained bases of LCNMF have a low-overlapping property, which enables the bases to be sparse and makes the proposed algorithms very robust. Experiments on synthetic and real-world data sets confirm the efficiency of LCNMF.
Please use this identifier to cite or link to this item: