Incorporating the loss function into discriminative clustering of structured outputs

Publication Type:
Journal Article
Citation:
IEEE Transactions on Neural Networks, 2010, 21 (10), pp. 1564 - 1575
Issue Date:
2010-10-01
Filename Description Size
Thumbnail2013004109OK.pdf597.1 kB
Adobe PDF
Full metadata record
Clustering using the Hilbert Schmidt independence criterion (CLUHSIC) is a recent clustering algorithm that maximizes the dependence between cluster labels and data observations according to the Hilbert Schmidt independence criterion (HSIC). It is unique in that structure information on the cluster outputs can be easily utilized in the clustering process. However, while the choice of the loss function is known to be very important in supervised learning with structured outputs, we will show in this paper that CLUHSIC is implicitly using the often inappropriate zero-one loss. We propose an extension called CLUHSICAL (which stands for Clustering using HSIC and loss) which explicitly considers both the output dependency and loss function. Its optimization problem has the same form as CLUHSIC, except that its partition matrix is constructed in a different manner. Experimental results on a number of datasets with structured outputs show that CLUHSICAL often outperforms CLUHSIC in terms of both structured loss and clustering accuracy. © 2010 IEEE.
Please use this identifier to cite or link to this item: