ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions.

Publisher:
MDPI
Publication Type:
Journal Article
Citation:
Entropy (Basel), 2025, 27, (12), pp. 1200
Issue Date:
2025-11-26
Full metadata record
Deep clustering aims to discover meaningful data groups by jointly learning representations and cluster probability distributions. Yet existing methods rarely consider the underlying information characteristics of these distributions, causing ambiguity and redundancy in cluster assignments, particularly when different augmented views are used. To address this issue, this paper proposes a novel information-principled deep clustering framework for learning invariant, redundancy-reduced, and discriminative cluster probability distributions, termed ICIRD. Specifically, ICIRD is built upon three complementary modules for cluster probability distributions: (i) conditional entropy minimization, which increases assignment certainty and discriminability; (ii) inter-cluster mutual information minimization, which reduces redundancy among cluster distributions and sharpens separability; and (iii) cross-view mutual information maximization, which enforces semantic consistency across augmented views. Additionally, a contrastive representation mechanism is incorporated to provide stable and reliable feature inputs for the cluster probability distributions. Together, these components enable ICIRD to jointly optimize both representations and cluster probability distributions in an information-regularized manner. Extensive experiments on five image benchmark datasets demonstrate that ICIRD outperforms most existing deep clustering methods, particularly on fine-grained datasets such as CIFAR-100 and ImageNet-Dogs.
Please use this identifier to cite or link to this item: