paper.2.0..pdf (3.34 MB)
Download file

Learning the Precise Feature for Cluster Assignment

Download (3.34 MB)
journal contribution
posted on 28.05.2021, 10:57 by Y Gan, X Dong, Huiyu Zhou, F Gao, J Dong
Clustering is one of the fundamental tasks in com-puter vision and pattern recognition. Recently, deep clusteringmethods (algorithms based on deep learning) have attractedwide attention with their impressive performance. Most ofthese algorithms combine deep unsupervised feature learningand standard clustering together. However, the separation offeature extraction and clustering will lead to suboptimal solutionsbecause the two-stage strategy prevents representation learningfrom adapting to subsequent tasks (e.g., clustering according tospecific cues). To overcome this issue, efforts have been made inthe dynamic adaption of representation and cluster assignment,whereas current state-of-the-art methods suffer from heuristicallyconstructed objectives with representation and cluster assignmentalternatively optimized. To further standardize the clusteringproblem, we formulate the objective of clustering as findinga precise feature as the cue for cluster assignment. Based onthis, we propose a general-purpose deep clustering frameworkwhich radically integrate representation learning and clusteringinto an individual pipeline for the first time. The proposedframework exploits the powerful ability of recently progressedgenerative models for learning intrinsic features, and imposes anentropy minimization on the distribution of cluster assignmentby a variational algorithm. Experimental results show that theperformance of our method is superior, or at least comparable to,the state-of-the-art methods on the handwritten digit recognition,race recognition and object recognition benchmark datasets.

History

Author affiliation

School of Informatics

Version

AM (Accepted Manuscript)

Published in

IEEE Transactions on Cybernetics

Publisher

Institute of Electrical and Electronics Engineers

issn

1083-4419

Acceptance date

30/04/2021

Copyright date

2021

Available date

28/05/2021

Language

en