Supporting Clustering with Contrastive Learning

Abstract

Unsupervised clustering aims at discovering the semantic categories of data according to some distance measure in the representation space. However, different categories often overlap with each other in the representation space at the beginning of the learning process, which poses a significant challenge for distance-based clustering in achieving good separation between different categories. To this end, we propose Supporting Clustering with Contrastive Learning (SCCL) – a novel framework to leverage contrastive learning to promote better separation. We assess the performance of SCCL on short text clustering tasks and show that SCCL significantly advances the state-of-the-art results on six out of eight benchmark datasets with 3%-11% improvement on Accuracy and 4%-15% improvement on Normalized Mutual Information. Furthermore, our quantitative analysis demonstrates the effectiveness of SCCL in leveraging the strengths of both bottom-up instance discrimination and top-down clustering to achieve better intra-cluster and inter-cluster distances when evaluated with the true labels.

Publication
North American Chapter of the Association for Computational Linguistics, NAACL 2021
Avatar
Dejiao Zhang
Senior Applied Scientist