Attentive Representation Learning with Adversarial Training for Short Text Clustering
Short text clustering has far-reaching effects on semantic analysis, showing its importance for multiple applications such as corpus summarization and information retrieval. However, it inevitably encounters the severe sparsity of short text representation, making the previous clustering approaches still far from satisfactory. In this paper, we present a novel attentive representation learning model for shot text clustering, wherein cluster-level attention is proposed to capture the correlation between text representation and cluster representation. Relying on this, the representation learning and clustering for short text are seamlessly integrated into a unified framework. To further facilitate the model training process, we apply adversarial training to the unsupervised clustering setting, by adding perturbations to the cluster representations. The model parameters and perturbations are optimized alternately through a minimax game. Extensive experiments on three real-world short text datasets demonstrate the superiority of the proposed model over several strong competitors, verifying that adversarial training yields a substantial performance gain.
READ FULL TEXT