Short-Text Topic Modeling via Non-negative Matrix Factorization Enriched with Local Word-Context Correlations
Tian Shi, Chandan Reddy
Abstract
Being a prevalent form of social communications on the Internet, billions of short texts are generated everyday. Discovering knowledge from them has gained a lot of interest from both industry and academia. The short texts have a limited contextual information, and they are sparse, noisy and ambiguous, and hence, automatically learning topics from them remains an important challenge. To tackle this problem, in this paper, we propose a semantics-assisted non-negative matrix factorization (SeaNMF) model to discover topics for the short texts. It effectively incorporates the word-context semantic correlations into the model, where the semantic relationships between the words and their contexts are learned from the skip-gram view of the corpus. The SeaNMF model is solved using a block coordinate descent algorithm. We also develop a sparse variant of the SeaNMF model which can achieve a better model interpretability. Extensive quantitative evaluations on various real-world short text datasets demonstrate the superior performance of the proposed models over several other state-of-the-art methods in terms of topic coherence and classification accuracy. The qualitative semantic analysis demonstrates the interpretability of our models by discovering meaningful and consistent topics. With a simple formulation and the superior performance, SeaNMF can be an effective standard topic model for short texts.
Tian Shi, Kyeongpil Kang, Jaegul Choo , Chandan K. Reddy: Short-Text Topic Modeling via Non-negative Matrix Factorization Enriched with Local Word-Context Correlations. WWW 2018: 1105-1114
People
Publication Details
- Date of publication:
- April 10, 2018
- Conference:
- World Wide Web conference
- Page number(s):
- 1105-1114