Tian Shi, Liuqing Li, Ping Wang, Chandan Reddy

Abstract

Unsupervised aspect detection (UAD) aims at automatically extracting interpretable aspects and identifying aspect-specific segments (such as sentences) from online reviews. However, recent deep learning based topic models, specifically aspect-based autoencoder, suffer from several problems such as extracting noisy aspects and poorly mapping aspects discovered by models to the aspects of interest. To tackle these challenges, in this paper, we first propose a self-supervised contrastive learning framework and an attention-based model equipped with a novel smooth self-attention (SSA) module for the UAD task in order to learn better representations for aspects and review segments. Secondly, we introduce a high-resolution selective mapping (HRSMap) method to efficiently assign aspects discovered by the model to the aspects of interest. We also propose using a knowledge distillation technique to further improve the aspect detection performance. Our methods outperform several recent unsupervised and weakly supervised approaches on publicly available benchmark user review datasets. Aspect interpretation results show that extracted aspects are meaningful, have a good coverage, and can be easily mapped to aspects of interest. Ablation studies and attention weight visualization also demonstrate effectiveness of SSA and the knowledge distillation method.

Tian Shi, Liuqing Li, Ping Wang, Chandan K. Reddy: A Simple and Effective Self-Supervised Contrastive Learning Framework for Aspect Detection. AAAI 2021: 13815-13824

People

Chandan Reddy


Liuqing Li


Tian Shi


Ping Wang


Publication Details

Date of publication:
May 18, 2021
Conference:
AAAI Conference on Artificial Intelligence
Page number(s):
13815-13824
Volume:
35
Issue Number:
15