Two-Stage Attention Network for Aspect-Level Sentiment Classification

Abstract

Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.

Publication
Neural Information Processing
Hua Xu
Hua Xu
Tenured Associate Professor, Associate Editor of Expert Systems with Application, Ph.D Supervisor
Xiaomin Sun
Xiaomin Sun
Associate Professor, Master Supervisor
Junhui Deng
Junhui Deng
Professor, Master Supervisor

Related