Two-Stage Attention Network for Aspect-Level Sentiment Classification

摘要

Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.

出版物
Neural Information Processing
徐华
徐华
长聘副教授, Expert Systems with Application 副主编,博士生导师
孙晓民
孙晓民
副教授,硕士生导师
邓俊辉
邓俊辉
教授,硕士生导师

相关