Position Weight with Muti-Head Attention for Aspect-Based Sentiment Classification
Download as PDF
DOI: 10.23977/mcee2020.041
Author(s)
Ping Ji, Xiaohong Liu
Corresponding Author
Ping Ji
ABSTRACT
The purpose of aspect based sentiment classification (ABSC) is to predict emotional tendencies of a sentence or article in certain aspects. Most of the existing methods use coarse-grained attention mechanisms and do not notice the relationship between emotional orientation and contextual content. In this paper, we proposed a model called PW-MHA in which a pre-trained BERT is applied. PW-MHA uses the multi-head attention mechanism to capture the relationships between internal words. Position weights are used for the classification on the idea that a closer context word is more likely to be the importance word for the target. We evaluated the PW-MHA model on three datasets: laptops and restaurants from SemEval2014, and the ACL 14 twitter dataset. It is shown that the PW-MHA has achieved competitive results on the given datasets.
1. Introduction
KEYWORDS
Aspect based sentiment classification, Position weights, Multi-head attention