Transformer based multi-head attention network for aspect-based sentiment classification

Abhinandan Shirahatti, Vijay Rajpurohit, Sanjeev Sannakki


Aspect-based sentiment classification is vital in helping manufacturers identify the pros and cons of their products and features. In the latest days, there has been a tremendous surge of interest in aspect-based sentiment classification (ABSC). Since it predicts an aspect term sentiment polarity in a sentence rather than the whole sentence. Most of the existing methods have used recurrent neural networks and attention mechanisms which fail to capture global dependencies of the input sequence and it leads to some information loss and some of the existing methods used sequence models for this task, but training these models is a bit tedious. Here, we propose the multi-head attention transformation (MHAT) network the MHAT utilizes a transformer encoder in order to minimize training time for ABSC tasks. First, we used a pre-trained Global vectors for word representation (GloVe) for word and aspect term embeddings. Second, part-of-speech (POS) features are fused with MHAT to extract grammatical aspects of an input sentence. Whereas most of the existing methods have neglected this. Using the SemEval 2014 dataset, the proposed model consistently outperforms the state-of-the-art methods on aspect-based sentiment classification tasks.


Aspect; Attention; Parts-of-speech; Sentiment; Transformer;

Full Text:




  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.


The Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

shopify stats IJEECS visitor statistics