1(School of Electronic & Information Engineering,Lanzhou Jiaotong University,Lanzhou 730070,China)2(School of Electronic & Information Engineering,Tianshui Normal University,Tianshui 741001,China)
Abstract:Aspect sentiment analysis is a more granular text sentiment analysis.The traditional method is to combine the Long Short Term Memory(LSTM) neural network with the Attention mechanism,but it is not considered the connection between the aspect terms and the sentence context actually;In the pre-training phase,the static language model is usually used,and the input word vector cannot be adjusted as needed.Aiming at the above two problems,this paper proposes an Ordered Neurons Long Short-Term Memory-based and Self-Attention Mechanism-based Aspect Sentiment Analysis(ON-LSTM-SA) model.First,the pre-training of corpus is performed by using the deep contextualized word representation-Embeddings from Language Models(ELMo).Secondly,the ON-LSTM neural network model is used in the hidden layer to train from the left and right directions of the context to obtain the hierarchical structure relationship between the aspect terms and the sentences.Finally,the internal word dependencies are calculated according to the Self-Attention mechanism.The model was tested on the three datasets of Laptop,Restaurant and Twitter in SemEval 2014 and SemEval 2017,which increased by 2.1%,5.9% and 6.5%,respectively,compared to the traditional LSTM model.