ABSTRACT
Recently, satisfactory performance has been achieved on sentiment analysis tasks by using recurrent neural network (RNN). As a derived model of RNN, gated recurrent units (GRU) model has a great advantage in dealing with long text sequence problems, while the performance is limited when working with short texts. Therefore, to effectively extract the contextual information of network comments, and take advantage of convolutional neural network (CNN) that can capture the local features, we propose a GCNN (gated convolutional neural network) model based on GRU and CNN. In this paper, we compare the proposed GCNN model with CNN and RNN models on standard datasets. Experiment results show that the GCNN model achieves state-of-the-art performance on the sentiment analysis tasks, with 94.9% and 90.0% F1-score on hotel comments (ChnSentiCorp) corpus and the IMDB corpus respectively.
- Turney, P. D. 2002. Thumbs up or thumbs down: semantic orientation applied to unsupervised classification of reviews. Meeting on Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2002:417--424. Google ScholarDigital Library
- Bo, P. Lee, L. Vaithyanathan, S. 2002. Thumbs up sentiment classification using machine learning techniques. Proceedings of Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA:Association for Computational Linguistics, 2002:79--86. Google ScholarDigital Library
- Lewis, D. D. 1992. An evaluation of phrasal and clustered representations on a text categorization task. Copenhagen, Denmark: International ACM SIGIR Conference on Research and Development in Information Retrieval. June. DBLP, 1992:37--50. Google ScholarDigital Library
- Post, M. Bergsma, S. 2013. Explicit and implicit syntactic features for text classification. Association for Computational Linguistics, 2013:866--872.Google Scholar
- Ng, A. Y. 2004. Feature selection, l1 vs. l2 regularization, and rotational invariance. ACM, 2004: 78. Google ScholarDigital Library
- Cover, T. M. Thomas, J. A. 2006. Elements of information theory(2nd ed.).Canada: John W, Sons, 2006:7--38. Google ScholarDigital Library
- Liu, G. S. He, W. L. Zhu, J. et al. 2011. Feature representation based on sentimental orientation classification. China Communications, 2011, 8(3): 90--98.Google Scholar
- Mikolov, T. Sutskever, I. Chen, K. et al. 2013. Distributed representations of words and phrases and their compositionality. Proceedings of International Conference on Neural Information Processing Systems. New York, USA: ACM Press, 2013, 26:3111--3119. Google ScholarDigital Library
- Mikolov, T. Chen, K. Corrado, G. et al. 2013 Efficient Estimation of Word Representations in Vector Space. Proceedings of International Conference on Intelligent Text Processing and Computational Linguistics. Berlin, Germany: Springer, 2013:430--443.Google Scholar
- Pennington, J. Socher, R. Manning, C. D. 2014. Glove: Global Vectors for Word Representation. Conference on Empirical Methods in Natural Language Processing(EMNLP), 2014: 1532--1543.Google Scholar
- Kim, Y. 2014. Convolutional Neural Networks for Sentence Classification. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Association for Computatinal Linguistics, 2014:1746--1751.Google ScholarCross Ref
- Zeng, D. J. Liu, K. Lai, S. W. et al. 2014. Relation classification via convolutional deep neural network. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, 2014: 2335--2344.Google Scholar
- Kalchbrenner, K. N. Grefendtette, E. Blunsom, P. 2014. A convolutional neural network for modelling sentence. In proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:ACL, 2014: 655--665.Google ScholarCross Ref
- Zhou, X. J. Wan, X. J. Xiao, J. G. 2016. Attention-based LSTM network for cross-lingual sentiment classification. In proceedings of the 2016 Conf on Empirical Methods in Natural Language Processing. Stroudsburg, PA:ACL, 2016:247--256.Google ScholarCross Ref
- Nakov, P. Ritter, A. Rosenthal, S. et al. 2016. SemEval-2016 Task 4: Sentiment Analysis in Twitter. In proceedings of the 10th International Workshop on Semantic Evaluation(SemEval-2016). San Diego, California: Association for Computational Linguistics, 2016:1--18.Google Scholar
- Severyn, A. Moschitti, A. 2015. Unitn: Training deep convolutional neural network for twitter sentiment classification. In proceedings of the 9th International Workshop on Semantic Evaluation(SemEval-2015). Denver, Colorado: Association for Computational Linguistics, 2015:464--469.Google ScholarCross Ref
- Stojanovski, D. Strezoski, G. Madjarov, G. et al. 2016. Finki at semeval-2016 task 4: Deep learning architecture for twitter sentiment analysis. In proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016). San Diego, California: Association for Computational Linguistics, 2016:149--154.Google Scholar
- He, H. Y. Zhang, J. Zhang, Z. P. 2017. Text sentiment analysis based on convolutional neural network combined with part of speech features.Computer Engineering, 2017.Google Scholar
- Hu, Z. J. Zhao, X. W. 2018. Sentiment analysis based on word vector technology and hybrid neural network. Application Research of Computers, 2018, Vol. 35 No. 12Google Scholar
- Hochreiter, S. Schmidhuber, J. 1997. Long short-term memory. Neural Computation, 1997, 9.8:1735--1780. Google ScholarDigital Library
- Cho, K. Merrienboer, B. V. Gulcehre, C. et al. 2014. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Eprint Arxiv, 2014.Google Scholar
Index Terms
- Sentiment Analysis of Network Comments Based on GCNN
Recommendations
Performance comparison of text-based sentiment analysis using recurrent neural network and convolutional neural network
ICCIP '17: Proceedings of the 3rd International Conference on Communication and Information ProcessingOne biggest challenge in sentiment analysis is that it should include Natural Language Processing (NLP), to make the machine understand the human language. With the current development of Artificial Neural Network (ANN), with its implementation, ...
Sentiment Analysis in the Light of LSTM Recurrent Neural Networks
Long short-term memory LSTM is a special type of recurrent neural network RNN architecture that was designed over simple RNNs for modeling temporal sequences and their long-range dependencies more accurately. In this article, the authors work with ...
Sentiment Analysis Using Gated Recurrent Neural Networks
AbstractText sentiment analysis is an important and challenging task. Sentiment analysis of customer reviews is a common problem faced by companies. It is a machine learning problem made demanding due to the varying nature of sentences, different lengths ...
Comments