The Use of Gated Recurrent Unit with First Order Probability for Sentiment Analysis


Samar Khudair Abbas,Loay E. George,



Gated recurrent unit,Deep Learning,Recurrent neural network,Sentiment analysis,


Sentiment analysis is one of the recent important subjects in classification filed that recently growing using deep learning. With the spread use of internet, many rising social media, known forums, survey sites, as well as a lot of bloggers produce massive amount of information in shape of customer sentimental assessments, feelings, point of view, debate, opinion around various social news, products, trademark, and protocols, videos etc. Text analysis is an important subject for any system that deals with strings to extract the useful data. In this paper, the effective methods of deep learning will been applied for job with sentiment analysis to get rid of the text analyzing problems and applied some solutions to these problems, by using recurrent neural network (Gated Recurrent Unit (GRU)). In addition, noisy words will been removed to reduce the search space. In order to test the system performance, a set of tests was applied on three datasets. The first and second datasets are collected data from IMDB that consist of movie reviews expressed through long sentences of English, and the third dataset is collection of twitter using the Twitter Search API to collect these tweets by using keyword search, these tweets in English words with short sentences. The conducted tests on the developed system gave accuracy that range 88% - 68%, and the time will been reduced with percentage about 89% when compared with the results of other newly published works. Experimental results on Datasets demonstrate that our proposed models can learn effective features and obtain superior performance over the baseline models.


I. Bengio, Y., Boulanger-Lewandowski, N., and Pascanu, R., “Advances in
Optimizing Recurrent Networks”, IEEE International Conference on
Acoustics, Speech and Signal Processing, pp. 8624-8628, 2013.
II. Bradbury, J., Merity, S., Xiong, C., & Socher, R., “Quasi-Recurrent
Neural Networks”, 5th International Conference on Learning
Representations, 2016.
III. Bengio, Y., “Learning Deep Architectures for AI”, Foundations and
Trends in Machine Learning Volume 2, no 1, pp 1-127, 2009.
IV. Collobert, R., “Deep Learning for Efficient Discriminative Parsing”, 14th
International Conference on Artificial Intelligence and Statistics
(AISTATS), Fort Lauderdale, FL, USA. Volume 15, pp. 224-232, 2011.
V. Chung, J., Gulcehre, C., Cho, K., and Bengio, Y., “Empirical Evaluation
of Gated Recurrent Neural Networks on Sequence Modeling”, arXiv
preprint arXiv: 1412.3555, 2014.
VI. Cho, K., Van Merriënboer, B., Bahdanau, D.,and Bengio, Y., “On the
Properties of Neural Machine Translation: Encoder–Decoder
Approaches”, arXiv preprint arXiv:1409.1259, 2014.

VII. Deng, L., and Wiebe, J., “Sentiment propagation via implicature
constraints”, Proceedings of the 14th Conference of the European
Chapter of the Association for Computational Linguistics, pages 377–385
VIII. Feng, S., Wang, Y., Liu, L., Wang, D., and Yu, G., “Attention based
hierarchical LSTM network for context-aware microblog sentiment
classification”, Springer Science Business Media, LLC, part of Springer
Nature, Volume 22, Issue 1, pp 59–81, 2019.
IX. Hridoy, S. A. A., Ekram, M. T., Islam, M. S., Ahmed, F., and Rahman, R.
M., “Localized twitter opinion mining using sentiment analysis”, springer
article, Vol. 2, 2015.
X. Hamouda, A., Marei, M., and Rohaim, M., “Building Machine Learning
Based Senti-word Lexicon for Sentiment Analysis”, Journal of advances
in information technology, vol. 2, no. 4, November 2011.
XI. Janane, S. K. , Keerthana, M. S. and Subbulakshmi, B., “Hybrid
Classification For Sentiment Analysis Of Movie Reviews”, International
journal of engineering sciences & research technology, ISSN: 2277-9655,
XII. Jivani, A. G., “A Comparative Study of Stemming Algorithms”, IJCTA
journal, ISSN: 2229-6093, volume 2, 2011.
XIII. Mohammad, S. M., “Sentiment Analysis: Detecting Valence, Emotions,
and Other Affectual States from Text”, Emotion Measurement. DOI:
Elsevier Ltd 2016.
XIV. Mouthami, K., Devi, K. N., and Bhaskaran, V. M., “Sentiment Analysis
and Classification Based on Textual Reviews”, international conference
on information communication and embedded systems, pp. 271-276,
XV. Meyer, D., Hornik, K., and Feinerer, I., “Text Mining Infrastructure in
R”, Journal of statistical software, Volume 25, Issue 5, 2008.
XVI. Medsker, L., Jain, L. C., “Recurrent Neural Network design and
application”, Departments of Physics and Computer Science and
Information Systems, American University, Washington, D.C., CRC
Press, 2001.
XVII. Nikita P.Katariya1, M. S. Chaudhari., “Text Prepeocessing For Text
Mining Using Side Information”, International Journal of Computer
Science and Mobile Applications, Vol.3, Issue. 1, 2015.
XVIII. Pal, S., Ghosh, S., and Nag, A., “Sentiment Analysis in the Light of
LSTM Recurrent Neural Networks”, International Journal of Synthetic
Emotions, Volume 9, Issue 1, January-June (IJSE), 9(1), 33-39 2018.
XIX. Prabowo, R., & Thelwall, M., “Sentiment analysis: A combined
approach”, Journal of Informetrics, vol. 3, no. 2, 143-157, 2009.
XX. Ramasubramanian C., R.Ramya, “Effective Pre-Processing Activities in
Text Mining using Improved Porter’s Stemming Algorithm”,
International Journal of Advanced Research in Computer and
Communication Engineering, Vol. 2, Issue 12, 2013.

XXI. Rachel Tsz-Wai Lo, He, B., and Ounis, I., “Automatically building a
stopword list for an information retrieval system”, Journal on Digital
Information Management: Special Issue on the 5th Dutch-Belgian
Information Retrieval Workshop (DIR),Volume 5, Pp. 17-24, 2005.
XXII. Shuqin Gu, Lipeng Zhang and Yuexian Hou, “A Position-aware
Bidirectional Attention Network for Aspect-level Sentiment Analysis”,
27th International Conference on Computational Linguistics, pages 774–
784, Santa Fe, New Mexico, USA, August 20-26, 2018.
XXIII. Singh, V., Saini, B., “An Effective Tokenization Algorithm for
Information Retrieval Systems”, in the 1st international Conference on
data Mining, DMIN-2014, Banglore, and DOI: 10.5121, pp. 109–119,
XXIV. Vijayarani, S., Ilamathi, M. J., and Nithya, M., “Preprocessing
Techniques for Text Mining – An Overview”; International Journal of
Computer Science & Communication Networks, Volume 5 ,Number 1,
Pp. 7-16, ISSN 2249-5789, 2015.
XXV. Wang, X., Wu, P., Liu, G., Huang, Q., Hu, X., and Xu, H., “Learning
performance prediction via convolutional GRU and explainable neural
networks in e-learning environments”, Springer-Verlag GmbH Austria,
part of Springer Nature 2019.

View Download