Open Access iconOpen Access

ARTICLE

crossmark

DeBERTa-GRU: Sentiment Analysis for Large Language Model

by Adel Assiri1, Abdu Gumaei2,*, Faisal Mehmood3,*, Touqeer Abbas4, Sami Ullah5

1 Department of Informatics for Business, College of Business, King Khalid University, Abha, 61421, Saudi Arabia
2 Department of Computer Science, College of Computer Engineering and Sciences, Prince Sattam bin Abdulaziz University, Al-Kharj, 11942, Saudi Arabia
3 School of Electrical and Information Engineering, Zhengzhou University, Zhengzhou, 450001, China
4 Department of Computer Science and Technology, Beijing University of Chemical Technology, Beijing, 100029, China
5 Department of Computer Science, Government College University Faisalabad, Faisalabad, Punjab, 38000, Pakistan

* Corresponding Authors: Abdu Gumaei. Email: email; Faisal Mehmood. Email: email

(This article belongs to the Special Issue: Advance Machine Learning for Sentiment Analysis over Various Domains and Applications)

Computers, Materials & Continua 2024, 79(3), 4219-4236. https://doi.org/10.32604/cmc.2024.050781

Abstract

Modern technological advancements have made social media an essential component of daily life. Social media allow individuals to share thoughts, emotions, and ideas. Sentiment analysis plays the function of evaluating whether the sentiment of the text is positive, negative, neutral, or any other personal emotion to understand the sentiment context of the text. Sentiment analysis is essential in business and society because it impacts strategic decision-making. Sentiment analysis involves challenges due to lexical variation, an unlabeled dataset, and text distance correlations. The execution time increases due to the sequential processing of the sequence models. However, the calculation times for the Transformer models are reduced because of the parallel processing. This study uses a hybrid deep learning strategy to combine the strengths of the Transformer and Sequence models while ignoring their limitations. In particular, the proposed model integrates the Decoding-enhanced with Bidirectional Encoder Representations from Transformers (BERT) attention (DeBERTa) and the Gated Recurrent Unit (GRU) for sentiment analysis. Using the Decoding-enhanced BERT technique, the words are mapped into a compact, semantic word embedding space, and the Gated Recurrent Unit model can capture the distance contextual semantics correctly. The proposed hybrid model achieves F1-scores of 97% on the Twitter Large Language Model (LLM) dataset, which is much higher than the performance of new techniques.

Keywords


Cite This Article

APA Style
Assiri, A., Gumaei, A., Mehmood, F., Abbas, T., Ullah, S. (2024). Deberta-gru: sentiment analysis for large language model. Computers, Materials & Continua, 79(3), 4219-4236. https://doi.org/10.32604/cmc.2024.050781
Vancouver Style
Assiri A, Gumaei A, Mehmood F, Abbas T, Ullah S. Deberta-gru: sentiment analysis for large language model. Comput Mater Contin. 2024;79(3):4219-4236 https://doi.org/10.32604/cmc.2024.050781
IEEE Style
A. Assiri, A. Gumaei, F. Mehmood, T. Abbas, and S. Ullah, “DeBERTa-GRU: Sentiment Analysis for Large Language Model,” Comput. Mater. Contin., vol. 79, no. 3, pp. 4219-4236, 2024. https://doi.org/10.32604/cmc.2024.050781



cc Copyright © 2024 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1110

    View

  • 364

    Download

  • 0

    Like

Share Link