Open Access
ARTICLE
An Efficient Long Short-Term Memory Model for Digital Cross-Language Summarization
1 Department of CSE, B V Raju Institute of Technology, Narsapur, Medak, T.S, 502 313, India
2 Department of IT, Vasavi College of Engineering, Hyderabad, T.S, 500089, India
3 Department of CSE, K.S.R.M College of Engineering, Kadapa, A.P, 516003, India
4 Department of IT, C.B.I.T, Gandipet, Hyderabad, Telangana, 500075, India
5 Department of CSE, B V Raju Institute of Technology, Narsapur, Medak, T.S, 502 313, India
* Corresponding Author: Purnachand Kollapudi. Email:
Computers, Materials & Continua 2023, 74(3), 6389-6409. https://doi.org/10.32604/cmc.2023.034072
Received 05 July 2022; Accepted 15 September 2022; Issue published 28 December 2022
Abstract
The rise of social networking enables the development of multilingual Internet-accessible digital documents in several languages. The digital document needs to be evaluated physically through the Cross-Language Text Summarization (CLTS) involved in the disparate and generation of the source documents. Cross-language document processing is involved in the generation of documents from disparate language sources toward targeted documents. The digital documents need to be processed with the contextual semantic data with the decoding scheme. This paper presented a multilingual cross-language processing of the documents with the abstractive and summarising of the documents. The proposed model is represented as the Hidden Markov Model LSTM Reinforcement Learning (HMMlstmRL). First, the developed model uses the Hidden Markov model for the computation of keywords in the cross-language words for the clustering. In the second stage, bi-directional long-short-term memory networks are used for key word extraction in the cross-language process. Finally, the proposed HMMlstmRL uses the voting concept in reinforcement learning for the identification and extraction of the keywords. The performance of the proposed HMMlstmRL is 2% better than that of the conventional bi-direction LSTM model.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.