Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Neural Machine Translation Models with Attention-Based Dropout Layer

    Huma Israr1,*, Safdar Abbas Khan1, Muhammad Ali Tahir1, Muhammad Khuram Shahzad1, Muneer Ahmad1, Jasni Mohamad Zain2,*

    CMC-Computers, Materials & Continua, Vol.75, No.2, pp. 2981-3009, 2023, DOI:10.32604/cmc.2023.035814 - 31 March 2023

    Abstract In bilingual translation, attention-based Neural Machine Translation (NMT) models are used to achieve synchrony between input and output sequences and the notion of alignment. NMT model has obtained state-of-the-art performance for several language pairs. However, there has been little work exploring useful architectures for Urdu-to-English machine translation. We conducted extensive Urdu-to-English translation experiments using Long short-term memory (LSTM)/Bidirectional recurrent neural networks (Bi-RNN)/Statistical recurrent unit (SRU)/Gated recurrent unit (GRU)/Convolutional neural network (CNN) and Transformer. Experimental results show that Bi-RNN and LSTM with attention mechanism trained iteratively, with a scalable data set, make precise predictions on unseen… More >

Displaying 1-10 on page 1 of 1. Per Page