Table of Content

Open Access iconOpen Access

ARTICLE

Dependency-Based Local Attention Approach to Neural Machine Translation

Jing Qiu1, Yan Liu2, Yuhan Chai2, Yaqi Si2, Shen Su1, ∗, Le Wang1, ∗, Yue Wu3

Cyberspace Institute of Advanced Technology (CIAT) Guangzhou University, Guangzhou, 510006, China.
Department of Information Science and Engineering, Hebei University of Science and Technology, Shijiazhuang, 050000, China.
USC Information Sciences Institute, Marina del Rey, CA 90292, USA.

* Corresponding Author: Shen Su. Email: email.
  Le Wang. Email: email.

Computers, Materials & Continua 2019, 59(2), 547-562. https://doi.org/10.32604/cmc.2019.05892

Abstract

Recently dependency information has been used in different ways to improve neural machine translation. For example, add dependency labels to the hidden states of source words. Or the contiguous information of a source word would be found according to the dependency tree and then be learned independently and be added into Neural Machine Translation (NMT) model as a unit in various ways. However, these works are all limited to the use of dependency information to enrich the hidden states of source words. Since many works in Statistical Machine Translation (SMT) and NMT have proven the validity and potential of using dependency information. We believe that there are still many ways to apply dependency information in the NMT structure. In this paper, we explore a new way to use dependency information to improve NMT. Based on the theory of local attention mechanism, we present Dependency-based Local Attention Approach (DLAA), a new attention mechanism that allowed the NMT model to trace the dependency words related to the current translating words. Our work also indicates that dependency information could help to supervise attention mechanism. Experiment results on WMT 17 Chineseto- English translation task shared training datasets show that our model is effective and perform distinctively on long sentence translation

Keywords


Cite This Article

APA Style
Qiu, J., Liu, Y., Chai, Y., Si, Y., Su, S. et al. (2019). Dependency-based local attention approach to neural machine translation. Computers, Materials & Continua, 59(2), 547-562. https://doi.org/10.32604/cmc.2019.05892
Vancouver Style
Qiu J, Liu Y, Chai Y, Si Y, Su S, Wang L, et al. Dependency-based local attention approach to neural machine translation. Comput Mater Contin. 2019;59(2):547-562 https://doi.org/10.32604/cmc.2019.05892
IEEE Style
J. Qiu et al., “Dependency-Based Local Attention Approach to Neural Machine Translation,” Comput. Mater. Contin., vol. 59, no. 2, pp. 547-562, 2019. https://doi.org/10.32604/cmc.2019.05892

Citations




cc Copyright © 2019 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2975

    View

  • 1356

    Download

  • 0

    Like

Related articles

Share Link