Open Access iconOpen Access

ARTICLE

crossmark

An Improved Time Feedforward Connections Recurrent Neural Networks

by Jin Wang1,2, Yongsong Zou1, Se-Jung Lim3,*

1 School of Hydraulic & Environmental Engineering, Changsha University of Science & Technology, Changsha, 410014, China
2 School of Computer & Communication Engineering, Changsha University of Science & Technology, Changsha, 410014, China
3 AI Liberal Arts Studies, Division of Convergence, Honam University, Gwangju-si, 62399, Korea

* Corresponding Author: Se-Jung Lim. Email: email

Intelligent Automation & Soft Computing 2023, 36(3), 2743-2755. https://doi.org/10.32604/iasc.2023.033869

Abstract

Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify the gradient issue due to the strict time serial dependency, making it difficult to realize a long-term memory function. On the other hand, RNNs cells are highly complex, which will significantly increase computational complexity and cause waste of computational resources during model training. In this paper, an improved Time Feedforward Connections Recurrent Neural Networks (TFC-RNNs) model was first proposed to address the gradient issue. A parallel branch was introduced for the hidden state at time t − 2 to be directly transferred to time t without the nonlinear transformation at time t − 1. This is effective in improving the long-term dependence of RNNs. Then, a novel cell structure named Single Gate Recurrent Unit (SGRU) was presented. This cell structure can reduce the number of parameters for RNNs cell, consequently reducing the computational complexity. Next, applying SGRU to TFC-RNNs as a new TFC-SGRU model solves the above two difficulties. Finally, the performance of our proposed TFC-SGRU was verified through several experiments in terms of long-term memory and anti-interference capabilities. Experimental results demonstrated that our proposed TFC-SGRU model can capture helpful information with time step 1500 and effectively filter out the noise. The TFC-SGRU model accuracy is better than the LSTM and GRU models regarding language processing ability.

Keywords


Cite This Article

APA Style
Wang, J., Zou, Y., Lim, S. (2023). An improved time feedforward connections recurrent neural networks. Intelligent Automation & Soft Computing, 36(3), 2743-2755. https://doi.org/10.32604/iasc.2023.033869
Vancouver Style
Wang J, Zou Y, Lim S. An improved time feedforward connections recurrent neural networks. Intell Automat Soft Comput . 2023;36(3):2743-2755 https://doi.org/10.32604/iasc.2023.033869
IEEE Style
J. Wang, Y. Zou, and S. Lim, “An Improved Time Feedforward Connections Recurrent Neural Networks,” Intell. Automat. Soft Comput. , vol. 36, no. 3, pp. 2743-2755, 2023. https://doi.org/10.32604/iasc.2023.033869



cc Copyright © 2023 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 906

    View

  • 618

    Download

  • 0

    Like

Share Link