Jin Wang1,2, Yongsong Zou1, Se-Jung Lim3,*
Intelligent Automation & Soft Computing, Vol.36, No.3, pp. 2743-2755, 2023, DOI:10.32604/iasc.2023.033869
- 15 March 2023
Abstract Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify the gradient issue due to the strict time serial dependency, making it difficult to realize a long-term memory function. On the other hand, RNNs cells are highly complex, which will significantly increase computational complexity and cause waste of computational resources during model training. In this paper, an improved Time Feedforward Connections Recurrent Neural Networks (TFC-RNNs) model was first proposed to address the gradient issue. A parallel… More >