Open Access iconOpen Access

ARTICLE

crossmark

Hyperparameter Tuning for Deep Neural Networks Based Optimization Algorithm

D. Vidyabharathi1,*, V. Mohanraj2

1 Sona College of Technology, Computer Science and Engineering, Salem, 636005, India
2 Sona College of Technology, Information Technology, Salem, 636005, India

* Corresponding Author: D. Vidyabharathi. Email: email

Intelligent Automation & Soft Computing 2023, 36(3), 2559-2573. https://doi.org/10.32604/iasc.2023.032255

Abstract

For training the present Neural Network (NN) models, the standard technique is to utilize decaying Learning Rates (LR). While the majority of these techniques commence with a large LR, they will decay multiple times over time. Decaying has been proved to enhance generalization as well as optimization. Other parameters, such as the network’s size, the number of hidden layers, dropouts to avoid overfitting, batch size, and so on, are solely based on heuristics. This work has proposed Adaptive Teaching Learning Based (ATLB) Heuristic to identify the optimal hyperparameters for diverse networks. Here we consider three architectures Recurrent Neural Networks (RNN), Long Short Term Memory (LSTM), Bidirectional Long Short Term Memory (BiLSTM) of Deep Neural Networks for classification. The evaluation of the proposed ATLB is done through the various learning rate schedulers Cyclical Learning Rate (CLR), Hyperbolic Tangent Decay (HTD), and Toggle between Hyperbolic Tangent Decay and Triangular mode with Restarts (T-HTR) techniques. Experimental results have shown the performance improvement on the 20Newsgroup, Reuters Newswire and IMDB dataset.

Keywords


Cite This Article

D. Vidyabharathi and V. Mohanraj, "Hyperparameter tuning for deep neural networks based optimization algorithm," Intelligent Automation & Soft Computing, vol. 36, no.3, pp. 2559–2573, 2023.



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1673

    View

  • 580

    Download

  • 0

    Like

Share Link