Open Access iconOpen Access

ARTICLE

crossmark

Efficient Training of Multi-Layer Neural Networks to Achieve Faster Validation

Adel Saad Assiri*

Management Information Systems Department, College of Business, King Khalid University, Abha, Saudi Arabia

* Corresponding Author: Adel Saad Assiri. Email: email

Computer Systems Science and Engineering 2021, 36(3), 435-450. https://doi.org/10.32604/csse.2021.014894

Abstract

Artificial neural networks (ANNs) are one of the hottest topics in computer science and artificial intelligence due to their potential and advantages in analyzing real-world problems in various disciplines, including but not limited to physics, biology, chemistry, and engineering. However, ANNs lack several key characteristics of biological neural networks, such as sparsity, scale-freeness, and small-worldness. The concept of sparse and scale-free neural networks has been introduced to fill this gap. Network sparsity is implemented by removing weak weights between neurons during the learning process and replacing them with random weights. When the network is initialized, the neural network is fully connected, which means the number of weights is four times the number of neurons. In this study, considering that a biological neural network has some degree of initial sparsity, we design an ANN with a prescribed level of initial sparsity. The neural network is tested on handwritten digits, Arabic characters, CIFAR-10, and Reuters newswire topics. Simulations show that it is possible to reduce the number of weights by up to 50% without losing prediction accuracy. Moreover, in both cases, the testing time is dramatically reduced compared with fully connected ANNs.

Keywords


Cite This Article

A. Saad Assiri, "Efficient training of multi-layer neural networks to achieve faster validation," Computer Systems Science and Engineering, vol. 36, no.3, pp. 435–450, 2021.



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2066

    View

  • 1207

    Download

  • 0

    Like

Share Link