Open Access
ARTICLE
Activation Functions Effect on Fractal Coding Using Neural Networks
Department of Mathematics, Faculty of Sciences and Arts, Najran University, KSA
* Corresponding Author: Rashad A. Al-Jawfi. Email:
Intelligent Automation & Soft Computing 2023, 36(1), 957-965. https://doi.org/10.32604/iasc.2023.031700
Received 25 April 2022; Accepted 06 July 2022; Issue published 29 September 2022
Abstract
Activation functions play an essential role in converting the output of the artificial neural network into nonlinear results, since without this nonlinearity, the results of the network will be less accurate. Nonlinearity is the mission of all nonlinear functions, except for polynomials. The activation function must be differentiable for backpropagation learning. This study’s objective is to determine the best activation functions for the approximation of each fractal image. Different results have been attained using Matlab and Visual Basic programs, which indicate that the bounded function is more helpful than other functions. The non-linearity of the activation function is important when using neural networks for coding fractal images because the coefficients of the Iterated Function System are different according to the different types of fractals. The most commonly chosen activation function is the sigmoidal function, which produces a positive value. Other functions, such as tansh or arctan, whose values can be positive or negative depending on the network input, tend to train neural networks faster. The coding speed of the fractal image is different depending on the appropriate activation function chosen for each fractal shape. In this paper, we have provided the appropriate activation functions for each type of system of iterated functions that help the network to identify the transactions of the system.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.