Open Access
ARTICLE
A Finite Difference Method and Effective Modification of Gradient Descent Optimization Algorithm for MHD Fluid Flow Over a Linearly Stretching Surface
1 Stochastic Analysis & Optimization Research Group, Department of Mathematics, Air University, PAF Complex E-9, Islamabad, 44000, Pakistan.
2 Department of Mathematics, Comsats University, Chak Shahzad Campus, Islamabad, 44000, Pakistan. 3 Department of Electrical and Computer Engineering, Comsats University, Islamabad, Wah Campus, Wah Cantt, 47040, Pakistan.
* Corresponding Author: Mairaj Bibi. Email: .
Computers, Materials & Continua 2020, 62(2), 657-677. https://doi.org/10.32604/cmc.2020.08584
Abstract
Present contribution is concerned with the construction and application of a numerical method for the fluid flow problem over a linearly stretching surface with the modification of standard Gradient descent Algorithm to solve the resulted difference equation. The flow problem is constructed using continuity, and Navier Stoke equations and these PDEs are further converted into boundary value problem by applying suitable similarity transformations. A central finite difference method is proposed that gives third-order accuracy using three grid points. The stability conditions of the present proposed method using a Gauss-Seidel iterative procedure is found using VonNeumann stability criteria and order of the finite difference method is proved by applying the Taylor series on the discretised equation. The comparison of the presently modified optimisation algorithm with the Gauss-Seidel iterative method and standard Newton’s method in optimisation is also made. It can be concluded that the presently modified optimisation Algorithm takes a few iterations to converge with a small value of the parameter contained in it compared with the standard descent algorithm that may take millions of iterations to converge. The present modification of the steepest descent method converges faster than Gauss-Seidel method and standard steepest descent method, and it may also overcome the deficiency of singular hessian arise in Newton’s method for some of the cases that may arise in optimisation problem(s).Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.