Influence Activation Function in Approximate Periodic Functions Using Neural Networks

Main Article Content

Luma N. M. Tawfiq
Ala K. Jabber

Abstract

The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks.             In all algorithms, the gradient of the performance function (energy function) is used to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training.

Article Details

How to Cite
Influence Activation Function in Approximate Periodic Functions Using Neural Networks. (2017). Ibn AL-Haitham Journal For Pure and Applied Sciences, 27(1), 306-313. https://jih.uobaghdad.edu.iq/index.php/j/article/view/396
Section
Mathematics

How to Cite

Influence Activation Function in Approximate Periodic Functions Using Neural Networks. (2017). Ibn AL-Haitham Journal For Pure and Applied Sciences, 27(1), 306-313. https://jih.uobaghdad.edu.iq/index.php/j/article/view/396