Influence Activation Function in Approximate Periodic Functions Using Neural Networks

Authors

  • Luma N. M. Tawfiq
  • Ala K. Jabber

Keywords:

Artificial neural network, Training network, Activation Function

Abstract

The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks.             In all algorithms, the gradient of the performance function (energy function) is used to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training.

Downloads

Published

23-Apr-2017

Issue

Section

Mathematics

Publication Dates