Influence Activation Function in Approximate Periodic Functions Using Neural Networks
Keywords:
Artificial neural network, Training network, Activation FunctionAbstract
The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks. In all algorithms, the gradient of the performance function (energy function) is used to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training.