Faster Convergent Artificial Neural Networks
DOI:
https://doi.org/10.24297/ijct.v17i1.7106Keywords:
Artificial Neural Networks, Backpropagation algorithm, Eigenvalues, Hessian Matrix, Learning RateAbstract
Proposed in this paper is a novel fast-convergence algorithm applied to neural networks (ANNs) with a learning rate based on the eigenvalues of the associated Hessian matrix of the input data. That is, the learning rate applied to the backpropagation algorithm changes dynamically with the input data used for training. The best choice of learning rate to converge to an accurate value quickly is derived. This newly proposed fast-convergence algorithm is applied to a traditional multilayer ANN architecture with feed-forward and backpropagation techniques. The proposed strategy is applied to various functions learned by the ANN through training. Learning curves obtained using calculated learning rates according to the novel method proposed are compared to learning curves utilizing an arbitrary learning rate to demonstrate the usefulness of this novel technique. This study shows that convergence to accurate values can be achieved much more quickly (a reduction in iterations by a factor of hundred) using the techniques proposed here. This approach is illustrated in this research work with derivations and pertinent examples to illustrate the method and learning curves obtained.
Downloads
References
2. Margoulas, Androulakis, Vrahatis, 1999. Improving the convergence of the backpropagation algorithm using learning rate adaptation methods. Neural Computation (November 1999),1769-1796.
3. Haykin, S.,1994. Neural Networks: A Comprehensive Foundation. Macmillan Publishing Company, Englewood Cliffs, NJ.
4. “Optimal Weight and Learning Rates for Linear Networksâ€, Available online: URL: https://www.willamette.edu/~gorr/classes/cs449/LearningRates/LearningRates.html, 2017