A Normalization Methods for Backpropagation: A Comparative Study

Neural Networks (NN) have been used by many researchers to solve problems in several domains including classification and pattern recognition, and Backpropagation (BP) which is one of the most well-known artificial neural network models. Constructing effective NN applications relies on some characte...

Full description

Bibliographic Details
Main Authors: Adel S. Eesa, Wahab Kh. Arabo
Format: Article
Language:English
Published: University of Zakho 2017-12-01
Series:Science Journal of University of Zakho
Subjects:
Online Access:https://sjuoz.uoz.edu.krd/index.php/sjuoz/article/view/440
Description
Summary:Neural Networks (NN) have been used by many researchers to solve problems in several domains including classification and pattern recognition, and Backpropagation (BP) which is one of the most well-known artificial neural network models. Constructing effective NN applications relies on some characteristics such as the network topology, learning parameter, and normalization approaches for the input and the output vectors. The Input and the output vectors for BP need to be normalized properly in order to achieve the best performance of the network. This paper applies several normalization methods on several UCI datasets and comparing between them to find the best normalization method that works better with BP. Norm, Decimal scaling, Mean-Man, Median-Mad, Min-Max, and Z-score normalization are considered in this study. The comparative study shows that the performance of Mean-Mad and Median-Mad is better than the all remaining methods. On the other hand, the worst result is produced with Norm method.
ISSN:2663-628X
2663-6298