A Normalization Methods for Backpropagation: A Comparative Study

Main Article Content

Adel S. Eesa
Wahab Kh. Arabo

Abstract

Neural Networks (NN) have been used by many researchers to solve problems in several domains including classification and pattern recognition, and Backpropagation (BP) which is one of the most well-known artificial neural network models. Constructing effective NN applications relies on some characteristics such as the network topology, learning parameter, and normalization approaches for the input and the output vectors. The Input and the output vectors for BP need to be normalized properly in order to achieve the best performance of the network. This paper applies several normalization methods on several UCI datasets and comparing between them to find the best normalization method that works better with BP. Norm, Decimal scaling, Mean-Man, Median-Mad, Min-Max, and Z-score normalization are considered in this study. The comparative study shows that the performance of Mean-Mad and Median-Mad is better than the all remaining methods. On the other hand, the worst result is produced with Norm method.

Downloads

Download data is not yet available.

Article Details

Section

Science Journal of University of Zakho

Author Biographies

Adel S. Eesa, University of Zakho

Dept. of Computer Science, Faculty of Science, University of Zakho, Kurdistan Region - Iraq (adel.eesa@uoz.edu.krd).

Wahab Kh. Arabo, University of Zakho

Dept. of Computer Science, Faculty of Science, University of Zakho, Kurdistan Region - Iraq.

How to Cite

Eesa, A. S., & Arabo, W. K. (2017). A Normalization Methods for Backpropagation: A Comparative Study. Science Journal of University of Zakho, 5(4), 319-323. https://doi.org/10.25271/2017.5.4.381

Most read articles by the same author(s)