סיכום: | ABSTRACT
During the last twenty years, Akaike's Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the Akaike's Information Criterion (AIC) to determine the optimal architecture model of neural network. Neural network have been used to resolve a variety of classification problems. The computational properties of many of the possible network designs have been analyzed, but the decision as to which of several competing network architecture is "best" for a given problem remains subjective.
A relationship between optimal neural net-work and model statistic identification is described. A derivative of Akaike's Information Criterion (AIC) is given.
Key words : neural network, Multi-Layered Perceptions, Maximum Likelihood, Kullback-Leibler Information, Entropy, Akaike's Information Criterion.
|