Performance comparison of model selection criteria by generated experimental data

In Bioinformatics and other areas the model selection is a process of choosing a model from set of candidate models of different classes which will provide the best balance between goodness of fitting of the data and complexity of the model. There are many criteria for evaluation of mathematical mod...

Full description

Bibliographic Details
Main Authors: Mavrevski Radoslav, Milanov Peter, Traykov Metodi, Pencheva Nevena
Format: Article
Language:English
Published: EDP Sciences 2018-01-01
Series:ITM Web of Conferences
Online Access:https://doi.org/10.1051/itmconf/20181602006
Description
Summary:In Bioinformatics and other areas the model selection is a process of choosing a model from set of candidate models of different classes which will provide the best balance between goodness of fitting of the data and complexity of the model. There are many criteria for evaluation of mathematical models for data fitting. The main objectives of this study are: (1) to fitting artificial experimental data with different models with increasing complexity; (2) to test whether two known criteria as Akaike’s information criterion (AIC) and Bayesian information criterion (BIC) can correctly identify the model, used to generate the artificial data and (3) to assess and compare empirically the performance of AIC and BIC.
ISSN:2271-2097