The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach

Both frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables...

Full description

Bibliographic Details
Main Authors: Endris Assen Ebrahim, Mehmet Ali Cengiz, Erol Terzi
Format: Article
Language:English
Published: Hindawi Limited 2024-01-01
Series:Journal of Mathematics
Online Access:http://dx.doi.org/10.1155/2024/1459524
_version_ 1827077577774202880
author Endris Assen Ebrahim
Mehmet Ali Cengiz
Erol Terzi
author_facet Endris Assen Ebrahim
Mehmet Ali Cengiz
Erol Terzi
author_sort Endris Assen Ebrahim
collection DOAJ
description Both frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables, and the number of model parameters is not distinct. Many regression model selection criteria are considered, including the maximum likelihood (ML) point estimation of the parameter and the logarithm of the likelihood of the dataset. This paper demonstrates the information complexity (ICOMP), Bayesian deviance information, or the widely applicable information criterion (WAIC) of the BRMS to hierarchical linear models fitted with repeated measures with a simulation and two real data examples. The Fisher information matrix for the Bayesian hierarchical model considering fixed and random parameters under maximizing a posterior estimation is derived. Using Gibbs sampling and Hybrid Hamiltonian Monte Carlo approaches, six different models were fitted for three distinct application datasets. The best-fitted candidate models were identified under each application dataset with the two MCMC approaches. In this case, the Bayesian hierarchical (mixed effect) linear model with random intercepts and random slopes estimated using the Hamiltonian Monte Carlo method best fits the two application datasets. Information complexity (ICOMP) is a better indicator of the best-fitted models than DIC and WAIC. In addition, the information complexity criterion showed that hierarchical models with gradient-based Hamiltonian Monte Carlo estimation are the best fit and have supper convergence relative to the gradient-free Gibbs sampling methods.
first_indexed 2024-03-08T04:09:42Z
format Article
id doaj.art-87eacc8286cc4c23bf4c7cdb8eae8927
institution Directory Open Access Journal
issn 2314-4785
language English
last_indexed 2025-03-20T02:15:59Z
publishDate 2024-01-01
publisher Hindawi Limited
record_format Article
series Journal of Mathematics
spelling doaj.art-87eacc8286cc4c23bf4c7cdb8eae89272024-10-03T07:51:41ZengHindawi LimitedJournal of Mathematics2314-47852024-01-01202410.1155/2024/1459524The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC ApproachEndris Assen Ebrahim0Mehmet Ali Cengiz1Erol Terzi2Debre Tabor UniversityOndokuz Mayis UniversityOndokuz Mayis UniversityBoth frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables, and the number of model parameters is not distinct. Many regression model selection criteria are considered, including the maximum likelihood (ML) point estimation of the parameter and the logarithm of the likelihood of the dataset. This paper demonstrates the information complexity (ICOMP), Bayesian deviance information, or the widely applicable information criterion (WAIC) of the BRMS to hierarchical linear models fitted with repeated measures with a simulation and two real data examples. The Fisher information matrix for the Bayesian hierarchical model considering fixed and random parameters under maximizing a posterior estimation is derived. Using Gibbs sampling and Hybrid Hamiltonian Monte Carlo approaches, six different models were fitted for three distinct application datasets. The best-fitted candidate models were identified under each application dataset with the two MCMC approaches. In this case, the Bayesian hierarchical (mixed effect) linear model with random intercepts and random slopes estimated using the Hamiltonian Monte Carlo method best fits the two application datasets. Information complexity (ICOMP) is a better indicator of the best-fitted models than DIC and WAIC. In addition, the information complexity criterion showed that hierarchical models with gradient-based Hamiltonian Monte Carlo estimation are the best fit and have supper convergence relative to the gradient-free Gibbs sampling methods.http://dx.doi.org/10.1155/2024/1459524
spellingShingle Endris Assen Ebrahim
Mehmet Ali Cengiz
Erol Terzi
The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
Journal of Mathematics
title The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
title_full The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
title_fullStr The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
title_full_unstemmed The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
title_short The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
title_sort best fit bayesian hierarchical generalized linear model selection using information complexity criteria in the mcmc approach
url http://dx.doi.org/10.1155/2024/1459524
work_keys_str_mv AT endrisassenebrahim thebestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach
AT mehmetalicengiz thebestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach
AT erolterzi thebestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach
AT endrisassenebrahim bestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach
AT mehmetalicengiz bestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach
AT erolterzi bestfitbayesianhierarchicalgeneralizedlinearmodelselectionusinginformationcomplexitycriteriainthemcmcapproach