The Bayesian Expectation-Maximization-Maximization for the 3PLM
The current study proposes an alternative feasible Bayesian algorithm for the three-parameter logistic model (3PLM) from a mixture-modeling perspective, namely, the Bayesian Expectation-Maximization-Maximization (Bayesian EMM, or BEMM). As a new maximum likelihood estimation (MLE) alternative to the...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2019-05-01
|
Series: | Frontiers in Psychology |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fpsyg.2019.01175/full |
_version_ | 1818210125463158784 |
---|---|
author | Shaoyang Guo Chanjin Zheng Chanjin Zheng |
author_facet | Shaoyang Guo Chanjin Zheng Chanjin Zheng |
author_sort | Shaoyang Guo |
collection | DOAJ |
description | The current study proposes an alternative feasible Bayesian algorithm for the three-parameter logistic model (3PLM) from a mixture-modeling perspective, namely, the Bayesian Expectation-Maximization-Maximization (Bayesian EMM, or BEMM). As a new maximum likelihood estimation (MLE) alternative to the marginal MLE EM (MMLE/EM) for the 3PLM, the EMM can explore the likelihood function much better, but it might still suffer from the unidentifiability problem indicated by occasional extremely large item parameter estimates. Traditionally, this problem was remedied by the Bayesian approach which led to the Bayes modal estimation (BME) in IRT estimation. The current study attempts to mimic the Bayes modal estimation method and develop the BEMM which, as a combination of the EMM and the Bayesian approach, can bring in the benefits of the two methods. The study also devised a supplemented EM method to estimate the standard errors (SEs). A simulation study and two real data examples indicate that the BEMM can be more robust against the change in the priors than the Bayes modal estimation. The mixture modeling idea and this algorithm can be naturally extended to other IRT with guessing parameters and the four-parameter logistic models (4PLM). |
first_indexed | 2024-12-12T05:11:38Z |
format | Article |
id | doaj.art-40fedfe8161c4ae09d560728b25508a2 |
institution | Directory Open Access Journal |
issn | 1664-1078 |
language | English |
last_indexed | 2024-12-12T05:11:38Z |
publishDate | 2019-05-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Psychology |
spelling | doaj.art-40fedfe8161c4ae09d560728b25508a22022-12-22T00:36:54ZengFrontiers Media S.A.Frontiers in Psychology1664-10782019-05-011010.3389/fpsyg.2019.01175426352The Bayesian Expectation-Maximization-Maximization for the 3PLMShaoyang Guo0Chanjin Zheng1Chanjin Zheng2Institute of Curriculum and Instruction, Faculty of Education, East China Normal University, Shanghai, ChinaDepartment of Educational Psychology, Faculty of Education, East China Normal University, Shanghai, ChinaWords up your way, Beijing, ChinaThe current study proposes an alternative feasible Bayesian algorithm for the three-parameter logistic model (3PLM) from a mixture-modeling perspective, namely, the Bayesian Expectation-Maximization-Maximization (Bayesian EMM, or BEMM). As a new maximum likelihood estimation (MLE) alternative to the marginal MLE EM (MMLE/EM) for the 3PLM, the EMM can explore the likelihood function much better, but it might still suffer from the unidentifiability problem indicated by occasional extremely large item parameter estimates. Traditionally, this problem was remedied by the Bayesian approach which led to the Bayes modal estimation (BME) in IRT estimation. The current study attempts to mimic the Bayes modal estimation method and develop the BEMM which, as a combination of the EMM and the Bayesian approach, can bring in the benefits of the two methods. The study also devised a supplemented EM method to estimate the standard errors (SEs). A simulation study and two real data examples indicate that the BEMM can be more robust against the change in the priors than the Bayes modal estimation. The mixture modeling idea and this algorithm can be naturally extended to other IRT with guessing parameters and the four-parameter logistic models (4PLM).https://www.frontiersin.org/article/10.3389/fpsyg.2019.01175/full3PLBayesian EMMBayesian EMmixture modelingestimation |
spellingShingle | Shaoyang Guo Chanjin Zheng Chanjin Zheng The Bayesian Expectation-Maximization-Maximization for the 3PLM Frontiers in Psychology 3PL Bayesian EMM Bayesian EM mixture modeling estimation |
title | The Bayesian Expectation-Maximization-Maximization for the 3PLM |
title_full | The Bayesian Expectation-Maximization-Maximization for the 3PLM |
title_fullStr | The Bayesian Expectation-Maximization-Maximization for the 3PLM |
title_full_unstemmed | The Bayesian Expectation-Maximization-Maximization for the 3PLM |
title_short | The Bayesian Expectation-Maximization-Maximization for the 3PLM |
title_sort | bayesian expectation maximization maximization for the 3plm |
topic | 3PL Bayesian EMM Bayesian EM mixture modeling estimation |
url | https://www.frontiersin.org/article/10.3389/fpsyg.2019.01175/full |
work_keys_str_mv | AT shaoyangguo thebayesianexpectationmaximizationmaximizationforthe3plm AT chanjinzheng thebayesianexpectationmaximizationmaximizationforthe3plm AT chanjinzheng thebayesianexpectationmaximizationmaximizationforthe3plm AT shaoyangguo bayesianexpectationmaximizationmaximizationforthe3plm AT chanjinzheng bayesianexpectationmaximizationmaximizationforthe3plm AT chanjinzheng bayesianexpectationmaximizationmaximizationforthe3plm |