What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances
The default procedures of the software programs M<i>plus</i> and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a negatively estimated variance does of...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Psych |
Subjects: | |
Online Access: | https://www.mdpi.com/2624-8611/4/3/29 |
_version_ | 1827657362180145152 |
---|---|
author | Steffen Zitzmann Julia-Kim Walther Martin Hecht Benjamin Nagengast |
author_facet | Steffen Zitzmann Julia-Kim Walther Martin Hecht Benjamin Nagengast |
author_sort | Steffen Zitzmann |
collection | DOAJ |
description | The default procedures of the software programs M<i>plus</i> and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a negatively estimated variance does often occur. One strategy to deal with this is fixing the variance to zero and then estimating the model again in order to obtain the estimates of the remaining model parameters. In the present article, we present one possible approach for justifying this strategy. Specifically, using a simple one-factor model as an example, we show that the maximum likelihood (ML) estimate of the variance of the latent factor is zero when the initial solution to the optimization problem (i.e., the solution provided by the default procedure) is a negative value. The basis of our argument is the very definition of ML estimation, which requires that the log-likelihood be maximized over the parameter space. We present the results of a small simulation study, which was conducted to evaluate the proposed ML procedure and compare it with M<i>plus</i>’ default procedure. We found that the proposed ML procedure increased estimation accuracy compared to M<i>plus</i>’ procedure, rendering the ML procedure an attractive option to deal with inadmissible solutions. |
first_indexed | 2024-03-09T22:41:10Z |
format | Article |
id | doaj.art-f1212a6c2e714e3db61399192947f1b0 |
institution | Directory Open Access Journal |
issn | 2624-8611 |
language | English |
last_indexed | 2024-03-09T22:41:10Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Psych |
spelling | doaj.art-f1212a6c2e714e3db61399192947f1b02023-11-23T18:39:19ZengMDPI AGPsych2624-86112022-06-014334335610.3390/psych4030029What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated VariancesSteffen Zitzmann0Julia-Kim Walther1Martin Hecht2Benjamin Nagengast3Hector Research Institute of Education Sciences and Psychology, University of Tübingen, 72072 Tübingen, GermanyHector Research Institute of Education Sciences and Psychology, University of Tübingen, 72072 Tübingen, GermanyFaculty of Humanities and Social Sciences, Helmut Schmidt University, 22043 Hamburg, GermanyHector Research Institute of Education Sciences and Psychology, University of Tübingen, 72072 Tübingen, GermanyThe default procedures of the software programs M<i>plus</i> and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a negatively estimated variance does often occur. One strategy to deal with this is fixing the variance to zero and then estimating the model again in order to obtain the estimates of the remaining model parameters. In the present article, we present one possible approach for justifying this strategy. Specifically, using a simple one-factor model as an example, we show that the maximum likelihood (ML) estimate of the variance of the latent factor is zero when the initial solution to the optimization problem (i.e., the solution provided by the default procedure) is a negative value. The basis of our argument is the very definition of ML estimation, which requires that the log-likelihood be maximized over the parameter space. We present the results of a small simulation study, which was conducted to evaluate the proposed ML procedure and compare it with M<i>plus</i>’ default procedure. We found that the proposed ML procedure increased estimation accuracy compared to M<i>plus</i>’ procedure, rendering the ML procedure an attractive option to deal with inadmissible solutions.https://www.mdpi.com/2624-8611/4/3/29maximum likelihoodHeywood caseinadmissible solutionconfirmatory factor analysis |
spellingShingle | Steffen Zitzmann Julia-Kim Walther Martin Hecht Benjamin Nagengast What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances Psych maximum likelihood Heywood case inadmissible solution confirmatory factor analysis |
title | What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances |
title_full | What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances |
title_fullStr | What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances |
title_full_unstemmed | What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances |
title_short | What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances |
title_sort | what is the maximum likelihood estimate when the initial solution to the optimization problem is inadmissible the case of negatively estimated variances |
topic | maximum likelihood Heywood case inadmissible solution confirmatory factor analysis |
url | https://www.mdpi.com/2624-8611/4/3/29 |
work_keys_str_mv | AT steffenzitzmann whatisthemaximumlikelihoodestimatewhentheinitialsolutiontotheoptimizationproblemisinadmissiblethecaseofnegativelyestimatedvariances AT juliakimwalther whatisthemaximumlikelihoodestimatewhentheinitialsolutiontotheoptimizationproblemisinadmissiblethecaseofnegativelyestimatedvariances AT martinhecht whatisthemaximumlikelihoodestimatewhentheinitialsolutiontotheoptimizationproblemisinadmissiblethecaseofnegativelyestimatedvariances AT benjaminnagengast whatisthemaximumlikelihoodestimatewhentheinitialsolutiontotheoptimizationproblemisinadmissiblethecaseofnegativelyestimatedvariances |