A Double-Penalized Estimator to Combat Separation and Multicollinearity in Logistic Regression

When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separ...

Full description

Bibliographic Details
Main Authors: Ying Guan, Guang-Hui Fu
Format: Article
Language:English
Published: MDPI AG 2022-10-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/20/3824
Description
Summary:When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separation and multicollinearity makes the task of logistic regression more difficult, and a few studies addressed separation and multicollinearity simultaneously. In this paper, we propose a double-penalized method called lFRE to combat separation and multicollinearity in logistic regression. lFRE combines the log<i>F</i>-type penalty with the ridge penalty. The results indicate that compared with other penalty methods, lFRE can not only effectively remove bias from predicted probabilities but also provide the minimum mean squared prediction error. Aside from that, a real dataset is also employed to test the performance of the lFRE algorithm compared with several existing methods. The result shows that lFRE has strong competitiveness compared with them and can be used as an alternative algorithm in logistic regression to solve separation and multicollinearity problems.
ISSN:2227-7390