Central double cross-validation for estimating parameters in regression models

The ridge regression, lasso, elastic net, forward stagewise regression and the least angle regression require a solution path and tuning parameter, λ, to estimate the coefficient vector. Therefore, it is crucial to find the ideal λ. Cross-validation (CV) is the most widely utilized method for choosi...

Full description

Bibliographic Details
Main Author: Chye, Rou Shi
Format: Thesis
Language:English
Published: 2016
Subjects:
Online Access:http://eprints.utm.my/80959/2/ChyeRouShiMFS2016.pdf
_version_ 1796863292607561728
author Chye, Rou Shi
author_facet Chye, Rou Shi
author_sort Chye, Rou Shi
collection ePrints
description The ridge regression, lasso, elastic net, forward stagewise regression and the least angle regression require a solution path and tuning parameter, λ, to estimate the coefficient vector. Therefore, it is crucial to find the ideal λ. Cross-validation (CV) is the most widely utilized method for choosing the ideal tuning parameter from the solution path. CV is essentially the breaking down of the original sample into two parts. One part is used to develop the regression equation. The regression equation is then applied to the other part to evaluate the risk of every model. Consequently, the final model is the model with smallest estimated risk. However, CV does not provide consistent results because it has overfitting and underfitting effects during the model selection. In the present study, a new method for estimating parameter in best-subset regression called central double cross-validation (CDCV) is proposed. In this method, the CV is run twice with different number of folds. Therefore, CDCV maximizes the usage of available data, enhances the model selection performance and builds a new stable CV curve. The final model with an error of less than ?? standard error above the smallest CV error is chosen. The CDCV was compared to existing CV methods in determining the correct model via a simulation study with different sample size and correlation settings. Simulation study indicates that the proposed CDCV method has the highest percentage of obtaining the right model and the lowest Bayesian information criterion (BIC) value across multiple simulated study settings. The results showed that, CDCV has the ability to select the right model correctly and prevent the model from underfitting and overfitting. Therefore, CDCV is recommended as a good alternative to the existing methods in the simulation settings.
first_indexed 2024-03-05T20:24:34Z
format Thesis
id utm.eprints-80959
institution Universiti Teknologi Malaysia - ePrints
language English
last_indexed 2024-03-05T20:24:34Z
publishDate 2016
record_format dspace
spelling utm.eprints-809592019-07-24T00:13:24Z http://eprints.utm.my/80959/ Central double cross-validation for estimating parameters in regression models Chye, Rou Shi QA Mathematics The ridge regression, lasso, elastic net, forward stagewise regression and the least angle regression require a solution path and tuning parameter, λ, to estimate the coefficient vector. Therefore, it is crucial to find the ideal λ. Cross-validation (CV) is the most widely utilized method for choosing the ideal tuning parameter from the solution path. CV is essentially the breaking down of the original sample into two parts. One part is used to develop the regression equation. The regression equation is then applied to the other part to evaluate the risk of every model. Consequently, the final model is the model with smallest estimated risk. However, CV does not provide consistent results because it has overfitting and underfitting effects during the model selection. In the present study, a new method for estimating parameter in best-subset regression called central double cross-validation (CDCV) is proposed. In this method, the CV is run twice with different number of folds. Therefore, CDCV maximizes the usage of available data, enhances the model selection performance and builds a new stable CV curve. The final model with an error of less than ?? standard error above the smallest CV error is chosen. The CDCV was compared to existing CV methods in determining the correct model via a simulation study with different sample size and correlation settings. Simulation study indicates that the proposed CDCV method has the highest percentage of obtaining the right model and the lowest Bayesian information criterion (BIC) value across multiple simulated study settings. The results showed that, CDCV has the ability to select the right model correctly and prevent the model from underfitting and overfitting. Therefore, CDCV is recommended as a good alternative to the existing methods in the simulation settings. 2016-07 Thesis NonPeerReviewed application/pdf en http://eprints.utm.my/80959/2/ChyeRouShiMFS2016.pdf Chye, Rou Shi (2016) Central double cross-validation for estimating parameters in regression models. Masters thesis, Universiti Teknologi Malaysia, Faculty of Science. http://dms.library.utm.my:8080/vital/access/manager/Repository/vital:120286
spellingShingle QA Mathematics
Chye, Rou Shi
Central double cross-validation for estimating parameters in regression models
title Central double cross-validation for estimating parameters in regression models
title_full Central double cross-validation for estimating parameters in regression models
title_fullStr Central double cross-validation for estimating parameters in regression models
title_full_unstemmed Central double cross-validation for estimating parameters in regression models
title_short Central double cross-validation for estimating parameters in regression models
title_sort central double cross validation for estimating parameters in regression models
topic QA Mathematics
url http://eprints.utm.my/80959/2/ChyeRouShiMFS2016.pdf
work_keys_str_mv AT chyeroushi centraldoublecrossvalidationforestimatingparametersinregressionmodels