Sparse Poisson regression via mixed-integer optimization.
We present a mixed-integer optimization (MIO) approach to sparse Poisson regression. The MIO approach to sparse linear regression was first proposed in the 1970s, but has recently received renewed attention due to advances in optimization algorithms and computer hardware. In contrast to many sparse...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2021-01-01
|
Series: | PLoS ONE |
Online Access: | https://doi.org/10.1371/journal.pone.0249916 |
_version_ | 1819199455520358400 |
---|---|
author | Hiroki Saishu Kota Kudo Yuichi Takano |
author_facet | Hiroki Saishu Kota Kudo Yuichi Takano |
author_sort | Hiroki Saishu |
collection | DOAJ |
description | We present a mixed-integer optimization (MIO) approach to sparse Poisson regression. The MIO approach to sparse linear regression was first proposed in the 1970s, but has recently received renewed attention due to advances in optimization algorithms and computer hardware. In contrast to many sparse estimation algorithms, the MIO approach has the advantage of finding the best subset of explanatory variables with respect to various criterion functions. In this paper, we focus on a sparse Poisson regression that maximizes the weighted sum of the log-likelihood function and the L2-regularization term. For this problem, we derive a mixed-integer quadratic optimization (MIQO) formulation by applying a piecewise-linear approximation to the log-likelihood function. Optimization software can solve this MIQO problem to optimality. Moreover, we propose two methods for selecting a limited number of tangent lines effective for piecewise-linear approximations. We assess the efficacy of our method through computational experiments using synthetic and real-world datasets. Our methods provide better log-likelihood values than do conventional greedy algorithms in selecting tangent lines. In addition, our MIQO formulation delivers better out-of-sample prediction performance than do forward stepwise selection and L1-regularized estimation, especially in low-noise situations. |
first_indexed | 2024-12-23T03:16:37Z |
format | Article |
id | doaj.art-d10898f258bb4ff88c6cb6761b7cb3d9 |
institution | Directory Open Access Journal |
issn | 1932-6203 |
language | English |
last_indexed | 2024-12-23T03:16:37Z |
publishDate | 2021-01-01 |
publisher | Public Library of Science (PLoS) |
record_format | Article |
series | PLoS ONE |
spelling | doaj.art-d10898f258bb4ff88c6cb6761b7cb3d92022-12-21T18:02:04ZengPublic Library of Science (PLoS)PLoS ONE1932-62032021-01-01164e024991610.1371/journal.pone.0249916Sparse Poisson regression via mixed-integer optimization.Hiroki SaishuKota KudoYuichi TakanoWe present a mixed-integer optimization (MIO) approach to sparse Poisson regression. The MIO approach to sparse linear regression was first proposed in the 1970s, but has recently received renewed attention due to advances in optimization algorithms and computer hardware. In contrast to many sparse estimation algorithms, the MIO approach has the advantage of finding the best subset of explanatory variables with respect to various criterion functions. In this paper, we focus on a sparse Poisson regression that maximizes the weighted sum of the log-likelihood function and the L2-regularization term. For this problem, we derive a mixed-integer quadratic optimization (MIQO) formulation by applying a piecewise-linear approximation to the log-likelihood function. Optimization software can solve this MIQO problem to optimality. Moreover, we propose two methods for selecting a limited number of tangent lines effective for piecewise-linear approximations. We assess the efficacy of our method through computational experiments using synthetic and real-world datasets. Our methods provide better log-likelihood values than do conventional greedy algorithms in selecting tangent lines. In addition, our MIQO formulation delivers better out-of-sample prediction performance than do forward stepwise selection and L1-regularized estimation, especially in low-noise situations.https://doi.org/10.1371/journal.pone.0249916 |
spellingShingle | Hiroki Saishu Kota Kudo Yuichi Takano Sparse Poisson regression via mixed-integer optimization. PLoS ONE |
title | Sparse Poisson regression via mixed-integer optimization. |
title_full | Sparse Poisson regression via mixed-integer optimization. |
title_fullStr | Sparse Poisson regression via mixed-integer optimization. |
title_full_unstemmed | Sparse Poisson regression via mixed-integer optimization. |
title_short | Sparse Poisson regression via mixed-integer optimization. |
title_sort | sparse poisson regression via mixed integer optimization |
url | https://doi.org/10.1371/journal.pone.0249916 |
work_keys_str_mv | AT hirokisaishu sparsepoissonregressionviamixedintegeroptimization AT kotakudo sparsepoissonregressionviamixedintegeroptimization AT yuichitakano sparsepoissonregressionviamixedintegeroptimization |