A Semismooth Newton-Based Augmented Lagrangian Algorithm for the Generalized Convex Nearly Isotonic Regression Problem

The generalized convex nearly isotonic regression problem addresses a least squares regression model that incorporates both sparsity and monotonicity constraints on the regression coefficients. In this paper, we introduce an efficient semismooth Newton-based augmented Lagrangian (<span style=&quo...

Full description

Bibliographic Details
Main Authors: Yanmei Xu, Lanyu Lin, Yong-Jin Liu
Format: Article
Language:English
Published: MDPI AG 2025-02-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/13/3/501
Description
Summary:The generalized convex nearly isotonic regression problem addresses a least squares regression model that incorporates both sparsity and monotonicity constraints on the regression coefficients. In this paper, we introduce an efficient semismooth Newton-based augmented Lagrangian (<span style="font-variant: small-caps;">Ssnal</span>) algorithm to solve this problem. We demonstrate that, under reasonable assumptions, the <span style="font-variant: small-caps;">Ssnal</span> algorithm achieves global convergence and exhibits a linear convergence rate. Computationally, we derive the generalized Jacobian matrix associated with the proximal mapping of the generalized convex nearly isotonic regression regularizer and leverage the second-order sparsity when applying the semismooth Newton method to the subproblems in the <span style="font-variant: small-caps;">Ssnal</span> algorithm. Numerical experiments conducted on both synthetic and real datasets clearly demonstrate that our algorithm significantly outperforms first-order methods in terms of efficiency and robustness.
ISSN:2227-7390