Summary: | The generalized convex nearly isotonic regression problem addresses a least squares regression model that incorporates both sparsity and monotonicity constraints on the regression coefficients. In this paper, we introduce an efficient semismooth Newton-based augmented Lagrangian (<span style="font-variant: small-caps;">Ssnal</span>) algorithm to solve this problem. We demonstrate that, under reasonable assumptions, the <span style="font-variant: small-caps;">Ssnal</span> algorithm achieves global convergence and exhibits a linear convergence rate. Computationally, we derive the generalized Jacobian matrix associated with the proximal mapping of the generalized convex nearly isotonic regression regularizer and leverage the second-order sparsity when applying the semismooth Newton method to the subproblems in the <span style="font-variant: small-caps;">Ssnal</span> algorithm. Numerical experiments conducted on both synthetic and real datasets clearly demonstrate that our algorithm significantly outperforms first-order methods in terms of efficiency and robustness.
|