Proximal methods for the latent group lasso penalty

We consider a regularized least squares problem, with regularization by structured sparsity-inducing norms, which extend the usual ℓ[subscript 1] and the group lasso penalty, by allowing the subsets to overlap. Such regularizations lead to nonsmooth problems that are difficult to optimize, and we pr...

Full description

Bibliographic Details
Main Authors: Villa, Silvia, Rosasco, Lorenzo Andrea, Mosci, Sofia, Verri, Alessandro
Other Authors: Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
Format: Article
Language:English
Published: Springer US 2016
Online Access:http://hdl.handle.net/1721.1/103284
https://orcid.org/0000-0001-6376-4786
Description
Summary:We consider a regularized least squares problem, with regularization by structured sparsity-inducing norms, which extend the usual ℓ[subscript 1] and the group lasso penalty, by allowing the subsets to overlap. Such regularizations lead to nonsmooth problems that are difficult to optimize, and we propose in this paper a suitable version of an accelerated proximal method to solve them. We prove convergence of a nested procedure, obtained composing an accelerated proximal method with an inner algorithm for computing the proximity operator. By exploiting the geometrical properties of the penalty, we devise a new active set strategy, thanks to which the inner iteration is relatively fast, thus guaranteeing good computational performances of the overall algorithm. Our approach allows to deal with high dimensional problems without pre-processing for dimensionality reduction, leading to better computational and prediction performances with respect to the state-of-the art methods, as shown empirically both on toy and real data.