New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f[superscript ∗] = min[subscript x∈Q] f(x),where we...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Springer Berlin Heidelberg
2018
|
Online Access: | http://hdl.handle.net/1721.1/116877 https://orcid.org/0000-0002-1733-5363 https://orcid.org/0000-0002-5217-1894 |
_version_ | 1826211055477456896 |
---|---|
author | Freund, Robert Michael Lu, Haihao |
author2 | Massachusetts Institute of Technology. Department of Mathematics |
author_facet | Massachusetts Institute of Technology. Department of Mathematics Freund, Robert Michael Lu, Haihao |
author_sort | Freund, Robert Michael |
collection | MIT |
description | Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f[superscript ∗] = min[subscript x∈Q] f(x),where we presume knowledge of a strict lower bound f[subscript slb] < f[superscript ∗]. [Indeed, f[subscript slb] is naturally known
when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem; in all these cases one has f[subscript slb] = 0 < f[superscript ∗]
.] We introduce a new functional measure called the growth constant G for f(·), that measures how quickly the level sets of f(·) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(·) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x[superscript 0] is far from the optimal solution set. When f(·) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x[superscript 0] is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational
guarantees |
first_indexed | 2024-09-23T14:59:51Z |
format | Article |
id | mit-1721.1/116877 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T14:59:51Z |
publishDate | 2018 |
publisher | Springer Berlin Heidelberg |
record_format | dspace |
spelling | mit-1721.1/1168772022-10-01T23:51:33Z New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure Freund, Robert Michael Lu, Haihao Massachusetts Institute of Technology. Department of Mathematics Sloan School of Management Freund, Robert Michael Lu, Haihao Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f[superscript ∗] = min[subscript x∈Q] f(x),where we presume knowledge of a strict lower bound f[subscript slb] < f[superscript ∗]. [Indeed, f[subscript slb] is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem; in all these cases one has f[subscript slb] = 0 < f[superscript ∗] .] We introduce a new functional measure called the growth constant G for f(·), that measures how quickly the level sets of f(·) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(·) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x[superscript 0] is far from the optimal solution set. When f(·) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x[superscript 0] is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guarantees 2018-07-11T13:24:07Z 2018-07-11T13:24:07Z 2017-06 2017-03 2018-07-04T04:43:08Z Article http://purl.org/eprint/type/JournalArticle 0025-5610 1436-4646 http://hdl.handle.net/1721.1/116877 Freund, Robert M., and Haihao Lu. “New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure.” Mathematical Programming, vol. 170, no. 2, Aug. 2018, pp. 445–77. https://orcid.org/0000-0002-1733-5363 https://orcid.org/0000-0002-5217-1894 en https://doi.org/10.1007/s10107-017-1164-1 Mathematical Programming Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society application/pdf Springer Berlin Heidelberg Springer Berlin Heidelberg |
spellingShingle | Freund, Robert Michael Lu, Haihao New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title_full | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title_fullStr | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title_full_unstemmed | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title_short | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure |
title_sort | new computational guarantees for solving convex optimization problems with first order methods via a function growth condition measure |
url | http://hdl.handle.net/1721.1/116877 https://orcid.org/0000-0002-1733-5363 https://orcid.org/0000-0002-5217-1894 |
work_keys_str_mv | AT freundrobertmichael newcomputationalguaranteesforsolvingconvexoptimizationproblemswithfirstordermethodsviaafunctiongrowthconditionmeasure AT luhaihao newcomputationalguaranteesforsolvingconvexoptimizationproblemswithfirstordermethodsviaafunctiongrowthconditionmeasure |