A new perspective on boosting in linear regression via subgradient optimization and relatives
We analyze boosting algorithms [Ann. Statist. 29 (2001) 1189–1232; Ann. Statist. 28 (2000) 337–407; Ann. Statist. 32 (2004) 407–499] in linear regression from a new perspective: that of modern first-order methods in convex optimiz ation. We show that classic boosting algorithms in linear regression,...
Main Authors: | M. Freund, Robert, Grigas, Paul, Mazumder, Rahul, Freund, Robert Michael, Grigas, Paul Edward |
---|---|
Other Authors: | Massachusetts Institute of Technology. Operations Research Center |
Format: | Article |
Published: |
Institute of Mathematical Statistics
2018
|
Online Access: | http://hdl.handle.net/1721.1/115300 https://orcid.org/0000-0002-1733-5363 https://orcid.org/0000-0002-5617-1058 |
Similar Items
-
AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods
by: Freund, Robert M., et al.
Published: (2014) -
An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion
by: Grigas, Paul, et al.
Published: (2018) -
New analysis and results for the Frank–Wolfe method
by: Freund, Robert Michael, et al.
Published: (2016) -
New Analysis and Results for the Conditional Gradient Method
by: Freund, Robert M., et al.
Published: (2013) -
Methods for convex optimization and statistical learning
by: Grigas, Paul (Paul Edward)
Published: (2017)