Essays on Algorithmic Learning and Uncertainty Quantification

The thesis consists of three essays. The first, titled “Localization, Convexity, and Star Aggregation,” develops new analytical tools based upon the offset Rademacher complexity for studying stochastic optimization in non-convex domains, including statistical prediction and model aggregation problem...

Full description

Bibliographic Details
Main Author: Vijaykumar, Suhas
Other Authors: Chernozhukov, Victor
Format: Thesis
Published: Massachusetts Institute of Technology 2023
Online Access:https://hdl.handle.net/1721.1/151512
https://orcid.org/0000-0001-8383-5617
Description
Summary:The thesis consists of three essays. The first, titled “Localization, Convexity, and Star Aggregation,” develops new analytical tools based upon the offset Rademacher complexity for studying stochastic optimization in non-convex domains, including statistical prediction and model aggregation problems. Using these tools, I show that a simple procedure called the star algorithm can recover near-optimal convergence rates for non-parametric logistic regression in non-convex models. The second essay, titled “Kernel Ridge Regression Inference,” introduces a new technique for deriving sharp, non-asymptotic, uniform Gaussian approximation for partial sums in a reproducing kernel Hilbert space, which is then applied to construct uniform confidence bands for the widely-used kernel ridge regression algorithm. The third and final essay, titled “Frank-Wolfe Meets Metric Entropy,” uses ideas from asymptotic geometry to derive new dimension-dependent and domain-specific lower bounds for conditional gradient algorithms, a class of optimization procedures including the popular Frank-Wolfe algorithm and many of its variants. Such algorithms have found extensive use in machine learning and high-dimensional statistics, motivating a more thorough analysis of their limitations in high-dimensional problems.