Learning and optimization in the face of data perturbations

Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020

Bibliographic Details
Main Author: Staib, Matthew James.
Other Authors: Stefanie Jegelka.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2020
Subjects:
Online Access:https://hdl.handle.net/1721.1/127004
_version_ 1826193353966878720
author Staib, Matthew James.
author2 Stefanie Jegelka.
author_facet Stefanie Jegelka.
Staib, Matthew James.
author_sort Staib, Matthew James.
collection MIT
description Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020
first_indexed 2024-09-23T09:38:04Z
format Thesis
id mit-1721.1/127004
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T09:38:04Z
publishDate 2020
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1270042020-09-04T03:38:39Z Learning and optimization in the face of data perturbations Staib, Matthew James. Stefanie Jegelka. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Electrical Engineering and Computer Science. Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020 Cataloged from the official PDF of thesis. Includes bibliographical references (pages 145-163). Many problems in the machine learning pipeline boil down to maximizing the expectation of a function over a distribution. This is the classic problem of stochastic optimization. There are two key challenges in solving such stochastic optimization problems: 1) the function is often non-convex, making optimization difficult; 2) the distribution is not known exactly, but may be perturbed adversarially or is otherwise obscured. Each issue is individually so challenging to warrant a substantial accompanying body of work addressing it, but addressing them simultaneously remains difficult. This thesis addresses problems at the intersection of non-convexity and data perturbations. We study the intersection of the two issues along two dual lines of inquiry: first, we build perturbation-aware algorithms with guarantees for non-convex problems; second, we seek to understand how data perturbations can be leveraged to enhance non-convex optimization algorithms. Along the way, we will study new types of data perturbations and seek to understand their connection to generalization. by Matthew James Staib. Ph. D. Ph.D. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science 2020-09-03T17:41:26Z 2020-09-03T17:41:26Z 2020 2020 Thesis https://hdl.handle.net/1721.1/127004 1191230169 eng MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. http://dspace.mit.edu/handle/1721.1/7582 241 pages application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
Staib, Matthew James.
Learning and optimization in the face of data perturbations
title Learning and optimization in the face of data perturbations
title_full Learning and optimization in the face of data perturbations
title_fullStr Learning and optimization in the face of data perturbations
title_full_unstemmed Learning and optimization in the face of data perturbations
title_short Learning and optimization in the face of data perturbations
title_sort learning and optimization in the face of data perturbations
topic Electrical Engineering and Computer Science.
url https://hdl.handle.net/1721.1/127004
work_keys_str_mv AT staibmatthewjames learningandoptimizationinthefaceofdataperturbations