Testing, Learning, and Optimization in High Dimensions
In this thesis we study two separate problems: (1) What is the sample complexity of testing the class of Determinantal Point Processes? and (2) Introducing a new analysis for optimization and generalization of deep neural networks beyond their linear approximation. For the first problem, we characte...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2022
|
Online Access: | https://hdl.handle.net/1721.1/144927 |
_version_ | 1826194781304258560 |
---|---|
author | Gatmiry, Khashayar |
author2 | Stefanie Jegelka |
author_facet | Stefanie Jegelka Gatmiry, Khashayar |
author_sort | Gatmiry, Khashayar |
collection | MIT |
description | In this thesis we study two separate problems: (1) What is the sample complexity of testing the class of Determinantal Point Processes? and (2) Introducing a new analysis for optimization and generalization of deep neural networks beyond their linear approximation. For the first problem, we characterize the optimal sample complexity up to logarithmic factors by proposing almost matching upper and lower bounds. For the second problem, we propose a new regime for the parameters and the algorithm of a three layer network model which goes beyond the Neural tangent kernel (NTK) approximation; as a result, we introduce a new data dependent complexity measure which generalizes the NTK complexity measure introduced by [Arora et al., 2019a]. We show that despite nonconvexity, a variant of Stochastic gradient descent (SGD) converges to a good solution for which we prove a novel generalization bound that is proportional to our complexity measure. |
first_indexed | 2024-09-23T10:01:59Z |
format | Thesis |
id | mit-1721.1/144927 |
institution | Massachusetts Institute of Technology |
last_indexed | 2024-09-23T10:01:59Z |
publishDate | 2022 |
publisher | Massachusetts Institute of Technology |
record_format | dspace |
spelling | mit-1721.1/1449272022-08-30T03:52:03Z Testing, Learning, and Optimization in High Dimensions Gatmiry, Khashayar Stefanie Jegelka Kelner, Jonathan Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science In this thesis we study two separate problems: (1) What is the sample complexity of testing the class of Determinantal Point Processes? and (2) Introducing a new analysis for optimization and generalization of deep neural networks beyond their linear approximation. For the first problem, we characterize the optimal sample complexity up to logarithmic factors by proposing almost matching upper and lower bounds. For the second problem, we propose a new regime for the parameters and the algorithm of a three layer network model which goes beyond the Neural tangent kernel (NTK) approximation; as a result, we introduce a new data dependent complexity measure which generalizes the NTK complexity measure introduced by [Arora et al., 2019a]. We show that despite nonconvexity, a variant of Stochastic gradient descent (SGD) converges to a good solution for which we prove a novel generalization bound that is proportional to our complexity measure. S.M. 2022-08-29T16:21:31Z 2022-08-29T16:21:31Z 2022-05 2022-06-21T19:25:24.953Z Thesis https://hdl.handle.net/1721.1/144927 In Copyright - Educational Use Permitted Copyright MIT http://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology |
spellingShingle | Gatmiry, Khashayar Testing, Learning, and Optimization in High Dimensions |
title | Testing, Learning, and Optimization in High Dimensions |
title_full | Testing, Learning, and Optimization in High Dimensions |
title_fullStr | Testing, Learning, and Optimization in High Dimensions |
title_full_unstemmed | Testing, Learning, and Optimization in High Dimensions |
title_short | Testing, Learning, and Optimization in High Dimensions |
title_sort | testing learning and optimization in high dimensions |
url | https://hdl.handle.net/1721.1/144927 |
work_keys_str_mv | AT gatmirykhashayar testinglearningandoptimizationinhighdimensions |