Representation and transfer learning using information-theoretic approximations

Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020

Bibliographic Details
Main Author: Qiu, David.
Other Authors: Lizhong Zheng.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2020
Subjects:
Online Access:https://hdl.handle.net/1721.1/127008
_version_ 1826191335417184256
author Qiu, David.
author2 Lizhong Zheng.
author_facet Lizhong Zheng.
Qiu, David.
author_sort Qiu, David.
collection MIT
description Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020
first_indexed 2024-09-23T08:54:19Z
format Thesis
id mit-1721.1/127008
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T08:54:19Z
publishDate 2020
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1270082020-09-04T03:42:02Z Representation and transfer learning using information-theoretic approximations Qiu, David. Lizhong Zheng. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Electrical Engineering and Computer Science. Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020 Cataloged from the official PDF of thesis. Includes bibliographical references (pages 119-127). Learning informative and transferable feature representations is a key aspect of machine learning systems. Mutual information and Kullback-Leibler divergence are principled and very popular metrics to measure feature relevance and perform distribution matching, respectively. However, clean formulations of machine learning algorithms based on these information-theoretic quantities typically require density estimation, which could be difficult for high dimensional problems. A central theme of this thesis is to translate these formulations into simpler forms that are more amenable to limited data. In particular, we modify local approximations and variational approximations of information-theoretic quantities to propose algorithms for unsupervised and transfer learning. Experiments show that the representations learned by our algorithms perform competitively compared to popular methods that require higher complexity. by David Qiu. Ph. D. Ph.D. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science 2020-09-03T17:41:36Z 2020-09-03T17:41:36Z 2020 2020 Thesis https://hdl.handle.net/1721.1/127008 1191230439 eng MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. http://dspace.mit.edu/handle/1721.1/7582 127 pages application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
Qiu, David.
Representation and transfer learning using information-theoretic approximations
title Representation and transfer learning using information-theoretic approximations
title_full Representation and transfer learning using information-theoretic approximations
title_fullStr Representation and transfer learning using information-theoretic approximations
title_full_unstemmed Representation and transfer learning using information-theoretic approximations
title_short Representation and transfer learning using information-theoretic approximations
title_sort representation and transfer learning using information theoretic approximations
topic Electrical Engineering and Computer Science.
url https://hdl.handle.net/1721.1/127008
work_keys_str_mv AT qiudavid representationandtransferlearningusinginformationtheoreticapproximations