Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions

<p>At its core, this thesis aims to enhance the practicality of deep learning by improving the label and training efficiency of deep learning models. To this end, we investigate data subset selection techniques, specifically active learning and active sampling, grounded in information-theoreti...

Full description

Bibliographic Details
Main Author: Kirsch, A
Other Authors: Gal, Y
Format: Thesis
Language:English
Published: 2023
Subjects:
_version_ 1811141240752701440
author Kirsch, A
author2 Gal, Y
author_facet Gal, Y
Kirsch, A
author_sort Kirsch, A
collection OXFORD
description <p>At its core, this thesis aims to enhance the practicality of deep learning by improving the label and training efficiency of deep learning models. To this end, we investigate data subset selection techniques, specifically active learning and active sampling, grounded in information-theoretic principles. Active learning improves label efficiency, while active sampling enhances training efficiency.</p> <p>Supervised deep learning models often require extensive training with labeled data. Label acquisition can be expensive and time-consuming, and training large models is resource-intensive, hindering the adoption outside academic research and "big tech."</p> <p>Existing methods for data subset selection in deep learning often rely on heuristics or lack a principled information-theoretic foundation. In contrast, this thesis examines several objectives for data subset selection and their applications within deep learning, striving for a more principled approach inspired by information theory.</p> <p>We begin by disentangling epistemic and aleatoric uncertainty in single forward-pass deep neural networks, which provides helpful intuitions and insights into different forms of uncertainty and their relevance for data subset selection. We then propose and investigate various approaches for active learning and data subset selection in (Bayesian) deep learning. Finally, we relate various existing and proposed approaches to approximations of information quantities in weight or prediction space.</p> <p>Underpinning this work is a principled and practical notation for information-theoretic quantities that includes both random variables and observed outcomes. This thesis demonstrates the benefits of working from a unified perspective and highlights the potential impact of our contributions to the practical application of deep learning.</p>
first_indexed 2024-03-07T08:16:53Z
format Thesis
id oxford-uuid:3799959f-1f39-4ae5-8254-9d7e54810099
institution University of Oxford
language English
last_indexed 2024-09-25T04:34:44Z
publishDate 2023
record_format dspace
spelling oxford-uuid:3799959f-1f39-4ae5-8254-9d7e548100992024-09-17T12:24:43ZAdvancing deep active learning & data subset selection: unifying principles with information-theory intuitionsThesishttp://purl.org/coar/resource_type/c_db06uuid:3799959f-1f39-4ae5-8254-9d7e54810099Deep learning (Machine learning)Uncertainty (Information theory)Machine learningData reductionEnglishHyrax Deposit2023Kirsch, AGal, Y<p>At its core, this thesis aims to enhance the practicality of deep learning by improving the label and training efficiency of deep learning models. To this end, we investigate data subset selection techniques, specifically active learning and active sampling, grounded in information-theoretic principles. Active learning improves label efficiency, while active sampling enhances training efficiency.</p> <p>Supervised deep learning models often require extensive training with labeled data. Label acquisition can be expensive and time-consuming, and training large models is resource-intensive, hindering the adoption outside academic research and "big tech."</p> <p>Existing methods for data subset selection in deep learning often rely on heuristics or lack a principled information-theoretic foundation. In contrast, this thesis examines several objectives for data subset selection and their applications within deep learning, striving for a more principled approach inspired by information theory.</p> <p>We begin by disentangling epistemic and aleatoric uncertainty in single forward-pass deep neural networks, which provides helpful intuitions and insights into different forms of uncertainty and their relevance for data subset selection. We then propose and investigate various approaches for active learning and data subset selection in (Bayesian) deep learning. Finally, we relate various existing and proposed approaches to approximations of information quantities in weight or prediction space.</p> <p>Underpinning this work is a principled and practical notation for information-theoretic quantities that includes both random variables and observed outcomes. This thesis demonstrates the benefits of working from a unified perspective and highlights the potential impact of our contributions to the practical application of deep learning.</p>
spellingShingle Deep learning (Machine learning)
Uncertainty (Information theory)
Machine learning
Data reduction
Kirsch, A
Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title_full Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title_fullStr Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title_full_unstemmed Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title_short Advancing deep active learning & data subset selection: unifying principles with information-theory intuitions
title_sort advancing deep active learning data subset selection unifying principles with information theory intuitions
topic Deep learning (Machine learning)
Uncertainty (Information theory)
Machine learning
Data reduction
work_keys_str_mv AT kirscha advancingdeepactivelearningdatasubsetselectionunifyingprincipleswithinformationtheoryintuitions