Provable variational inference for constrained log-submodular models

© 2018 Curran Associates Inc.All rights reserved. Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functio...

Full description

Bibliographic Details
Main Authors: Djolonga, J, Jegelka, S, Krause, A
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/132303
_version_ 1826216425655631872
author Djolonga, J
Jegelka, S
Krause, A
author_facet Djolonga, J
Jegelka, S
Krause, A
author_sort Djolonga, J
collection MIT
description © 2018 Curran Associates Inc.All rights reserved. Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of - possibly exponentially many - set of states, which form the bases of a matroid. To perform inference in these models we design novel variational inference algorithms, which carefully leverage the combinatorial and probabilistic properties of these objects. In addition to providing completely tractable and well-understood variational approximations, our approach results in the minimization of a convex upper bound on the log-partition function. The bound can be efficiently evaluated using greedy algorithms and optimized using any first-order method. Moreover, for the case of facility location and weighted coverage functions, we prove the first constant factor guarantee in this setting - an efficiently certifiable e/(e − 1) approximation of the log-partition function. Finally, we empirically demonstrate the effectiveness of our approach on several instances.
first_indexed 2024-09-23T16:47:10Z
format Article
id mit-1721.1/132303
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T16:47:10Z
publishDate 2021
record_format dspace
spelling mit-1721.1/1323032021-09-21T03:03:54Z Provable variational inference for constrained log-submodular models Djolonga, J Jegelka, S Krause, A © 2018 Curran Associates Inc.All rights reserved. Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of - possibly exponentially many - set of states, which form the bases of a matroid. To perform inference in these models we design novel variational inference algorithms, which carefully leverage the combinatorial and probabilistic properties of these objects. In addition to providing completely tractable and well-understood variational approximations, our approach results in the minimization of a convex upper bound on the log-partition function. The bound can be efficiently evaluated using greedy algorithms and optimized using any first-order method. Moreover, for the case of facility location and weighted coverage functions, we prove the first constant factor guarantee in this setting - an efficiently certifiable e/(e − 1) approximation of the log-partition function. Finally, we empirically demonstrate the effectiveness of our approach on several instances. 2021-09-20T18:21:45Z 2021-09-20T18:21:45Z 2020-12-21T19:16:43Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/132303 en Advances in Neural Information Processing Systems Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Neural Information Processing Systems (NIPS)
spellingShingle Djolonga, J
Jegelka, S
Krause, A
Provable variational inference for constrained log-submodular models
title Provable variational inference for constrained log-submodular models
title_full Provable variational inference for constrained log-submodular models
title_fullStr Provable variational inference for constrained log-submodular models
title_full_unstemmed Provable variational inference for constrained log-submodular models
title_short Provable variational inference for constrained log-submodular models
title_sort provable variational inference for constrained log submodular models
url https://hdl.handle.net/1721.1/132303
work_keys_str_mv AT djolongaj provablevariationalinferenceforconstrainedlogsubmodularmodels
AT jegelkas provablevariationalinferenceforconstrainedlogsubmodularmodels
AT krausea provablevariationalinferenceforconstrainedlogsubmodularmodels