Hybrid models with deep and invertible features
We propose a neural hybrid model consisting of a linear model defined on a set of features computed by a deep, invertible transformation (i.e. a normalizing flow). An attractive property of our model is that both p(features), the density of the features, and p(targets|features), the predictive distr...
Main Authors: | , , , , |
---|---|
Format: | Conference item |
Published: |
Proceedings of Machine Learning Research
2019
|
_version_ | 1826288093707108352 |
---|---|
author | Eric, N Matsukawa, A Teh, Y Gorur, D Lakshminarayanan, B |
author_facet | Eric, N Matsukawa, A Teh, Y Gorur, D Lakshminarayanan, B |
author_sort | Eric, N |
collection | OXFORD |
description | We propose a neural hybrid model consisting of a linear model defined on a set of features computed by a deep, invertible transformation (i.e. a normalizing flow). An attractive property of our model is that both p(features), the density of the features, and p(targets|features), the predictive distribution, can be computed exactly in a single feed-forward pass. We show that our hybrid model, despite the invertibility constraints, achieves similar accuracy to purely predictive models. Yet the generative component remains a good model of the input features despite the hybrid optimization objective. This offers additional capabilities such as detection of out-of-distribution inputs and enabling semi-supervised learning. The availability of the exact joint density p(targets, features) also allows us to compute many quantities readily, making our hybrid model a useful building block for downstream applications of probabilistic deep learning. |
first_indexed | 2024-03-07T02:08:32Z |
format | Conference item |
id | oxford-uuid:9fd08712-5556-4a22-a872-9654c9b06556 |
institution | University of Oxford |
last_indexed | 2024-03-07T02:08:32Z |
publishDate | 2019 |
publisher | Proceedings of Machine Learning Research |
record_format | dspace |
spelling | oxford-uuid:9fd08712-5556-4a22-a872-9654c9b065562022-03-27T02:00:41ZHybrid models with deep and invertible featuresConference itemhttp://purl.org/coar/resource_type/c_5794uuid:9fd08712-5556-4a22-a872-9654c9b06556Symplectic Elements at OxfordProceedings of Machine Learning Research2019Eric, NMatsukawa, ATeh, YGorur, DLakshminarayanan, BWe propose a neural hybrid model consisting of a linear model defined on a set of features computed by a deep, invertible transformation (i.e. a normalizing flow). An attractive property of our model is that both p(features), the density of the features, and p(targets|features), the predictive distribution, can be computed exactly in a single feed-forward pass. We show that our hybrid model, despite the invertibility constraints, achieves similar accuracy to purely predictive models. Yet the generative component remains a good model of the input features despite the hybrid optimization objective. This offers additional capabilities such as detection of out-of-distribution inputs and enabling semi-supervised learning. The availability of the exact joint density p(targets, features) also allows us to compute many quantities readily, making our hybrid model a useful building block for downstream applications of probabilistic deep learning. |
spellingShingle | Eric, N Matsukawa, A Teh, Y Gorur, D Lakshminarayanan, B Hybrid models with deep and invertible features |
title | Hybrid models with deep and invertible features |
title_full | Hybrid models with deep and invertible features |
title_fullStr | Hybrid models with deep and invertible features |
title_full_unstemmed | Hybrid models with deep and invertible features |
title_short | Hybrid models with deep and invertible features |
title_sort | hybrid models with deep and invertible features |
work_keys_str_mv | AT ericn hybridmodelswithdeepandinvertiblefeatures AT matsukawaa hybridmodelswithdeepandinvertiblefeatures AT tehy hybridmodelswithdeepandinvertiblefeatures AT gorurd hybridmodelswithdeepandinvertiblefeatures AT lakshminarayananb hybridmodelswithdeepandinvertiblefeatures |