Bayesian Reasoning with Trained Neural Networks
We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems,...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/6/693 |
_version_ | 1797531819963318272 |
---|---|
author | Jakob Knollmüller Torsten A. Enßlin |
author_facet | Jakob Knollmüller Torsten A. Enßlin |
author_sort | Jakob Knollmüller |
collection | DOAJ |
description | We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems, which we approximately solved through variational or sampling techniques. The approach built on top of already trained networks, and the addressable questions grew super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compared the approach to specifically trained generators, showed how to solve riddles, and demonstrated its compatibility with state-of-the-art architectures. |
first_indexed | 2024-03-10T10:50:10Z |
format | Article |
id | doaj.art-cdea0798b9ac428a9654166731a0da4b |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-10T10:50:10Z |
publishDate | 2021-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-cdea0798b9ac428a9654166731a0da4b2023-11-21T22:19:12ZengMDPI AGEntropy1099-43002021-05-0123669310.3390/e23060693Bayesian Reasoning with Trained Neural NetworksJakob Knollmüller0Torsten A. Enßlin1Physics Department, Technical University Munich, Boltzmann-Str. 2, 85748 Garching, GermanyMax Planck Institut for Astrophysics, Karl-Schwarzschild-Str. 1, 85748 Garching, GermanyWe showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems, which we approximately solved through variational or sampling techniques. The approach built on top of already trained networks, and the addressable questions grew super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compared the approach to specifically trained generators, showed how to solve riddles, and demonstrated its compatibility with state-of-the-art architectures.https://www.mdpi.com/1099-4300/23/6/693reasoninggenerative modelsuncertainty quantificationdeep learning |
spellingShingle | Jakob Knollmüller Torsten A. Enßlin Bayesian Reasoning with Trained Neural Networks Entropy reasoning generative models uncertainty quantification deep learning |
title | Bayesian Reasoning with Trained Neural Networks |
title_full | Bayesian Reasoning with Trained Neural Networks |
title_fullStr | Bayesian Reasoning with Trained Neural Networks |
title_full_unstemmed | Bayesian Reasoning with Trained Neural Networks |
title_short | Bayesian Reasoning with Trained Neural Networks |
title_sort | bayesian reasoning with trained neural networks |
topic | reasoning generative models uncertainty quantification deep learning |
url | https://www.mdpi.com/1099-4300/23/6/693 |
work_keys_str_mv | AT jakobknollmuller bayesianreasoningwithtrainedneuralnetworks AT torstenaenßlin bayesianreasoningwithtrainedneuralnetworks |