Probabilistic Models with Deep Neural Networks

Recent advances in statistical inference have significantly expanded the toolbox of probabilistic modeling. Historically, probabilistic modeling has been constrained to very restricted model classes, where exact or approximate probabilistic inference is feasible. However, developments in variational...

Full description

Bibliographic Details
Main Authors: Andrés R. Masegosa, Rafael Cabañas, Helge Langseth, Thomas D. Nielsen, Antonio Salmerón
Format: Article
Language:English
Published: MDPI AG 2021-01-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/1/117
_version_ 1797410221202604032
author Andrés R. Masegosa
Rafael Cabañas
Helge Langseth
Thomas D. Nielsen
Antonio Salmerón
author_facet Andrés R. Masegosa
Rafael Cabañas
Helge Langseth
Thomas D. Nielsen
Antonio Salmerón
author_sort Andrés R. Masegosa
collection DOAJ
description Recent advances in statistical inference have significantly expanded the toolbox of probabilistic modeling. Historically, probabilistic modeling has been constrained to very restricted model classes, where exact or approximate probabilistic inference is feasible. However, developments in variational inference, a general form of approximate probabilistic inference that originated in statistical physics, have enabled probabilistic modeling to overcome these limitations: (i) Approximate probabilistic inference is now possible over a broad class of probabilistic models containing a large number of parameters, and (ii) scalable inference methods based on stochastic gradient descent and distributed computing engines allow probabilistic modeling to be applied to massive data sets. One important practical consequence of these advances is the possibility to include deep neural networks within probabilistic models, thereby capturing complex non-linear stochastic relationships between the random variables. These advances, in conjunction with the release of novel probabilistic modeling toolboxes, have greatly expanded the scope of applications of probabilistic models, and allowed the models to take advantage of the recent strides made by the deep learning community. In this paper, we provide an overview of the main concepts, methods, and tools needed to use deep neural networks within a probabilistic modeling framework.
first_indexed 2024-03-09T04:26:44Z
format Article
id doaj.art-57c0bd55a4bc400e96dd46e4fd23f0b2
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-09T04:26:44Z
publishDate 2021-01-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-57c0bd55a4bc400e96dd46e4fd23f0b22023-12-03T13:39:09ZengMDPI AGEntropy1099-43002021-01-0123111710.3390/e23010117Probabilistic Models with Deep Neural NetworksAndrés R. Masegosa0Rafael Cabañas1Helge Langseth2Thomas D. Nielsen3Antonio Salmerón4Department of Mathematics, Center for the Development and Transfer of Mathematical Research to Industry (CDTIME), University of Almería, 04120 Almería, SpainIstituto Dalle Molle di Studi sull’Intelligenza Artificiale (IDSIA), CH-6962 Lugano, SwitzerlandDepartment of Computer Science, Norwegian University of Science and Technology, NO-7491 Trondheim, NorwayDepartment of Computer Science, Aalborg University, DK-9220 Aalborg, DenmarkDepartment of Mathematics, Center for the Development and Transfer of Mathematical Research to Industry (CDTIME), University of Almería, 04120 Almería, SpainRecent advances in statistical inference have significantly expanded the toolbox of probabilistic modeling. Historically, probabilistic modeling has been constrained to very restricted model classes, where exact or approximate probabilistic inference is feasible. However, developments in variational inference, a general form of approximate probabilistic inference that originated in statistical physics, have enabled probabilistic modeling to overcome these limitations: (i) Approximate probabilistic inference is now possible over a broad class of probabilistic models containing a large number of parameters, and (ii) scalable inference methods based on stochastic gradient descent and distributed computing engines allow probabilistic modeling to be applied to massive data sets. One important practical consequence of these advances is the possibility to include deep neural networks within probabilistic models, thereby capturing complex non-linear stochastic relationships between the random variables. These advances, in conjunction with the release of novel probabilistic modeling toolboxes, have greatly expanded the scope of applications of probabilistic models, and allowed the models to take advantage of the recent strides made by the deep learning community. In this paper, we provide an overview of the main concepts, methods, and tools needed to use deep neural networks within a probabilistic modeling framework.https://www.mdpi.com/1099-4300/23/1/117deep probabilistic modelingvariational inferenceneural networkslatent variable modelsBayesian learning
spellingShingle Andrés R. Masegosa
Rafael Cabañas
Helge Langseth
Thomas D. Nielsen
Antonio Salmerón
Probabilistic Models with Deep Neural Networks
Entropy
deep probabilistic modeling
variational inference
neural networks
latent variable models
Bayesian learning
title Probabilistic Models with Deep Neural Networks
title_full Probabilistic Models with Deep Neural Networks
title_fullStr Probabilistic Models with Deep Neural Networks
title_full_unstemmed Probabilistic Models with Deep Neural Networks
title_short Probabilistic Models with Deep Neural Networks
title_sort probabilistic models with deep neural networks
topic deep probabilistic modeling
variational inference
neural networks
latent variable models
Bayesian learning
url https://www.mdpi.com/1099-4300/23/1/117
work_keys_str_mv AT andresrmasegosa probabilisticmodelswithdeepneuralnetworks
AT rafaelcabanas probabilisticmodelswithdeepneuralnetworks
AT helgelangseth probabilisticmodelswithdeepneuralnetworks
AT thomasdnielsen probabilisticmodelswithdeepneuralnetworks
AT antoniosalmeron probabilisticmodelswithdeepneuralnetworks