An Efficient Learning Procedure for Deep Boltzmann Machines

We present a new learning algorithm for Boltzmann Machines that contain many layers of hidden variables. Data-dependent statistics are estimated using a variational approximation that tends to focus on a single mode, and data-independent statistics are estimated using persistent Markov chains. The u...

Full description

Bibliographic Details
Main Authors: Salakhutdinov, Ruslan, Hinton, Geoffrey
Other Authors: Joshua Tenenbaum
Published: 2010
Subjects:
Online Access:http://hdl.handle.net/1721.1/57474
_version_ 1826216435343425536
author Salakhutdinov, Ruslan
Hinton, Geoffrey
author2 Joshua Tenenbaum
author_facet Joshua Tenenbaum
Salakhutdinov, Ruslan
Hinton, Geoffrey
author_sort Salakhutdinov, Ruslan
collection MIT
description We present a new learning algorithm for Boltzmann Machines that contain many layers of hidden variables. Data-dependent statistics are estimated using a variational approximation that tends to focus on a single mode, and data-independent statistics are estimated using persistent Markov chains. The use of two quite different techniques for estimating the two types of statistic that enter into the gradient of the log likelihood makes it practical to learn Boltzmann Machines with multiple hidden layers and millions of parameters. The learning can be made more efficient by using a layer-by-layer "pre-training" phase that initializes the weights sensibly. The pre-training also allows the variational inference to be initialized sensibly with a single bottom-up pass. We present results on the MNIST and NORB datasets showing that Deep Boltzmann Machines learn very good generative models of hand-written digits and 3-D objects. We also show that the features discovered by Deep Boltzmann Machines are a very effective way to initialize the hidden layers of feed-forward neural nets which are then discriminatively fine-tuned.
first_indexed 2024-09-23T16:47:38Z
id mit-1721.1/57474
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T16:47:38Z
publishDate 2010
record_format dspace
spelling mit-1721.1/574742019-04-10T10:25:00Z An Efficient Learning Procedure for Deep Boltzmann Machines Salakhutdinov, Ruslan Hinton, Geoffrey Joshua Tenenbaum Computational Cognitive Science Deep learning Graphical models Boltzmann Machines We present a new learning algorithm for Boltzmann Machines that contain many layers of hidden variables. Data-dependent statistics are estimated using a variational approximation that tends to focus on a single mode, and data-independent statistics are estimated using persistent Markov chains. The use of two quite different techniques for estimating the two types of statistic that enter into the gradient of the log likelihood makes it practical to learn Boltzmann Machines with multiple hidden layers and millions of parameters. The learning can be made more efficient by using a layer-by-layer "pre-training" phase that initializes the weights sensibly. The pre-training also allows the variational inference to be initialized sensibly with a single bottom-up pass. We present results on the MNIST and NORB datasets showing that Deep Boltzmann Machines learn very good generative models of hand-written digits and 3-D objects. We also show that the features discovered by Deep Boltzmann Machines are a very effective way to initialize the hidden layers of feed-forward neural nets which are then discriminatively fine-tuned. 2010-08-04T15:15:39Z 2010-08-04T15:15:39Z 2010-08-04 http://hdl.handle.net/1721.1/57474 MIT-CSAIL-TR-2010-037 32 p. application/pdf
spellingShingle Deep learning
Graphical models
Boltzmann Machines
Salakhutdinov, Ruslan
Hinton, Geoffrey
An Efficient Learning Procedure for Deep Boltzmann Machines
title An Efficient Learning Procedure for Deep Boltzmann Machines
title_full An Efficient Learning Procedure for Deep Boltzmann Machines
title_fullStr An Efficient Learning Procedure for Deep Boltzmann Machines
title_full_unstemmed An Efficient Learning Procedure for Deep Boltzmann Machines
title_short An Efficient Learning Procedure for Deep Boltzmann Machines
title_sort efficient learning procedure for deep boltzmann machines
topic Deep learning
Graphical models
Boltzmann Machines
url http://hdl.handle.net/1721.1/57474
work_keys_str_mv AT salakhutdinovruslan anefficientlearningprocedurefordeepboltzmannmachines
AT hintongeoffrey anefficientlearningprocedurefordeepboltzmannmachines
AT salakhutdinovruslan efficientlearningprocedurefordeepboltzmannmachines
AT hintongeoffrey efficientlearningprocedurefordeepboltzmannmachines