Reversible Jump MCMC Simulated Annealing for Neural Networks
We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global...
Main Authors: | , , |
---|---|
Format: | Conference item |
Published: |
Morgan Kaufmann
2000
|
_version_ | 1797104199612235776 |
---|---|
author | Andrieu, C de Freitas, N Doucet, A |
author_facet | Andrieu, C de Freitas, N Doucet, A |
author_sort | Andrieu, C |
collection | OXFORD |
description | We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the classical AIC, BIC and MDL model selection criteria within a penalized likelihood framework. Finally, we show theoretically and empirically that the algorithm converges to the modes of the full posterior distribution in an efficient way. |
first_indexed | 2024-03-07T06:30:31Z |
format | Conference item |
id | oxford-uuid:f5d3e741-d2f3-4a00-ae69-85bd94ddd0c2 |
institution | University of Oxford |
last_indexed | 2024-03-07T06:30:31Z |
publishDate | 2000 |
publisher | Morgan Kaufmann |
record_format | dspace |
spelling | oxford-uuid:f5d3e741-d2f3-4a00-ae69-85bd94ddd0c22022-03-27T12:30:19ZReversible Jump MCMC Simulated Annealing for Neural NetworksConference itemhttp://purl.org/coar/resource_type/c_5794uuid:f5d3e741-d2f3-4a00-ae69-85bd94ddd0c2Department of Computer ScienceMorgan Kaufmann2000Andrieu, Cde Freitas, NDoucet, AWe propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the classical AIC, BIC and MDL model selection criteria within a penalized likelihood framework. Finally, we show theoretically and empirically that the algorithm converges to the modes of the full posterior distribution in an efficient way. |
spellingShingle | Andrieu, C de Freitas, N Doucet, A Reversible Jump MCMC Simulated Annealing for Neural Networks |
title | Reversible Jump MCMC Simulated Annealing for Neural Networks |
title_full | Reversible Jump MCMC Simulated Annealing for Neural Networks |
title_fullStr | Reversible Jump MCMC Simulated Annealing for Neural Networks |
title_full_unstemmed | Reversible Jump MCMC Simulated Annealing for Neural Networks |
title_short | Reversible Jump MCMC Simulated Annealing for Neural Networks |
title_sort | reversible jump mcmc simulated annealing for neural networks |
work_keys_str_mv | AT andrieuc reversiblejumpmcmcsimulatedannealingforneuralnetworks AT defreitasn reversiblejumpmcmcsimulatedannealingforneuralnetworks AT douceta reversiblejumpmcmcsimulatedannealingforneuralnetworks |