Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints
Learning the conditional probability table (CPT) parameters of Bayesian networks (BNs) is a key challenge in real-world decision support applications, especially when there are limited data available. The traditional approach to this challenge is introducing domain knowledge/expert judgments that ar...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-07-01
|
Series: | Symmetry |
Subjects: | |
Online Access: | https://www.mdpi.com/2073-8994/14/7/1469 |
_version_ | 1797415353233440768 |
---|---|
author | Yu Zhang Zhiming Hu |
author_facet | Yu Zhang Zhiming Hu |
author_sort | Yu Zhang |
collection | DOAJ |
description | Learning the conditional probability table (CPT) parameters of Bayesian networks (BNs) is a key challenge in real-world decision support applications, especially when there are limited data available. The traditional approach to this challenge is introducing domain knowledge/expert judgments that are encoded as qualitative parameter constraints. In this paper, we focus on multiplicative synergistic constraints. The negative multiplicative synergy constraint and positive multiplicative synergy constraint in this paper are symmetric. In order to integrate multiplicative synergistic constraints into the learning process of Bayesian Network parameters, we propose four methods to deal with the multiplicative synergistic constraints based on the idea of classical isotonic regression algorithm. The four methods are simulated by using the lawn moist model and Asia network, and we compared them with the maximum likelihood estimation (MLE) algorithm. Simulation results show that the proposed methods are superior to the MLE algorithm in the accuracy of parameter learning, which can improve the results of the MLE algorithm to obtain more accurate estimators of the parameters. The proposed methods can reduce the dependence of parameter learning on expert experiences. Combining these constraint methods with Bayesian estimation can improve the accuracy of parameter learning under small sample conditions. |
first_indexed | 2024-03-09T05:47:26Z |
format | Article |
id | doaj.art-b4dda3ee0ff84776aebb00b22ee665e7 |
institution | Directory Open Access Journal |
issn | 2073-8994 |
language | English |
last_indexed | 2024-03-09T05:47:26Z |
publishDate | 2022-07-01 |
publisher | MDPI AG |
record_format | Article |
series | Symmetry |
spelling | doaj.art-b4dda3ee0ff84776aebb00b22ee665e72023-12-03T12:20:14ZengMDPI AGSymmetry2073-89942022-07-01147146910.3390/sym14071469Parameter Learning of Bayesian Network with Multiplicative Synergistic ConstraintsYu Zhang0Zhiming Hu1School of Mathematics and Economics, Bigdata Modeling and Intelligent Computing Research Institute, Hubei University of Education, Wuhan 430205, ChinaSchool of Statistics and Mathematics, Zhejiang Gongshang University, Hangzhou 310018, ChinaLearning the conditional probability table (CPT) parameters of Bayesian networks (BNs) is a key challenge in real-world decision support applications, especially when there are limited data available. The traditional approach to this challenge is introducing domain knowledge/expert judgments that are encoded as qualitative parameter constraints. In this paper, we focus on multiplicative synergistic constraints. The negative multiplicative synergy constraint and positive multiplicative synergy constraint in this paper are symmetric. In order to integrate multiplicative synergistic constraints into the learning process of Bayesian Network parameters, we propose four methods to deal with the multiplicative synergistic constraints based on the idea of classical isotonic regression algorithm. The four methods are simulated by using the lawn moist model and Asia network, and we compared them with the maximum likelihood estimation (MLE) algorithm. Simulation results show that the proposed methods are superior to the MLE algorithm in the accuracy of parameter learning, which can improve the results of the MLE algorithm to obtain more accurate estimators of the parameters. The proposed methods can reduce the dependence of parameter learning on expert experiences. Combining these constraint methods with Bayesian estimation can improve the accuracy of parameter learning under small sample conditions.https://www.mdpi.com/2073-8994/14/7/1469multiplicative synergisticBayesian networksparameter learninglimited data |
spellingShingle | Yu Zhang Zhiming Hu Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints Symmetry multiplicative synergistic Bayesian networks parameter learning limited data |
title | Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints |
title_full | Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints |
title_fullStr | Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints |
title_full_unstemmed | Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints |
title_short | Parameter Learning of Bayesian Network with Multiplicative Synergistic Constraints |
title_sort | parameter learning of bayesian network with multiplicative synergistic constraints |
topic | multiplicative synergistic Bayesian networks parameter learning limited data |
url | https://www.mdpi.com/2073-8994/14/7/1469 |
work_keys_str_mv | AT yuzhang parameterlearningofbayesiannetworkwithmultiplicativesynergisticconstraints AT zhiminghu parameterlearningofbayesiannetworkwithmultiplicativesynergisticconstraints |