Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions

The superparent one-dependence estimators (SPODEs) is a popular family of semi-naive Bayesian network classifiers, and the averaged one-dependence estimators (AODE) provides efficient single pass learning with competitive classification accuracy. All the SPODEs in AODE are treated equally and have t...

Full description

Bibliographic Details
Main Authors: Hua Lou, Gaojie Wang, Limin Wang, Musa Mammadov
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9169616/
_version_ 1818431117813874688
author Hua Lou
Gaojie Wang
Limin Wang
Musa Mammadov
author_facet Hua Lou
Gaojie Wang
Limin Wang
Musa Mammadov
author_sort Hua Lou
collection DOAJ
description The superparent one-dependence estimators (SPODEs) is a popular family of semi-naive Bayesian network classifiers, and the averaged one-dependence estimators (AODE) provides efficient single pass learning with competitive classification accuracy. All the SPODEs in AODE are treated equally and have the same weight. Researchers have proposed to apply information-theoretic metrics, such as mutual information or conditional log likelihood, for assigning discriminative weights. However, while dealing with different instances the independence assumptions for different SPODEs may hold to different extents. The quest for highly scalable learning algorithms is urgent to approximate the ground-truth attribute dependencies that are implicit in training data. In this study we set each instance as the target and investigate extensions to AODE by measuring the independence assumption of SPODEs and assigning weights. The proposed approach, called independence weighted AODE (IWAODE), is validated on 40 benchmark datasets from the UCI machine learning repository. Experimental results reveal that, the resulting weighted SPODEs delivers computationally efficient low-bias learning, proving to be a competitive alternative to state-of-the-art single and ensemble Bayesian network classifiers (such as tree-augmented naive Bayes, k-dependence Bayesian classifier, WAODE-MI and etc).
first_indexed 2024-12-14T15:44:13Z
format Article
id doaj.art-286d41bb8ae24afab31dac643cccab0a
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-14T15:44:13Z
publishDate 2020-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-286d41bb8ae24afab31dac643cccab0a2022-12-21T22:55:32ZengIEEEIEEE Access2169-35362020-01-01815046515047710.1109/ACCESS.2020.30169849169616Model Weighting for One-Dependence Estimators by Measuring the Independence AssumptionsHua Lou0Gaojie Wang1Limin Wang2https://orcid.org/0000-0001-7742-669XMusa Mammadov3Department of Software and Big Data, Changzhou College of Information Technology, Changzhou, ChinaCollege of Computer Science and Technology, Jilin University, Changchun, ChinaCollege of Computer Science and Technology, Jilin University, Changchun, ChinaSchool of Information Technology, Deakin University, Burwood, VIC, AustraliaThe superparent one-dependence estimators (SPODEs) is a popular family of semi-naive Bayesian network classifiers, and the averaged one-dependence estimators (AODE) provides efficient single pass learning with competitive classification accuracy. All the SPODEs in AODE are treated equally and have the same weight. Researchers have proposed to apply information-theoretic metrics, such as mutual information or conditional log likelihood, for assigning discriminative weights. However, while dealing with different instances the independence assumptions for different SPODEs may hold to different extents. The quest for highly scalable learning algorithms is urgent to approximate the ground-truth attribute dependencies that are implicit in training data. In this study we set each instance as the target and investigate extensions to AODE by measuring the independence assumption of SPODEs and assigning weights. The proposed approach, called independence weighted AODE (IWAODE), is validated on 40 benchmark datasets from the UCI machine learning repository. Experimental results reveal that, the resulting weighted SPODEs delivers computationally efficient low-bias learning, proving to be a competitive alternative to state-of-the-art single and ensemble Bayesian network classifiers (such as tree-augmented naive Bayes, k-dependence Bayesian classifier, WAODE-MI and etc).https://ieeexplore.ieee.org/document/9169616/Averaged one-dependence estimatorsdiscriminative weightsconditional independence
spellingShingle Hua Lou
Gaojie Wang
Limin Wang
Musa Mammadov
Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
IEEE Access
Averaged one-dependence estimators
discriminative weights
conditional independence
title Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
title_full Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
title_fullStr Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
title_full_unstemmed Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
title_short Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
title_sort model weighting for one dependence estimators by measuring the independence assumptions
topic Averaged one-dependence estimators
discriminative weights
conditional independence
url https://ieeexplore.ieee.org/document/9169616/
work_keys_str_mv AT hualou modelweightingforonedependenceestimatorsbymeasuringtheindependenceassumptions
AT gaojiewang modelweightingforonedependenceestimatorsbymeasuringtheindependenceassumptions
AT liminwang modelweightingforonedependenceestimatorsbymeasuringtheindependenceassumptions
AT musamammadov modelweightingforonedependenceestimatorsbymeasuringtheindependenceassumptions