Self-Adaptive Attribute Value Weighting for Averaged One-Dependence Estimators

Of numerous proposals for weakening the attribute independence assumption of Naive Bayes, averaged one-dependence estimators (AODE) learns by extrapolation from marginal to full-multivariate probability distributions, and has demonstrated reasonable improvement in terms of classification performance...

Full description

Bibliographic Details
Main Authors: Limin Wang, Jie Chen, Yang Liu, Minghui Sun
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8984347/
Description
Summary:Of numerous proposals for weakening the attribute independence assumption of Naive Bayes, averaged one-dependence estimators (AODE) learns by extrapolation from marginal to full-multivariate probability distributions, and has demonstrated reasonable improvement in terms of classification performance. However, all the one-dependence estimators in AODE are assigned with the same weight, and their probability estimates are combined linearly. This work presents an efficient and effective attribute value weighting approach that assigns discriminative weights to different super-parent one-dependence estimators for different instances by identifying the differences among these one-dependence estimators in terms of log likelihood. The proposed approach is validated on widely used benchmark datasets from UCI machine learning repository. Experimental results show that the proposed approach achieves bias-variance trade-off and is a competitive alternative to state-of-the-art Bayesian and non-Bayesian learners (e.g., tree augmented Naive Bayes and logistic regression).
ISSN:2169-3536