Upgrading the Fusion of Imprecise Classifiers
Imprecise classification is a relatively new task within Machine Learning. The difference with standard classification is that not only is one state of the variable under study determined, a set of states that do not have enough information against them and cannot be ruled out is determined as well....
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/25/7/1088 |
_version_ | 1797589405375922176 |
---|---|
author | Serafín Moral-García María D. Benítez Joaquín Abellán |
author_facet | Serafín Moral-García María D. Benítez Joaquín Abellán |
author_sort | Serafín Moral-García |
collection | DOAJ |
description | Imprecise classification is a relatively new task within Machine Learning. The difference with standard classification is that not only is one state of the variable under study determined, a set of states that do not have enough information against them and cannot be ruled out is determined as well. For imprecise classification, a mode called an Imprecise Credal Decision Tree (ICDT) that uses imprecise probabilities and maximum of entropy as the information measure has been presented. A difficult and interesting task is to show how to combine this type of imprecise classifiers. A procedure based on the minimum level of dominance has been presented; though it represents a very strong method of combining, it has the drawback of an important risk of possible erroneous prediction. In this research, we use the second-best theory to argue that the aforementioned type of combination can be improved through a new procedure built by relaxing the constraints. The new procedure is compared with the original one in an experimental study on a large set of datasets, and shows improvement. |
first_indexed | 2024-03-11T01:06:11Z |
format | Article |
id | doaj.art-f0029c290f6c4dc29998712a79f9e5d1 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-11T01:06:11Z |
publishDate | 2023-07-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-f0029c290f6c4dc29998712a79f9e5d12023-11-18T19:14:38ZengMDPI AGEntropy1099-43002023-07-01257108810.3390/e25071088Upgrading the Fusion of Imprecise ClassifiersSerafín Moral-García0María D. Benítez1Joaquín Abellán2Department of Computer Science and Artificial Intelligence, University of Granada, 18012 Granada, SpainDepartment of Computer Science and Artificial Intelligence, University of Granada, 18012 Granada, SpainDepartment of Computer Science and Artificial Intelligence, University of Granada, 18012 Granada, SpainImprecise classification is a relatively new task within Machine Learning. The difference with standard classification is that not only is one state of the variable under study determined, a set of states that do not have enough information against them and cannot be ruled out is determined as well. For imprecise classification, a mode called an Imprecise Credal Decision Tree (ICDT) that uses imprecise probabilities and maximum of entropy as the information measure has been presented. A difficult and interesting task is to show how to combine this type of imprecise classifiers. A procedure based on the minimum level of dominance has been presented; though it represents a very strong method of combining, it has the drawback of an important risk of possible erroneous prediction. In this research, we use the second-best theory to argue that the aforementioned type of combination can be improved through a new procedure built by relaxing the constraints. The new procedure is compared with the original one in an experimental study on a large set of datasets, and shows improvement.https://www.mdpi.com/1099-4300/25/7/1088imprecise classificationCredal Decision Treesensemblesbaggingcombination technique |
spellingShingle | Serafín Moral-García María D. Benítez Joaquín Abellán Upgrading the Fusion of Imprecise Classifiers Entropy imprecise classification Credal Decision Trees ensembles bagging combination technique |
title | Upgrading the Fusion of Imprecise Classifiers |
title_full | Upgrading the Fusion of Imprecise Classifiers |
title_fullStr | Upgrading the Fusion of Imprecise Classifiers |
title_full_unstemmed | Upgrading the Fusion of Imprecise Classifiers |
title_short | Upgrading the Fusion of Imprecise Classifiers |
title_sort | upgrading the fusion of imprecise classifiers |
topic | imprecise classification Credal Decision Trees ensembles bagging combination technique |
url | https://www.mdpi.com/1099-4300/25/7/1088 |
work_keys_str_mv | AT serafinmoralgarcia upgradingthefusionofimpreciseclassifiers AT mariadbenitez upgradingthefusionofimpreciseclassifiers AT joaquinabellan upgradingthefusionofimpreciseclassifiers |