Decision Rules Derived from Optimal Decision Trees with Hypotheses
Conventional decision trees use queries each of which is based on one attribute. In this study, we also examine decision trees that handle additional queries based on hypotheses. This kind of query is similar to the equivalence queries considered in exact learning. Earlier, we designed dynamic progr...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-12-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/12/1641 |
_version_ | 1797504883601965056 |
---|---|
author | Mohammad Azad Igor Chikalov Shahid Hussain Mikhail Moshkov Beata Zielosko |
author_facet | Mohammad Azad Igor Chikalov Shahid Hussain Mikhail Moshkov Beata Zielosko |
author_sort | Mohammad Azad |
collection | DOAJ |
description | Conventional decision trees use queries each of which is based on one attribute. In this study, we also examine decision trees that handle additional queries based on hypotheses. This kind of query is similar to the equivalence queries considered in exact learning. Earlier, we designed dynamic programming algorithms for the computation of the minimum depth and the minimum number of internal nodes in decision trees that have hypotheses. Modification of these algorithms considered in the present paper permits us to build decision trees with hypotheses that are optimal relative to the depth or relative to the number of the internal nodes. We compare the length and coverage of decision rules extracted from optimal decision trees with hypotheses and decision rules extracted from optimal conventional decision trees to choose the ones that are preferable as a tool for the representation of information. To this end, we conduct computer experiments on various decision tables from the UCI Machine Learning Repository. In addition, we also consider decision tables for randomly generated Boolean functions. The collected results show that the decision rules derived from decision trees with hypotheses in many cases are better than the rules extracted from conventional decision trees. |
first_indexed | 2024-03-10T04:10:44Z |
format | Article |
id | doaj.art-02fe0d3b103140c8ac9801fa81156992 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-10T04:10:44Z |
publishDate | 2021-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-02fe0d3b103140c8ac9801fa811569922023-11-23T08:11:01ZengMDPI AGEntropy1099-43002021-12-012312164110.3390/e23121641Decision Rules Derived from Optimal Decision Trees with HypothesesMohammad Azad0Igor Chikalov1Shahid Hussain2Mikhail Moshkov3Beata Zielosko4Department of Computer Science, College of Computer and Information Sciences, Jouf University, Sakaka 72441, Saudi ArabiaIntel Corporation, 5000 W Chandler Blvd, Chandler, AZ 85226, USADepartment of Computer Science, School of Mathematics and Computer Science, Institute of Business Administration, University Road, Karachi 75270, PakistanComputer, Electrical and Mathematical Sciences & Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900, Saudi ArabiaInstitute of Computer Science, Faculty of Science and Technology, University of Silesia in Katowice, Będzińska 39, 41-200 Sosnowiec, PolandConventional decision trees use queries each of which is based on one attribute. In this study, we also examine decision trees that handle additional queries based on hypotheses. This kind of query is similar to the equivalence queries considered in exact learning. Earlier, we designed dynamic programming algorithms for the computation of the minimum depth and the minimum number of internal nodes in decision trees that have hypotheses. Modification of these algorithms considered in the present paper permits us to build decision trees with hypotheses that are optimal relative to the depth or relative to the number of the internal nodes. We compare the length and coverage of decision rules extracted from optimal decision trees with hypotheses and decision rules extracted from optimal conventional decision trees to choose the ones that are preferable as a tool for the representation of information. To this end, we conduct computer experiments on various decision tables from the UCI Machine Learning Repository. In addition, we also consider decision tables for randomly generated Boolean functions. The collected results show that the decision rules derived from decision trees with hypotheses in many cases are better than the rules extracted from conventional decision trees.https://www.mdpi.com/1099-4300/23/12/1641decision ruledecision treerepresentation of informationhypothesis |
spellingShingle | Mohammad Azad Igor Chikalov Shahid Hussain Mikhail Moshkov Beata Zielosko Decision Rules Derived from Optimal Decision Trees with Hypotheses Entropy decision rule decision tree representation of information hypothesis |
title | Decision Rules Derived from Optimal Decision Trees with Hypotheses |
title_full | Decision Rules Derived from Optimal Decision Trees with Hypotheses |
title_fullStr | Decision Rules Derived from Optimal Decision Trees with Hypotheses |
title_full_unstemmed | Decision Rules Derived from Optimal Decision Trees with Hypotheses |
title_short | Decision Rules Derived from Optimal Decision Trees with Hypotheses |
title_sort | decision rules derived from optimal decision trees with hypotheses |
topic | decision rule decision tree representation of information hypothesis |
url | https://www.mdpi.com/1099-4300/23/12/1641 |
work_keys_str_mv | AT mohammadazad decisionrulesderivedfromoptimaldecisiontreeswithhypotheses AT igorchikalov decisionrulesderivedfromoptimaldecisiontreeswithhypotheses AT shahidhussain decisionrulesderivedfromoptimaldecisiontreeswithhypotheses AT mikhailmoshkov decisionrulesderivedfromoptimaldecisiontreeswithhypotheses AT beatazielosko decisionrulesderivedfromoptimaldecisiontreeswithhypotheses |