A multi-voter multi-commission nearest neighbor classifier

Many improved versions of k-nearest neighbor (KNN) have been proposed by minimizing total distances of multi k nearest neighbors (multi-voter) in each class instead of the majority voting, such as a local mean-based pseudo nearest neighbor (LMPNN) that give a better decision. In this paper, a new KN...

Full description

Bibliographic Details
Main Authors: Suyanto Suyanto, Prasti Eko Yunanto, Tenia Wahyuningrum, Siti Khomsah
Format: Article
Language:English
Published: Elsevier 2022-09-01
Series:Journal of King Saud University: Computer and Information Sciences
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1319157822000313
_version_ 1811281010987368448
author Suyanto Suyanto
Prasti Eko Yunanto
Tenia Wahyuningrum
Siti Khomsah
author_facet Suyanto Suyanto
Prasti Eko Yunanto
Tenia Wahyuningrum
Siti Khomsah
author_sort Suyanto Suyanto
collection DOAJ
description Many improved versions of k-nearest neighbor (KNN) have been proposed by minimizing total distances of multi k nearest neighbors (multi-voter) in each class instead of the majority voting, such as a local mean-based pseudo nearest neighbor (LMPNN) that give a better decision. In this paper, a new KNN variant called multi-voter multi-commission nearest neighbor (MVMCNN) is proposed to examine its benefits in enhancing the LMPNN. As the name suggests, MVMCNN uses some commissions: each calculates the total distance between the given query point (test pattern) and k pseudo nearest neighbors using the LMPNN scheme. The decision class is defined by minimizing those total distances. Hence, the decision in MVMCNN is obtained more locally than LMPNN. Examination based on 10-fold cross-validation shows that the proposed multi-commission scheme can enhance the original (single-commission) LMPNN. Compared with two single-voter models: KNN and Bonferroni Mean Fuzzy k-Nearest Neighbors (BM-FKNN), the proposed MVMCNN also gives lower mean error rates as well as higher Precision, Recall, and F1 Score, indicating that the multi-voter model provides a better decision than the single-voter ones.
first_indexed 2024-04-13T01:25:04Z
format Article
id doaj.art-146d6a0bc5cf4fe78ab25b508de2e577
institution Directory Open Access Journal
issn 1319-1578
language English
last_indexed 2024-04-13T01:25:04Z
publishDate 2022-09-01
publisher Elsevier
record_format Article
series Journal of King Saud University: Computer and Information Sciences
spelling doaj.art-146d6a0bc5cf4fe78ab25b508de2e5772022-12-22T03:08:38ZengElsevierJournal of King Saud University: Computer and Information Sciences1319-15782022-09-0134862926302A multi-voter multi-commission nearest neighbor classifierSuyanto Suyanto0Prasti Eko Yunanto1Tenia Wahyuningrum2Siti Khomsah3School of Computing, Telkom University, Bandung, Indonesia; Corresponding author.School of Computing, Telkom University, Bandung, IndonesiaFaculty of Informatics, Institut Teknologi Telkom Purwokerto, IndonesiaFaculty of Informatics, Institut Teknologi Telkom Purwokerto, IndonesiaMany improved versions of k-nearest neighbor (KNN) have been proposed by minimizing total distances of multi k nearest neighbors (multi-voter) in each class instead of the majority voting, such as a local mean-based pseudo nearest neighbor (LMPNN) that give a better decision. In this paper, a new KNN variant called multi-voter multi-commission nearest neighbor (MVMCNN) is proposed to examine its benefits in enhancing the LMPNN. As the name suggests, MVMCNN uses some commissions: each calculates the total distance between the given query point (test pattern) and k pseudo nearest neighbors using the LMPNN scheme. The decision class is defined by minimizing those total distances. Hence, the decision in MVMCNN is obtained more locally than LMPNN. Examination based on 10-fold cross-validation shows that the proposed multi-commission scheme can enhance the original (single-commission) LMPNN. Compared with two single-voter models: KNN and Bonferroni Mean Fuzzy k-Nearest Neighbors (BM-FKNN), the proposed MVMCNN also gives lower mean error rates as well as higher Precision, Recall, and F1 Score, indicating that the multi-voter model provides a better decision than the single-voter ones.http://www.sciencedirect.com/science/article/pii/S1319157822000313Nearest neighbor classifierc-Means clusteringMachine learningMulti-commissionMulti-voter
spellingShingle Suyanto Suyanto
Prasti Eko Yunanto
Tenia Wahyuningrum
Siti Khomsah
A multi-voter multi-commission nearest neighbor classifier
Journal of King Saud University: Computer and Information Sciences
Nearest neighbor classifier
c-Means clustering
Machine learning
Multi-commission
Multi-voter
title A multi-voter multi-commission nearest neighbor classifier
title_full A multi-voter multi-commission nearest neighbor classifier
title_fullStr A multi-voter multi-commission nearest neighbor classifier
title_full_unstemmed A multi-voter multi-commission nearest neighbor classifier
title_short A multi-voter multi-commission nearest neighbor classifier
title_sort multi voter multi commission nearest neighbor classifier
topic Nearest neighbor classifier
c-Means clustering
Machine learning
Multi-commission
Multi-voter
url http://www.sciencedirect.com/science/article/pii/S1319157822000313
work_keys_str_mv AT suyantosuyanto amultivotermulticommissionnearestneighborclassifier
AT prastiekoyunanto amultivotermulticommissionnearestneighborclassifier
AT teniawahyuningrum amultivotermulticommissionnearestneighborclassifier
AT sitikhomsah amultivotermulticommissionnearestneighborclassifier
AT suyantosuyanto multivotermulticommissionnearestneighborclassifier
AT prastiekoyunanto multivotermulticommissionnearestneighborclassifier
AT teniawahyuningrum multivotermulticommissionnearestneighborclassifier
AT sitikhomsah multivotermulticommissionnearestneighborclassifier