Classification of Durian Types Using Features Extraction Gray Level Co-Occurrence Matrix (GLCM) AND K-Nearest Neighbors (KNN)

Durian is one of the most popular fruits because it has a delicious taste and distinctive aroma. It has different shapes and types, especially from thorns and different colors and has fruit parts that are also not the same as other parts. In terms of fruit selection, care must be taken because cons...

Full description

Bibliographic Details
Main Authors: Frencis Matheos Sarimole, Achmad Syaeful
Format: Article
Language:English
Published: Yayasan Pendidikan Riset dan Pengembangan Intelektual (YRPI) 2022-09-01
Series:Journal of Applied Engineering and Technological Science
Subjects:
Online Access:https://journal.yrpipku.com/index.php/jaets/article/view/959
Description
Summary:Durian is one of the most popular fruits because it has a delicious taste and distinctive aroma. It has different shapes and types, especially from thorns and different colors and has fruit parts that are also not the same as other parts. In terms of fruit selection, care must be taken because consumers generally still find it difficult to distinguish physically identified types of Durian fruit due to limited knowledge of the types of Durian fruit and require a relatively long time and accuracy in sorting. Therefore, there is a need for a method to sort the types of Durian fruit effectively and efficiently. Namely image segmentation based on the classification of the types of Durian fruit to help consumers. The method used is Gray Level Co-Occurrence Matrices for feature extraction, while to determine the proximity between the test image and the training image using the K-Nearest Neighbor method based on texture based on the color of the Durian fruit obtained. Extraction features using the GLCM method based on angles of 0°, 45°, 90° and 135°. Then the KNN method is used for the classification of characteristic results using K = 3. In this study, 1281 data training was used and 321 data testing was used, resulting in an accuracy of 93%.
ISSN:2715-6087
2715-6079