Deep learning mango fruits recognition based on tensorflow lite

Agricultural images such as fruits and vegetables have previously been recognised and classified using image analysis and computer vision techniques. Mangoes are currently being classified manually, whereby mango sellers must laboriously identify mangoes by hand. This is time-consuming and tedious....

Full description

Bibliographic Details
Main Authors: Mas Rina Mustaffa, Aainaa Azullya Idris, Lili Nurliyana Abdullah, Nurul Amelina Nasharuddin
Format: Article
Language:English
Published: Universitas Ahmad Dahlan 2023-11-01
Series:IJAIN (International Journal of Advances in Intelligent Informatics)
Online Access:http://ijain.org/index.php/IJAIN/article/view/1368
Description
Summary:Agricultural images such as fruits and vegetables have previously been recognised and classified using image analysis and computer vision techniques. Mangoes are currently being classified manually, whereby mango sellers must laboriously identify mangoes by hand. This is time-consuming and tedious. In this work, TensorFlow Lite was used as a transfer learning tool. Transfer learning is a fast approach in resolving classification problems effectively using small datasets. This work involves six categories, where four mango types are classified (Harum Manis, Langra, Dasheri and Sindhri), categories for other types of mangoes, and a non-mango category. Each category dataset comprises 100 images, and is split 70/30 between the training and testing set, respectively. This work was undertaken with a mobile-based application that can be used to distinguish various types of mangoes based on the proposed transfer learning method. The results obtained from the conducted experiment show that adopted transfer learning can achieve an accuracy of 95% for mango recognition. A preliminary user acceptance survey was also carried out to investigate the user’s requirements, the effectiveness of the proposed functionalities, and the ease of use of its proposed interfaces, with promising results.
ISSN:2442-6571
2548-3161