Parasitic egg recognition using convolution and attention network
Abstract Intestinal parasitic infections (IPIs) caused by protozoan and helminth parasites are among the most common infections in humans in low-and-middle-income countries. IPIs affect not only the health status of a country, but also the economic sector. Over the last decade, pattern recognition a...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2023-09-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-023-41711-3 |
_version_ | 1827723329954381824 |
---|---|
author | Nouar AlDahoul Hezerul Abdul Karim Mhd Adel Momo Francesca Isabelle F. Escobar Vina Alyzza Magallanes Myles Joshua Toledo Tan |
author_facet | Nouar AlDahoul Hezerul Abdul Karim Mhd Adel Momo Francesca Isabelle F. Escobar Vina Alyzza Magallanes Myles Joshua Toledo Tan |
author_sort | Nouar AlDahoul |
collection | DOAJ |
description | Abstract Intestinal parasitic infections (IPIs) caused by protozoan and helminth parasites are among the most common infections in humans in low-and-middle-income countries. IPIs affect not only the health status of a country, but also the economic sector. Over the last decade, pattern recognition and image processing techniques have been developed to automatically identify parasitic eggs in microscopic images. Existing identification techniques are still suffering from diagnosis errors and low sensitivity. Therefore, more accurate and faster solution is still required to recognize parasitic eggs and classify them into several categories. A novel Chula-ParasiteEgg dataset including 11,000 microscopic images proposed in ICIP2022 was utilized to train various methods such as convolutional neural network (CNN) based models and convolution and attention (CoAtNet) based models. The experiments conducted show high recognition performance of the proposed CoAtNet that was tuned with microscopic images of parasitic eggs. The CoAtNet produced an average accuracy of 93%, and an average F1 score of 93%. The finding opens door to integrate the proposed solution in automated parasitological diagnosis. |
first_indexed | 2024-03-10T21:59:43Z |
format | Article |
id | doaj.art-c5db59b0fe59436eb4538f4e55d2ce18 |
institution | Directory Open Access Journal |
issn | 2045-2322 |
language | English |
last_indexed | 2024-03-10T21:59:43Z |
publishDate | 2023-09-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj.art-c5db59b0fe59436eb4538f4e55d2ce182023-11-19T13:00:49ZengNature PortfolioScientific Reports2045-23222023-09-0113112010.1038/s41598-023-41711-3Parasitic egg recognition using convolution and attention networkNouar AlDahoul0Hezerul Abdul Karim1Mhd Adel Momo2Francesca Isabelle F. Escobar3Vina Alyzza Magallanes4Myles Joshua Toledo Tan5Computer Science, New York UniversityFaculty of Engineering, Multimedia UniversityFleet Management Systems and TechnologiesDepartment of Natural Sciences, University of St. La SalleDepartment of Natural Sciences, University of St. La SalleDepartment of Natural Sciences, University of St. La SalleAbstract Intestinal parasitic infections (IPIs) caused by protozoan and helminth parasites are among the most common infections in humans in low-and-middle-income countries. IPIs affect not only the health status of a country, but also the economic sector. Over the last decade, pattern recognition and image processing techniques have been developed to automatically identify parasitic eggs in microscopic images. Existing identification techniques are still suffering from diagnosis errors and low sensitivity. Therefore, more accurate and faster solution is still required to recognize parasitic eggs and classify them into several categories. A novel Chula-ParasiteEgg dataset including 11,000 microscopic images proposed in ICIP2022 was utilized to train various methods such as convolutional neural network (CNN) based models and convolution and attention (CoAtNet) based models. The experiments conducted show high recognition performance of the proposed CoAtNet that was tuned with microscopic images of parasitic eggs. The CoAtNet produced an average accuracy of 93%, and an average F1 score of 93%. The finding opens door to integrate the proposed solution in automated parasitological diagnosis.https://doi.org/10.1038/s41598-023-41711-3 |
spellingShingle | Nouar AlDahoul Hezerul Abdul Karim Mhd Adel Momo Francesca Isabelle F. Escobar Vina Alyzza Magallanes Myles Joshua Toledo Tan Parasitic egg recognition using convolution and attention network Scientific Reports |
title | Parasitic egg recognition using convolution and attention network |
title_full | Parasitic egg recognition using convolution and attention network |
title_fullStr | Parasitic egg recognition using convolution and attention network |
title_full_unstemmed | Parasitic egg recognition using convolution and attention network |
title_short | Parasitic egg recognition using convolution and attention network |
title_sort | parasitic egg recognition using convolution and attention network |
url | https://doi.org/10.1038/s41598-023-41711-3 |
work_keys_str_mv | AT nouaraldahoul parasiticeggrecognitionusingconvolutionandattentionnetwork AT hezerulabdulkarim parasiticeggrecognitionusingconvolutionandattentionnetwork AT mhdadelmomo parasiticeggrecognitionusingconvolutionandattentionnetwork AT francescaisabellefescobar parasiticeggrecognitionusingconvolutionandattentionnetwork AT vinaalyzzamagallanes parasiticeggrecognitionusingconvolutionandattentionnetwork AT mylesjoshuatoledotan parasiticeggrecognitionusingconvolutionandattentionnetwork |