Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning

Abstract Ingestible robotic capsules with locomotion capabilities and on-board sampling mechanism have great potential for non-invasive diagnostic and interventional use in the gastrointestinal tract. Real-time tracking of capsule location and operational state is necessary for clinical application,...

Full description

Bibliographic Details
Main Authors: Xiaoyun Liu, Daniel Esser, Brandon Wagstaff, Anna Zavodni, Naomi Matsuura, Jonathan Kelly, Eric Diller
Format: Article
Language:English
Published: Nature Portfolio 2022-12-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-022-25572-w
_version_ 1811203733914124288
author Xiaoyun Liu
Daniel Esser
Brandon Wagstaff
Anna Zavodni
Naomi Matsuura
Jonathan Kelly
Eric Diller
author_facet Xiaoyun Liu
Daniel Esser
Brandon Wagstaff
Anna Zavodni
Naomi Matsuura
Jonathan Kelly
Eric Diller
author_sort Xiaoyun Liu
collection DOAJ
description Abstract Ingestible robotic capsules with locomotion capabilities and on-board sampling mechanism have great potential for non-invasive diagnostic and interventional use in the gastrointestinal tract. Real-time tracking of capsule location and operational state is necessary for clinical application, yet remains a significant challenge. To this end, we propose an approach that can simultaneously determine the mechanism state and in-plane 2D pose of millimeter capsule robots in an anatomically representative environment using ultrasound imaging. Our work proposes an attention-based hierarchical deep learning approach and adapts the success of transfer learning towards solving the multi-task tracking problem with limited dataset. To train the neural networks, we generate a representative dataset of a robotic capsule within ex-vivo porcine stomachs. Experimental results show that the accuracy of capsule state classification is 97%, and the mean estimation errors for orientation and centroid position are 2.0 degrees and 0.24 mm (1.7% of the capsule’s body length) on the hold-out test set. Accurate detection of the capsule while manipulated by an external magnet in a porcine stomach and colon is also demonstrated. The results suggest our proposed method has the potential for advancing the wireless capsule-based technologies by providing accurate detection of capsule robots in clinical scenarios.
first_indexed 2024-04-12T03:00:05Z
format Article
id doaj.art-e9d04bbd4f0c4b169eb5073e9dbf5f9e
institution Directory Open Access Journal
issn 2045-2322
language English
last_indexed 2024-04-12T03:00:05Z
publishDate 2022-12-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj.art-e9d04bbd4f0c4b169eb5073e9dbf5f9e2022-12-22T03:50:41ZengNature PortfolioScientific Reports2045-23222022-12-0112111210.1038/s41598-022-25572-wCapsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learningXiaoyun Liu0Daniel Esser1Brandon Wagstaff2Anna Zavodni3Naomi Matsuura4Jonathan Kelly5Eric Diller6Department of Mechanical and Industrial Engineering, University of TorontoDepartment of Mechanical Engineering, Vanderbilt UniversityUniversity of Toronto Institute of Aerospace Studies, University of TorontoDivision of Cardiology, Department of Medicine, University of TorontoDepartment of Materials Science and Engineering and Institute of Biomedical Engineering, University of TorontoUniversity of Toronto Institute of Aerospace Studies, University of TorontoDepartment of Mechanical and Industrial Engineering, University of TorontoAbstract Ingestible robotic capsules with locomotion capabilities and on-board sampling mechanism have great potential for non-invasive diagnostic and interventional use in the gastrointestinal tract. Real-time tracking of capsule location and operational state is necessary for clinical application, yet remains a significant challenge. To this end, we propose an approach that can simultaneously determine the mechanism state and in-plane 2D pose of millimeter capsule robots in an anatomically representative environment using ultrasound imaging. Our work proposes an attention-based hierarchical deep learning approach and adapts the success of transfer learning towards solving the multi-task tracking problem with limited dataset. To train the neural networks, we generate a representative dataset of a robotic capsule within ex-vivo porcine stomachs. Experimental results show that the accuracy of capsule state classification is 97%, and the mean estimation errors for orientation and centroid position are 2.0 degrees and 0.24 mm (1.7% of the capsule’s body length) on the hold-out test set. Accurate detection of the capsule while manipulated by an external magnet in a porcine stomach and colon is also demonstrated. The results suggest our proposed method has the potential for advancing the wireless capsule-based technologies by providing accurate detection of capsule robots in clinical scenarios.https://doi.org/10.1038/s41598-022-25572-w
spellingShingle Xiaoyun Liu
Daniel Esser
Brandon Wagstaff
Anna Zavodni
Naomi Matsuura
Jonathan Kelly
Eric Diller
Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
Scientific Reports
title Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
title_full Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
title_fullStr Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
title_full_unstemmed Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
title_short Capsule robot pose and mechanism state detection in ultrasound using attention-based hierarchical deep learning
title_sort capsule robot pose and mechanism state detection in ultrasound using attention based hierarchical deep learning
url https://doi.org/10.1038/s41598-022-25572-w
work_keys_str_mv AT xiaoyunliu capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT danielesser capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT brandonwagstaff capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT annazavodni capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT naomimatsuura capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT jonathankelly capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning
AT ericdiller capsulerobotposeandmechanismstatedetectioninultrasoundusingattentionbasedhierarchicaldeeplearning