Automated annotation and quantitative description of ultrasound videos of the fetal heart

Interpretation of ultrasound videos of the fetal heart is crucial for the antenatal diagnosis of congenital heart disease (CHD). We believe that automated image analysis techniques could make an important contribution towards improving CHD detection rates. However, to our knowledge, no previous work...

Full description

Bibliographic Details
Main Authors: Bridge, C, Ioannou, C, Noble, J
Format: Journal article
Published: Elsevier 2016
_version_ 1826292902682165248
author Bridge, C
Ioannou, C
Noble, J
author_facet Bridge, C
Ioannou, C
Noble, J
author_sort Bridge, C
collection OXFORD
description Interpretation of ultrasound videos of the fetal heart is crucial for the antenatal diagnosis of congenital heart disease (CHD). We believe that automated image analysis techniques could make an important contribution towards improving CHD detection rates. However, to our knowledge, no previous work has been done in this area. With this goal in mind, this paper presents a framework for tracking the key variables that describe the content of each frame of freehand 2D ultrasound scanning videos of the healthy fetal heart. This represents an important first step towards developing tools that can assist with CHD detection in abnormal cases. We argue that it is natural to approach this as a sequential Bayesian filtering problem, due to the strong prior model we have of the underlying anatomy, and the ambiguity of the appearance of structures in ultrasound images. We train classification and regression forests to predict the visibility, location and orientation of the fetal heart in the image, and the viewing plane label from each frame. We also develop a novel adaptation of regression forests for circular variables to deal with the prediction of cardiac phase. Using a particle-filtering-based method to combine predictions from multiple video frames, we demonstrate how to filter this information to give a temporally consistent output at real-time speeds. We present results on a challenging dataset gathered in a real-world clinical setting and compare to expert annotations, achieving similar levels of accuracy to the levels of inter- and intra-observer variation.
first_indexed 2024-03-07T03:21:51Z
format Journal article
id oxford-uuid:b7b1ea0b-0f24-4bfa-904b-53e5c12504a3
institution University of Oxford
last_indexed 2024-03-07T03:21:51Z
publishDate 2016
publisher Elsevier
record_format dspace
spelling oxford-uuid:b7b1ea0b-0f24-4bfa-904b-53e5c12504a32022-03-27T04:50:35ZAutomated annotation and quantitative description of ultrasound videos of the fetal heartJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:b7b1ea0b-0f24-4bfa-904b-53e5c12504a3Symplectic Elements at OxfordElsevier2016Bridge, CIoannou, CNoble, JInterpretation of ultrasound videos of the fetal heart is crucial for the antenatal diagnosis of congenital heart disease (CHD). We believe that automated image analysis techniques could make an important contribution towards improving CHD detection rates. However, to our knowledge, no previous work has been done in this area. With this goal in mind, this paper presents a framework for tracking the key variables that describe the content of each frame of freehand 2D ultrasound scanning videos of the healthy fetal heart. This represents an important first step towards developing tools that can assist with CHD detection in abnormal cases. We argue that it is natural to approach this as a sequential Bayesian filtering problem, due to the strong prior model we have of the underlying anatomy, and the ambiguity of the appearance of structures in ultrasound images. We train classification and regression forests to predict the visibility, location and orientation of the fetal heart in the image, and the viewing plane label from each frame. We also develop a novel adaptation of regression forests for circular variables to deal with the prediction of cardiac phase. Using a particle-filtering-based method to combine predictions from multiple video frames, we demonstrate how to filter this information to give a temporally consistent output at real-time speeds. We present results on a challenging dataset gathered in a real-world clinical setting and compare to expert annotations, achieving similar levels of accuracy to the levels of inter- and intra-observer variation.
spellingShingle Bridge, C
Ioannou, C
Noble, J
Automated annotation and quantitative description of ultrasound videos of the fetal heart
title Automated annotation and quantitative description of ultrasound videos of the fetal heart
title_full Automated annotation and quantitative description of ultrasound videos of the fetal heart
title_fullStr Automated annotation and quantitative description of ultrasound videos of the fetal heart
title_full_unstemmed Automated annotation and quantitative description of ultrasound videos of the fetal heart
title_short Automated annotation and quantitative description of ultrasound videos of the fetal heart
title_sort automated annotation and quantitative description of ultrasound videos of the fetal heart
work_keys_str_mv AT bridgec automatedannotationandquantitativedescriptionofultrasoundvideosofthefetalheart
AT ioannouc automatedannotationandquantitativedescriptionofultrasoundvideosofthefetalheart
AT noblej automatedannotationandquantitativedescriptionofultrasoundvideosofthefetalheart