A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation
© 2018, Springer Nature Switzerland AG. A reliable Ultrasound (US)-to-US registration method to compensate for brain shift would substantially improve Image-Guided Neurological Surgery. Developing such a registration method is very challenging, due to factors such as the tumor resection, the complex...
Main Authors: | , , , , , , , , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Springer International Publishing
2021
|
Online Access: | https://hdl.handle.net/1721.1/137470 |
_version_ | 1826210798737817600 |
---|---|
author | Luo, Jie Toews, Matthew Machado, Ines Frisken, Sarah Zhang, Miaomiao Preiswerk, Frank Sedghi, Alireza Ding, Hongyi Pieper, Steve Golland, Polina Golby, Alexandra Sugiyama, Masashi Wells III, William M. |
author2 | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
author_facet | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Luo, Jie Toews, Matthew Machado, Ines Frisken, Sarah Zhang, Miaomiao Preiswerk, Frank Sedghi, Alireza Ding, Hongyi Pieper, Steve Golland, Polina Golby, Alexandra Sugiyama, Masashi Wells III, William M. |
author_sort | Luo, Jie |
collection | MIT |
description | © 2018, Springer Nature Switzerland AG. A reliable Ultrasound (US)-to-US registration method to compensate for brain shift would substantially improve Image-Guided Neurological Surgery. Developing such a registration method is very challenging, due to factors such as the tumor resection, the complexity of brain pathology and the demand for fast computation. We propose a novel feature-driven active registration framework. Here, landmarks and their displacement are first estimated from a pair of US images using corresponding local image features. Subsequently, a Gaussian Process (GP) model is used to interpolate a dense deformation field from the sparse landmarks. Kernels of the GP are estimated by using variograms and a discrete grid search method. If necessary, the user can actively add new landmarks based on the image context and visualization of the uncertainty measure provided by the GP to further improve the result. We retrospectively demonstrate our registration framework as a robust and accurate brain shift compensation solution on clinical data. |
first_indexed | 2024-09-23T14:55:55Z |
format | Article |
id | mit-1721.1/137470 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T14:55:55Z |
publishDate | 2021 |
publisher | Springer International Publishing |
record_format | dspace |
spelling | mit-1721.1/1374702022-10-01T23:25:40Z A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation Luo, Jie Toews, Matthew Machado, Ines Frisken, Sarah Zhang, Miaomiao Preiswerk, Frank Sedghi, Alireza Ding, Hongyi Pieper, Steve Golland, Polina Golby, Alexandra Sugiyama, Masashi Wells III, William M. Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory © 2018, Springer Nature Switzerland AG. A reliable Ultrasound (US)-to-US registration method to compensate for brain shift would substantially improve Image-Guided Neurological Surgery. Developing such a registration method is very challenging, due to factors such as the tumor resection, the complexity of brain pathology and the demand for fast computation. We propose a novel feature-driven active registration framework. Here, landmarks and their displacement are first estimated from a pair of US images using corresponding local image features. Subsequently, a Gaussian Process (GP) model is used to interpolate a dense deformation field from the sparse landmarks. Kernels of the GP are estimated by using variograms and a discrete grid search method. If necessary, the user can actively add new landmarks based on the image context and visualization of the uncertainty measure provided by the GP to further improve the result. We retrospectively demonstrate our registration framework as a robust and accurate brain shift compensation solution on clinical data. 2021-11-05T14:13:57Z 2021-11-05T14:13:57Z 2018 2019-05-30T12:33:07Z Article http://purl.org/eprint/type/ConferencePaper 0302-9743 1611-3349 https://hdl.handle.net/1721.1/137470 Luo, Jie, Toews, Matthew, Machado, Ines, Frisken, Sarah, Zhang, Miaomiao et al. 2018. "A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation." en 10.1007/978-3-030-00937-3_4 Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Springer International Publishing arXiv |
spellingShingle | Luo, Jie Toews, Matthew Machado, Ines Frisken, Sarah Zhang, Miaomiao Preiswerk, Frank Sedghi, Alireza Ding, Hongyi Pieper, Steve Golland, Polina Golby, Alexandra Sugiyama, Masashi Wells III, William M. A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title | A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title_full | A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title_fullStr | A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title_full_unstemmed | A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title_short | A Feature-Driven Active Framework for Ultrasound-Based Brain Shift Compensation |
title_sort | feature driven active framework for ultrasound based brain shift compensation |
url | https://hdl.handle.net/1721.1/137470 |
work_keys_str_mv | AT luojie afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT toewsmatthew afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT machadoines afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT friskensarah afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT zhangmiaomiao afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT preiswerkfrank afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT sedghialireza afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT dinghongyi afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT piepersteve afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT gollandpolina afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT golbyalexandra afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT sugiyamamasashi afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT wellsiiiwilliamm afeaturedrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT luojie featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT toewsmatthew featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT machadoines featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT friskensarah featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT zhangmiaomiao featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT preiswerkfrank featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT sedghialireza featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT dinghongyi featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT piepersteve featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT gollandpolina featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT golbyalexandra featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT sugiyamamasashi featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation AT wellsiiiwilliamm featuredrivenactiveframeworkforultrasoundbasedbrainshiftcompensation |