A Clifford Analytic Signal-Based Breast Lesion Segmentation Method for 4D Spatial-Temporal DCE-MRI Sequences

Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been increasingly used for lesion detection in breast cancer diagnosis for its capability to provide spatial-temporal information. However, the massive and complex 4D spatial-temporal DCE-MRI data make the diagnosis process lengthy a...

Full description

Bibliographic Details
Main Authors: Liang Wang, Haocheng Shen, Jun Zhang, Yanchun Zhu, Cheng Jiang
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8944006/
Description
Summary:Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been increasingly used for lesion detection in breast cancer diagnosis for its capability to provide spatial-temporal information. However, the massive and complex 4D spatial-temporal DCE-MRI data make the diagnosis process lengthy and error-prone. Moreover, normal fibroglandular tissue is occasionally enhanced through background parenchymal enhancement (BPE), which can degrade the performance of current algorithms. We propose a new method using a 3D Clifford analytic signal (CAS) approach for breast lesion segmentation of DCE-MRI data. A 2D temporal image is constructed from all the 2D DCE-MRI slices at different scanning time points on a given transverse plane, according to the CAS approach. Then, a 3D Clifford temporal image (CTI) is constructed by successively stacking temporal images. The proposed CTI can distinguish lesion regions both visually and quantitatively compared to the traditional DCE-MRI subtraction image. Finally, we employ a fully convolutional network (FCN) model for breast lesion segmentation using the CTI as one of the inputs. Experimental results on an independent public dataset (TCIA QIN breast DCE-MRI) and a private household breast DCE-MRI dataset (TBD) show that the proposed method can achieve superior performance over current methods, both qualitatively and quantitatively.
ISSN:2169-3536