Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps
Ultrasound (US) imaging is widely utilized as a diagnostic screening method, and deep learning has recently drawn attention for the analysis of US images for the pathological status of tissues. While low image quality and poor reproducibility are the common obstacles in US analysis, the small size o...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-10-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/11/21/10216 |
_version_ | 1827678322115477504 |
---|---|
author | Hyungsuk Kim Juyoung Park Hakjoon Lee Geuntae Im Jongsoo Lee Ki-Baek Lee Heung Jae Lee |
author_facet | Hyungsuk Kim Juyoung Park Hakjoon Lee Geuntae Im Jongsoo Lee Ki-Baek Lee Heung Jae Lee |
author_sort | Hyungsuk Kim |
collection | DOAJ |
description | Ultrasound (US) imaging is widely utilized as a diagnostic screening method, and deep learning has recently drawn attention for the analysis of US images for the pathological status of tissues. While low image quality and poor reproducibility are the common obstacles in US analysis, the small size of the dataset is a new limitation for deep learning due to lack of generalization. In this work, a convolutional neural network (CNN) using multiple feature maps, such as entropy and phase images, as well as a B-mode image, was proposed to classify breast US images. Although B-mode images contain both anatomical and textual information, traditional CNNs experience difficulties in abstracting features automatically, especially with small datasets. For the proposed CNN framework, two distinct feature maps were obtained from a B-mode image and utilized as new inputs for training the CNN. These feature maps can also be made from the evaluation data and applied to the CNN separately for the final classification decision. The experimental results with 780 breast US images in three categories of benign, malignant, and normal, showed that the proposed CNN framework using multiple feature maps exhibited better performances than the traditional CNN with B-mode only for most deep network models. |
first_indexed | 2024-03-10T06:06:40Z |
format | Article |
id | doaj.art-890fb61d22b74595ab69b870c3c0e465 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-10T06:06:40Z |
publishDate | 2021-10-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-890fb61d22b74595ab69b870c3c0e4652023-11-22T20:29:31ZengMDPI AGApplied Sciences2076-34172021-10-0111211021610.3390/app112110216Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature MapsHyungsuk Kim0Juyoung Park1Hakjoon Lee2Geuntae Im3Jongsoo Lee4Ki-Baek Lee5Heung Jae Lee6Department of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaDepartment of Electrical Engineering, Kwangwoon University, Seoul 01897, KoreaUltrasound (US) imaging is widely utilized as a diagnostic screening method, and deep learning has recently drawn attention for the analysis of US images for the pathological status of tissues. While low image quality and poor reproducibility are the common obstacles in US analysis, the small size of the dataset is a new limitation for deep learning due to lack of generalization. In this work, a convolutional neural network (CNN) using multiple feature maps, such as entropy and phase images, as well as a B-mode image, was proposed to classify breast US images. Although B-mode images contain both anatomical and textual information, traditional CNNs experience difficulties in abstracting features automatically, especially with small datasets. For the proposed CNN framework, two distinct feature maps were obtained from a B-mode image and utilized as new inputs for training the CNN. These feature maps can also be made from the evaluation data and applied to the CNN separately for the final classification decision. The experimental results with 780 breast US images in three categories of benign, malignant, and normal, showed that the proposed CNN framework using multiple feature maps exhibited better performances than the traditional CNN with B-mode only for most deep network models.https://www.mdpi.com/2076-3417/11/21/10216medical ultrasoundbreast US imagesdeep learningconvolutional neural networkB-mode imageentropy image |
spellingShingle | Hyungsuk Kim Juyoung Park Hakjoon Lee Geuntae Im Jongsoo Lee Ki-Baek Lee Heung Jae Lee Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps Applied Sciences medical ultrasound breast US images deep learning convolutional neural network B-mode image entropy image |
title | Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps |
title_full | Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps |
title_fullStr | Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps |
title_full_unstemmed | Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps |
title_short | Classification for Breast Ultrasound Using Convolutional Neural Network with Multiple Time-Domain Feature Maps |
title_sort | classification for breast ultrasound using convolutional neural network with multiple time domain feature maps |
topic | medical ultrasound breast US images deep learning convolutional neural network B-mode image entropy image |
url | https://www.mdpi.com/2076-3417/11/21/10216 |
work_keys_str_mv | AT hyungsukkim classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT juyoungpark classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT hakjoonlee classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT geuntaeim classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT jongsoolee classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT kibaeklee classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps AT heungjaelee classificationforbreastultrasoundusingconvolutionalneuralnetworkwithmultipletimedomainfeaturemaps |