BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were cl...
Main Authors: | , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Diagnostics |
Subjects: | |
Online Access: | https://www.mdpi.com/2075-4418/12/7/1564 |
_version_ | 1827624167037468672 |
---|---|
author | Albin Sabani Anna Landsmann Patryk Hejduk Cynthia Schmidt Magda Marcon Karol Borkowski Cristina Rossi Alexander Ciritsis Andreas Boss |
author_facet | Albin Sabani Anna Landsmann Patryk Hejduk Cynthia Schmidt Magda Marcon Karol Borkowski Cristina Rossi Alexander Ciritsis Andreas Boss |
author_sort | Albin Sabani |
collection | DOAJ |
description | The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories: “no opacities” (BI-RADS 1), “probably benign opacities” (BI-RADS 2/3) and “suspicious opacities” (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a “real-world” dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI: 85.4–100%; reader 1: 86.2%, 95% CI: 67.4–95.5%; reader 2: 79.3%, 95% CI: 59.7–91.3%), and the sensitivity (84.0%, 95% CI: 63.9–95.5%) was lower than that of human readers (reader 1:88.0%, 95% CI: 67.4–95.4%; reader 2:88.0%, 95% CI: 67.7–96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence. |
first_indexed | 2024-03-09T12:02:02Z |
format | Article |
id | doaj.art-ba852106417c4c48943257fe07b9309c |
institution | Directory Open Access Journal |
issn | 2075-4418 |
language | English |
last_indexed | 2024-03-09T12:02:02Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Diagnostics |
spelling | doaj.art-ba852106417c4c48943257fe07b9309c2023-11-30T23:02:45ZengMDPI AGDiagnostics2075-44182022-06-01127156410.3390/diagnostics12071564BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural NetworkAlbin Sabani0Anna Landsmann1Patryk Hejduk2Cynthia Schmidt3Magda Marcon4Karol Borkowski5Cristina Rossi6Alexander Ciritsis7Andreas Boss8Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandInstitute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, SwitzerlandThe aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories: “no opacities” (BI-RADS 1), “probably benign opacities” (BI-RADS 2/3) and “suspicious opacities” (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a “real-world” dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI: 85.4–100%; reader 1: 86.2%, 95% CI: 67.4–95.5%; reader 2: 79.3%, 95% CI: 59.7–91.3%), and the sensitivity (84.0%, 95% CI: 63.9–95.5%) was lower than that of human readers (reader 1:88.0%, 95% CI: 67.4–95.4%; reader 2:88.0%, 95% CI: 67.7–96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence.https://www.mdpi.com/2075-4418/12/7/1564breast neoplasmsmammographyneural networkscomputermachine learningartificial intelligence |
spellingShingle | Albin Sabani Anna Landsmann Patryk Hejduk Cynthia Schmidt Magda Marcon Karol Borkowski Cristina Rossi Alexander Ciritsis Andreas Boss BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network Diagnostics breast neoplasms mammography neural networks computer machine learning artificial intelligence |
title | BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network |
title_full | BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network |
title_fullStr | BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network |
title_full_unstemmed | BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network |
title_short | BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network |
title_sort | bi rads based classification of mammographic soft tissue opacities using a deep convolutional neural network |
topic | breast neoplasms mammography neural networks computer machine learning artificial intelligence |
url | https://www.mdpi.com/2075-4418/12/7/1564 |
work_keys_str_mv | AT albinsabani biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT annalandsmann biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT patrykhejduk biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT cynthiaschmidt biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT magdamarcon biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT karolborkowski biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT cristinarossi biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT alexanderciritsis biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork AT andreasboss biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork |