Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer
Abstract One drawback of existing artificial intelligence (AI)-based histopathological prediction models is the lack of interpretability. The objective of this study is to extract p16-positive oropharyngeal squamous cell carcinoma (OPSCC) features in a form that can be interpreted by pathologists us...
Main Authors: | , , , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2024-02-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-024-55288-y |
_version_ | 1797274587533148160 |
---|---|
author | Masahiro Adachi Tetsuro Taki Naoya Sakamoto Motohiro Kojima Akihiko Hirao Kazuto Matsuura Ryuichi Hayashi Keiji Tabuchi Shumpei Ishikawa Genichiro Ishii Shingo Sakashita |
author_facet | Masahiro Adachi Tetsuro Taki Naoya Sakamoto Motohiro Kojima Akihiko Hirao Kazuto Matsuura Ryuichi Hayashi Keiji Tabuchi Shumpei Ishikawa Genichiro Ishii Shingo Sakashita |
author_sort | Masahiro Adachi |
collection | DOAJ |
description | Abstract One drawback of existing artificial intelligence (AI)-based histopathological prediction models is the lack of interpretability. The objective of this study is to extract p16-positive oropharyngeal squamous cell carcinoma (OPSCC) features in a form that can be interpreted by pathologists using AI model. We constructed a model for predicting p16 expression using a dataset of whole-slide images from 114 OPSCC biopsy cases. We used the clustering-constrained attention-based multiple-instance learning (CLAM) model, a weakly supervised learning approach. To improve performance, we incorporated tumor annotation into the model (Annot-CLAM) and achieved the mean area under the receiver operating characteristic curve of 0.905. Utilizing the image patches on which the model focused, we examined the features of model interest via histopathologic morphological analysis and cycle-consistent adversarial network (CycleGAN) image translation. The histopathologic morphological analysis evaluated the histopathological characteristics of image patches, revealing significant differences in the numbers of nuclei, the perimeters of the nuclei, and the intercellular bridges between p16-negative and p16-positive image patches. By using the CycleGAN-converted images, we confirmed that the sizes and densities of nuclei are significantly converted. This novel approach improves interpretability in histopathological morphology-based AI models and contributes to the advancement of clinically valuable histopathological morphological features. |
first_indexed | 2024-03-07T15:01:26Z |
format | Article |
id | doaj.art-806b75dc683d4dd09ddc6442cc76e37c |
institution | Directory Open Access Journal |
issn | 2045-2322 |
language | English |
last_indexed | 2024-03-07T15:01:26Z |
publishDate | 2024-02-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj.art-806b75dc683d4dd09ddc6442cc76e37c2024-03-05T19:08:39ZengNature PortfolioScientific Reports2045-23222024-02-0114111210.1038/s41598-024-55288-yExtracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancerMasahiro Adachi0Tetsuro Taki1Naoya Sakamoto2Motohiro Kojima3Akihiko Hirao4Kazuto Matsuura5Ryuichi Hayashi6Keiji Tabuchi7Shumpei Ishikawa8Genichiro Ishii9Shingo Sakashita10Department of Pathology and Clinical Laboratories, National Cancer Center Hospital EastDepartment of Pathology and Clinical Laboratories, National Cancer Center Hospital EastDepartment of Pathology and Clinical Laboratories, National Cancer Center Hospital EastDepartment of Pathology and Clinical Laboratories, National Cancer Center Hospital EastDivision of Pathology, National Cancer Center Exploratory Oncology Research and Clinical Trial CenterDepartment of Head and Neck Surgery, National Cancer Center Hospital EastDepartment of Head and Neck Surgery, National Cancer Center Hospital EastDepartment of Otolaryngology, Head and Neck Surgery, University of TsukubaDivision of Pathology, National Cancer Center Exploratory Oncology Research and Clinical Trial CenterDepartment of Pathology and Clinical Laboratories, National Cancer Center Hospital EastDepartment of Pathology and Clinical Laboratories, National Cancer Center Hospital EastAbstract One drawback of existing artificial intelligence (AI)-based histopathological prediction models is the lack of interpretability. The objective of this study is to extract p16-positive oropharyngeal squamous cell carcinoma (OPSCC) features in a form that can be interpreted by pathologists using AI model. We constructed a model for predicting p16 expression using a dataset of whole-slide images from 114 OPSCC biopsy cases. We used the clustering-constrained attention-based multiple-instance learning (CLAM) model, a weakly supervised learning approach. To improve performance, we incorporated tumor annotation into the model (Annot-CLAM) and achieved the mean area under the receiver operating characteristic curve of 0.905. Utilizing the image patches on which the model focused, we examined the features of model interest via histopathologic morphological analysis and cycle-consistent adversarial network (CycleGAN) image translation. The histopathologic morphological analysis evaluated the histopathological characteristics of image patches, revealing significant differences in the numbers of nuclei, the perimeters of the nuclei, and the intercellular bridges between p16-negative and p16-positive image patches. By using the CycleGAN-converted images, we confirmed that the sizes and densities of nuclei are significantly converted. This novel approach improves interpretability in histopathological morphology-based AI models and contributes to the advancement of clinically valuable histopathological morphological features.https://doi.org/10.1038/s41598-024-55288-y |
spellingShingle | Masahiro Adachi Tetsuro Taki Naoya Sakamoto Motohiro Kojima Akihiko Hirao Kazuto Matsuura Ryuichi Hayashi Keiji Tabuchi Shumpei Ishikawa Genichiro Ishii Shingo Sakashita Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer Scientific Reports |
title | Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
title_full | Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
title_fullStr | Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
title_full_unstemmed | Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
title_short | Extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
title_sort | extracting interpretable features for pathologists using weakly supervised learning to predict p16 expression in oropharyngeal cancer |
url | https://doi.org/10.1038/s41598-024-55288-y |
work_keys_str_mv | AT masahiroadachi extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT tetsurotaki extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT naoyasakamoto extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT motohirokojima extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT akihikohirao extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT kazutomatsuura extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT ryuichihayashi extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT keijitabuchi extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT shumpeiishikawa extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT genichiroishii extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer AT shingosakashita extractinginterpretablefeaturesforpathologistsusingweaklysupervisedlearningtopredictp16expressioninoropharyngealcancer |