Classification of head and neck cancer from PET images using convolutional neural networks

Abstract The aim of this study was to develop a convolutional neural network (CNN) for classifying positron emission tomography (PET) images of patients with and without head and neck squamous cell carcinoma (HNSCC) and other types of head and neck cancer. A PET/magnetic resonance imaging scan with...

Full description

Bibliographic Details
Main Authors: Henri Hellström, Joonas Liedes, Oona Rainio, Simona Malaspina, Jukka Kemppainen, Riku Klén
Format: Article
Language:English
Published: Nature Portfolio 2023-06-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-023-37603-1
_version_ 1797789862886113280
author Henri Hellström
Joonas Liedes
Oona Rainio
Simona Malaspina
Jukka Kemppainen
Riku Klén
author_facet Henri Hellström
Joonas Liedes
Oona Rainio
Simona Malaspina
Jukka Kemppainen
Riku Klén
author_sort Henri Hellström
collection DOAJ
description Abstract The aim of this study was to develop a convolutional neural network (CNN) for classifying positron emission tomography (PET) images of patients with and without head and neck squamous cell carcinoma (HNSCC) and other types of head and neck cancer. A PET/magnetic resonance imaging scan with 18F-fluorodeoxyglucose (18F-FDG) was performed for 200 head and neck cancer patients, 182 of which were diagnosed with HNSCC, and the location of cancer tumors was marked to the images with a binary mask by a medical doctor. The models were trained and tested with five-fold cross-validation with the primary data set of 1990 2D images obtained by dividing the original 3D images of 178 HNSCC patients into transaxial slices and with an additional test set with 238 images from the patients with head and neck cancer other than HNSCC. A shallow and a deep CNN were built by using the U-Net architecture for classifying the data into two groups based on whether an image contains cancer or not. The impact of data augmentation on the performance of the two CNNs was also considered. According to our results, the best model for this task in terms of area under receiver operator characteristic curve (AUC) is a deep augmented model with a median AUC of 85.1%. The four models had highest sensitivity for HNSCC tumors on the root of the tongue (median sensitivities of 83.3–97.7%), in fossa piriformis (80.2–93.3%), and in the oral cavity (70.4–81.7%). Despite the fact that the models were trained with only HNSCC data, they had also very good sensitivity for detecting follicular and papillary carcinoma of thyroid gland and mucoepidermoid carcinoma of the parotid gland (91.7–100%).
first_indexed 2024-03-13T01:56:35Z
format Article
id doaj.art-9fdb55474bbd441597a19d720ae9c98b
institution Directory Open Access Journal
issn 2045-2322
language English
last_indexed 2024-03-13T01:56:35Z
publishDate 2023-06-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj.art-9fdb55474bbd441597a19d720ae9c98b2023-07-02T11:12:48ZengNature PortfolioScientific Reports2045-23222023-06-011311910.1038/s41598-023-37603-1Classification of head and neck cancer from PET images using convolutional neural networksHenri Hellström0Joonas Liedes1Oona Rainio2Simona Malaspina3Jukka Kemppainen4Riku Klén5Turku PET Centre, University of Turku and Turku University HospitalTurku PET Centre, University of Turku and Turku University HospitalTurku PET Centre, University of Turku and Turku University HospitalTurku PET Centre, University of Turku and Turku University HospitalTurku PET Centre, University of Turku and Turku University HospitalTurku PET Centre, University of Turku and Turku University HospitalAbstract The aim of this study was to develop a convolutional neural network (CNN) for classifying positron emission tomography (PET) images of patients with and without head and neck squamous cell carcinoma (HNSCC) and other types of head and neck cancer. A PET/magnetic resonance imaging scan with 18F-fluorodeoxyglucose (18F-FDG) was performed for 200 head and neck cancer patients, 182 of which were diagnosed with HNSCC, and the location of cancer tumors was marked to the images with a binary mask by a medical doctor. The models were trained and tested with five-fold cross-validation with the primary data set of 1990 2D images obtained by dividing the original 3D images of 178 HNSCC patients into transaxial slices and with an additional test set with 238 images from the patients with head and neck cancer other than HNSCC. A shallow and a deep CNN were built by using the U-Net architecture for classifying the data into two groups based on whether an image contains cancer or not. The impact of data augmentation on the performance of the two CNNs was also considered. According to our results, the best model for this task in terms of area under receiver operator characteristic curve (AUC) is a deep augmented model with a median AUC of 85.1%. The four models had highest sensitivity for HNSCC tumors on the root of the tongue (median sensitivities of 83.3–97.7%), in fossa piriformis (80.2–93.3%), and in the oral cavity (70.4–81.7%). Despite the fact that the models were trained with only HNSCC data, they had also very good sensitivity for detecting follicular and papillary carcinoma of thyroid gland and mucoepidermoid carcinoma of the parotid gland (91.7–100%).https://doi.org/10.1038/s41598-023-37603-1
spellingShingle Henri Hellström
Joonas Liedes
Oona Rainio
Simona Malaspina
Jukka Kemppainen
Riku Klén
Classification of head and neck cancer from PET images using convolutional neural networks
Scientific Reports
title Classification of head and neck cancer from PET images using convolutional neural networks
title_full Classification of head and neck cancer from PET images using convolutional neural networks
title_fullStr Classification of head and neck cancer from PET images using convolutional neural networks
title_full_unstemmed Classification of head and neck cancer from PET images using convolutional neural networks
title_short Classification of head and neck cancer from PET images using convolutional neural networks
title_sort classification of head and neck cancer from pet images using convolutional neural networks
url https://doi.org/10.1038/s41598-023-37603-1
work_keys_str_mv AT henrihellstrom classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks
AT joonasliedes classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks
AT oonarainio classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks
AT simonamalaspina classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks
AT jukkakemppainen classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks
AT rikuklen classificationofheadandneckcancerfrompetimagesusingconvolutionalneuralnetworks