How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset

Accurate classification of medical images is of great importance for correct disease diagnosis. The automation of medical image classification is of great necessity because it can provide a second opinion or even a better classification in case of a shortage of experienced medical staff. Convolution...

Full description

Bibliographic Details
Main Authors: Ibrahem Kandel, Mauro Castelli
Format: Article
Language:English
Published: MDPI AG 2020-05-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/10/3359
_version_ 1797568173994672128
author Ibrahem Kandel
Mauro Castelli
author_facet Ibrahem Kandel
Mauro Castelli
author_sort Ibrahem Kandel
collection DOAJ
description Accurate classification of medical images is of great importance for correct disease diagnosis. The automation of medical image classification is of great necessity because it can provide a second opinion or even a better classification in case of a shortage of experienced medical staff. Convolutional neural networks (CNN) were introduced to improve the image classification domain by eliminating the need to manually select which features to use to classify images. Training CNN from scratch requires very large annotated datasets that are scarce in the medical field. Transfer learning of CNN weights from another large non-medical dataset can help overcome the problem of medical image scarcity. Transfer learning consists of fine-tuning CNN layers to suit the new dataset. The main questions when using transfer learning are how deeply to fine-tune the network and what difference in generalization that will make. In this paper, all of the experiments were done on two histopathology datasets using three state-of-the-art architectures to systematically study the effect of block-wise fine-tuning of CNN. Results show that fine-tuning the entire network is not always the best option; especially for shallow networks, alternatively fine-tuning the top blocks can save both time and computational power and produce more robust classifiers.
first_indexed 2024-03-10T19:52:41Z
format Article
id doaj.art-849d0fff95324678b11f51cbef82c6d8
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-03-10T19:52:41Z
publishDate 2020-05-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-849d0fff95324678b11f51cbef82c6d82023-11-20T00:12:44ZengMDPI AGApplied Sciences2076-34172020-05-011010335910.3390/app10103359How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology DatasetIbrahem Kandel0Mauro Castelli1Nova Information Management School (NOVA IMS), Universidade Nova de Lisboa, Campus de Campolide, 1070-312 Lisboa, PortugalNova Information Management School (NOVA IMS), Universidade Nova de Lisboa, Campus de Campolide, 1070-312 Lisboa, PortugalAccurate classification of medical images is of great importance for correct disease diagnosis. The automation of medical image classification is of great necessity because it can provide a second opinion or even a better classification in case of a shortage of experienced medical staff. Convolutional neural networks (CNN) were introduced to improve the image classification domain by eliminating the need to manually select which features to use to classify images. Training CNN from scratch requires very large annotated datasets that are scarce in the medical field. Transfer learning of CNN weights from another large non-medical dataset can help overcome the problem of medical image scarcity. Transfer learning consists of fine-tuning CNN layers to suit the new dataset. The main questions when using transfer learning are how deeply to fine-tune the network and what difference in generalization that will make. In this paper, all of the experiments were done on two histopathology datasets using three state-of-the-art architectures to systematically study the effect of block-wise fine-tuning of CNN. Results show that fine-tuning the entire network is not always the best option; especially for shallow networks, alternatively fine-tuning the top blocks can save both time and computational power and produce more robust classifiers.https://www.mdpi.com/2076-3417/10/10/3359convolutional neural networkimage classificationtransfer learningmedical imagesdeep learningfine-tuning
spellingShingle Ibrahem Kandel
Mauro Castelli
How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
Applied Sciences
convolutional neural network
image classification
transfer learning
medical images
deep learning
fine-tuning
title How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
title_full How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
title_fullStr How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
title_full_unstemmed How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
title_short How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset
title_sort how deeply to fine tune a convolutional neural network a case study using a histopathology dataset
topic convolutional neural network
image classification
transfer learning
medical images
deep learning
fine-tuning
url https://www.mdpi.com/2076-3417/10/10/3359
work_keys_str_mv AT ibrahemkandel howdeeplytofinetuneaconvolutionalneuralnetworkacasestudyusingahistopathologydataset
AT maurocastelli howdeeplytofinetuneaconvolutionalneuralnetworkacasestudyusingahistopathologydataset