Offloading the computational complexity of transfer learning with generic features

Deep learning approaches are generally complex, requiring extensive computational resources and having high time complexity. Transfer learning is a state-of-the-art approach to reducing the requirements of high computational resources by using pre-trained models without compromising accuracy and per...

Full description

Bibliographic Details
Main Authors: Muhammad Safdar Ali Khan, Arif Husen, Shafaq Nisar, Hasnain Ahmed, Syed Shah Muhammad, Shabib Aftab
Format: Article
Language:English
Published: PeerJ Inc. 2024-03-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-1938.pdf
_version_ 1797238992694935552
author Muhammad Safdar Ali Khan
Arif Husen
Shafaq Nisar
Hasnain Ahmed
Syed Shah Muhammad
Shabib Aftab
author_facet Muhammad Safdar Ali Khan
Arif Husen
Shafaq Nisar
Hasnain Ahmed
Syed Shah Muhammad
Shabib Aftab
author_sort Muhammad Safdar Ali Khan
collection DOAJ
description Deep learning approaches are generally complex, requiring extensive computational resources and having high time complexity. Transfer learning is a state-of-the-art approach to reducing the requirements of high computational resources by using pre-trained models without compromising accuracy and performance. In conventional studies, pre-trained models are trained on datasets from different but similar domains with many domain-specific features. The computational requirements of transfer learning are directly dependent on the number of features that include the domain-specific and the generic features. This article investigates the prospects of reducing the computational requirements of the transfer learning models by discarding domain-specific features from a pre-trained model. The approach is applied to breast cancer detection using the dataset curated breast imaging subset of the digital database for screening mammography and various performance metrics such as precision, accuracy, recall, F1-score, and computational requirements. It is seen that discarding the domain-specific features to a specific limit provides significant performance improvements as well as minimizes the computational requirements in terms of training time (reduced by approx. 12%), processor utilization (reduced approx. 25%), and memory usage (reduced approx. 22%). The proposed transfer learning strategy increases accuracy (approx. 7%) and offloads computational complexity expeditiously.
first_indexed 2024-04-24T17:44:27Z
format Article
id doaj.art-e52eedd3ae8a409dade49873af42170a
institution Directory Open Access Journal
issn 2376-5992
language English
last_indexed 2024-04-24T17:44:27Z
publishDate 2024-03-01
publisher PeerJ Inc.
record_format Article
series PeerJ Computer Science
spelling doaj.art-e52eedd3ae8a409dade49873af42170a2024-03-27T15:05:12ZengPeerJ Inc.PeerJ Computer Science2376-59922024-03-0110e193810.7717/peerj-cs.1938Offloading the computational complexity of transfer learning with generic featuresMuhammad Safdar Ali Khan0Arif Husen1Shafaq Nisar2Hasnain Ahmed3Syed Shah Muhammad4Shabib Aftab5Department of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDepartment of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDepartment of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDepartment of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDepartment of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDepartment of Computer Science and Information Technology, Virtual University of Pakistan, Lahore, Punjab, PakistanDeep learning approaches are generally complex, requiring extensive computational resources and having high time complexity. Transfer learning is a state-of-the-art approach to reducing the requirements of high computational resources by using pre-trained models without compromising accuracy and performance. In conventional studies, pre-trained models are trained on datasets from different but similar domains with many domain-specific features. The computational requirements of transfer learning are directly dependent on the number of features that include the domain-specific and the generic features. This article investigates the prospects of reducing the computational requirements of the transfer learning models by discarding domain-specific features from a pre-trained model. The approach is applied to breast cancer detection using the dataset curated breast imaging subset of the digital database for screening mammography and various performance metrics such as precision, accuracy, recall, F1-score, and computational requirements. It is seen that discarding the domain-specific features to a specific limit provides significant performance improvements as well as minimizes the computational requirements in terms of training time (reduced by approx. 12%), processor utilization (reduced approx. 25%), and memory usage (reduced approx. 22%). The proposed transfer learning strategy increases accuracy (approx. 7%) and offloads computational complexity expeditiously.https://peerj.com/articles/cs-1938.pdfGeneric featuresDomain specific featuresDeep learningCancer detectionCancer classificationTransfer learning
spellingShingle Muhammad Safdar Ali Khan
Arif Husen
Shafaq Nisar
Hasnain Ahmed
Syed Shah Muhammad
Shabib Aftab
Offloading the computational complexity of transfer learning with generic features
PeerJ Computer Science
Generic features
Domain specific features
Deep learning
Cancer detection
Cancer classification
Transfer learning
title Offloading the computational complexity of transfer learning with generic features
title_full Offloading the computational complexity of transfer learning with generic features
title_fullStr Offloading the computational complexity of transfer learning with generic features
title_full_unstemmed Offloading the computational complexity of transfer learning with generic features
title_short Offloading the computational complexity of transfer learning with generic features
title_sort offloading the computational complexity of transfer learning with generic features
topic Generic features
Domain specific features
Deep learning
Cancer detection
Cancer classification
Transfer learning
url https://peerj.com/articles/cs-1938.pdf
work_keys_str_mv AT muhammadsafdaralikhan offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures
AT arifhusen offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures
AT shafaqnisar offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures
AT hasnainahmed offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures
AT syedshahmuhammad offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures
AT shabibaftab offloadingthecomputationalcomplexityoftransferlearningwithgenericfeatures